I am scraping a website which returns in a list of
scrapy crawl xyz_spider -o urls.csv
Unfortunately scrapy can't do this at the moment.
There is a proposed enhancement on github though: https://github.com/scrapy/scrapy/issues/547
However you can easily do redirect the output to stdout and redirect that to a file:
scrapy crawl myspider -t json --nolog -o - > output.json
-o - means output to minus and minus in this case means stdout.
You can also make some aliases to delete the file before running scrapy, something like:
alias sc='-rm output.csv && scrapy crawl myspider'