I'll try to explain this, hopefully it makes sense.
I had installed a virtual env a while ago in the directory:
You can create virtualenv in any directory:
virtualenv some/directory # or windows virtualenv some\directory
Then you can activate that virtualenv which will change your $PATH environment variables to read python related things from virtual environments directory instead of your system's:
source some/directory/bin/activate # or for windows some\directory\Scripts\activate
and to deactivate type:
See more at official documentation of virtualenv
Once you have virtualenv activated when you call
scrapy the scrapy of virtual environment will be called instead of system one and any packages you install via pip(if the python version in your virtualenv has it) will be installed to the virtual environment.
You can always test what will run by using:
$ which scrapy some/directory/bin/scrapy # or for windows $ where scrapy some\directory\bin\scrapy