GCSFeedStorage requires google-cloud-storage
Package:
scrapy
41445

Exception Class:
unittest.SkipTest
Raise code
class GCSFeedStorageTest(unittest.TestCase):
def test_parse_settings(self):
try:
from google.cloud.storage import Client # noqa
except ImportError:
raise unittest.SkipTest("GCSFeedStorage requires google-cloud-storage")
settings = {'GCS_PROJECT_ID': '123', 'FEED_STORAGE_GCS_ACL': 'publicRead'}
crawler = get_crawler(settings_dict=settings)
storage = GCSFeedStorage.from_crawler(crawler, 'gs://mybucket/export.csv')
assert storage.project_id == '123'
assert storage.acl == 'publicRead'
assert storage.bucket_name == 'mybucket'
Links to the raise (3)
https://github.com/scrapy/scrapy/blob/ee682af3b06d48815dbdaa27c1177b94aaf679e1/tests/test_feedexport.py#L422 https://github.com/scrapy/scrapy/blob/ee682af3b06d48815dbdaa27c1177b94aaf679e1/tests/test_feedexport.py#L436 https://github.com/scrapy/scrapy/blob/ee682af3b06d48815dbdaa27c1177b94aaf679e1/tests/test_feedexport.py#L453NO FIXES YET
Just press the button and we will add solution
to this exception as soon as possible
* As many users press the button, the faster we create a fix
Add a possible fix
Please authorize to post fix