You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I went through the DEMO and I thought I could download all queries in a group, but in practice I got this error: ('No valid queries ids could be extracted', 'MyGroup')
In bwresources.py, it says: def get_mention(self, **kwargs): Retrieves a single mention by url or resource id. This is ONLY a valid function for queries (not groups), which is why it isn't split out into bwdata.
Well, is there a way to download the data from all queries in a group in one call? Or at least interact with a list of queries?
Thanks!
The text was updated successfully, but these errors were encountered:
To download the data from all queries in a group in one call, use the get_mentions() function. You can achieve this in the DEMO by editing the first get_mentions() call in the 'Downloading Data' section to be run on groups instead of queries, and passing in a group name (see screenshot attached)
There are two similarly named functions, which I think you've confused above. get_mentions() retrieves a list of mentions, works on both queries and groups, and is defined in the bwdata.py file get_mention() retrieves only one mention by looking up a url or id, only works on queries, and is defined in the bwresources.py file
Here, you want to use the get_mentions() function.
Hello,
I went through the DEMO and I thought I could download all queries in a group, but in practice I got this error:
('No valid queries ids could be extracted', 'MyGroup')
In bwresources.py, it says:
def get_mention(self, **kwargs): Retrieves a single mention by url or resource id. This is ONLY a valid function for queries (not groups), which is why it isn't split out into bwdata.
Well, is there a way to download the data from all queries in a group in one call? Or at least interact with a list of queries?
Thanks!
The text was updated successfully, but these errors were encountered: