New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
id2nframe.json file not found #2
Comments
Here are the logs I got (with only the slowfast_1.5 folder downloaded in video_db): |
I see you shared azcopy log file. Can you also share the stdout from running the command? Something like this:
I want to check the total bytes transferred to see if it matches what's on the cloud. |
Actually ~/azcopy/azcopy cp $BLOB/video_db/vlep.tar $DOWNLOAD/video_db/vlep.tar gets killed: 0 Done, 0 Failed, 1 Pending, 0 Skipped, 1 Total, 2-sec Throughput (Mb/s): 2770.2691scripts/download_vlep_videodb.sh: line 25: 3710 Killed ~/azcopy/azcopy cp $BLOB/video_db/vlep.tar $DOWNLOAD/video_db/vlep.tar |
I suspect the Another alternative is to use DOWNLOAD=$1
for FOLDER in 'video_db' 'txt_db' 'pretrained' 'finetune'; do
if [ ! -d $DOWNLOAD/$FOLDER ] ; then
mkdir -p $DOWNLOAD/$FOLDER
fi
done
BLOB='https://datarelease.blob.core.windows.net/value-leaderboard/starter_code_data'
# video dbs
if [ ! -d $DOWNLOAD/video_db/vlep/ ] ; then
wget -P $DOWNLOAD/video_db/ $BLOB/video_db/vlep.tar
tar -xvf $DOWNLOAD/video_db/vlep.tar -C $DOWNLOAD/video_db
rm $DOWNLOAD/video_db/vlep.tar
fi But |
Actually the downloading script worked fine on another cluster, probably due to different CPU resources. Thanks for helping! |
Good to know! All dev/test results (with CLIP-ViT + Slowfast or ResNet + Slowfast) are included in our paper. ST is model trained on single task and AT-> ST is first perform all-task training then finetune on single task. |
Closed due to inactivity. |
Hi,
I downloaded the data of a dataset (VLEP) with the corresponding script, but when using train_qa.py with the base config, the dataloading fails as it seems that the file loaded in data.py lines 60-65 {img_dir}/id2nframe_{frame_interval:g}.json or {img_dir}/id2nframe.json is missing (and I couldn't find it in the downloaded folders). Would you mind providing it please?
PS1: it seems that when replacing name2nframe by None, the _compute_nframe function then fails as the downloaded database seems corrupted (fnames = json.loads(self.txn.get(key=b'keys').decode('utf-8')) --> lmdb.CorruptedError: mdb_get: MDB_CORRUPTED: Located page was wrong type).
PS2: it seems also that default configs uses "vfeat_version": "resnet_slowfast", but the downloading script only downloads a "slowfast" folder in video_db, not a "resnet_slowfast" one.
Best,
Antoine Yang
The text was updated successfully, but these errors were encountered: