Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

data_loader #4

Closed
duducode opened this issue Dec 20, 2019 · 18 comments
Closed

data_loader #4

duducode opened this issue Dec 20, 2019 · 18 comments

Comments

@duducode
Copy link

image

@WenmuZhou
Copy link
Owner

your root path should be DBNet.pytorch

@zhouguanghui001
Copy link

你好,运行的时候出现这样一个问题。能帮忙看看么?
File "/home/zhouguanghui/code/DB1/train.py", line 76, in
main(config)
File "/home/zhouguanghui/code/DB1/train.py", line 60, in main
trainer.train()
File "/home/zhouguanghui/code/DB1/base/base_trainer.py", line 103, in train
self.epoch_result = self._train_epoch(epoch)
File "/home/zhouguanghui/code/DB1/trainer/trainer.py", line 46, in _train_epoch
for i, batch in enumerate(self.train_loader):
File "/home/zhouguanghui/anaconda3/lib/python3.7/site-packages/torch/utils/data/dataloader.py", line 560, in next
batch = self.collate_fn([self.dataset[i] for i in indices])
TypeError: 'NoneType' object is not callable

@WenmuZhou
Copy link
Owner

你的dataset是空的

@zhouguanghui001
Copy link

数据是有的。就是迭代器那点报错呢

@zhouguanghui001
Copy link

        data['text_polys'] = data['text_polys'].tolist()
        #return data


        if len(self.filter_keys):
            data_dict = {}
            for k, v in data.items():
                if k not in self.filter_keys:
                    data_dict[k] = v
            return data_dict
        else:
            return data

@zhouguanghui001
Copy link

返回值的时候这点报错呢

@zhouguanghui001
Copy link

zhouguanghui001 commented Dec 23, 2019

图片debug的时候数据都在,就是没办法返回呢。

@duducode
Copy link
Author

image
是在根目录下运行的predict.py

@WenmuZhou
Copy link
Owner

WenmuZhou commented Dec 24, 2019

@duducode 用我新上传的predict看看,这种问题你百度一下就可以解决了

@TALQinYong
Copy link

图片debug的时候数据都在,就是没办法返回呢。

所以这个问题你咋解决的

1 similar comment
@TALQinYong
Copy link

图片debug的时候数据都在,就是没办法返回呢。

所以这个问题你咋解决的

@TALQinYong
Copy link

@zhouguanghui001 这个问题你解决没有

@TALQinYong
Copy link

@WenmuZhou
Traceback (most recent call last):
File "tools/train.py", line 77, in
main(config)
File "tools/train.py", line 61, in main
trainer.train()
File "/share/yongqin/DBNet.pytorch/base/base_trainer.py", line 103, in train
self.epoch_result = self._train_epoch(epoch)
File "/share/yongqin/DBNet.pytorch/trainer/trainer.py", line 46, in _train_epoch
dict_keys(['img_path', 'img_name', 'text_polys', 'texts', 'ignore_tags', 'img', 'shape', 'threshold_map', 'threshold_mask', 'shrink_map', 'shrink_mask'])
for i, batch in enumerate(self.train_loader):
File "/root/miniconda3/envs/py36/lib/python3.6/site-packages/torch/utils/data/dataloader.py", line 582, in next
return self._process_next_batch(batch)
File "/root/miniconda3/envs/py36/lib/python3.6/site-packages/torch/utils/data/dataloader.py", line 608, in _process_next_batch
dict_keys(['img_path', 'img_name', 'text_polys', 'texts', 'ignore_tags', 'img', 'shape', 'threshold_map', 'threshold_mask', 'shrink_map', 'shrink_mask'])
raise batch.exc_type(batch.exc_msg)
TypeError: Traceback (most recent call last):
File "/root/miniconda3/envs/py36/lib/python3.6/site-packages/torch/utils/data/_utils/worker.py", line 99, in _worker_loop
samples = collate_fn([dataset[i] for i in batch_indices])
TypeError: 'NoneType' object is not callable
这个问题是因为pytorch版本吗

@TALQinYong
Copy link

@zhouguanghui001 不知道你解决没有,是pytorch版本的问题

@zhouguanghui001
Copy link

@TALQinYong 已经解决了

@zhouguanghui001
Copy link

zhouguanghui001 commented Dec 30, 2019 via email

@chensiyuanlove
Copy link

chensiyuanlove commented Jan 16, 2020

@WenmuZhou
Traceback (most recent call last):
File "tools/train.py", line 77, in
main(config)
File "tools/train.py", line 61, in main
trainer.train()
File "/share/yongqin/DBNet.pytorch/base/base_trainer.py", line 103, in train
self.epoch_result = self._train_epoch(epoch)
File "/share/yongqin/DBNet.pytorch/trainer/trainer.py", line 46, in _train_epoch
dict_keys(['img_path', 'img_name', 'text_polys', 'texts', 'ignore_tags', 'img', 'shape', 'threshold_map', 'threshold_mask', 'shrink_map', 'shrink_mask'])
for i, batch in enumerate(self.train_loader):
File "/root/miniconda3/envs/py36/lib/python3.6/site-packages/torch/utils/data/dataloader.py", line 582, in next
return self._process_next_batch(batch)
File "/root/miniconda3/envs/py36/lib/python3.6/site-packages/torch/utils/data/dataloader.py", line 608, in _process_next_batch
dict_keys(['img_path', 'img_name', 'text_polys', 'texts', 'ignore_tags', 'img', 'shape', 'threshold_map', 'threshold_mask', 'shrink_map', 'shrink_mask'])
raise batch.exc_type(batch.exc_msg)
TypeError: Traceback (most recent call last):
File "/root/miniconda3/envs/py36/lib/python3.6/site-packages/torch/utils/data/_utils/worker.py", line 99, in _worker_loop
samples = collate_fn([dataset[i] for i in batch_indices])
TypeError: 'NoneType' object is not callable
这个问题是因为pytorch版本吗

兄弟,你需要把 下面代码改改
`def get_dataloader(module_config, distributed=False):
if module_config is None:
return None
config = copy.deepcopy(module_config)
dataset_args = config['dataset']['args']
if 'transforms' in dataset_args:
img_transfroms = get_transforms(dataset_args.pop('transforms'))
else:
img_transfroms = None
# 创建数据集
dataset_name = config['dataset']['type']
data_path = dataset_args.pop('data_path')

if 'collate_fn' not in config['loader'] or config['loader']['collate_fn'] is None or len(config['loader']['collate_fn']) == 0:
    #config['loader']['collate_fn'] = None # here has to changle,=========  这里要改成下面的,不然传None进去会被直接赋值 ====
    config['loader']['collate_fn'] = torch.utils.data.dataloader.default_collate
else:
    config['loader']['collate_fn'] = eval(config['loader']['collate_fn'])()

_dataset = get_dataset(data_path=data_path, module_name=dataset_name, transform=img_transfroms, dataset_args=dataset_args)
sampler = None
if distributed:
    from torch.utils.data.distributed import DistributedSampler
    # 3)使用DistributedSampler
    sampler = DistributedSampler(_dataset)
    config['loader']['shuffle'] = False
    config['loader']['pin_memory'] = True
loader = DataLoader(dataset=_dataset, sampler=sampler, **config['loader'])
return  @loader

I5SM58XP_)EO~DW7D@T}(FL
`
我用的是 pytroch1.1.0 cuda10.0

@WenmuZhou
Copy link
Owner

I think I can close this

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants