-
Notifications
You must be signed in to change notification settings - Fork 308
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feat] Add Otter to OpenCompass MMBench Evaluation #232
Conversation
# model settings | ||
otter_9b_mmbench_model = dict( | ||
type="otter-9b", | ||
model_path="luodian/OTTER-Image-MPT7B", # noqa |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This path should be changed
configs/multimodal/tasks.py
Outdated
# evaluators = [minigpt_4_mmbench_evaluator, otter_9b_mmbench_evaluator] | ||
# load_froms = [minigpt_4_mmbench_load_from, otter_9b_mmbench_load_from] | ||
models = [otter_9b_mmbench_model] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
please revert these changes
configs/multimodal/tasks.py
Outdated
evaluators = [otter_9b_mmbench_evaluator] | ||
load_froms = [otter_9b_mmbench_load_from] | ||
num_gpus = 2 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
these configuration should also be reverted
return output_text | ||
|
||
def generate(self, batch): | ||
images = [image.unsqueeze(0) for image in batch["inputs"]] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You should write a custom prompt constructor
and post processor
following those in MiniGPT-4
configs/multimodal/tasks.py
Outdated
datasets = [minigpt_4_mmbench_dataloader] | ||
evaluators = [minigpt_4_mmbench_evaluator] | ||
load_froms = [minigpt_4_mmbench_load_from] | ||
models = [minigpt_4_mmbench_model, otter_9b_mmbench_model] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please do not make any changes to this file after you evaluate your model
self.prompt_constructor = mmengine.registry.build_from_cfg( | ||
prompt_constructor, MM_MODELS) | ||
|
||
def post_process(self, output_text): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You should create a new Post Processs
class, and please refer to the implementation of MiniGPT-4
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please rebase this branch to the latest main
branch, and fix these comments above
hi I changed it, please check again! |
# model settings | ||
otter_9b_mmbench_model = dict( | ||
type="otter-9b", | ||
model_path="luodian/OTTER-Image-MPT7B", # noqa |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
please change to /path/to/OTTER-Image-MPT7B/
, don't write path with your name
configs/multimodal/tasks.py
Outdated
|
||
models = [minigpt_4_mmbench_model] | ||
datasets = [minigpt_4_mmbench_dataloader] | ||
evaluators = [minigpt_4_mmbench_evaluator] | ||
load_froms = [minigpt_4_mmbench_load_from] | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
please don't modify this file, this file is just an example
self.post_processor = mmengine.registry.build_from_cfg( | ||
post_processor, MM_MODELS) | ||
|
||
def generate(self, batch): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
please follow other multimodal algorithm code, write def forward
to control the mode
hi I made the changes accordingly! |
sorry! I've resolved the lint problem, please run workflow again~ |
* add otter model for opencompass mmbench * add docs * add readme docs * debug for otter opencomass eval * delete unused folders * change to default data path * remove unused files * remove unused files * update * update config file * flake8 lint formated and add prompt generator * add prompt generator to config * add a specific postproecss * add post processor * add post processor * add post processor * update according to suggestions * remove unused redefinition
Thanks for your contribution and we appreciate it a lot. The following instructions would make your pull request more healthy and more easily get feedback. If you do not understand some items, don't worry, just make the pull request and seek help from maintainers.
Motivation
The main motivation behind this PR is to integrate the 'otter' model into the opencompass mmbench, making it usable and testable within the platform. Additionally, efforts were made to provide clearer documentation for potential users and contributors.
Modification
The following modifications were made:
Added the 'otter' model for opencompass mmbench.
Provided documentation for the added features.
Added a README to give users a comprehensive understanding of the new addition.
Debugged the evaluation process for 'otter' in opencompass.
Removed unnecessary folders to streamline the repository structure.
BC-breaking (Optional)
As of now, there's no indication that these changes break backward compatibility. Downstream projects should continue to function as expected.
Use cases (Optional)
Users wanting to leverage the 'otter' model within the opencompass mmbench for benchmarking or other relevant tasks.
Developers or contributors seeking a clear understanding of the 'otter' model's integration through the provided documentation.
Checklist
Before PR:
After PR: