Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Essential Techniques for Deep Learning to avoid Overfitting #4

Open
3 tasks done
sumankanukollu opened this issue Apr 26, 2023 · 6 comments
Open
3 tasks done
Labels
talk-proposal New Talk for Bangpypers Meetup virtual Available Virtually

Comments

@sumankanukollu
Copy link

Title of the talk

Essential Techniques for Deep Learning to avoid Overfitting

Description

In deep learning, overfitting is a common problem that can hinder the performance and generalisation of models. To address this issue, deep learning practitioners use a variety of techniques to regularize their models, including dropout, data augmentation, early stopping, L1 and L2 regularization, and batch normalization.

In this talk, we will explore the essential techniques for avoiding overfitting in deep learning, and discuss their benefits and limitations.

By the end of this session, participants will have a solid understanding of the essential techniques for avoiding overfitting in deep learning, and will be able to apply these techniques to their own projects and research.

What format do you have in mind for your talk?

Talk

Table of contents

Some potential questions to explore during the session include:

  1. What is overfitting, and how does it affect the performance and generalization of deep learning models?
  2. How does dropout work, and what are some best practices for using it effectively?
  3. What are some common data augmentation techniques, and how can they help improve the performance and generalization of models?
  4. What is early stopping, and how can it be used to prevent overfitting during training?
  5. How do L1 and L2 regularization work, and how do they differ from each other?
  6. What is batch normalization, and how can it help prevent overfitting in deep learning models?
  7. How do these techniques fit into the broader landscape of deep learning regularization, and what are some emerging trends and challenges in this area?

What domain would you say your talk falls under?

Data Science & Machine Learning

Duration in minutes (including Q&A)

60

Prerequisites

  • Technical requirements: Familiarity with deep learning concepts
  • Target audience: Deep learning practitioners and researchers seeking to improve model performance and generalisation, Above Intermediate level.

Speaker bio

https://www.linkedin.com/in/suman-kanukollu/

The talk/workshop speaker agrees to

@sumankanukollu sumankanukollu added the talk-proposal New Talk for Bangpypers Meetup label Apr 26, 2023
@anistark
Copy link
Member

anistark commented May 8, 2023

Hi @sumankanukollu
Are you based out of Bangalore or do you need to travel?
If you can travel, would you be available to present this month 20th morning?
We're also designing a poster this time as an experiment. Would you be open to sharing your pic to be put up on the poster?

@sumankanukollu
Copy link
Author

sumankanukollu commented May 8, 2023 via email

@anistark
Copy link
Member

anistark commented Jul 4, 2023

Hi @sumankanukollu
Are you willing to present this in July meetup?

@sumankanukollu
Copy link
Author

sumankanukollu commented Jul 5, 2023 via email

@anistark
Copy link
Member

anistark commented Jul 5, 2023

This one's a physical meetup unfortunately. If we do virtual event, will reach out. Thanks.

@anistark anistark added the virtual Available Virtually label Jul 5, 2023
@sumankanukollu
Copy link
Author

sumankanukollu commented Jul 5, 2023 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
talk-proposal New Talk for Bangpypers Meetup virtual Available Virtually
Projects
None yet
Development

No branches or pull requests

2 participants