You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
"description": "Enabling AI is very energy-consuming. Nowadays, many companies turn\ntheir attention toward productionizing machine learning models. However,\nthe energy efficiency of the models has not been considered as one of\nthe main quality requirements. In this talk, we outline some energy\nefficiency guidelines that you can start using today!\n\nDeploying machine learning models into production can be very energy\ncostly. For instance, training and hyper-parameter tuning utilize quite\nsome hardware resources for quite long time intervals. Yet, most\ncompanies do not consider energy efficiency metrics as one of the main\nquality attributes of machine learning pipelines.\n\nIn this talk, we walk you through some example machine learning\npipelines, which we use to show a couple of energy efficiency\napproaches. We explore inefficiency sweet spots in the life cycle of the\nmodels by looking at the utilization rates of the cloud resources.\nFinally, we evaluate the effectiveness of our energy efficiency\napproaches with regards to other efficiency metrics such as performance.\n\nData scientists, machine learning engineers, and data engineers can\nbenefit from this talk by reusing the guidelines in similar use cases.\nThis talk aims to emphasize the importance of \"energy efficiency by\ndesign\" in the data science field.\n",