Skip to content

Used Mapreduce on a Hadoop environment at CCR(University at Buffalo) to compute the monthly volatility of about 3000 stocks with each having data of about three years .There were a total of 40000 files.

Notifications You must be signed in to change notification settings

HarshHarwani/Stock-Analysis-using-Map-Reduce-and-Hadoop

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

21 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Stock-Analysis-using-Map-Reduce-and-Hadoop

Used Mapreduce on a Hadoop environment at CCR(University at Buffalo) to compute the monthly volatility of about 3000 stocks with each having data of about three years .There were a total of 40000 files.Volatility index is widely used by traders to find out stocks with higher earning potential.you cand find more details about the topic at: http://stockcharts.com/school/doku.php?id=chart_school:technical_indicators:standard_deviation_volatility. The following image shows the perfomance analysis of small,medium and large datasets on machines with 12,24 and 48 cores. Performance Analysis Performance Analysis

About

Used Mapreduce on a Hadoop environment at CCR(University at Buffalo) to compute the monthly volatility of about 3000 stocks with each having data of about three years .There were a total of 40000 files.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Languages