Big Data has traditionally been associated with the high cost to manage it as well as users’ inability to extract meaningful or timely insight. However, forward-looking organizations are flipping this paradigm on end. They’re using faster processing, new techniques and larger datasets to better understand past events and more effectively forecast the future. As CTOs, we must learn and deploy the new technologies to leverage Big Data.
Michael Brown, CTO at comScore, shared his experiences on how comScore manages and processes large volumes of data. Mike walked through the implementation and changes that were made to support the rollout of comScore’s 3rd generation product.
Michael Brown was a founding member of comScore, Inc. in 1999. He leads the technology efforts of the company to measure Internet and Digital activities. In this position, he helped the company build one of the world’s largest Decision Support Systems. This system currently has over 1.5 trillion rows of data online and captures over 120 billion rows of new data every week to measure the Internet in over 45 countries. He also has been responsible for over seventeen patent applications at comScore; three of which have already been issued by the US Patent and Trademark Office.