Intel investing in Big Data to make Chip Testing faster

Posted on Mar 8 2013 - 1:57pm by Martin Gicheru

intel hqBig Data is clearly going to play a huge role in delivering working products to customers where structuring of data from research and customer feedback needs to be done faster and correctly. Intel is one such company that relies in processing this data to deliver better than their competition in a futuristic way. They aim to reduce costs while at it and Big data will be an enabler as the system they developed, Apache Hadoop that works together with a Big Data platform will see Intel reduce processor test costs by $30 million.

Time of Chip testing will also go down by 25% resulting in the reduced costs of production. The amount of quality checks done on a single chip is rigorous and a series of tests have to be done. The new platform has been used to gather historical manufacturing information and new sources of information that were so unmanageable to use added up to it resulting in a 5 man team reducing testing costs by $3 million for a single line of Intel Core Processors.
Extracting insight from unstructured data can be quite a task requiring the assistance of a Data Scientist and Intel is working with Apache to develop a hybrid cloud infrastructure around Hadoop. WIth this, Apache will understand how to utilise Intel hardware features and collect feedback on Hadoop support requirements.

Source: Techworld

About the Author

is the Managing Editor at Techweez, he has a passion for consumer technology and the ecosystems around it. You will find him interested mostly about mobile phones, mobile operating systems, application stores, social media and internet business. He does the mobile phone reviews on this site. He loves watching animations movies and series like Garfield. Follow him at @martingicheru on Twitter and Google+, he doesnt bite.

Never miss a thing!

Get daily updates in your inbox