![]() Large data sets allowed overcoming noisy and missing data.Īndrew Ng postulates that model-centric AI now reached a point of saturation. Do we need gradient-boosted trees or neural networks? How many layers, which activation functions, which gradient descent algorithm? The plethora of options posed many challenges in identifying suitable architectures. The main question is what machine learning algorithm squeezes the most out of the data. Traditionally, data is considered as given input for algorithms. The focus is on improving the model, trying to get the best possible performance out of the fixed data set In model-centric AI, data is assumed to be a given. We will revisit data-centric AI in more depth, but first we must address the model-centric AI that currently dominates the field. For true progress, the quality of model input should match the quality of the transformation. However, to maintain momentum, designing and improving algorithms alone is not enough. Model-centric research has been very fruitful, culminating in many high-quality architectures. In the past years, the community’s focus has been model-centric, with an emphasis on designing, fine-tuning and improving algorithms suitable for various tasks (text mining, image recognition, etc.). When he identifies an emerging trend in data science, it pays off to listen.Īndrew argues that - in order to unlock the full potential of artificial intelligence - it is time to start focusing on the quality of data, dubbing the movement data-centric AI. Having founded Google Brain, taught at Stanford, co-founded the online learning platform Coursera (including the extremely popular ‘ Machine Learning’ course), and pioneered the use of GPUs for machine learning, it is safe to say he has some credibility. Manufacturing is one of the fields that may benefit from data-driven defect detection, yet the number of relevant defect examples is often too small for effective machine learning A paradigm shift?īring in Andrew Ng. To make the most of modern machine learning, a different angle is needed. However, these are not necessarily tasks for which billions of data points are readily available, especially when considering rare defects or diseases. The relevance of such examples is eminent, as is the pivotal role that data can play. Manufacturing: Automatically detecting defects on a production line.Health care: Identifying tumors on X-ray images.Cost accounting: Predicting costs for custom-built machines.To make the potential applications a bit more concrete, just consider the following few examples: These organizations just want to extract valuable insights from their small data sets - leveraging the state-of-the-art in machine learning - without relying on bizarrely large datasets. Millions of SMEs (and other organizations) out deal with problems that beg for comprehensive data solutions. Nonetheless, there is a world outside of Silicon Valley that tends to be overlooked. ![]() For instance, natural language processing has matured, yet video analysis remains pretty much a green field, still awaiting technological advances to propel developments. As long as there is more data to collect, we will find new ways to utilize it. Such big data trends will persist, no doubt. The breakthroughs have been plentiful and exhilarating. ![]() ![]() ![]() Petabytes of fine-grained information extracted from high-frequency sensor and user logs, stored in enormous server farms. Vast streams of social media feeds, processed in real-time. We have witnessed gigantic neural networks, with millions of parameters to tune. Fueled by the successes of Google, Amazon and Facebook, substantial breakthroughs in large-scale data analysis have been made, with data-driven decision-making becoming a top priority for many enterprises. With storage capacity and computational power becoming increasingly cheaper, we could store and process massive amounts of data to generate new insights. For the past two decades or so, we have lived in an era of ‘Big Data’. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |