DEV Community

Serokell
Serokell

Posted on • Originally published at serokell.io on

Machine Learning Trends for 2023

Machine learning and artificial intelligence is a field that drives major innovations across different industries. It is predicted that in 2023, the AI market will reach $500 billion, and in 2030, $1,597.1 billion in size. This means that machine learning technologies will continue to be in high demand in the near future.

However, the machine learning industry evolves very rapidly: new technologies and scientific research define how new products and services are built. At the end of 2022, everyone, from machine learning engineers to startup founders, is on the lookout for the most promising trends for the next year. If you want to learn about some of the hottest trends for the upcoming year, continue reading this article.

Machine learning technology trends

We can never predict with 100% certainty what kind of technologies will be in demand next year, since new innovations appear every day. But here are some of the most promising machine learning trends for 2023, based on what we have seen in 2022.

1. Foundation models

Large language models are an important innovation that has gained popularity recently and is most likely to stay with us in the nearest future. Foundation models are artificial intelligence tools that are trained on immense amounts of data, even compared to regular neural networks.

Engineers try to achieve a new level of understanding by teaching the machines to not just search for patterns but also accumulate knowledge. Foundation models are incredibly helpful in content generation and summarization, coding and translation, and customer support. Well-known examples of foundation models are GPT-3 and MidJourney.

An amazing thing about foundation models is that they can also scale fast and work with data they have never seen before, hence their wonderful generating capabilities. Leading providers of these solutions are NVIDIA and Open AI.

2. Multimodal machine learning

In such tasks as computer vision or natural language processing that involve interaction between the model and the real world, the model often has to rely only on one type of data, be it images or text. But in real life, we perceive the world around us through many senses: smell, hearing, feeling textures, and flavors.

Multimodal machine learning suggests using the fact that the world around us can be experienced in multiple ways (called modalities) to build better models. The term “multimodal” in AI describes how to build ML models that can perceive the event in multiple modalities at a time, just like humans do.

Building an MML can be achieved through combining different types of information and using them in training. For example, matching images with audio and text labels to make them easier to recognize. Multimodal machine learning is so far a young field that is yet to be developed and advanced in 2023, but many believe that it can be key to achieving general AI.

3. Transformers

Transformers are a type of artificial intelligence architecture that performs transduction (or transformation) on an input sequence of data using encoder and decoder and transforms it into another sequence. Many foundation models are also built on transformers. However, we wanted to point them out separately since they are used for many other applications. In fact, it is reported that transformers are taking the AI world by storm.

Also called Seq2Seq models, transformers are widely used in translation and other natural language processing tasks. Because transformers can analyze sequences of words rather than individual words, they generally show better results than ordinary artificial neural networks.

Rather than simply taking all words in a sentence and translating them word by word, a transformer model is able to assign weights that assess the importance to each word in the sequence. Then, the model transforms it into a sentence in a different language that takes in consideration the assigned weights. Some of the leading solutions that can help you build transformer pipelines are Hugging Face and Amazon Comprehend.

4. Embedded machine learning

Embedded machine learning (or TinyML) is a subfield of machine learning that enables machine learning technologies to run on different devices.

TinyML is used in household appliances, smartphones, and laptops, smart home systems, and more. As Lian Jye Su, AI & ML Principal Analyst at ABI Research explains:

The proliferation and democratization of AI has fueled the growth of Internet of Things (IoT) analytics. Data collected from IoT devices are used to train Machine Learning (ML) models, generating valuable new insights into the IoT overall. These applications require powerful and expensive solutions that rely on complex chipsets.

The rising popularity of embedded machine learning systems is one of the major drives of the chipset manufacturing industry. If ten years ago, the number of transistors on a chipset doubled every two years, according to Moore’s law, which allowed us to predict the increase in computational power as well, in the last few years, we have seen a 40-60% leap per year. We believe that this tendency will persist in the upcoming years as well.

With the wider proliferation of IoT technologies and robotics, embedded systems have gained even more importance. Tiny ML poses its own unique challenges that are yet to be resolved in 2023 as it requires maximum optimization and efficiency while saving resources.

5. Low-code and no-code solutions

Machine learning and AI have penetrated literally every field from agriculture to marketing to banking. Making ML solutions easy to use by non-techy employees is often considered by managers as a key to maintaining the efficiency of the whole organization.

However, instead of putting stuff through the long and costly process of learning programming, it’s much easier to simply choose apps that require zero or close to zero coding skills. But this is not the only issue no-code solutions are likely to solve.

Gartner has found that the demand for high-quality solutions on the market is bigger than the possibilities to deliver – “it grows at least 5x faster than IT capacity to deliver them”. No-code and low-code solutions can help bridge this gap and satisfy the demand. Similarly, low-code solutions enable tech teams alike to come up and test their hypothesis faster, reducing time-to-delivery and development costs. If 10 years ago, it would take a whole team of people to build an application or launch a website, today just one person can do the same and do it fast.

Moreover, 82% of organizations experience difficulties attracting and retaining the quality and quantity of software engineers and are willing to build and maintain their apps with the help of no-code and low-code techniques.

While many low-code and no-code solutions have appeared in recent years, the general trend is that they still are inferior in quality compared to regular development. Startups that will be able to improve the situation will win on the AI market.

Finally, it is worth mentioning that with the rapidly increasing computational power that is required to train an ML model (especially for real-time ML that runs in large organizations), cloud computing remains an important technology that lays behind the innovations. According to statistics, about 60% of the world’s corporate data is stored in the cloud, and this number is likely to grow. In 2023, we will see increased investment in cloud security and resilience to satisfy the growing needs of the ML industry.

Top technological segments for ML in 2023

Gartner has defined the technological segments that are expected to have obtained the most machine learning presence in the next 7-8 years. Among the leading areas that they have mentioned are:

  • Creative artificial intelligence. AI used for generative texts, code, and even images and video has gained wide popularity in 2022, especially with the release of state-of-the-art image generation network by MidJourney, DALLE-2, Stable Diffusion, and the new text-davinci-003 by Open AI. Products and services that use generational AI for fashion, creativity, and marketing will be in high demand in 2023.
  • Distributed enterprise management. With remote work becoming a norm, companies were bound to look for new ways to manage the workforce and maintain efficiency. According to Gartner, ML will help distributed companies to grow and increase their income.
  • Automation. Autonomous software systems that can take on increasingly complicated tasks and adapt to quickly changing conditions are in high demand in many industries from security to banking. New innovations that provide for smarter automation will appear in 2023.
  • Cybersecurity. The importance of cybersecurity is growing every year with increasing digitalization of various fields of life and the necessity to protect sensitive information. ML and AI are believed to be crucial in the role of protecting private data and securing organizations.

Conclusion

In 2023, machine learning will continue to be a promising and rapidly growing field that will present many interesting innovations. Large language models, multimodal machine learning, transformers, TinyML, and no-code and low-code solutions are the emerging technologies that will gain considerable importance in the near future.

Some of the technical segments that will increasingly use ML in 2023 are creative AI, autonomous systems, distributed enterprise management, and cyber security. Gartner predicts that in 2023, ML will penetrate even more business fields helping to increase efficiency and work security.

If you want to continue learning the latest news and gain inspiration from the leading professionals in the ML industry, stay tuned to our blog and follow us on Twitter.

Top comments (0)