Modular Closes $30M Seed Round to Simplify AI Development Process

- Advertisement -


AI has transformative potential. But if you ask the co-founders Modular, a startup that emerged today from stealth, the software used to develop it is “monolithic”, broken down into silos heaped with layers of complexity. Helpful contributions have been made by major tech companies such as TensorFlow and PyTorch, AI development platforms backed by Google and Facebook, respectively. But these companies, according to Modular’s co-founders, are prioritizing their tools and infrastructure at the expense of advances in AI.

- Advertisement -

Modular aims to change that. Founded by former Apple and Google engineers and executives, the company today closed a major ($30 million) seed round led by GV (formerly Google Ventures) involving Greylock, The Factory and SV Angel to realize its vision of a streamlined platform. – an independent platform for the development of artificial intelligence systems.

- Advertisement -

“The industry struggles to support and scale fragmented, custom tool chains that vary across R&D, training and deployment, server and edge,” Modular CEO Chris Lattner told TechCrunch in an email interview. “Many of the world’s largest small tech companies naively believe that the open source community and open source infrastructure owned by Google, Meta and Nvidia will eventually deliver when their priorities and constraints show otherwise.”

Lattner has an impressive resume as he spearheaded the creation of Swift, the programming language on which much of the Apple ecosystem is based. Previously, he was vice president of Tesla’s self-driving car division and president of engineering and products at SiFive, which provides intellectual property to chip design companies. During his time at Google, Lattner managed and built a number of AI-related products, including TPU at Google Brain, one of Google’s AI research divisions, and TensorFlow.

- Advertisement -

Modular’s other co-founder, Tim Davis, has been successful on his own, helping to define the vision, strategy, and roadmaps for Google’s machine learning products spanning small research teams and production systems. From 2020 to early 2022, Davis led the development of Google’s machine learning APIs, compilers, and execution infrastructure for servers and peripherals.

Modular

Image credits: Modular

“The most pressing issue facing companies that are not ‘big tech’ is how to bring AI into production in terms of productivity, cost, time and talent. The opportunity cost of this task is enormous. For individual companies, this means that innovations do not make it to the market, a poor quality product and, ultimately, a negative impact on their profits,” said Lattner. “AI can change the world, but only after fragmentation is eliminated and the global developer community can focus on solving real problems and not on the infrastructure itself.”

The Modular solution is a platform that unifies popular AI framework interfaces through modular, “composable” common components. The details are a bit murky — this is just the beginning, Lattner warned — but the goal of Modular is to allow developers to hook up dedicated hardware to train AI systems, deploy those systems to edge devices or servers, and otherwise “smoothly scale.” [the systems] at the hardware level, so deploying the latest AI research to manufacturing “just works,” Lattner said.

By one description, Modular fits into a new category of MLOps vendors, providing tools for collecting, labeling, and transforming the data needed to train AI systems, as well as workflows for developing, deploying, and monitoring AI. MLOps, short for “machine learning operations”, seeks to streamline the AI ​​lifecycle by automating and standardizing development workflows, much in the same way that DevOps should have been done for software.

“The founders of Modular are on a familiar path – reimagining the underlying infrastructure with a modular programming layer, prioritizing simplicity and usability,” he told TechCrunch when contacted for comment. “The hardware and software ecosystems that support AI have reached a familiar inflection point, where enthusiasm for new capabilities has bred complexity and fragmentation that are ripe for simplification.”

Analyst company Cognilytica, driven by the rapid adoption of artificial intelligence predicts that the global market for MLOps solutions will be worth $4 billion by 2025, up from $350 million in 2019. interviewForrester found that 73% of companies believe that implementing MLOps will keep them competitive, and 24% believe it will make them industry leaders.

“Modular’s ​​main competitor is the mindset that dominates AI software development within Big Tech and Big Tech itself,” Lattner said. “The reason these companies are successful in implementing AI is because they are gathering armies of developers, incredibly talented AI masters, and using their massive computing and financial resources to advance their own efforts and products, including their own AI clouds and hardware. Despite their incredible contribution to the field, their self-confidence highlights the deep chasm in the field of AI and limits the ability of the rest of the world to use this technology to solve some of our biggest socio-economic and environmental challenges.”

Lattner, without naming names, says Modular is already working with “some of the biggest [firms] in technology.” In the near term, the focus is on expanding the Modular team of 25 and preparing the platform for launch in the coming months.

“Changing economic conditions mean the world’s largest AI companies are spending billions on AI to focus on producing and making money from AI, rather than polishing it,” Lattner said. “Many of the best and smartest people in computer science — essentially 100x engineers in organizations where 10x engineers are the norm—struggle just to keep and make these systems work for major use cases — most of which focused on revenue optimization projects, not changing the world. To this end, technical decision makers are looking for a more usable, flexible, and performant infrastructure that makes it easier to develop and deploy e2e AI, and to move AI research into production more quickly. In fact, they just want to get much more value from AI at a lower deployment cost.”


Credit: techcrunch.com /

- Advertisement -

Stay on top - Get the daily news in your inbox

DMCA / Correction Notice

Recent Articles

Related Stories

Stay on top - Get the daily news in your inbox