As companies move to leverage machine learning to run their businesses more efficiently, the fact remains that it takes a lot of energy to build, test, and run models in production. neuro.ro, an early stage full-stack MLOPS solution building a green approach.
Today, the company announced a zero-emissions AI cloud solution in collaboration with a Finnish cloud infrastructure partner in the north,
The company says it is providing a Tier 3, ISO 27001-certified data center in the North, running NVIDIA A100-powered DGX and HGX systems. The data center provides 80MW of power capacity, which runs entirely on geothermal and hydro power. In addition, due to its location near the Arctic, it offers essentially free cooling, making neu.ro an energy-efficient solution for customers to build machine learning models using its solution.
The company’s co-founder Max Prasolov says that after researching the problem, he found that computing and telecommunications are accounting for about 9 percent of total energy consumption around the world, with his research showing that this number will increase over the next decade. can be doubled. He believes that machine learning model building will be a growing part of that, and he decided to work closely with North to reduce their carbon emissions footprint.
“We decided to move all of our operations and all of our experiments to a zero-emission cloud. And the goal is not to be carbon neutral because we [know that we could] Buy credit and compensate [our usage], the question is how [achieve] zero emission? we [realized] We spent so much energy and so much computing power training our models for customers, and we understand that’s definitely the biggest carbon footprint. [are producing]Praslov said.
While the company was at it, it came to model through its software solutions in a more efficient way, which, in turn, reduces the amount of energy required and enables the company to provide even more sustainable solutions. Is.
In terms of product, the company is offering a flexible, cloud native service where they provide some of the tooling, but leave enough room for companies to fill in the pieces that they think work best for them. does.
“The way we approach this is instead of trying to build every single tool that needs to be interpreted, from data ingestion to monitoring to building pipeline engines, etc., we are all about interoperability. We create what hasn’t been built, and we connect with the universe of Kubernetes-based tools that are already out there,” explained Arthur McCallum, the company’s co-founder.
The startup currently has a commercial solution, but it is working on an open source version of the stack that will be released soon, probably before the end of the year. The company aims to provide cloud-based AI solutions for smaller cloud vendors beyond the big three of Amazon, Microsoft and Google. This will include regional vendors from around the world.
Neuro.ro launched in 2019 and came out with its first version of the solution last year. According to the company, it has raised $2.3 million in seed funding so far.