Microsoft announces SynapseML for .NET for large-scale machine learning — Visual Studio Magazine

New

Microsoft announces SynapseML for .NET for large-scale machine learning

Microsoft announced SynapseML for .NET, building on its open-source project for large-scale machine learning that debuted last November.

This open-source project in turn leverages Apache Spark and SparkML to simplify the creation of scalable machine learning pipelines while enabling new types of machine learning, analysis, and model deployment workflows. Formerly called MMLSpark, it brings many deep learning and data science tools to the Spark ecosystem, such as the seamless integration of Spark Machine Learning pipelines with the Open Neural Network Exchange (ONNX), LightGBM, The Cognitive Services, Vowpal Wabbit and OpenCV. Microsoft said these tools enable powerful and highly califiable predictive and analytical models for a variety of data sources.

As part of the new SynapseML v0.10 release, Microsoft announced a new set of .NET APIs for massively scalable machine learning.

“It allows you to build, train, and use any SynapseML model from C#, F#, or other .NET family languages ​​with our .NET language bindings for Apache Spark,” said the company said in an Aug. 9 blog post.

SynapseML in animated action
[Click on image for larger, animated GIF view.] SynapseML in animated action (source: Microsoft).

The tool can help developers build scalable and intelligent systems in a wide variety of Microsoft areas, including:

“A unified API standardizes many of today’s tools, frameworks, and algorithms, streamlining the distributed ML experience,” Microsoft said last November when announcing the open-source project. “This allows developers to quickly compose disparate ML frameworks for use cases that require more than one framework, such as web-based supervised learning, search engine building, and many more. It can also train and evaluate models on single-node, multi-node, and elastically scalable computer clusters, so developers can scale their work without wasting resources.”

About the Author


David Ramel is an editor and writer for Converge360.



Comments are closed.