ServiceNow, Hugging Face, and NVIDIA have released a new family of big language code generation models called StarCoder2, which is designed for developers.

“The state-of-the-art open-access model improves on prior generative AI performance to increase developer productivity and provides developers equal access to the benefits of code generation AI, which in turn enables organizations of any size to more easily meet their full business potential”, said Harm de Vries, lead of ServiceNow’s StarCoder2 development team, and co-lead of BigCode.

StarCoder2 is trained on over 600 programming languages from The Stack v2 and some natural language code such as issues from Wikipedia, Arxiv and GitHub. The models use grouped query attention, a context window of 16,384 lexemes, with sliding window attention of 4,096 lexemes. Models 3B and 7B are trained on over 3 trillion lexemes, and model 15B on over 4 trillion lexemes.

According to the creators, the goal of StarCoder2 is to provide developers with features such as code generation, workflow generation, and text summarization, among others.

The project to develop StarCoder2 was led by ServiceNow and Hugging Face. StarCoder 2 is available in three distinct sizes: ServiceNow crafted a 3 billion-parameter model, Hugging Face developed a 7 billion-parameter model, and NVIDIA engineered a 15 billion-parameter model.

The smaller models are designed to offer powerful performance while using small amounts of compute power. According to the companies, the 3 billion-parameter model matches the performance of the 15 billion-parameter model of the original StarCoder release.

Read more:
1. Mark Zuckerberg Met with Japanese Prime Minister Fumio Kishida
2. Odysseus Mission Will Cease Operations
3. Meta Starts to Counter Disinformation

Tags: , , , , , , , , , , , , , , , , , , , , , , , , ,
Editor @ DevStyleR