We’ve known it for a while, but now it’s certain: the creative AI race is as much a competition for developers as it is for end users.
Case in point: Today, Elon Musk’s xAI, a spin-off startup from social network X that uses its data to train new large language models (LLMs) like the Growk family. declared its application programming interface (API) Now open to the public and comes with a free $25 per month in API credits until the end of the year.
Since it’s already November, that’s just 2 months of free credits, or $50 total.
Musk previously announced that the xAI API was open in beta three weeks ago, but apparently the uptake wasn’t enough for his liking, hence the added incentive of free dev credits.
Is $25 per month and 2 months left really that much of a carrot?
It doesn’t sound like much coming from the world’s richest man and billionaire, and it’s not really on a per-user basis or in aggregate, but it’s getting some developers to at least use xAI’s tools and That might be enough to entice you to check out the platform. Building apps on top of Grok models.
in particular, xAI’s API costs $5 per million input tokens and $15 per million output.Compared to $2.50/$10 for OpenAI’s GPT-4o model And on $3/$15 for Anthropic’s Claude 3.5 Sonnet model. Ultimately, this means that xAI’s $25 credit won’t get the developer very far – only about two million tokens in and a million out. For reference, there is a million tokens The equivalent of 7-8 novels worth of words.
The context limit, or how many tokens can be input or output in one interaction via the API, is around 128,000, as OpenAI’s GPT-4o and Under Anthropic’s 200,000 token windowand well below 1 million context window length of Google Gemini 1.5 Flash.
Also, from my brief test of the xAPI, I was only able to access grok-beta and text only, with no image rendering capabilities like Grok 2 (powered by Black Forest Labs’ Flux.1 model ).
New Grok models are coming soon.
According to xAI’s blog post, this is actually “a preview of a new Grok model currently in the final stages of development,” and a new Grok “Vision model will be available next week.”
In addition, xAI notes that grok-beta supports “function calling,” or the ability for LLM to take commands from the user and access functions of other connected apps and services, even on the user’s behalf. Executes them (if the associated app allows such access).
In line with the competition
Furthermore, the xAI account on social network X Posted that the xAI API is “compatible with OpenAI and Anthropic SDKs,” or the software development kits of various web tools used by developers on those platforms, meaning switching those models to grok-beta or others. Getting out should be relatively easy. xAI Platform.
Musk’s xAI recently turned it on. “Colossus” supercluster of 100,000 Nvidia H100 GPUs in Memphis, Tennessee, which is being used to train its new models – the world’s largest or largest – so apparently the facility is already up to the task.
Credit : venturebeat.com