Skip to content

feat: add AkashML provider and models#1347

Open
fenilmodi00 wants to merge 1 commit intoanomalyco:devfrom
fenilmodi00:feat/add-akashml-provider
Open

feat: add AkashML provider and models#1347
fenilmodi00 wants to merge 1 commit intoanomalyco:devfrom
fenilmodi00:feat/add-akashml-provider

Conversation

@fenilmodi00
Copy link
Copy Markdown

feat: add AkashML managed inference provider

AkashML is a managed AI inference service built on the Akash Network decentralized GPU supercloud. This PR adds the provider configuration, a refined logo, and five core models.

What's included

1. Provider Core (providers/akashml/provider.toml)

  • Uses @ai-sdk/openai-compatible for drop-in integration.
  • API base: https://api.akashml.com/v1.
  • Authentication via AKASHML_API_KEY.

2. Refined Logo (providers/akashml/logo.svg)

  • Extracted graphical mark from official branding (akashml_logo_long.svg).
  • Stripped text for a clean icon-only presentation.
  • Optimized with currentColor for seamless light/dark theme support.

3. Model Definitions (5 models)

Added models in vendor-specific subdirectories with verified April 2026 specifications:

Model ID Family Context Modalities Attributes
meta-llama/Llama-3.3-70B-Instruct llama 131k Text tool_call
deepseek-ai/DeepSeek-V3.2 deepseek 131k Text reasoning, tool_call
google/gemma-4-31b-it gemma 131k Text, Image, Video reasoning, attachment, tool_call
qwen/qwen3.5-35b-a3b qwen 131k Text, Image, Video reasoning, attachment, tool_call
minimax/minimax-m2-5 minimax 131k Text reasoning, tool_call

Key Notes

  • Capabilities Verified: Re-verified reasoning ("thinking" mode) and multimodal support for the 2026 frontier models.
  • Pricing: Initial competitive pricing sourced from AkashML; subject to upstream updates.
  • Organization: Followed repository best practices for nested model folders.

References

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants