| | Getting Started with Gemma 3 (secondstate.io) |
|
2 points by thunderbong 22 days ago | past
|
| | Run DeepSeek R1 on Mac with Simple Commands (secondstate.io) |
|
3 points by 3Sophons 84 days ago | past | 1 comment
|
| | RustCoder: Code and Build with Rust, Zero Headaches (secondstate.io) |
|
4 points by 3Sophons 3 months ago | past
|
| | Run Qwen2.5-14B Locally – An OpenAI API Alternative for Chatbots and Embeddings (secondstate.io) |
|
1 point by 3Sophons 6 months ago | past | 1 comment
|
| | Contribute to open source and win $600 Linux foundation exam/course voucher (secondstate.io) |
|
1 point by 3Sophons 6 months ago | past | 1 comment
|
| | Running FLUX.1 [Schnell] Locally on a MacBook– Fast and Easy Setup (secondstate.io) |
|
1 point by 3Sophons 6 months ago | past | 1 comment
|
| | Write code/ article around WasmEdge to get a free LFX exam/ course (secondstate.io) |
|
1 point by 3Sophons 7 months ago | past | 1 comment
|
| | Show HN: Try Yi Coder with Cursor to Write a Search Webpage (secondstate.io) |
|
1 point by 3Sophons 7 months ago | past
|
| | Run with Phi 3.5-mini across GPUs, CPUs and OSes (secondstate.io) |
|
1 point by 3Sophons 7 months ago | past | 1 comment
|
| | KubeCon and CloudNativeCon and Open Source Summit and AI_dev China 2024 (secondstate.io) |
|
1 point by 3Sophons 8 months ago | past | 1 comment
|
| | Run Llama 3.1 on any device. Embed Llama 3.1 in any app (secondstate.io) |
|
1 point by 3Sophons 8 months ago | past | 1 comment
|
| | Embed LLMs in your app easily–each model tackles agentic tasks it finetuned for (secondstate.io) |
|
2 points by 3Sophons 9 months ago | past | 1 comment
|
| | [flagged] Can the New Mathstral LLM Accurately Compare 9.11 and 9.9? (secondstate.io) |
|
23 points by 3Sophons 9 months ago | past | 32 comments
|
| | Run Alicloud's Qwen LLM with a 2MB cross GPU/CPU rust app (secondstate.io) |
|
1 point by 3Sophons on Feb 28, 2024 | past | 1 comment
|
| | Self-Host Across GPU/CPUs Google's Gemma LLMs with a 2MB App (secondstate.io) |
|
3 points by 3Sophons on Feb 22, 2024 | past | 1 comment
|
| | Free Linux Foundation Training/ Certification for New Open Source Contributors (secondstate.io) |
|
1 point by 3Sophons on Feb 22, 2024 | past
|
| | Complete tutorial: Run Qwen LLM across your local machines without Python/C++ (secondstate.io) |
|
3 points by 3Sophons on Feb 22, 2024 | past
|
| | Self-host any GUFF LLMs on Hugging Face and run across devices (secondstate.io) |
|
1 point by 3Sophons on Feb 4, 2024 | past
|
| | Run the leaked Mistral Medium, miqu-1-70B across GPUs CPUs and OSes (secondstate.io) |
|
3 points by 3Sophons on Feb 2, 2024 | past | 3 comments
|
| | Self-host StableLM-2-Zephyr-1.6B. Portable across GPUs CPUs OSes (secondstate.io) |
|
2 points by 3Sophons on Jan 31, 2024 | past | 1 comment
|
| | WASM as the Runtime for LLMs and AGI (secondstate.io) |
|
1 point by ben_s on Jan 30, 2024 | past
|
| | LFX Mentorship 2024 Spring LLM Projects: Build Open Source AI Inference Infra (secondstate.io) |
|
1 point by 3Sophons on Jan 30, 2024 | past | 1 comment
|
| | Run Nous-Hermes-2-Mixtral-8x7B with one Command on Mac, Jetson and more (secondstate.io) |
|
1 point by 3Sophons on Jan 19, 2024 | past | 1 comment
|
| | Self-host SOLAR-10.7B-Instruct-v1.0 LLM with portable 2MB AI inference app (secondstate.io) |
|
1 point by 3Sophons on Jan 3, 2024 | past
|
| | Selfhost LLMs like Mixtral 8x7B on the Edge & across devices. (secondstate.io) |
|
1 point by 3Sophons on Jan 3, 2024 | past | 1 comment
|
| | Easy Setup Self-host Mixtral-8x7B across devices with a 2M inference app (secondstate.io) |
|
2 points by 3Sophons on Jan 2, 2024 | past | 1 comment
|
| | Run on Mac Japanese LLM CALM2-7B with portable 2M inference app and create API (secondstate.io) |
|
4 points by 3Sophons on Dec 29, 2023 | past | 1 comment
|
| | StarlingLM Runs Across Devices – No Python Hassles (secondstate.io) |
|
1 point by 3Sophons on Nov 29, 2023 | past | 1 comment
|
| | Getting Started with Orca-2-13B (secondstate.io) |
|
68 points by 3Sophons on Nov 27, 2023 | past | 17 comments
|
| | 4 Command lines to run open source LLMs across devices with 2MB inference app (secondstate.io) |
|
1 point by 3Sophons on Nov 21, 2023 | past
|
|
|
More |