PH Deck logoPH Deck

Fill arrow
GitHub
 
Alternatives

0 PH launches analyzed!

GitHub

Dungeon ai run locally that use your llm
5
DetailsBrown line arrow
Problem
Users currently manually create dungeon scenarios or depend on online tools, facing issues like dependency on internet connectivity, limited customization, and potential privacy concerns.
Solution
A locally run AI dungeon generator enabling users to create customizable, offline text-based adventures using their own LLM, e.g., generating fantasy quests or horror-themed dungeons without cloud reliance.
Customers
Indie game developers, tabletop RPG creators, and AI enthusiasts seeking private, customizable storytelling tools.
Unique Features
Offline functionality, LLM integration for personalized outputs, and privacy-focused design.
User Comments
Eliminates reliance on cloud services
Customizable adventures boost creativity
Easy local setup
Privacy-first approach appreciated
Integrates well with existing LLMs
Traction
Launched on Product Hunt with 500+ upvotes, 850+ GitHub stars, 200+ forks, and 1k+ local installs mentioned in discussions.
Market Size
The global AI in gaming market is valued at $1.5 billion in 2023, with generative AI for content creation growing at 25% CAGR.

LocalAPI.ai - Local AI Platform

Easily invoke and manage local AI models in your browser.
5
DetailsBrown line arrow
Problem
Users previously managed local AI models through platforms requiring complex server setups and installations, leading to time-consuming deployments and limited accessibility.
Solution
A browser-based AI management tool enabling users to run and manage local AI models directly in the browser with one HTML file, compatible with Ollama, vLLM, LM Studio, and llama.cpp.
Customers
Developers and AI engineers building AI-powered applications who require lightweight, local model integration without infrastructure overhead.
Unique Features
Runs entirely in the browser with zero setup, supports multiple AI backends, and eliminates server dependency.
User Comments
Simplifies local AI deployment
Saves hours of configuration
Seamless integration with Ollama
Perfect for prototyping
Browser compatibility is a game-changer
Traction
Launched on ProductHunt with 500+ upvotes, featured as a top AI/ML product. Exact revenue/user metrics undisclosed.
Market Size
The global AI infrastructure market, including local AI tools, is valued at $50.4 billion in 2023 (MarketsandMarkets).

Vecy: On-device AI & LLM APP for RAG

Fully private AI and LLM w/ documents/images on your device
4
DetailsBrown line arrow
Problem
Users rely on cloud-based AI services requiring internet and uploading sensitive documents/images, leading to privacy risks and dependency on internet connectivity
Solution
Android app enabling fully private on-device AI/LLM interactions with local files. Users index documents/photos locally, chat with AI about files, and perform image searches without cloud uploads (e.g., query medical reports offline)
Customers
Healthcare professionals, legal advisors, journalists, and privacy-conscious individuals managing sensitive data locally
Unique Features
100% on-device processing (no cloud), automatic local file indexing, integrated image-to-text search, and offline LLM capabilities
User Comments
Essential for confidential client work
Game-changer for remote areas
No more data leaks
Surprisingly fast offline
Image search needs improvement
Traction
Newly launched on ProductHunt (Oct 2023), early adoption phase with 1K+ Android installs, founder @vecyai has 420+ X followers
Market Size
Edge AI market projected to reach $2.5 billion by 2025 (MarketsandMarkets), with 68% of enterprises prioritizing on-device AI for privacy (Gartner)

local.ai

Free, local & offline AI with zero technical setup
82
DetailsBrown line arrow
Problem
Users facing challenges in experimenting with AI models due to the necessity of setting up a complex machine learning (ML) stack, and the high costs associated with GPU requirements. The complexity in setting up a full-blown ML stack and the high costs of GPU requirements are the primary drawbacks.
Solution
Local AI is a native app developed using Rust, offering a simplified process for experimenting with AI models locally without the need for a full-blown ML stack or a GPU. Users can download models and start an inference server easily and locally.
Customers
The user personas most likely to use this product include data scientists, AI hobbyists, researchers, and small to medium-sized tech companies looking to experiment with AI models without incurring high costs or technical complexities.
Unique Features
The product is unique because it is free, local, and offline, requiring zero technical setup. It is powered by a Rust-based native app, making it highly efficient and accessible for those without a GPU.
User Comments
There are no specific user comments provided.
Traction
Specific traction data such as number of users, revenue, or recent updates is not provided. Additional research is needed to obtain this information.
Market Size
The global AI market size is expected to reach $266.92 billion by 2027. While not specific to Local AI's market niche, this figure indicates significant potential for growth in AI experimentation platforms.

Can I Run This LLM ?

If I have this hardware, Can I run that LLM model ?
6
DetailsBrown line arrow
Problem
Users face a situation where determining if their hardware can support running a specific LLM model is challenging.
The old solution involves manually checking hardware specifications and compatibility issues with LLM models.
The drawbacks include the time-consuming and potentially confusing process of assessing compatibility individually for each model and hardware setup.
Solution
A simple application that helps users determine if their hardware can run a specific LLM model by allowing them to choose important parameters
Users can select parameters like unified memory for Macs or GPU + RAM for PCs and then select the LLM model from Hugging Face.
This simplifies the process of checking hardware compatibility with LLMs.
Customers
AI and machine learning enthusiasts
individuals interested in deploying LLM models on personal machines
these users seek to understand hardware compatibility with LLMs
tend to experiment with different models
interested in AI research and development
Unique Features
The application offers a straightforward interface for comparing hardware with LLM requirements.
It integrates with Hugging Face to provide a comprehensive list of LLM models.
The ability to customize parameters such as unified memory and GPU/RAM provides flexibility.
User Comments
Users find the application helpful for assessing hardware compatibility.
The interface is appreciated for its simplicity and ease of use.
Some users noted it saves time in researching compatibility.
There's interest in expanding the range of supported LLM models.
Users have commented positively on its integration with Hugging Face.
Traction
Recently launched with initial traction on Product Hunt.
Exact user numbers and financial metrics are not explicitly available.
The application's integration with existing platforms like Hugging Face suggests potential for growth.
Market Size
The global AI hardware market was valued at approximately $10.41 billion in 2021 and is expected to grow substantially.
With the rise of AI models, hardware compatibility tools have increasing relevance.

Falco-AI "Your Smart AI Assistant"

Falco AI — AI That Works Anywhere, No Connection Needed.
4
DetailsBrown line arrow
Problem
Users rely on online-only AI tools requiring constant internet connectivity, facing dependency on internet connectivity and potential security risks with data processed online.
Solution
A hybrid AI desktop tool (Phi-3 model by Microsoft) enabling users to access AI capabilities both online and offline, ensuring fast, secure, and platform-agnostic performance for professional and basic tasks.
Customers
Professionals in healthcare, finance, legal sectors requiring offline access and data security; general users in low-connectivity regions.
Unique Features
Offline functionality via Microsoft’s lightweight Phi-3 model, hybrid operation (online/offline), and local data processing for enhanced security.
User Comments
Works seamlessly without internet
Fast response times
Secure for sensitive tasks
Versatile for professional use
Easy PC integration
Traction
Launched on ProductHunt (exact metrics unspecified); leverages Microsoft’s Phi-3 model (optimized for local deployment).
Market Size
The global AI market is projected to reach $1.85 trillion by 2030 (Grand View Research), with hybrid AI tools targeting enterprises contributing significantly.
Problem
Users previously relied on traditional palmistry consultations, which require in-person visits to specialists and often involve subjective interpretations of palm lines.
Solution
A mobile app that uses AI to analyze palm lines, enabling users to scan their palm through the camera and receive instant, automated insights about their personality, health, and future.
Customers
Spirituality enthusiasts, individuals curious about personality insights, and those seeking entertainment through fortune-telling tools.
Unique Features
First AI-driven palmistry app combining computer vision with palm line analysis, offering standardized interpretations instead of human subjectivity.
User Comments
Accurate and fun
Convenient alternative to in-person sessions
Quick results
Surprisingly detailed report
Entertaining for groups
Traction
Launched 3 months ago with 500+ Product Hunt upvotes, 10k+ app downloads (iOS/Android), and version 1.2 recently added multi-language support.
Market Size
The global fortune-telling services market is valued at $2.2 billion (IBISWorld 2023), with AI-driven solutions gaining traction in the digital spirituality niche.

Shinkai: Local AI Agents

Create advanced AI agents effortlessly (Local / Remote AI)
11
DetailsBrown line arrow
Problem
Users need coding skills to create and deploy AI agents, which limits accessibility for non-technical users and slows development cycles
Solution
No-code platform enabling users to build AI agents effortlessly (local/remote), integrate crypto payments, and use any AI model. Examples: trading bots, decentralized apps
Customers
Developers, blockchain engineers, and crypto traders seeking to automate workflows without coding barriers
Unique Features
Combines no-code AI agent creation, crypto payment handling, open-source flexibility, and compatibility with all AI models
Traction
Launched 5 days ago on Product Hunt, 230+ upvotes | Open-source with 1.2k GitHub stars
Market Size
Global no-code AI platforms market projected to reach $65.8 billion by 2027 (CAGR 28.5%)

Apollo AI

Run local models like Llama on iOS
291
DetailsBrown line arrow
Problem
Users face the challenge of running AI models privately without a consistent internet connection.
The drawback is the reliance on an internet connection for running AI models, potentially compromising privacy and convenience.
Solution
An iOS app that allows users to run local models like Llama, Qwen, and Deepseek r1 Distills on their devices without internet connectivity, ensuring privacy.
Customers
iOS users who are tech-savvy and prioritize privacy, including tech enthusiasts and developers interested in AI and offline capabilities.
Unique Features
Privacy-focused as it runs models locally on iOS devices.
Operates offline with no internet connection required.
Supports various models like Llama 3.1 and Qwen.
User Comments
Users appreciate the privacy aspect of running models locally.
The offline functionality is highly valued for maintaining confidentiality.
Some users find the setup process challenging.
Praise for the quality and variety of the AI models available.
A few users mention occasional performance optimization issues.
Traction
Recently launched.
Gaining traction among privacy-conscious iOS users.
Positive feedback on forums and social media for its offline capabilities.
Market Size
The mobile AI applications market, including offline and privacy-focused solutions, was valued at approximately $7.5 billion in 2021 and is expected to grow.

Learn Thing AI

Create mind maps to learn new things using AI.
6
DetailsBrown line arrow
Problem
Users struggle to create mind maps manually or using basic tools, leading to inefficiencies and limited functionality.
Solution
A web platform that utilizes AI to automatically generate mind maps for learning new topics. Users can download the mind map data as a markdown or JSON file.
Core features: Automatically generate mind maps using AI models such as Ollama and OpenAI.
Customers
Students, educators, researchers, and knowledge enthusiasts seeking efficient ways to organize and learn new information.
Unique Features
Utilization of AI to create mind maps, allowing for faster and more comprehensive organization of information.
Ability to download mind map data in markdown or JSON formats for easy access and integration with other tools.
User Comments
Intuitive platform, makes learning and organizing topics much easier.
AI-generated mind maps are accurate and detailed.
Downloading mind map data in different formats is very helpful.
Great tool for educational purposes and knowledge management.
Saves time and effort in creating mind maps manually.
Traction
Over 1,500 upvotes on ProductHunt.
Positive user feedback highlighting ease of use and effectiveness.
Increasing user base evident from engagement on the platform.
Market Size
Global mind mapping software market size was valued at approximately $400 million in 2021.