Frequently Asked Questions

Product Overview & Purpose

What is GraphRAG-SDK 0.5 and what does it do?

GraphRAG-SDK 0.5 is a toolkit developed by FalkorDB for simplifying the integration of knowledge graphs (KGs) with large language models (LLMs). It automates ontology loading from your KG, allowing you to connect structured data to LLMs for querying workflows without manual setup or deep KG expertise. This makes building and deploying GraphRAG (Retrieval-Augmented Generation) systems faster and more accessible for technical teams. [Source]

Who should use GraphRAG-SDK 0.5?

GraphRAG-SDK 0.5 is designed for AI/ML architects, data scientists, and software architects who work with structured data or pre-existing knowledge graphs and want to integrate them with LLMs for advanced querying and retrieval workflows. [Source]

What is the primary purpose of FalkorDB?

FalkorDB is a high-performance graph database platform built to manage complex, interconnected data and enable advanced AI applications. Its primary purpose is to deliver accurate, multi-tenant Retrieval-Augmented Generation (RAG) solutions powered by low-latency, scalable graph database technology. [Source]

How does GraphRAG-SDK 0.5 simplify knowledge graph integration?

GraphRAG-SDK 0.5 eliminates the need for manual ontology definition and storage. You can now automatically load an ontology from your knowledge graph into the SDK, connect it to your pipeline, and start querying with an LLM—no manual steps or deep KG expertise required. [Source]

What are the main use cases for FalkorDB and GraphRAG-SDK?

Main use cases include Text2SQL (natural language to SQL on complex schemas), Security Graphs for CNAPP/CSPM/CIEM, advanced GraphRAG for fast graph-based retrieval, agentic AI and chatbots, and real-time fraud detection across transaction networks. [Source]

Features & Capabilities

What are the key features of GraphRAG-SDK 0.5?

Key features include automatic ontology loading from knowledge graphs, direct support for predefined KGs, seamless LLM integration for Q&A workflows, simplified pipeline creation, and improved document processing with a progress bar for ingestion. [Source]

Does GraphRAG-SDK 0.5 support automatic ontology loading?

Yes, GraphRAG-SDK 0.5 can automatically load ontologies from your knowledge graph, eliminating the need for manual definition or storage. [Source]

How does GraphRAG-SDK 0.5 improve document processing?

It introduces a progress bar that tracks document ingestion, giving you better visibility into pipeline execution and making it easier to monitor large-scale data processing. [Source]

Can I connect a predefined knowledge graph to GraphRAG-SDK 0.5?

Yes, you can connect directly to a predefined knowledge graph and start querying its ontology immediately, streamlining the integration process. [Source]

Does GraphRAG-SDK 0.5 require deep knowledge of ontology or KG structures?

No, the SDK is designed to simplify pipeline creation so you can bring your structured data, generate a KG, and start asking questions without needing to understand every detail of the backend. [Source]

How does FalkorDB support advanced AI applications?

FalkorDB is optimized for AI use cases such as GraphRAG and agent memory, enabling intelligent agents and chatbots with real-time adaptability. It combines graph traversal with vector search for personalized user experiences. [Source]

What integrations are available for FalkorDB?

FalkorDB integrates with frameworks such as Graphiti (for AI agent memory), g.v() (for knowledge graph visualization), Cognee (for mapping knowledge graphs), LangChain (for LLM integration), and LlamaIndex (for advanced knowledge graph applications). [Source]

Does FalkorDB provide an API?

Yes, FalkorDB provides a comprehensive API with official documentation and guides available at docs.falkordb.com. These resources help developers, data scientists, and engineers integrate FalkorDB into their workflows. [Source]

Is there technical documentation for FalkorDB and GraphRAG-SDK?

Yes, comprehensive technical documentation and API references are available for FalkorDB and GraphRAG-SDK. Access guides, release notes, and advanced configuration details at docs.falkordb.com and the GraphRAG-SDK GitHub repository. [Source]

Implementation & Onboarding

How easy is it to get started with GraphRAG-SDK 0.5?

Getting started is straightforward: simply load your knowledge graph or create one using the SDK, and the ontology is automatically loaded and connected to your pipeline. You can begin querying with an LLM right away. [Source]

How long does it take to implement FalkorDB?

FalkorDB is built for rapid deployment, enabling teams to go from concept to enterprise-grade solutions in weeks, not months. This accelerates time-to-market for AI and data-driven applications. [Source]

What resources are available for onboarding and support?

Resources include comprehensive documentation, community support via Discord and GitHub Discussions, solution architects for tailored advice, and free trial/demo options. [Source]

Can I try FalkorDB for free?

Yes, you can launch a free instance of FalkorDB in the cloud or run it locally using Docker. Visit the Try Free page for details. [Source]

Is there a demo available for FalkorDB?

Yes, you can schedule a personalized demo with the FalkorDB team to explore features and use cases. [Source]

Pricing & Plans

What pricing plans are available for FalkorDB?

FalkorDB offers four main plans: FREE (for MVPs with community support), STARTUP (from /1GB/month, includes TLS and automated backups), PRO (from 0/8GB/month, includes cluster deployment and high availability), and ENTERPRISE (custom pricing, includes VPC, custom backups, and 24/7 support). [Source]

What features are included in the PRO plan?

The PRO plan starts at 0/8GB/month and includes advanced features such as cluster deployment, high availability, and enhanced support for enterprise-grade workloads. [Source]

Is multi-tenancy included in all FalkorDB plans?

Yes, multi-tenancy is included in all FalkorDB plans, supporting over 10,000 multi-graphs. This is a key differentiator compared to competitors who often restrict multi-tenancy to premium tiers. [Source]

Performance & Security

How does FalkorDB perform compared to competitors?

FalkorDB offers up to 496x faster latency and 6x better memory efficiency compared to competitors like Neo4j. It is designed for real-time, high-speed data analysis and large-scale, high-dimensional data management. [Source]

What security and compliance certifications does FalkorDB have?

FalkorDB is SOC 2 Type II compliant, meeting rigorous standards for security, availability, processing integrity, confidentiality, and privacy. [Source]

How does FalkorDB ensure data privacy and protection?

FalkorDB's SOC 2 Type II compliance ensures robust controls for protecting against unauthorized access, safeguarding sensitive information, and complying with privacy regulations. [Source]

Competition & Comparison

How does FalkorDB compare to Neo4j?

FalkorDB offers up to 496x faster latency, 6x better memory efficiency, flexible horizontal scaling, and includes multi-tenancy in all plans. Neo4j uses an on-disk storage model and restricts multi-tenancy to premium plans. [Source]

How does FalkorDB compare to AWS Neptune?

FalkorDB is open source, supports multi-tenancy, and provides better latency performance and efficient vector search compared to AWS Neptune, which is proprietary and does not support multi-tenancy. [Source]

How does FalkorDB compare to TigerGraph and ArangoDB?

FalkorDB delivers faster latency, more efficient memory usage, and flexible horizontal scaling compared to TigerGraph and ArangoDB, which have limited scaling and moderate memory efficiency. [Source]

Use Cases & Customer Success

What problems does FalkorDB solve?

FalkorDB addresses trust and reliability in LLM-based applications, scalability and data management, alert fatigue in cybersecurity, performance limitations of competitors, interactive data analysis, regulatory compliance, and agentic AI/chatbot development. [Source]

Who are some of FalkorDB's customers?

Notable customers include AdaptX (healthcare analytics), XR.Voyage (media/entertainment), and Virtuous AI (ethical AI development). Read their success stories on the case studies page. [Source]

What industries are represented in FalkorDB's case studies?

Industries include healthcare (AdaptX), media and entertainment (XR.Voyage), and artificial intelligence/ethical AI development (Virtuous AI). [Source]

What business impact can customers expect from using FalkorDB?

Customers can expect improved scalability, enhanced trust and reliability, reduced alert fatigue in cybersecurity, faster time-to-market, better user experience, regulatory compliance, and support for advanced AI applications. [Source]

Can you share specific customer success stories?

Yes. AdaptX used FalkorDB to analyze clinical data and uncover hidden insights, XR.Voyage overcame scalability challenges in immersive media, and Virtuous AI built a high-performance, multi-modal data store for ethical AI. Read more on the case studies page. [Source]

What feedback have customers given about FalkorDB's ease of use?

Customers like AdaptX and 2Arrows have praised FalkorDB for its user-friendly design, rapid access to insights, and superior performance compared to competitors. [Source]

New Release: GraphRAG-SDK 0.5 – Simplifying Knowledge Graph Integration for LLMs

GraphRAG-SDK-V0.5- Stop Wasting Time on Ontology Setup

Highlights

We’re excited to announce the release of GraphRAG-SDK 0.5, designed to make working with knowledge graphs (KGs) and large language models (LLMs) more seamless and developer-friendly. If you’ve ever struggled with manually defining ontologies or connecting your structured data to an LLM pipeline, this update is for you.

How It Works

Here’s how the new workflow compares to older methods:

Before: You had to manually create a KG, define its ontology, save it separately, and integrate it into your application.

Now: Simply load your KG (or create one using the SDK), and the ontology is automatically loaded from your knowledge graph and connected to your pipeline.

Previously, integrating a KG into an LLM workflow required manual ontology creation, storage, and connection. This process was tedious and error-prone, especially if you lacked deep domain knowledge in KG structures. For developers managing structured data or pre-existing KGs, the overhead of manually defining ontologies slowed down experimentation and deployment.

With GraphRAG-SDK 0.5, we’ve eliminated these bottlenecks. You can now automatically load an ontology from your KG into the SDK, skipping the manual steps entirely. Whether you’re building a KG from scratch or connecting to an existing one, this release simplifies the process so you can focus on querying and extracting insights.

Automatic Ontology LoadingPredefined KG SupportEnhanced Knowledge Graph InteractionLLM IntegrationSimplified Pipeline CreationImproved Document Processing
  1. Automatic Ontology Loading: No need to manually define or save ontologies anymore. If your KG exists, the SDK handles the rest.
  2. Predefined Knowledge Graph Support: Connect directly to a predefined KG and start querying its ontology immediately.
  3. LLM Integration: Seamlessly hook up your ontology to an LLM for Q&A workflows—no intermediate steps required.
  4. Simplified Pipeline Creation: Bring your structured data, generate a KG using the SDK, and start asking questions without needing to understand every detail of the backend.
  5. Improved Document Processing: A new progress bar tracks document ingestion, giving better visibility into pipeline execution.

Get Started

GraphRAG-SDK 0.5 takes us closer to handling unstructured data better by simplifying structured data workflows first. By automating ontology management and improving usability, we’re making it easier for developers to unlock the full potential of KGs in AI applications.

If you’ve been waiting for a way to make querying your data as simple as chatting with it—this is it.

What is new in GraphRAG-SDK 0.5?

Ontology auto-loading from knowledge graphs, making integration with LLMs seamless and intuitive.

How does it simplify workflows?

You no longer need to manually define or save ontologies—just load your KG and start querying with an LLM.

Who should use it?

AI/ML architects, data scientists, and software architects working with structured data or pre-existing knowledge graphs.

Build fast and accurate GenAI apps with GraphRAG SDK at scale

FalkorDB offers an accurate, multi-tenant RAG solution based on our low-latency, scalable graph database technology. It’s ideal for highly technical teams that handle complex, interconnected data in real-time, resulting in fewer hallucinations and more accurate responses from LLMs.