Skip to content

Quick Start

Get up and running with Dalang in under 5 minutes.

0. Fast Path (Docker Compose)

Use this if you want the quickest route to a working environment with pre-bundled tools.

bash
git clone https://github.com/sangkan-dev/dalang.git
cd dalang

cat > .env << 'EOF'
LLM_PROVIDER=openai
LLM_API_KEY=your-api-key-here
# Optional:
# LLM_BASE_URL=https://your-endpoint.example.com/v1
EOF

docker compose up --build

Then open http://localhost:4000 and start from the Web UI.

CLI Path (Source Install)

If you installed Dalang directly on your host (non-Docker), follow the steps below.

1. Login to an LLM Provider

Dalang supports multiple AI providers. Choose one:

bash
dalang login --provider gemini
# Prompts for your Google AI Studio API key
bash
dalang login --provider openai
# Prompts for your OpenAI API key
bash
dalang login --provider anthropic
# Prompts for your Anthropic API key
bash
export LLM_API_KEY="your-api-key-here"
# No login required

After login, you'll be prompted to select your preferred AI model.

2. Run a Simple Scan

bash
# Scan a target with a specific skill
dalang scan --target 192.168.1.1 --skills nmap_scanner

# Run multiple skills
dalang scan --target https://example.com --skills nmap_scanner,web-audit

3. Enable Auto-Pilot Mode

Let the AI decide which tools to use:

bash
dalang scan --target https://example.com --auto

The AI will:

  1. Analyze the target
  2. Select appropriate skills from the library
  3. Execute tools and observe results
  4. Chain new actions based on observations
  5. Generate a vulnerability report

4. Interactive Mode

Start a conversational security session:

bash
dalang interact --target https://example.com

Type commands in natural language:

dalang> scan for open ports on the target
dalang> check if port 80 has any web vulnerabilities
dalang> look for SQL injection on the login form
dalang> exit

Released under the MIT License.