Quick Start
Get up and running with Dalang in under 5 minutes.
0. Fast Path (Docker Compose)
Use this if you want the quickest route to a working environment with pre-bundled tools.
bash
git clone https://github.com/sangkan-dev/dalang.git
cd dalang
cat > .env << 'EOF'
LLM_PROVIDER=openai
LLM_API_KEY=your-api-key-here
# Optional:
# LLM_BASE_URL=https://your-endpoint.example.com/v1
EOF
docker compose up --buildThen open http://localhost:4000 and start from the Web UI.
CLI Path (Source Install)
If you installed Dalang directly on your host (non-Docker), follow the steps below.
1. Login to an LLM Provider
Dalang supports multiple AI providers. Choose one:
bash
dalang login --provider gemini
# Prompts for your Google AI Studio API keybash
dalang login --provider openai
# Prompts for your OpenAI API keybash
dalang login --provider anthropic
# Prompts for your Anthropic API keybash
export LLM_API_KEY="your-api-key-here"
# No login requiredAfter login, you'll be prompted to select your preferred AI model.
2. Run a Simple Scan
bash
# Scan a target with a specific skill
dalang scan --target 192.168.1.1 --skills nmap_scanner
# Run multiple skills
dalang scan --target https://example.com --skills nmap_scanner,web-audit3. Enable Auto-Pilot Mode
Let the AI decide which tools to use:
bash
dalang scan --target https://example.com --autoThe AI will:
- Analyze the target
- Select appropriate skills from the library
- Execute tools and observe results
- Chain new actions based on observations
- Generate a vulnerability report
4. Interactive Mode
Start a conversational security session:
bash
dalang interact --target https://example.comType commands in natural language:
dalang> scan for open ports on the target
dalang> check if port 80 has any web vulnerabilities
dalang> look for SQL injection on the login form
dalang> exit