API Reference
CSV Import API
Bulk import findings from CSV files using the PentestPad API
Overview
The CSV Import API allows you to bulk import findings from CSV files, making it easy to migrate data from other tools or import large numbers of findings at once. The API provides field mapping, validation, and detailed error reporting.
Endpoints
Import Findings from CSV
Upload and import findings from a CSV file into a project.
POST /api/v1/projects/{project_uuid}/findings/import-csv
Request:
- Content-Type:
multipart/form-data
- Body: Form data with
file
field containing the CSV file
File Requirements:
- Format: CSV (.csv) or text (.txt) files
- Size Limit: 10MB maximum
- Encoding: UTF-8 recommended
Response:
{
"success": true,
"message": "CSV import completed. 8 findings imported, 2 errors.",
"data": {
"success_count": 8,
"error_count": 2,
"errors": [
{
"row": 5,
"error": "Invalid impact level: extreme. Must be one of: informational, low, medium, high, critical",
"data": {
"title": "SQL Injection",
"impact": "extreme",
"probability": "high"
}
},
{
"row": 7,
"error": "The title field is required.",
"data": {
"title": "",
"impact": "high",
"probability": "medium"
}
}
],
"imported_findings": [
{
"row": 2,
"id": 458,
"uuid": "finding-uuid-1",
"title": "SQL Injection in Login Form"
},
{
"row": 3,
"id": 459,
"uuid": "finding-uuid-2",
"title": "Cross-Site Scripting (XSS)"
}
]
}
}
Get CSV Field Mapping
Get information about required and optional fields for CSV import.
GET /api/v1/csv/field-mapping
Response:
{
"success": true,
"data": {
"field_mapping": {
"title": "Finding title/name",
"impact": "Impact level (informational|low|medium|high|critical)",
"probability": "Probability level (informational|low|medium|high|critical)",
"description": "Detailed description of the finding",
"poc": "Proof of concept details",
"risks": "Risk assessment and business impact",
"remediation": "Remediation steps and recommendations",
"cvss": "CVSS vector string",
"cvss_score": "CVSS score (0-10)",
"status": "Finding status (draft|in-progress|done)",
"http_excerpts": "HTTP request/response examples",
"affected_hosts": "JSON array or comma-separated endpoints",
"categories": "Comma-separated vulnerability category IDs",
"template_id": "Template ID to base finding on",
"extra_fields": "JSON object with custom fields"
},
"sample_csv": "title,impact,probability,description,poc,affected_hosts,categories\n\"SQL Injection in Login Form\",high,medium,\"Login form vulnerable to SQL injection\",\"admin' OR 1=1 --\",\"https://example.com/login,https://api.example.com/auth\",\"1,2\"\n\"Cross-Site Scripting (XSS)\",medium,high,\"Reflected XSS in search parameter\",\"<script>alert(1)</script>\",\"{\"\"endpoint\"\": \"\"https://example.com/search\"\", \"\"port\"\": 443}\",\"3\""
}
}
CSV Format
Required Fields
Every CSV must include these columns:
Field | Type | Description | Valid Values |
---|---|---|---|
title | string | Finding name/title | Any string (required) |
impact | string | Impact severity | informational , low , medium , high , critical |
probability | string | Likelihood of exploitation | informational , low , medium , high , critical |
Optional Fields
Field | Type | Description | Format |
---|---|---|---|
description | string | Detailed description | Plain text |
poc | string | Proof of concept | Plain text |
risks | string | Business risks | Plain text |
remediation | string | Fix recommendations | Plain text |
cvss | string | CVSS vector | CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:H/I:H/A:N |
cvss_score | number | CVSS score | 0 to 10 |
status | string | Finding status | draft , in-progress , done |
http_excerpts | string | HTTP examples | Plain text |
affected_hosts | string | Affected endpoints | See Affected Hosts Format |
categories | string | Vulnerability categories | Comma-separated IDs: 1,2,3 |
template_id | number | Template reference | Template ID number |
extra_fields | string | Custom fields | JSON string |
Affected Hosts Format
The affected_hosts
field supports two formats:
Comma-Separated URLs
"https://example.com/login,https://api.example.com/auth,10.1.1.1:8080"
JSON Format
"[{\"endpoint\": \"https://example.com/login\", \"port\": 443, \"protocol\": \"HTTPS\", \"description\": \"Main login page\"}]"
Single JSON Object
"{\"endpoint\": \"https://example.com/search\", \"port\": 443}"
Extra Fields Format
Custom fields should be provided as JSON:
"{\"cwe_id\": \"CWE-89\", \"owasp_category\": \"A03:2021\", \"custom_field\": \"value\"}"
CSV Examples
Basic CSV
title,impact,probability,description
"SQL Injection in Login",high,medium,"Authentication bypass via SQL injection"
"XSS in Search",medium,high,"Reflected XSS in search functionality"
"Weak Password Policy",low,high,"Passwords can be 4 characters minimum"
Complete CSV
title,impact,probability,description,poc,risks,remediation,cvss,cvss_score,status,http_excerpts,affected_hosts,categories,extra_fields
"SQL Injection in Login Form",high,medium,"Login form vulnerable to SQL injection","admin' OR 1=1 --","Data breach potential","Use parameterized queries","CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:H/I:H/A:N",8.1,draft,"POST /login HTTP/1.1...","https://example.com/login","1,2","{""cwe_id"": ""CWE-89""}"
"Cross-Site Scripting",medium,high,"Reflected XSS in search","<script>alert(1)</script>","Session hijacking","Implement output encoding","CVSS:3.1/AV:N/AC:L/PR:N/UI:R/S:C/C:L/I:L/A:N",6.1,draft,"GET /search?q=<script>alert(1)</script>","[{""endpoint"": ""https://example.com/search"", ""port"": 443}]","3","{""owasp"": ""A03:2021""}"
Examples
Import CSV File
curl -X POST \
-H "Authorization: Bearer your_api_key" \
-F "file=@findings.csv" \
https://your-instance.pentestpad.com/api/v1/projects/project-uuid/findings/import-csv
Get Field Mapping
curl -H "Authorization: Bearer your_api_key" \
https://your-instance.pentestpad.com/api/v1/csv/field-mapping
JavaScript Example
// Import CSV file
const formData = new FormData();
formData.append('file', csvFile); // csvFile is a File object
const importResult = await fetch(`/api/v1/projects/${projectId}/findings/import-csv`, {
method: 'POST',
headers: {
'Authorization': 'Bearer your_api_key'
},
body: formData
}).then(r => r.json());
console.log(`Imported ${importResult.data.success_count} findings`);
if (importResult.data.error_count > 0) {
console.log('Errors:', importResult.data.errors);
}
Python Example
import requests
import csv
# Create CSV data
csv_data = [
['title', 'impact', 'probability', 'description'],
['SQL Injection', 'high', 'medium', 'Database injection vulnerability'],
['XSS Attack', 'medium', 'high', 'Cross-site scripting issue']
]
# Write to file
with open('findings.csv', 'w', newline='') as file:
writer = csv.writer(file)
writer.writerows(csv_data)
# Upload CSV
with open('findings.csv', 'rb') as file:
response = requests.post(
'https://your-instance.pentestpad.com/api/v1/projects/project-uuid/findings/import-csv',
headers={'Authorization': 'Bearer your_api_key'},
files={'file': file}
)
result = response.json()
print(f"Success: {result['data']['success_count']} findings imported")
Error Handling
Common Import Errors
Error Type | Description | Solution |
---|---|---|
Invalid File Format | File is not CSV/TXT | Use .csv or .txt files |
File Too Large | File exceeds 10MB limit | Split into smaller files |
Missing Required Fields | CSV missing title, impact, or probability | Add required columns |
Invalid Enum Values | Impact/probability not in valid list | Use: informational, low, medium, high, critical |
Invalid JSON | Malformed JSON in affected_hosts or extra_fields | Validate JSON format |
Invalid Template ID | Template doesn't exist | Use valid template ID or leave empty |
Invalid Category IDs | Category doesn't exist | Use valid category IDs |
Validation Rules
- Title: Required, maximum 255 characters
- Impact/Probability: Must be one of:
informational
,low
,medium
,high
,critical
- CVSS Score: Must be between 0 and 10
- Status: Must be one of:
draft
,in-progress
,done
- Categories: Must be valid category IDs (integers)
- Template ID: Must be valid template ID (integer)
Best Practices
Import Tips
- Test First: Import a small sample CSV to verify format
- Use UTF-8: Ensure proper character encoding
- Validate Data: Check for required fields before import
- Handle Errors: Review error messages for failed rows
- Batch Size: Keep files under 1000 rows for best performance
Preparing Your CSV
- Export from Source: Export data from your current tool
- Map Fields: Match columns to PentestPad field names
- Clean Data: Remove invalid characters and ensure proper formatting
- Test Import: Try a small sample first
- Full Import: Import the complete dataset
After Import
- Review imported findings for accuracy
- Update any findings that need manual correction
- Assign findings to team members if needed
- Update project status and generate reports