Skip to content

Support for different LLM hosts (remote or local, compatible with OpenAI API interface) #6

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
May 30, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 9 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -93,7 +93,15 @@ vibesafe scan -r
vibesafe scan --report
```

**Generate AI Report (Requires API Key):**
*Using a local llm host for report (the llm host must support OpenAI API)
```bash
# example with ollama at local host with default ollama port
vibesafe scan --url http://127.0.0.1:11434 --model gemma3:27b-it-q8_0
```

if --url flag is not specified the report will be done by OpenAI (you will need an OpenAI API Key, see below)

**Generate AI Report from OpenAI (Requires API Key):**

To generate fix suggestions in the Markdown report, you need an OpenAI API key.

Expand Down
4 changes: 3 additions & 1 deletion src/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -53,6 +53,8 @@ program.command('scan')
.option('-o, --output <file>', 'Specify JSON output file path (e.g., report.json)')
.option('-r, --report [file]', 'Specify Markdown report file path (defaults to VIBESAFE-REPORT.md)')
.option('--high-only', 'Only report high severity issues')
.option('-m, --model <model>', 'Specify OpenAI model to use for suggestions. If not specified the program will use gpt-4.1-nano', 'gpt-4.1-nano')
.option('-u, --url <url>', 'Use the specified url (e.g. http://localhost:11434 for ollama or https://api.openai.com for ChatGPT) for ai suggestions. If not specified the program will call OpenAI API', 'https://api.openai.com')
.action(async (directory, options) => {
const rootDir = path.resolve(directory);
console.log(`Scanning directory: ${rootDir}`);
Expand Down Expand Up @@ -310,7 +312,7 @@ program.command('scan')
infoSecretFindings: infoSecretFindings
};
try {
const markdownContent = await generateMarkdownReport(reportData);
const markdownContent = await generateMarkdownReport(reportData, options.url, options.model);
fs.writeFileSync(reportPath, markdownContent);
console.log(chalk.green(`\nMarkdown report generated successfully at ${reportPath}`));
} catch (error: any) {
Expand Down
9 changes: 5 additions & 4 deletions src/reporting/aiSuggestions.ts
Original file line number Diff line number Diff line change
Expand Up @@ -51,11 +51,12 @@ const MAX_FINDINGS_PER_TYPE = 10;
/**
* Generates AI-powered suggestions for fixing findings.
* @param reportData The aggregated findings.
* @param apiKey OpenAI API key.
* @param openaiConf OpenAI API key and url.
* @param model model name.
* @returns A promise resolving to a Markdown string with suggestions.
*/
export async function generateAISuggestions(reportData: ReportData, apiKey: string): Promise<string> {
const openai = new OpenAI({ apiKey });
export async function generateAISuggestions(reportData: ReportData, openaiConf: {baseURL:string, apiKey: string}, model: string): Promise<string> {
const openai = new OpenAI(openaiConf);

// Prepare a simplified list of findings for the prompt
const simplifiedFindings: SimplifiedFinding[] = [
Expand Down Expand Up @@ -92,7 +93,7 @@ export async function generateAISuggestions(reportData: ReportData, apiKey: stri

try {
const completion = await openai.chat.completions.create({
model: "gpt-4.1-nano",
model: model,
messages: [
{ role: "system", content: "You are a helpful security assistant providing fix suggestions for code vulnerabilities." },
{ role: "user", content: prompt }
Expand Down
29 changes: 14 additions & 15 deletions src/reporting/markdown.ts
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@ function getSeverityInfo(severity: FindingSeverity | SecretFinding['severity'] |
* @param reportData The aggregated findings.
* @returns A Markdown formatted string.
*/
export async function generateMarkdownReport(reportData: ReportData): Promise<string> {
export async function generateMarkdownReport(reportData: ReportData, url: string, model: string = 'gpt-4.1-nano'): Promise<string> {
let markdown = `# VibeSafe Security Scan Report ✨🛡️\n\n`;
markdown += `Generated: ${new Date().toISOString()}\n\n`;

Expand Down Expand Up @@ -198,20 +198,19 @@ export async function generateMarkdownReport(reportData: ReportData): Promise<st
}
}

// --- AI Suggestions ---
const apiKey = process.env.OPENAI_API_KEY;
if (apiKey && apiKey !== 'YOUR_API_KEY_PLACEHOLDER') {
const spinner = ora('Generating AI suggestions (using OpenAI GPT-4o-mini)... ').start();
try {
const aiSuggestions = await generateAISuggestions(reportData, apiKey);
spinner.succeed('AI suggestions generated.');
markdown += aiSuggestions; // Append the suggestions section
} catch (error: any) {
spinner.fail('AI suggestion generation failed.');
markdown += `\n## AI Suggestions\n\n*Error generating suggestions: ${error.message}*\n`; // Append error message
}
} else {
markdown += `\n*AI suggestions skipped. Set the OPENAI_API_KEY environment variable to enable.*\n`;
// --- AI Suggestions ---
let apiKey = process.env.OPENAI_API_KEY;
if (! apiKey){ // ollama dont need an API key but this field can't be none
apiKey = 'YOUR_API_KEY_PLACEHOLDER';
}
const spinner = ora(`Generating AI suggestions (using API from ${url}/v1 with model: ${model})... `).start();
try {
const aiSuggestions = await generateAISuggestions(reportData, {baseURL: url + '/v1', apiKey: apiKey}, model);
spinner.succeed('AI suggestions generated.');
markdown += aiSuggestions; // Append the suggestions section
} catch (error: any) {
spinner.fail('AI suggestion generation failed.');
markdown += `\n## AI Suggestions\n\n*Error generating suggestions: ${error.message}*\n`; // Append error message
}

return markdown;
Expand Down
Loading