Allow prompt templates to be overridden in the zed configuration directory (#15887)

I need this to refine our prompts on the fly as I work.

Release Notes:

- Templates for prompts driving inline transformation in editors and the
terminal can now be overridden in the `~/.config/zed/prompts/templates`
directory. This is an advanced feature, and prevents you from getting
upstream changes. It's intended for use by Zed developers.
This commit is contained in:
Nathan Sobo 2024-08-06 19:30:48 -06:00 committed by GitHub
parent 6065db174a
commit c8f1358629
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
12 changed files with 569 additions and 168 deletions

66
Cargo.lock generated
View File

@ -391,6 +391,7 @@ dependencies = [
"futures 0.3.30",
"fuzzy",
"gpui",
"handlebars",
"heed",
"html_to_markdown 0.1.0",
"http_client",
@ -4958,6 +4959,20 @@ dependencies = [
"crunchy",
]
[[package]]
name = "handlebars"
version = "4.5.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "faa67bab9ff362228eb3d00bd024a4965d8231bbb7921167f0cfa66c6626b225"
dependencies = [
"log",
"pest",
"pest_derive",
"serde",
"serde_json",
"thiserror",
]
[[package]]
name = "hashbrown"
version = "0.12.3"
@ -7595,6 +7610,51 @@ version = "2.3.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e3148f5046208a5d56bcfc03053e3ca6334e51da8dfb19b6cdc8b306fae3283e"
[[package]]
name = "pest"
version = "2.7.11"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "cd53dff83f26735fdc1ca837098ccf133605d794cdae66acfc2bfac3ec809d95"
dependencies = [
"memchr",
"thiserror",
"ucd-trie",
]
[[package]]
name = "pest_derive"
version = "2.7.11"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2a548d2beca6773b1c244554d36fcf8548a8a58e74156968211567250e48e49a"
dependencies = [
"pest",
"pest_generator",
]
[[package]]
name = "pest_generator"
version = "2.7.11"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3c93a82e8d145725dcbaf44e5ea887c8a869efdcc28706df2d08c69e17077183"
dependencies = [
"pest",
"pest_meta",
"proc-macro2",
"quote",
"syn 2.0.59",
]
[[package]]
name = "pest_meta"
version = "2.7.11"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a941429fea7e08bedec25e4f6785b6ffaacc6b755da98df5ef3e7dcf4a124c4f"
dependencies = [
"once_cell",
"pest",
"sha2",
]
[[package]]
name = "petgraph"
version = "0.6.4"
@ -11791,6 +11851,12 @@ version = "1.17.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "42ff0bf0c66b8238c6f3b578df37d0b7848e55df8577b3f74f92a69acceeb825"
[[package]]
name = "ucd-trie"
version = "0.1.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ed646292ffc8188ef8ea4d1e0e0150fb15a5c2e12ad9b8fc191ae7a8a7f3c4b9"
[[package]]
name = "uds_windows"
version = "1.1.0"

View File

@ -209,6 +209,7 @@ go_to_line = { path = "crates/go_to_line" }
google_ai = { path = "crates/google_ai" }
gpui = { path = "crates/gpui" }
gpui_macros = { path = "crates/gpui_macros" }
handlebars = "4.3"
headless = { path = "crates/headless" }
html_to_markdown = { path = "crates/html_to_markdown" }
http_client = { path = "crates/http_client" }

View File

@ -0,0 +1,61 @@
{{#if language_name}}
Here's a file of {{language_name}} that I'm going to ask you to make an edit to.
{{else}}
Here's a file of text that I'm going to ask you to make an edit to.
{{/if}}
{{#if is_insert}}
The point you'll need to insert at is marked with <insert_here></insert_here>.
{{else}}
The section you'll need to rewrite is marked with <rewrite_this></rewrite_this> tags.
{{/if}}
<document>
{{{document_content}}}
</document>
{{#if is_truncated}}
The context around the relevant section has been truncated (possibly in the middle of a line) for brevity.
{{/if}}
{{#if is_insert}}
You can't replace {{content_type}}, your answer will be inserted in place of the `<insert_here></insert_here>` tags. Don't include the insert_here tags in your output.
Generate {{content_type}} based on the following prompt:
<prompt>
{{{user_prompt}}}
</prompt>
Match the indentation in the original file in the inserted {{content_type}}, don't include any indentation on blank lines.
Immediately start with the following format with no remarks:
```
{{INSERTED_CODE}}
```
{{else}}
Edit the section of {{content_type}} in <rewrite_this></rewrite_this> tags based on the following prompt:
<prompt>
{{{user_prompt}}}
</prompt>
{{#if rewrite_section}}
And here's the section to rewrite based on that prompt again for reference:
<rewrite_this>
{{{rewrite_section}}}
</rewrite_this>
{{/if}}
Only make changes that are necessary to fulfill the prompt, leave everything else as-is. All surrounding {{content_type}} will be preserved.
Start at the indentation level in the original file in the rewritten {{content_type}}. Don't stop until you've rewritten the entire section, even if you have no more changes to make, always write out the whole section with no unnecessary elisions.
Immediately start with the following format with no remarks:
```
{{REWRITTEN_CODE}}
```
{{/if}}

View File

@ -0,0 +1,18 @@
You are an expert terminal user.
You will be given a description of a command and you need to respond with a command that matches the description.
Do not include markdown blocks or any other text formatting in your response, always respond with a single command that can be executed in the given shell.
Current OS name is '{{os}}', architecture is '{{arch}}'.
{{#if shell}}
Current shell is '{{shell}}'.
{{/if}}
{{#if working_directory}}
Current working directory is '{{working_directory}}'.
{{/if}}
{{#if latest_output}}
Latest non-empty terminal output:
{{#each latest_output as |line|}}
{{line}}
{{/each}}
{{/if}}
Here is the description of the command:
{{{user_prompt}}}

View File

@ -11,6 +11,7 @@ use rust_embed::RustEmbed;
#[include = "themes/**/*"]
#[exclude = "themes/src/*"]
#[include = "sounds/**/*"]
#[include = "prompts/**/*"]
#[include = "*.md"]
#[exclude = "*.DS_Store"]
pub struct Assets;

View File

@ -39,6 +39,7 @@ fs.workspace = true
futures.workspace = true
fuzzy.workspace = true
gpui.workspace = true
handlebars.workspace = true
heed.workspace = true
html_to_markdown.workspace = true
http_client.workspace = true

View File

@ -35,6 +35,7 @@ use slash_command::{
};
use std::sync::Arc;
pub(crate) use streaming_diff::*;
use util::ResultExt;
actions!(
assistant,
@ -197,8 +198,17 @@ pub fn init(fs: Arc<dyn Fs>, client: Arc<Client>, cx: &mut AppContext) {
assistant_slash_command::init(cx);
register_slash_commands(cx);
assistant_panel::init(cx);
inline_assistant::init(fs.clone(), client.telemetry().clone(), cx);
terminal_inline_assistant::init(fs.clone(), client.telemetry().clone(), cx);
if let Some(prompt_builder) = prompts::PromptBuilder::new(Some((fs.clone(), cx))).log_err() {
let prompt_builder = Arc::new(prompt_builder);
inline_assistant::init(
fs.clone(),
prompt_builder.clone(),
client.telemetry().clone(),
cx,
);
terminal_inline_assistant::init(fs.clone(), prompt_builder, client.telemetry().clone(), cx);
}
IndexedDocsRegistry::init_global(cx);
CommandPaletteFilter::update_global(cx, |filter, _cx| {

View File

@ -1,5 +1,5 @@
use crate::{
humanize_token_count, prompts::generate_content_prompt, AssistantPanel, AssistantPanelEvent,
humanize_token_count, prompts::PromptBuilder, AssistantPanel, AssistantPanelEvent,
CharOperation, LineDiff, LineOperation, ModelSelector, StreamingDiff,
};
use anyhow::{anyhow, Context as _, Result};
@ -51,8 +51,13 @@ use ui::{prelude::*, CheckboxWithLabel, IconButtonShape, Popover, Tooltip};
use util::{RangeExt, ResultExt};
use workspace::{notifications::NotificationId, Toast, Workspace};
pub fn init(fs: Arc<dyn Fs>, telemetry: Arc<Telemetry>, cx: &mut AppContext) {
cx.set_global(InlineAssistant::new(fs, telemetry));
pub fn init(
fs: Arc<dyn Fs>,
prompt_builder: Arc<PromptBuilder>,
telemetry: Arc<Telemetry>,
cx: &mut AppContext,
) {
cx.set_global(InlineAssistant::new(fs, prompt_builder, telemetry));
}
const PROMPT_HISTORY_MAX_LEN: usize = 20;
@ -64,6 +69,7 @@ pub struct InlineAssistant {
assists_by_editor: HashMap<WeakView<Editor>, EditorInlineAssists>,
assist_groups: HashMap<InlineAssistGroupId, InlineAssistGroup>,
prompt_history: VecDeque<String>,
prompt_builder: Arc<PromptBuilder>,
telemetry: Option<Arc<Telemetry>>,
fs: Arc<dyn Fs>,
}
@ -71,7 +77,11 @@ pub struct InlineAssistant {
impl Global for InlineAssistant {}
impl InlineAssistant {
pub fn new(fs: Arc<dyn Fs>, telemetry: Arc<Telemetry>) -> Self {
pub fn new(
fs: Arc<dyn Fs>,
prompt_builder: Arc<PromptBuilder>,
telemetry: Arc<Telemetry>,
) -> Self {
Self {
next_assist_id: InlineAssistId::default(),
next_assist_group_id: InlineAssistGroupId::default(),
@ -79,6 +89,7 @@ impl InlineAssistant {
assists_by_editor: HashMap::default(),
assist_groups: HashMap::default(),
prompt_history: VecDeque::default(),
prompt_builder,
telemetry: Some(telemetry),
fs,
}
@ -155,6 +166,7 @@ impl InlineAssistant {
range.clone(),
None,
self.telemetry.clone(),
self.prompt_builder.clone(),
cx,
)
});
@ -260,6 +272,7 @@ impl InlineAssistant {
range.clone(),
initial_transaction_id,
self.telemetry.clone(),
self.prompt_builder.clone(),
cx,
)
});
@ -2021,6 +2034,7 @@ pub struct Codegen {
diff: Diff,
telemetry: Option<Arc<Telemetry>>,
_subscription: gpui::Subscription,
builder: Arc<PromptBuilder>,
}
pub enum CodegenStatus {
@ -2050,6 +2064,7 @@ impl Codegen {
range: Range<Anchor>,
initial_transaction_id: Option<TransactionId>,
telemetry: Option<Arc<Telemetry>>,
builder: Arc<PromptBuilder>,
cx: &mut ModelContext<Self>,
) -> Self {
let snapshot = buffer.read(cx).snapshot(cx);
@ -2087,6 +2102,7 @@ impl Codegen {
telemetry,
_subscription: cx.subscribe(&buffer, Self::handle_buffer_event),
initial_transaction_id,
builder,
}
}
@ -2118,7 +2134,10 @@ impl Codegen {
) -> BoxFuture<'static, Result<usize>> {
if let Some(model) = LanguageModelRegistry::read_global(cx).active_model() {
let request = self.build_request(user_prompt, assistant_panel_context, edit_range, cx);
model.count_tokens(request, cx)
match request {
Ok(request) => model.count_tokens(request, cx),
Err(error) => futures::future::ready(Err(error)).boxed(),
}
} else {
future::ready(Err(anyhow!("no active model"))).boxed()
}
@ -2152,7 +2171,8 @@ impl Codegen {
async { Ok(stream::empty().boxed()) }.boxed_local()
} else {
let request =
self.build_request(user_prompt, assistant_panel_context, edit_range.clone(), cx);
self.build_request(user_prompt, assistant_panel_context, edit_range.clone(), cx)?;
let chunks =
cx.spawn(|_, cx| async move { model.stream_completion(request, &cx).await });
async move { Ok(chunks.await?.boxed()) }.boxed_local()
@ -2167,7 +2187,7 @@ impl Codegen {
assistant_panel_context: Option<LanguageModelRequest>,
edit_range: Range<Anchor>,
cx: &AppContext,
) -> LanguageModelRequest {
) -> Result<LanguageModelRequest> {
let buffer = self.buffer.read(cx).snapshot(cx);
let language = buffer.language_at(edit_range.start);
let language_name = if let Some(language) = language.as_ref() {
@ -2202,12 +2222,15 @@ impl Codegen {
if start_buffer.remote_id() == end_buffer.remote_id() {
(start_buffer.clone(), start_buffer_offset..end_buffer_offset)
} else {
panic!("invalid transformation range");
return Err(anyhow::anyhow!("invalid transformation range"));
}
} else {
panic!("invalid transformation range");
return Err(anyhow::anyhow!("invalid transformation range"));
};
let prompt = generate_content_prompt(user_prompt, language_name, buffer, range);
let prompt = self
.builder
.generate_content_prompt(user_prompt, language_name, buffer, range)
.map_err(|e| anyhow::anyhow!("Failed to generate content prompt: {}", e))?;
let mut messages = Vec::new();
if let Some(context_request) = assistant_panel_context {
@ -2219,11 +2242,11 @@ impl Codegen {
content: prompt,
});
LanguageModelRequest {
Ok(LanguageModelRequest {
messages,
stop: vec!["|END|>".to_string()],
temperature,
}
})
}
pub fn handle_stream(
@ -2752,8 +2775,17 @@ mod tests {
let snapshot = buffer.snapshot(cx);
snapshot.anchor_before(Point::new(1, 0))..snapshot.anchor_after(Point::new(4, 5))
});
let codegen =
cx.new_model(|cx| Codegen::new(buffer.clone(), range.clone(), None, None, cx));
let prompt_builder = Arc::new(PromptBuilder::new(None).unwrap());
let codegen = cx.new_model(|cx| {
Codegen::new(
buffer.clone(),
range.clone(),
None,
None,
prompt_builder,
cx,
)
});
let (chunks_tx, chunks_rx) = mpsc::unbounded();
codegen.update(cx, |codegen, cx| {
@ -2815,8 +2847,17 @@ mod tests {
let snapshot = buffer.snapshot(cx);
snapshot.anchor_before(Point::new(1, 6))..snapshot.anchor_after(Point::new(1, 6))
});
let codegen =
cx.new_model(|cx| Codegen::new(buffer.clone(), range.clone(), None, None, cx));
let prompt_builder = Arc::new(PromptBuilder::new(None).unwrap());
let codegen = cx.new_model(|cx| {
Codegen::new(
buffer.clone(),
range.clone(),
None,
None,
prompt_builder,
cx,
)
});
let (chunks_tx, chunks_rx) = mpsc::unbounded();
codegen.update(cx, |codegen, cx| {
@ -2881,8 +2922,17 @@ mod tests {
let snapshot = buffer.snapshot(cx);
snapshot.anchor_before(Point::new(1, 2))..snapshot.anchor_after(Point::new(1, 2))
});
let codegen =
cx.new_model(|cx| Codegen::new(buffer.clone(), range.clone(), None, None, cx));
let prompt_builder = Arc::new(PromptBuilder::new(None).unwrap());
let codegen = cx.new_model(|cx| {
Codegen::new(
buffer.clone(),
range.clone(),
None,
None,
prompt_builder,
cx,
)
});
let (chunks_tx, chunks_rx) = mpsc::unbounded();
codegen.update(cx, |codegen, cx| {
@ -2946,8 +2996,17 @@ mod tests {
let snapshot = buffer.snapshot(cx);
snapshot.anchor_before(Point::new(0, 0))..snapshot.anchor_after(Point::new(4, 2))
});
let codegen =
cx.new_model(|cx| Codegen::new(buffer.clone(), range.clone(), None, None, cx));
let prompt_builder = Arc::new(PromptBuilder::new(None).unwrap());
let codegen = cx.new_model(|cx| {
Codegen::new(
buffer.clone(),
range.clone(),
None,
None,
prompt_builder,
cx,
)
});
let (chunks_tx, chunks_rx) = mpsc::unbounded();
codegen.update(cx, |codegen, cx| {

View File

@ -1,153 +1,239 @@
use assets::Assets;
use fs::Fs;
use futures::StreamExt;
use handlebars::{Handlebars, RenderError, TemplateError};
use language::BufferSnapshot;
use std::{fmt::Write, ops::Range};
use parking_lot::Mutex;
use serde::Serialize;
use std::{ops::Range, sync::Arc, time::Duration};
use util::ResultExt;
pub fn generate_content_prompt(
user_prompt: String,
language_name: Option<&str>,
buffer: BufferSnapshot,
range: Range<usize>,
) -> String {
let mut prompt = String::new();
#[derive(Serialize)]
pub struct ContentPromptContext {
pub content_type: String,
pub language_name: Option<String>,
pub is_insert: bool,
pub is_truncated: bool,
pub document_content: String,
pub user_prompt: String,
pub rewrite_section: Option<String>,
}
let content_type = match language_name {
None | Some("Markdown" | "Plain Text") => {
writeln!(
prompt,
"Here's a file of text that I'm going to ask you to make an edit to."
)
.unwrap();
"text"
#[derive(Serialize)]
pub struct TerminalAssistantPromptContext {
pub os: String,
pub arch: String,
pub shell: Option<String>,
pub working_directory: Option<String>,
pub latest_output: Vec<String>,
pub user_prompt: String,
}
pub struct PromptBuilder {
handlebars: Arc<Mutex<Handlebars<'static>>>,
}
impl PromptBuilder {
pub fn new(
fs_and_cx: Option<(Arc<dyn Fs>, &gpui::AppContext)>,
) -> Result<Self, Box<TemplateError>> {
let mut handlebars = Handlebars::new();
Self::register_templates(&mut handlebars)?;
let handlebars = Arc::new(Mutex::new(handlebars));
if let Some((fs, cx)) = fs_and_cx {
Self::watch_fs_for_template_overrides(fs, cx, handlebars.clone());
}
Some(language_name) => {
writeln!(
prompt,
"Here's a file of {language_name} that I'm going to ask you to make an edit to."
Ok(Self { handlebars })
}
fn watch_fs_for_template_overrides(
fs: Arc<dyn Fs>,
cx: &gpui::AppContext,
handlebars: Arc<Mutex<Handlebars<'static>>>,
) {
let templates_dir = paths::prompt_templates_dir();
cx.background_executor()
.spawn(async move {
// Create the prompt templates directory if it doesn't exist
if !fs.is_dir(templates_dir).await {
if let Err(e) = fs.create_dir(templates_dir).await {
log::error!("Failed to create prompt templates directory: {}", e);
return;
}
}
// Initial scan of the prompts directory
if let Ok(mut entries) = fs.read_dir(templates_dir).await {
while let Some(Ok(file_path)) = entries.next().await {
if file_path.to_string_lossy().ends_with(".hbs") {
if let Ok(content) = fs.load(&file_path).await {
let file_name = file_path.file_stem().unwrap().to_string_lossy();
match handlebars.lock().register_template_string(&file_name, content) {
Ok(_) => {
log::info!(
"Successfully registered template override: {} ({})",
file_name,
file_path.display()
);
},
Err(e) => {
log::error!(
"Failed to register template during initial scan: {} ({})",
e,
file_path.display()
);
},
}
}
}
}
}
// Watch for changes
let (mut changes, watcher) = fs.watch(templates_dir, Duration::from_secs(1)).await;
while let Some(changed_paths) = changes.next().await {
for changed_path in changed_paths {
if changed_path.extension().map_or(false, |ext| ext == "hbs") {
log::info!("Reloading template: {}", changed_path.display());
if let Some(content) = fs.load(&changed_path).await.log_err() {
let file_name = changed_path.file_stem().unwrap().to_string_lossy();
let file_path = changed_path.to_string_lossy();
match handlebars.lock().register_template_string(&file_name, content) {
Ok(_) => log::info!(
"Successfully reloaded template: {} ({})",
file_name,
file_path
),
Err(e) => log::error!(
"Failed to register template: {} ({})",
e,
file_path
),
}
}
}
}
}
drop(watcher);
})
.detach();
}
fn register_templates(handlebars: &mut Handlebars) -> Result<(), Box<TemplateError>> {
let content_prompt = Assets::get("prompts/content_prompt.hbs")
.expect("Content prompt template not found")
.data;
let terminal_assistant_prompt = Assets::get("prompts/terminal_assistant_prompt.hbs")
.expect("Terminal assistant prompt template not found")
.data;
handlebars
.register_template_string("content_prompt", String::from_utf8_lossy(&content_prompt))
.map_err(Box::new)?;
handlebars
.register_template_string(
"terminal_assistant_prompt",
String::from_utf8_lossy(&terminal_assistant_prompt),
)
.unwrap();
"code"
.map_err(Box::new)?;
Ok(())
}
pub fn generate_content_prompt(
&self,
user_prompt: String,
language_name: Option<&str>,
buffer: BufferSnapshot,
range: Range<usize>,
) -> Result<String, RenderError> {
let content_type = match language_name {
None | Some("Markdown" | "Plain Text") => "text",
Some(_) => "code",
};
const MAX_CTX: usize = 50000;
let is_insert = range.is_empty();
let mut is_truncated = false;
let before_range = 0..range.start;
let truncated_before = if before_range.len() > MAX_CTX {
is_truncated = true;
range.start - MAX_CTX..range.start
} else {
before_range
};
let after_range = range.end..buffer.len();
let truncated_after = if after_range.len() > MAX_CTX {
is_truncated = true;
range.end..range.end + MAX_CTX
} else {
after_range
};
let mut document_content = String::new();
for chunk in buffer.text_for_range(truncated_before) {
document_content.push_str(chunk);
}
};
const MAX_CTX: usize = 50000;
let mut is_truncated = false;
if range.is_empty() {
prompt.push_str("The point you'll need to insert at is marked with <insert_here></insert_here>.\n\n<document>");
} else {
prompt.push_str("The section you'll need to rewrite is marked with <rewrite_this></rewrite_this> tags.\n\n<document>");
}
// Include file content.
let before_range = 0..range.start;
let truncated_before = if before_range.len() > MAX_CTX {
is_truncated = true;
range.start - MAX_CTX..range.start
} else {
before_range
};
let mut non_rewrite_len = truncated_before.len();
for chunk in buffer.text_for_range(truncated_before) {
prompt.push_str(chunk);
}
if !range.is_empty() {
prompt.push_str("<rewrite_this>\n");
for chunk in buffer.text_for_range(range.clone()) {
prompt.push_str(chunk);
}
prompt.push_str("\n<rewrite_this>");
} else {
prompt.push_str("<insert_here></insert_here>");
}
let after_range = range.end..buffer.len();
let truncated_after = if after_range.len() > MAX_CTX {
is_truncated = true;
range.end..range.end + MAX_CTX
} else {
after_range
};
non_rewrite_len += truncated_after.len();
for chunk in buffer.text_for_range(truncated_after) {
prompt.push_str(chunk);
}
write!(prompt, "</document>\n\n").unwrap();
if is_truncated {
writeln!(prompt, "The context around the relevant section has been truncated (possibly in the middle of a line) for brevity.\n").unwrap();
}
if range.is_empty() {
writeln!(
prompt,
"You can't replace {content_type}, your answer will be inserted in place of the `<insert_here></insert_here>` tags. Don't include the insert_here tags in your output.",
)
.unwrap();
writeln!(
prompt,
"Generate {content_type} based on the following prompt:\n\n<prompt>\n{user_prompt}\n</prompt>",
)
.unwrap();
writeln!(prompt, "Match the indentation in the original file in the inserted {content_type}, don't include any indentation on blank lines.\n").unwrap();
prompt.push_str("Immediately start with the following format with no remarks:\n\n```\n{{INSERTED_CODE}}\n```");
} else {
writeln!(prompt, "Edit the section of {content_type} in <rewrite_this></rewrite_this> tags based on the following prompt:'").unwrap();
writeln!(prompt, "\n<prompt>\n{user_prompt}\n</prompt>\n").unwrap();
let rewrite_len = range.end - range.start;
if rewrite_len < 20000 && rewrite_len * 2 < non_rewrite_len {
writeln!(prompt, "And here's the section to rewrite based on that prompt again for reference:\n\n<rewrite_this>\n").unwrap();
if is_insert {
document_content.push_str("<insert_here></insert_here>");
} else {
document_content.push_str("<rewrite_this>\n");
for chunk in buffer.text_for_range(range.clone()) {
prompt.push_str(chunk);
document_content.push_str(chunk);
}
writeln!(prompt, "\n</rewrite_this>\n").unwrap();
document_content.push_str("\n</rewrite_this>");
}
writeln!(prompt, "Only make changes that are necessary to fulfill the prompt, leave everything else as-is. All surrounding {content_type} will be preserved.\n").unwrap();
write!(
prompt,
"Start at the indentation level in the original file in the rewritten {content_type}. "
)
.unwrap();
prompt.push_str("Don't stop until you've rewritten the entire section, even if you have no more changes to make, always write out the whole section with no unnecessary elisions.");
prompt.push_str("\n\nImmediately start with the following format with no remarks:\n\n```\n{{REWRITTEN_CODE}}\n```");
for chunk in buffer.text_for_range(truncated_after) {
document_content.push_str(chunk);
}
let rewrite_section = if !is_insert {
let mut section = String::new();
for chunk in buffer.text_for_range(range.clone()) {
section.push_str(chunk);
}
Some(section)
} else {
None
};
let context = ContentPromptContext {
content_type: content_type.to_string(),
language_name: language_name.map(|s| s.to_string()),
is_insert,
is_truncated,
document_content,
user_prompt,
rewrite_section,
};
self.handlebars.lock().render("content_prompt", &context)
}
prompt
}
pub fn generate_terminal_assistant_prompt(
&self,
user_prompt: &str,
shell: Option<&str>,
working_directory: Option<&str>,
latest_output: &[String],
) -> Result<String, RenderError> {
let context = TerminalAssistantPromptContext {
os: std::env::consts::OS.to_string(),
arch: std::env::consts::ARCH.to_string(),
shell: shell.map(|s| s.to_string()),
working_directory: working_directory.map(|s| s.to_string()),
latest_output: latest_output.to_vec(),
user_prompt: user_prompt.to_string(),
};
pub fn generate_terminal_assistant_prompt(
user_prompt: &str,
shell: Option<&str>,
working_directory: Option<&str>,
latest_output: &[String],
) -> String {
let mut prompt = String::new();
writeln!(&mut prompt, "You are an expert terminal user.").unwrap();
writeln!(&mut prompt, "You will be given a description of a command and you need to respond with a command that matches the description.").unwrap();
writeln!(&mut prompt, "Do not include markdown blocks or any other text formatting in your response, always respond with a single command that can be executed in the given shell.").unwrap();
writeln!(
&mut prompt,
"Current OS name is '{}', architecture is '{}'.",
std::env::consts::OS,
std::env::consts::ARCH,
)
.unwrap();
if let Some(shell) = shell {
writeln!(&mut prompt, "Current shell is '{shell}'.").unwrap();
self.handlebars
.lock()
.render("terminal_assistant_prompt", &context)
}
if let Some(working_directory) = working_directory {
writeln!(
&mut prompt,
"Current working directory is '{working_directory}'."
)
.unwrap();
}
if !latest_output.is_empty() {
writeln!(
&mut prompt,
"Latest non-empty {} lines of the terminal output: {:?}",
latest_output.len(),
latest_output
)
.unwrap();
}
writeln!(&mut prompt, "Here is the description of the command:").unwrap();
prompt.push_str(user_prompt);
prompt
}

View File

@ -1,6 +1,6 @@
use crate::{
humanize_token_count, prompts::generate_terminal_assistant_prompt, AssistantPanel,
AssistantPanelEvent, ModelSelector, DEFAULT_CONTEXT_LINES,
humanize_token_count, prompts::PromptBuilder, AssistantPanel, AssistantPanelEvent,
ModelSelector, DEFAULT_CONTEXT_LINES,
};
use anyhow::{Context as _, Result};
use client::telemetry::Telemetry;
@ -32,8 +32,13 @@ use ui::{prelude::*, IconButtonShape, Tooltip};
use util::ResultExt;
use workspace::{notifications::NotificationId, Toast, Workspace};
pub fn init(fs: Arc<dyn Fs>, telemetry: Arc<Telemetry>, cx: &mut AppContext) {
cx.set_global(TerminalInlineAssistant::new(fs, telemetry));
pub fn init(
fs: Arc<dyn Fs>,
prompt_builder: Arc<PromptBuilder>,
telemetry: Arc<Telemetry>,
cx: &mut AppContext,
) {
cx.set_global(TerminalInlineAssistant::new(fs, prompt_builder, telemetry));
}
const PROMPT_HISTORY_MAX_LEN: usize = 20;
@ -55,18 +60,24 @@ pub struct TerminalInlineAssistant {
prompt_history: VecDeque<String>,
telemetry: Option<Arc<Telemetry>>,
fs: Arc<dyn Fs>,
prompt_builder: Arc<PromptBuilder>,
}
impl Global for TerminalInlineAssistant {}
impl TerminalInlineAssistant {
pub fn new(fs: Arc<dyn Fs>, telemetry: Arc<Telemetry>) -> Self {
pub fn new(
fs: Arc<dyn Fs>,
prompt_builder: Arc<PromptBuilder>,
telemetry: Arc<Telemetry>,
) -> Self {
Self {
next_assist_id: TerminalInlineAssistId::default(),
assists: HashMap::default(),
prompt_history: VecDeque::default(),
telemetry: Some(telemetry),
fs,
prompt_builder,
}
}
@ -246,7 +257,7 @@ impl TerminalInlineAssistant {
None
};
let prompt = generate_terminal_assistant_prompt(
let prompt = self.prompt_builder.generate_terminal_assistant_prompt(
&assist
.prompt_editor
.clone()
@ -256,7 +267,7 @@ impl TerminalInlineAssistant {
shell.as_deref(),
working_directory.as_deref(),
&latest_output,
);
)?;
let mut messages = Vec::new();
if let Some(context_request) = context_request {

View File

@ -184,6 +184,20 @@ pub fn prompts_dir() -> &'static PathBuf {
})
}
/// Returns the path to the prompt templates directory.
///
/// This is where the prompt templates for core features can be overridden with templates.
pub fn prompt_templates_dir() -> &'static PathBuf {
static PROMPT_TEMPLATES_DIR: OnceLock<PathBuf> = OnceLock::new();
PROMPT_TEMPLATES_DIR.get_or_init(|| {
if cfg!(target_os = "macos") {
config_dir().join("prompts").join("templates")
} else {
support_dir().join("prompts").join("templates")
}
})
}
/// Returns the path to the semantic search's embeddings directory.
///
/// This is where the embeddings used to power semantic search are stored.

View File

@ -151,3 +151,76 @@ To create a custom keybinding that prefills a prompt, you can add the following
}
]
```
## Advanced: Overriding prompt templates
Zed allows you to override the default prompts used for various assistant features by placing custom Handlebars (.hbs) templates in your `~/.config/zed/prompts/templates` directory. The following templates can be overridden:
1. `content_prompt.hbs`: Used for generating content in the editor.
Format:
```handlebars
You are an AI programming assistant. Your task is to
{{#if is_insert}}insert{{else}}rewrite{{/if}}
{{content_type}}{{#if language_name}} in {{language_name}}{{/if}}
based on the following context and user request. Context:
{{#if is_truncated}}
[Content truncated...]
{{/if}}
{{document_content}}
{{#if is_truncated}}
[Content truncated...]
{{/if}}
User request:
{{user_prompt}}
{{#if rewrite_section}}
Please rewrite the section enclosed in
<rewrite_this></rewrite_this>
tags.
{{else}}
Please insert your response at the
<insert_here></insert_here>
tag.
{{/if}}
Provide only the
{{content_type}}
content in your response, without any additional explanation.
```
2. `terminal_assistant_prompt.hbs`: Used for the terminal assistant feature.
Format:
```handlebars
You are an AI assistant for a terminal emulator. Provide helpful responses to
user queries about terminal commands, file systems, and general computer
usage. System information: - Operating System:
{{os}}
- Architecture:
{{arch}}
{{#if shell}}
- Shell:
{{shell}}
{{/if}}
{{#if working_directory}}
- Current Working Directory:
{{working_directory}}
{{/if}}
Latest terminal output:
{{#each latest_output}}
{{this}}
{{/each}}
User query:
{{user_prompt}}
Provide a clear and concise response to the user's query, considering the
given system information and latest terminal output if relevant.
```
You can customize these templates to better suit your needs while maintaining the core structure and variables used by Zed. Zed will automatically reload your prompt overrides when they change on disk.
Be sure you want to override these, as you'll miss out on iteration on our built in features. This should be primarily used when developing Zed.