Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
45 changes: 45 additions & 0 deletions src/tokenizer.rs
Original file line number Diff line number Diff line change
Expand Up @@ -1920,6 +1920,21 @@ impl<'a> Tokenizer<'a> {

chars.next();

// Handle ${placeholder} syntax
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

could we include a reference to the dialect that defines this syntax maybe here?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a tough one, as it's not defined by a specific dialect but used in some of our other integrations around parametrised queries. Namely we're using this with perses

if matches!(chars.peek(), Some('{')) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

wondering how this works with the self.dialect.supports_dollar_placeholder() dialects, is it possible that e.g. ${abc}$ is part of a dollar string or other potentially conflicting syntax?

Copy link
Author

@cetra3 cetra3 Feb 27, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I could possibly add this as another flag to dialect if that makes sense? I just thought it would be yet another flag, and if we could include it in this flag, but you're right it means that it will parse some dialects as valid even if it's not for that dialect

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think if the syntax potentially conflicts we might need to ensure that it doesn't, because in this case its rather that ${abc}$ in the example will be parsed incorrectly as something that it is not (a placeholder vs a string)

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So there is some existing behaviour, whereby $placeholder will work currently with postgres and other dialects even though supports_dollar_placeholder() is false — it falls through to the placeholder fallback when the next character after the identifier isn't $.

The ${placeholder} branch is consistent with that existing behavior, and there's no conflict with dollar-quoted strings since { is not a valid tag character

dollar-quoted string tags only allow alphanumeric characters and underscores (e.g., $tag_1$...$tag_1$), so ${abc}$ could never be interpreted as a dollar-quoted string.

chars.next(); // consume '{'
let placeholder = peeking_take_while(chars, |ch| ch != '}');
if matches!(chars.peek(), Some('}')) {
chars.next(); // consume '}'
return Ok(Token::Placeholder(format!("${{{placeholder}}}")));
} else {
return self.tokenizer_error(
chars.location(),
"Unterminated dollar-brace placeholder, expected '}'",
);
}
}

// If the dialect does not support dollar-quoted strings, then `$$` is rather a placeholder.
if matches!(chars.peek(), Some('$')) && !self.dialect.supports_dollar_placeholder() {
chars.next();
Expand Down Expand Up @@ -3218,6 +3233,36 @@ mod tests {
);
}

#[test]
fn tokenize_dollar_brace_placeholder() {
let sql = String::from("SELECT ${name}, ${1}");
let dialect = GenericDialect {};
let tokens = Tokenizer::new(&dialect, &sql).tokenize().unwrap();
assert_eq!(
tokens,
vec![
Token::make_keyword("SELECT"),
Token::Whitespace(Whitespace::Space),
Token::Placeholder("${name}".into()),
Token::Comma,
Token::Whitespace(Whitespace::Space),
Token::Placeholder("${1}".into()),
]
);
}

#[test]
fn tokenize_dollar_brace_placeholder_unterminated() {
let sql = String::from("SELECT ${name");
let dialect = GenericDialect {};
let result = Tokenizer::new(&dialect, &sql).tokenize();
assert!(result.is_err());
let err = result.unwrap_err();
assert!(err
.to_string()
.contains("Unterminated dollar-brace placeholder"));
}

#[test]
fn tokenize_nested_dollar_quoted_strings() {
let sql = String::from("SELECT $tag$dollar $nested$ string$tag$");
Expand Down
5 changes: 4 additions & 1 deletion tests/sqlparser_common.rs
Original file line number Diff line number Diff line change
Expand Up @@ -10415,7 +10415,7 @@ fn test_placeholder() {
})
);

let sql = "SELECT $fromage_français, :x, ?123";
let sql = "SELECT $fromage_français, :x, ?123, ${placeholder}";
let ast = dialects.verified_only_select(sql);
assert_eq!(
ast.projection,
Expand All @@ -10429,6 +10429,9 @@ fn test_placeholder() {
UnnamedExpr(Expr::Value(
(Value::Placeholder("?123".into())).with_empty_span()
)),
UnnamedExpr(Expr::Value(
(Value::Placeholder("${placeholder}".into())).with_empty_span()
)),
]
);
}
Expand Down