As many macros as possible are moved to Macros.h, while the
macros to create a test case are moved to TestCase.h. TestCase is now
the only user-facing header for creating a test case. TestSuite and its
helpers have moved into a .cpp file. Instead of requiring a TEST_MAIN
macro to be instantiated into the test file, a TestMain.cpp file is
provided instead that will be linked against each test. This has the
side effect that, if we wanted to have test cases split across multiple
files, it's as simple as adding them all to the same executable.
The test main should be portable to kernel mode as well, so if
there's a set of tests that should be run in self-test mode in kernel
space, we can accomodate that.
A new serenity_test CMake function streamlines adding a new test with
arguments for the test source file, subdirectory under /usr/Tests to
install the test application and an optional list of libraries to link
against the test application. To accomodate future test where the
provided TestMain.cpp is not suitable (e.g. test-js), a CUSTOM_MAIN
parameter can be passed to the function to not link against the
boilerplate main function.
The EXISTS expression is a bit of an odd-man-out because it can appear
as any of the following forms:
EXISTS (select-stmt)
NOT EXISTS (select-stmt)
(select-stmt)
Which makes it the only keyword expression that doesn't require its
keyword to actually be used. The consequence is that we might come
across an EXISTS expression while parsing another expression type;
NOT would have triggered a unary operator expression, and an opening
parentheses would have triggered an expression chain.
Currently, every parse_*_statement method ends by parsing a semi-colon.
This will prevent nested statements, e.g. a SELECT statement may be
nested in a CREATE TABLE statement. Move the semi-colon expectation up
and out of the individual statement parsers.
Another common semantic is parsing an identifier of the form
"schema_name.table_name" / "table_name". Add a helper to do this work.
This helper does not parse any optional alias after the table name.
some syntaxes specify an alias using the AS keyword, some let the AS
keyword be optional, and others just parse it as an identifier. So
callers to this helper will just continue parsing the alias however
they require.
A quite common semantic emerged for parsing comma-separated expressions:
consume(TokenType::ParenOpen);
do {
// do something
if (!match(TokenType::Comma))
break;
consume(TokenType::Comma);
} while (!match(TokenType::Eof));
consume(TokenType::ParenClose);
Add a helper to do the bulk of the while loop.
Statements like SELECT, INSERT, and UPDATE also optionally include this
list, so move its parsing out of parse_delete_statement(). Since it will
appear before the actual statement, parse it first in next_statement();
then only parse for statements that are allowed to include the list.
Misread the graph: In the "WITH [RECURSIVE] common-table-expression"
section, common-table-expression is actually a repeating list. This
changes the parser to correctly parse this section as a list. Create a
new AST node, CommonTableExpressionList, to store both this list and the
boolean RECURSIVE attribute (because every statement that uses this list
also includes the RECURSIVE attribute beforehand).
SPDX License Identifiers are a more compact / standardized
way of representing file license information.
See: https://spdx.dev/resources/use/#identifiers
This was done with the `ambr` search and replace tool.
ambr --no-parent-ignore --key-from-file --rep-from-file key.txt rep.txt *
https://sqlite.org/lang_expr.html
The entry point to using expressions, parse_expression(), is not used
by SQL::Parser in this commit. But there's so much here that it's easier
to grok as its own commit.
The following is a common (and soon to be *very* common) expression:
if (match(token_type))
consume();
Using consume_if() makes this a bit simpler and makes it less likely to
forget to invoke consume() after the match().
LibSQL aims to be a SQLite clone for SerenityOS. Step 1 is creating a
tokenizer to lex SQL tokens. This lexer is heavily influenced by the
LibJS lexer.