- Fix bug where `DB_Table` data quality indicators broke deserialization in the table viz.
- Memorization of the untrimmed data quality indicator and move to it being an operation and column function.
- If more than 10,000 rows then use a sample for untrimmed.
- ALIASes for blank functions.
- Fix for Snowflake drill down.
- Bug fix for Long and Double columns with Nothings at end.
Replaces the Regex based number parser with a new parser which works out the same by working out each part as it sees and example of it.
Close#7398 - performance of reading the large CSV now about 2s (down from 15-20s).
- Part of #11311
- Adds ability to read a list of files (Vector, Column, Table) into a Vector.
- Reading into a Table of objects or merged will come in a next PR.
- Closes#11227
- Additionally, it should fix#11278 by ensuring that every scheduled message goes to the desired endpoint, by splitting each batch by endpoint.
- Allow Interrupted Exceptions to float out of the web requests.
- Use `Type_Error` rather than Any when catching auto scoping resolving.
- Rename `Java_Exception` to `JException`
- Fix debug logging for #11088 case--attempt to create an exception that is its own cause fails.
- In case the parser is used after closing, throw an `IllegalStateException` instead of UB. (This case is not known to occur and doesn't seem to be behind the #11121, but we should handle it more safely if it does.)
* Add today, now and time to Expression.
* Move running and compute into Column as that allows them to be used in expressions.
* Fix bug.
* Fix exports.
* Java fmt.
This implements `DB_Column.with`, which uses `WITH ... AS` SQL clauses to remove duplicates in the generated SQL.
After a discussion with @radeusgd, we concluded that we will probably want a more complete CTE implementation, so this one is useful for now to deal with big queries (like `round`).
# Important Notes
Still to do in this PR:
- [x] Rename `with` to `let` (or something similar)
- [x] tests
- [x] documentation
- [x] remove `State` hack by moving query generation into a class and using a `Ref` field for scoping
Results on `round_float`:
| --- | SQL length in characters (unprettified) | SQL length in lines (prettified) |
| --- | --- | --- |
| Without CTEs | 13193 | 851 |
| With CTEs | 3644 | 187 |
Compare the SQL:
[without-ctes.sql.txt](https://github.com/user-attachments/files/16629356/without-ctes.sql.txt)
[with-ctes.sql.txt](https://github.com/user-attachments/files/16629357/with-ctes.sql.txt)
Update, with name shortening:
| --- | SQL length in characters (unprettified) | SQL length in lines (prettified) |
| --- | --- | --- |
| Without CTEs | 13193 | 853 |
| With CTEs | 2427 | 176 |
[without-cte.txt](https://github.com/user-attachments/files/16694328/without-cte.txt)
[with-cte.txt](https://github.com/user-attachments/files/16694327/with-cte.txt)
- Enables the `..` autoscoping style for creating Atoms in expressions.
- Add type checking to methods in columns.
- Auto wrap returns from method in expressions into a column as needed.
- Remove `Time_Period.Day` to remove confusion..
- Adds `Hyper_File` allowing reading a Tableau hyper file.
- Can read the schema and table list.
- Can read the structure of a table.
- Can read data into an Enso Table.
Fixes#10609 by rewriting all our upload-related operations to rely on `DDL_Transaction` - an abstraction that handles 'transactionality' of `CREATE TABLE` statements dependent on if a given backend allows DDLs within transactions or not (if not it emulates transactionality by creating the tables outside of transaction and then dropping them on rollback).
Enables `engine.TruffleCompilation` in `std-benchmarks`, collects the logs and dumps compilation into to `System.err` when a benchmark is influenced by dynamic compilation.
- Related to #9486
- Ensures that even though an integer column in Snowflake is represented by `Decimal` type, if the values are small enough, they are materialized as `Integer`.
- If the values are larger, they are still read in as `Decimal`.
- Adds tests for some other `Decimal` edge cases (various precisions and scales), and for `Float`.
- Related to #9486
- Fixes types in literal tables that are used throughout the tests
- Tries to makes testing faster by disabling some edge cases, trying batching some queries, re-using the main connection and trying to re-use tables more
- Implements date/time type mapping and operations for Snowflake
- Updates type mapping to correctly reflect what Snowflake does
- Disables warnings for Integer->Decimal coercion as that's too annoying and implicitly understood in Snowflake
- Allows to select a Decimal column with `..By_Type ..Integer` (only in Snowflake backend) because the Decimal column there is its 'de-facto' Integer column replacement.
* treat scale nothing as unspecifed
* cast to decimal
* float int biginteger
* conversion failure ints
* loss of decimal precision
* precision loss for mixed column to float
* mixed columns
* loss of precision on inexact float conversion
* cleanup, reuse
* changelog
* review
* no fits bd
* no warning on 0.1 conversion
* fmt
* big_decimal_fetcher
* default fetcher and statement setting
* round-trip d
* fix warning
* expr +10
* double builder retype to bigdecimal
* Use BD fetcher for underspecified postgres numeric column, not inferred builder, and do not use biginteger builder for integral bigdecimal values
* fix tests
* fix test
* cast_op_type
* no-ops for other dialects
* Types
* sum + avg
* avg + sum test
* fix test
* update agg type inference test
* wip
* is_int8, stddev
* more doc, overflow check
* fmt
* finish round-trip test
* wip
* Update existing behaviou to match new
* Add signatures
* Red test
* First test green
* sbt javafmtAll
* In-Memory working
* Not implemeted for In-Db
* Docs
* Disable tests for in-db
* Changelog
* Code review changes
* Fix
* Fix
* Fixc tests
- Includes HTTP method in error message
- Does not do special handling for `403` status code - this was wrong and led to `Unauthorized` error when the real cause was lack of permssions in the Cloud. The errors should be more understandable now.
- Adds `projectSessionId` to audit log metadata.
- Fixes a test (`Secrets_Spec`) that did not have unique names and would fail if cleanup of previous runs failed (or if ran in parallel).