Private Beta 1
We are thrilled to launch the first private beta of Timeplus cloud release. A lot of cool features and unlimited possibilities. We will update the beta version from time to time and list key enhancements in this page.
(in year 2022)
Week of 8/1
Last weekly release in Private Beta 1. Starting from August 8, we are transiting to Private Beta 2. Customers will be migrated to the new environments batch by batch. URLs to access beta tenants are changed from https://TENANT.beta.timeplus.com to https://beta.timeplus.cloud/TENANT
-
Streaming engine
- Added 2 geo related functions: point_in_polygon and geo_distance.
-
Source, sink, API and SDK
- Updated the layout of the "Sources" page to leave more space for the source definitions.
-
UI improvements
- Added a visual indicator on the query tab if the query is running.
- Updated the error or page-not-found screens.
Week of 7/25
- Streaming engine
- Enhanced json_extract_array function to return clean string values.
select '{"a":1,"tags":["x","y"]}' as raw, json_extract_array(raw:tags)
now returns[ "x", "y" ]
, instead of[ "\"x\"", "\"y\"" ]
in the previous releases. - Added a new shortcut to access json arrays without having to use the json_extract_array function. The above query can be simplified as
select '{"a":1,"tags":["x","y"]}' as raw, raw:tags[*]
- Refined typing system and logical comparisons return
bool
instead ofuint8
- Enhanced json_extract_array function to return clean string values.
- Source, sink, API and SDK
- All Timeplus sinks now use the
{{.columnName}}
syntax to access the column value in the sink title or body. Numbers and other primitive types are supported (previously only string columns are supported). - Fixed an issue that canceled queries could be marked as finished.
- Fixed an issue that EPS(event per second) won't be shown if the query finishes too fast.
- All Timeplus sinks now use the
- UI improvements
- Added a new option in the 'Send data to..' dialog to send the results to a stream in the Timeplus tenant.
- Show the number of running queries when you create a new query tab.
- Enhanced the font colors.
- Enhanced the chart colors.
Week of 7/18
-
Streaming engine
- Refined the behavior of materialized views, to keep it consistent with the other Timeplus queries.
SELECT * FROM table(a_materialized_view)
will get all past results, instead of the recent one. - Added the count_if function and unique_exact_if function to count the number of rows or unique value matching certain conditions.
- Added json_extract_keys function to get the keys for the JSON map object.
- Added the to_bool function to convert other types to
bool
- Added is_nan, is_infinite, is_finite functions to detect the edge cases when a number column contains infinite numbers, etc.
- Added to_type_name function to show the data type name, mainly for troubleshooting purposes.
- Refined the behavior of materialized views, to keep it consistent with the other Timeplus queries.
-
Source, sink, API and SDK
- Updated the Python SDK to show the metrics
-
UI improvements
- Added new visualization types: bar chart and streaming table
- Enhanced the management page of sinks, to show the sink status, number of messages sent, failure count, etc.
- Enhanced the SQL editor to highlight the beginning/ending (),,[]. This could be very helpful for complex SQLs with nested function calls.
Week of 7/11
-
Streaming engine
-
Source, sink, API and SDK
- Updated the Python SDK to support the new source API and add authentication to websockets
- Added the optional description field for sinks
-
UI improvements
- Show the newly-opened query tab such as 'Tab 1', or 'Tab 2', instead of 'Untitled'
- Able to delete the first query tab
- Consolidated various streamlit demos to a single URL
- Replaced alert API with sink API. ACTION-REQUIRED: please recreate the alert/sink after the upgrade
Week of 7/4
-
Streaming engine
- Enhanced the UDF(User-Defined-Function) to allow empty authentication headers
-
Source, sink, API and SDK
- Updated the Python SDK to enable the stream retention settings and to create/manage the views
-
UI improvements
- For streaming query results, we now show the percentages for categorical fields
- A new visual editor for complex data type is available for new stream/source/function wizards
- We improved the UI for event time selector while previewing messages in Kafka source
- When you create a stream while defining a Kafka source, now you can define the retention policy (how the old data will be purged automatically)
Week of 6/27
-
Streaming engine
- At this point, session window aggregation only supports streaming queries. Properly reject the query if the
session(..)
is used together withtable(..)
function.
- At this point, session window aggregation only supports streaming queries. Properly reject the query if the
-
Source, sink and API
- The tenant/workspace level access token has been removed in REST API and UI. Please create API keys per user.
- Greatly improved the type inference when you create a stream with the Kafka source. More accurate data type for primitive types. Added array/map support.
-
UI improvements
- Now you can register UDF(User-Defined-Function) in the Workspace menu (only available for workspace admin)
- Greatly improved the data type selector in the stream creation page
- Improved the display for columns in boolean and date types, and better display for null value
- Refined the look & feel for the navigation menu
- Improved the display of error messages
Week of 6/20
-
Streaming engine
- Session window now supports millisecond for window_start/window_end.
- Added a new lags function to get a range of past results. This could help the pure-SQL based ML/prediction.
- Added a new grok function to parse a line of text into key/value pairs, without having to write regular expressions.
-
Source and sink
- Updated datapm to use personal API key instead of API token
-
UI improvements
- Refined the 'Create New Stream' dialog. Now you can specify the max age or size for the stream.
- You can click the user icon on the bottom-left corner and open the 'Personal Settings'. We will add more settings. You can create and manage personal API keys in this setting page. Tenant level access token UI will be removed soon.
-
API
-
Updated the REST API doc, adding the experimental stream-level rentention policies.
-
Week of 6/13
- Streaming engine
- Added new function moving_sum to calculate the moving sum for a column. This unlocks more use cases of stateful streaming processing, such as streaming over.
- Added other functions for array processing, such as array_sum, array_avg
- Source and sink
- Kafka source supports local schema registry without authentication
- UI improvements
- Added validation while creating streams. The stream name should start with a letter and the rest of the name only include number, letter or _
- Disable the Next button in the source wizard if the data is not previewed
Week of 6/6
- Streaming engine
- More math functions are exposed. This can help you to run SQL-based simple ML/prediction models.
- (Experimental) stream-to-stream join no longer requires a
date_diff_within(..)
, although it's still recommended to add timestamp constraints to improve performance. - (Experimental) able to set a retention policy for each stream, either time-based (say only keep recent 7 days' data), or size based(say only keep recent 1GB data)
- Source and sink
- (Experimental) support Personal Access Token (PAT) in the REST API, which is long-living (or set an expiration date) and per-user. Tenant-level access token will be deprecated.
- UI improvements
- After the sources or sinks are created, now you can edit them without having to delete and recreate them.
- Great performance improvement for live tables/charts.
Week of 5/30
- Streaming engine
- Source and sink
- Worked with datapm to send live Twitter data to https://demo.timeplus.com
- Updated the REST API doc, the
/exec
endpoint has been removed. SendPOST
requests to/queries
instead.
- UI improvements
- Able to drag-n-drop to change the order of query tabs
Week of 5/23
- Streaming engine
- (Experimental) new UI and API to create and query external streams. You can query real-time data in Confluent Cloud, Apache Kafka or Redpanda immediately, without loading the data into Timeplus.
- (Experimental) stream-to-stream join is ready to test for beta customers, e.g.
SELECT .. FROM stream1 INNER JOIN stream2 ON stream1.id=stream2.id AND date_diff_within(10s)
- New function date_diff_within to determine whether 2 datetime are within the specified range. This is necessary for stream to stream join. You can also use more flexible expressions like
date_diff('second',left.time,right.time) between -3 and 5
- Source and sink
- Enhanced the datapm Timeplus sink to support loading JSON data from PostgreSQL.
- When you are previewing data from Kafka, you can choose the timezone if the time zone is not included in the raw data.
- UI improvements
- Able to edit the panel titles on dashboards.
- Improve UI consistency.
Week of 5/16
- Streaming engine
- (Experimental) greatly simplified how to query JSON documents. Now you can use
json_doc:json_path
as a shortcut to extract value in the JSON document. For exampleselect '{"c":"hello"}' as j, j:c
, you will get "hello" as the value. In the past, you have to calljson_extract_string(j,'c')
Nested structures are support too, such asselect '{"a":{"b":1}}' as j, j:a.b
will get1
as a string value. To convert to int, you can use::
as the shortcut, e.g.select '{"a":{"b":1}}' as j, j:a.b::int
- Added a function is_valid_json to check whether the specific string is a valid JSON document, e.g.
select j:a where is_valid_json(j)
- Added
varchar
as an alias forstring
data type. This could improve the compatibility with other SQL tools.
- (Experimental) greatly simplified how to query JSON documents. Now you can use
- Source and sink
- Enhanced the authentication for Kafka/Redpanda source: added new SASL mechanisms: scram-sha-256 and scram-sha-512, added config to disable TLS, added config to skip verifying server when TLS is enabled, fixed a bug that the source will hang if authentication fails.
- Enhanced the REST API to specify the timezone for event timestamps in the raw event.
- UI improvements
- Refresh stream list after creating streams or views.
- Show the source configuration JSON in a better tree view.