Changelog¶
0.65 (2024-10-07)¶
Upgrade for compatibility with Python 3.13 (by vendoring Pint dependency). (#2434)
Dropped support for Python 3.8.
1.0a16 (2024-09-05)¶
This release focuses on performance, in particular against large tables, and introduces some minor breaking changes for CSS styling in Datasette plugins.
Removed the unit conversions feature and its dependency, Pint. This means Datasette is now compatible with the upcoming Python 3.13. (#2400, #2320)
The
datasette --pdb
option now uses the ipdb debugger if it is installed. You can install it usingdatasette install ipdb
. Thanks, Tiago Ilieve. (#2342)Fixed a confusing error that occurred if
metadata.json
contained nested objects. (#2403)Fixed a bug with
?_trace=1
where it returned a blank page if the response was larger than 256KB. (#2404)Tracing mechanism now also displays SQL queries that returned errors or ran out of time. datasette-pretty-traces 0.5 includes support for displaying this new type of trace. (#2405)
Fixed a text spacing with table descriptions on the homepage. (#2399)
- Performance improvements for large tables:
Suggested facets now only consider the first 1000 rows. (#2406)
Improved performance of date facet suggestion against large tables. (#2407)
Row counts stop at 10,000 rows when listing tables. (#2398)
On table page the count stops at 10,000 rows too, with a "count all" button to execute the full count. (#2408)
New
.dicts()
internal method on Results that returns a list of dictionaries representing the results from a SQL query: (#2414)rows = (await db.execute("select * from t")).dicts()
Default Datasette core CSS that styles inputs and buttons now requires a class of
"core"
on the element or a containing element, for example<form class="core">
. (#2415)Similarly, default table styles now only apply to
<table class="rows-and-columns">
. (#2420)
1.0a15 (2024-08-15)¶
Datasette now defaults to hiding SQLite "shadow" tables, as seen in extensions such as SQLite FTS and sqlite-vec. Virtual tables that it makes sense to display, such as FTS core tables, are no longer hidden. Thanks, Alex Garcia. (#2296)
Fixed bug where running Datasette with one or more
-s/--setting
options could over-ride settings that were present indatasette.yml
. (#2389)The Datasette homepage is now duplicated at
/-/
, using the defaultindex.html
template. This ensures that the information on that page is still accessible even if the Datasette homepage has been customized using a customindex.html
template, for example on sites like datasette.io. (#2393)Failed CSRF checks now display a more user-friendly error page. (#2390)
Fixed a bug where the
json1
extension was not correctly detected on the/-/versions
page. Thanks, Seb Bacon. (#2326)Fixed a bug where the Datasette write API did not correctly accept
Content-Type: application/json; charset=utf-8
. (#2384)Fixed a bug where Datasette would fail to start if
metadata.yml
contained aqueries
block. (#2386)
1.0a14 (2024-08-05)¶
This alpha introduces significant changes to Datasette's Metadata system, some of which represent breaking changes in advance of the full 1.0 release. The new Upgrade guide document provides detailed coverage of those breaking changes and how they affect plugin authors and Datasette API consumers.
The
/databasename?sql=
interface and JSON API for executing arbitrary SQL queries can now be found at/databasename/-/query?sql=
. Requests with a?sql=
parameter to the old endpoints will be redirected. Thanks, Alex Garcia. (#2360)Metadata about tables, databases, instances and columns is now stored in Datasette's internal database. Thanks, Alex Garcia. (#2341)
Database write connections now execute using the
IMMEDIATE
isolation level for SQLite. This should help avoid a rareSQLITE_BUSY
error that could occur when a transaction upgraded to a write mid-flight. (#2358)Fix for a bug where canned queries with named parameters could fail against SQLite 3.46. (#2353)
Datasette now serves
E-Tag
headers for static files. Thanks, Agustin Bacigalup. (#2306)Dropdown menus now use a
z-index
that should avoid them being hidden by plugins. (#2311)Incorrect table and row names are no longer reflected back on the resulting 404 page. (#2359)
Improved documentation for async usage of the track_event(datasette, event) hook. (#2319)
Fixed some HTTPX deprecation warnings. (#2307)
Datasette now serves a
<html lang="en">
attribute. Thanks, Charles Nepote. (#2348)Datasette's automated tests now run against the maximum and minimum supported versions of SQLite: 3.25 (from September 2018) and 3.46 (from May 2024). Thanks, Alex Garcia. (#2352)
Fixed an issue where clicking twice on the URL output by
datasette --root
produced a confusing error. (#2375)
0.64.8 (2024-06-21)¶
Security improvement: 404 pages used to reflect content from the URL path, which could be used to display misleading information to Datasette users. 404 errors no longer display additional information from the URL. (#2359)
Backported a better fix for correctly extracting named parameters from canned query SQL against SQLite 3.46.0. (#2353)
0.64.7 (2024-06-12)¶
Fixed a bug where canned queries with named parameters threw an error when run against SQLite 3.46.0. (#2353)
1.0a13 (2024-03-12)¶
Each of the key concepts in Datasette now has an actions menu, which plugins can use to add additional functionality targeting that entity.
Plugin hook: view_actions() for actions that can be applied to a SQL view. (#2297)
Plugin hook: homepage_actions() for actions that apply to the instance homepage. (#2298)
Plugin hook: row_actions() for actions that apply to the row page. (#2299)
Action menu items for all of the
*_actions()
plugin hooks can now return an optional"description"
key, which will be displayed in the menu below the action label. (#2294)Plugin hooks documentation page is now organized with additional headings. (#2300)
Improved the display of action buttons on pages that also display metadata. (#2286)
The header and footer of the page now uses a subtle gradient effect, and options in the navigation menu are better visually defined. (#2302)
Table names that start with an underscore now default to hidden. (#2104)
pragma_table_list
has been added to the allow-list of SQLite pragma functions supported by Datasette.select * from pragma_table_list()
is no longer blocked. (#2104)
1.0a12 (2024-02-29)¶
New query_actions() plugin hook, similar to table_actions() and database_actions(). Can be used to add a menu of actions to the canned query or arbitrary SQL query page. (#2283)
New design for the button that opens the query, table and database actions menu. (#2281)
"does not contain" table filter for finding rows that do not contain a string. (#2287)
Fixed a bug in the makeColumnActions(columnDetails) JavaScript plugin mechanism where the column action menu was not fully reset in between each interaction. (#2289)
1.0a11 (2024-02-19)¶
The
"replace": true
argument to the/db/table/-/insert
API now requires the actor to have theupdate-row
permission. (#2279)Fixed some UI bugs in the interactive permissions debugging tool. (#2278)
The column action menu now aligns better with the cog icon, and positions itself taking into account the width of the browser window. (#2263)
1.0a10 (2024-02-17)¶
The only changes in this alpha correspond to the way Datasette handles database transactions. (#2277)
The database.execute_write_fn() method has a new
transaction=True
parameter. This defaults toTrue
which means all functions executed using this method are now automatically wrapped in a transaction - previously the functions needed to roll transaction handling on their own, and many did not.Pass
transaction=False
toexecute_write_fn()
if you want to manually handle transactions in your function.Several internal Datasette features, including parts of the JSON write API, had been failing to wrap their operations in a transaction. This has been fixed by the new
transaction=True
default.
1.0a9 (2024-02-16)¶
This alpha release adds basic alter table support to the Datasette Write API and fixes a permissions bug relating to the /upsert
API endpoint.
Alter table support for create, insert, upsert and update¶
The JSON write API can now be used to apply simple alter table schema changes, provided the acting actor has the new alter-table permission. (#2101)
The only alter operation supported so far is adding new columns to an existing table.
The /db/-/create API now adds new columns during large operations to create a table based on incoming example
"rows"
, in the case where one of the later rows includes columns that were not present in the earlier batches. This requires thecreate-table
but not thealter-table
permission.When
/db/-/create
is called with rows in a situation where the table may have been already created, an"alter": true
key can be included to indicate that any missing columns from the new rows should be added to the table. This requires thealter-table
permission./db/table/-/insert and /db/table/-/upsert and /db/table/row-pks/-/update all now also accept
"alter": true
, depending on thealter-table
permission.
Operations that alter a table now fire the new alter-table event.
Permissions fix for the upsert API¶
The /database/table/-/upsert API had a minor permissions bug, only affecting Datasette instances that had configured the insert-row
and update-row
permissions to apply to a specific table rather than the database or instance as a whole. Full details in issue #2262.
To avoid similar mistakes in the future the datasette.permission_allowed() method now specifies default=
as a keyword-only argument.
Permission checks now consider opinions from every plugin¶
The datasette.permission_allowed() method previously consulted every plugin that implemented the permission_allowed() plugin hook and obeyed the opinion of the last plugin to return a value. (#2275)
Datasette now consults every plugin and checks to see if any of them returned False
(the veto rule), and if none of them did, it then checks to see if any of them returned True
.
This is explained at length in the new documentation covering How permissions are resolved.
Other changes¶
The new DATASETTE_TRACE_PLUGINS=1 environment variable turns on detailed trace output for every executed plugin hook, useful for debugging and understanding how the plugin system works at a low level. (#2274)
Datasette on Python 3.9 or above marks its non-cryptographic uses of the MD5 hash function as
usedforsecurity=False
, for compatibility with FIPS systems. (#2270)SQL relating to Datasette's internal database now executes inside a transaction, avoiding a potential database locked error. (#2273)
The
/-/threads
debug page now identifies the database in the name associated with each dedicated write thread. (#2265)The
/db/-/create
API now fires ainsert-rows
event if rows were inserted after the table was created. (#2260)
1.0a8 (2024-02-07)¶
This alpha release continues the migration of Datasette's configuration from metadata.yaml
to the new datasette.yaml
configuration file, introduces a new system for JavaScript plugins and adds several new plugin hooks.
See Datasette 1.0a8: JavaScript plugins, new plugin hooks and plugin configuration in datasette.yaml for an annotated version of these release notes.
Configuration¶
Plugin configuration now lives in the datasette.yaml configuration file, passed to Datasette using the
-c/--config
option. Thanks, Alex Garcia. (#2093)datasette -c datasette.yaml
Where
datasette.yaml
contains configuration that looks like this:plugins: datasette-cluster-map: latitude_column: xlat longitude_column: xlon
Previously plugins were configured in
metadata.yaml
, which was confusing as plugin settings were unrelated to database and table metadata.The
-s/--setting
option can now be used to set plugin configuration as well. See Configuration via the command-line for details. (#2252)The above YAML configuration example using
-s/--setting
looks like this:datasette mydatabase.db \ -s plugins.datasette-cluster-map.latitude_column xlat \ -s plugins.datasette-cluster-map.longitude_column xlon
The new
/-/config
page shows the current instance configuration, after redacting keys that could contain sensitive data such as API keys or passwords. (#2254)Existing Datasette installations may already have configuration set in
metadata.yaml
that should be migrated todatasette.yaml
. To avoid breaking these installations, Datasette will silently treat table configuration, plugin configuration and allow blocks in metadata as if they had been specified in configuration instead. (#2247) (#2248) (#2249)
Note that the datasette publish
command has not yet been updated to accept a datasette.yaml
configuration file. This will be addressed in #2195 but for the moment you can include those settings in metadata.yaml
instead.
JavaScript plugins¶
Datasette now includes a JavaScript plugins mechanism, allowing JavaScript to customize Datasette in a way that can collaborate with other plugins.
This provides two initial hooks, with more to come in the future:
makeAboveTablePanelConfigs() can add additional panels to the top of the table page.
makeColumnActions() can add additional actions to the column menu.
Thanks Cameron Yick for contributing this feature. (#2052)
Plugin hooks¶
New jinja2_environment_from_request(datasette, request, env) plugin hook, which can be used to customize the current Jinja environment based on the incoming request. This can be used to modify the template lookup path based on the incoming request hostname, among other things. (#2225)
New family of template slot plugin hooks:
top_homepage
,top_database
,top_table
,top_row
,top_query
,top_canned_query
. Plugins can use these to provide additional HTML to be injected at the top of the corresponding pages. (#1191)- New track_event() mechanism for plugins to emit and receive events when certain events occur within Datasette. (#2240)
Plugins can register additional event classes using register_events(datasette).
They can then trigger those events with the datasette.track_event(event) internal method.
Plugins can subscribe to notifications of events using the track_event(datasette, event) plugin hook.
Datasette core now emits
login
,logout
,create-token
,create-table
,drop-table
,insert-rows
,upsert-rows
,update-row
,delete-row
events, documented here.
New internal function for plugin authors: await db.execute_isolated_fn(fn), for creating a new SQLite connection, executing code and then closing that connection, all while preventing other code from writing to that particular database. This connection will not have the prepare_connection() plugin hook executed against it, allowing plugins to perform actions that might otherwise be blocked by existing connection configuration. (#2218)
Documentation¶
Documentation describing how to write tests that use signed actor cookies using
datasette.client.actor_cookie()
. (#1830)Documentation on how to register a plugin for the duration of a test. (#2234)
The configuration documentation now shows examples of both YAML and JSON for each setting.
Minor fixes¶
Datasette no longer attempts to run SQL queries in parallel when rendering a table page, as this was leading to some rare crashing bugs. (#2189)
Fixed warning:
DeprecationWarning: pkg_resources is deprecated as an API
(#2057)Fixed bug where
?_extra=columns
parameter returned an incorrectly shaped response. (#2230)
0.64.6 (2023-12-22)¶
Fixed a bug where CSV export with expanded labels could fail if a foreign key reference did not correctly resolve. (#2214)
0.64.5 (2023-10-08)¶
Dropped dependency on
click-default-group-wheel
, which could cause a dependency conflict. (#2197)
1.0a7 (2023-09-21)¶
Fix for a crashing bug caused by viewing the table page for a named in-memory database. (#2189)
0.64.4 (2023-09-21)¶
Fix for a crashing bug caused by viewing the table page for a named in-memory database. (#2189)
1.0a6 (2023-09-07)¶
New plugin hook: actors_from_ids(datasette, actor_ids) and an internal method to accompany it, await .actors_from_ids(actor_ids). This mechanism is intended to be used by plugins that may need to display the actor who was responsible for something managed by that plugin: they can now resolve the recorded IDs of actors into the full actor objects. (#2181)
DATASETTE_LOAD_PLUGINS
environment variable for controlling which plugins are loaded by Datasette. (#2164)Datasette now checks if the user has permission to view a table linked to by a foreign key before turning that foreign key into a clickable link. (#2178)
The
execute-sql
permission now implies that the actor can also view the database and instance. (#2169)Documentation describing a pattern for building plugins that themselves define further hooks for other plugins. (#1765)
Datasette is now tested against the Python 3.12 preview. (#2175)
1.0a5 (2023-08-29)¶
When restrictions are applied to API tokens, those restrictions now behave slightly differently: applying the
view-table
restriction will imply the ability toview-database
for the database containing that table, and bothview-table
andview-database
will implyview-instance
. Previously you needed to create a token with restrictions that explicitly listedview-instance
andview-database
andview-table
in order to view a table without getting a permission denied error. (#2102)New
datasette.yaml
(or.json
) configuration file, which can be specified usingdatasette -c path-to-file
. The goal here to consolidate settings, plugin configuration, permissions, canned queries, and other Datasette configuration into a single single file, separate frommetadata.yaml
. The legacysettings.json
config file used for Configuration directory mode has been removed, anddatasette.yaml
has a"settings"
section where the same settings key/value pairs can be included. In the next future alpha release, more configuration such as plugins/permissions/canned queries will be moved to thedatasette.yaml
file. See #2093 for more details. Thanks, Alex Garcia.The
-s/--setting
option can now take dotted paths to nested settings. These will then be used to set or over-ride the same options as are present in the new configuration file. (#2156)New
--actor '{"id": "json-goes-here"}'
option for use withdatasette --get
to treat the simulated request as being made by a specific actor, see datasette --get. (#2153)The Datasette
_internal
database has had some changes. It no longer shows up in thedatasette.databases
list by default, and is now instead available to plugins using thedatasette.get_internal_database()
. Plugins are invited to use this as a private database to store configuration and settings and secrets that should not be made visible through the default Datasette interface. Users can pass the new--internal internal.db
option to persist that internal database to disk. Thanks, Alex Garcia. (#2157).
1.0a4 (2023-08-21)¶
This alpha fixes a security issue with the /-/api
API explorer. On authenticated Datasette instances (instances protected using plugins such as datasette-auth-passwords) the API explorer interface could reveal the names of databases and tables within the protected instance. The data stored in those tables was not revealed.
For more information and workarounds, read the security advisory. The issue has been present in every previous alpha version of Datasette 1.0: versions 1.0a0, 1.0a1, 1.0a2 and 1.0a3.
Also in this alpha:
The new
datasette plugins --requirements
option outputs a list of currently installed plugins in Pythonrequirements.txt
format, useful for duplicating that installation elsewhere. (#2133)Writable canned queries can now define a
on_success_message_sql
field in their configuration, containing a SQL query that should be executed upon successful completion of the write operation in order to generate a message to be shown to the user. (#2138)The automatically generated border color for a database is now shown in more places around the application. (#2119)
Every instance of example shell script code in the documentation should now include a working copy button, free from additional syntax. (#2140)
1.0a3 (2023-08-09)¶
This alpha release previews the updated design for Datasette's default JSON API. (#782)
The new default JSON representation for both table pages (/dbname/table.json
) and arbitrary SQL queries (/dbname.json?sql=...
) is now shaped like this:
{
"ok": true,
"rows": [
{
"id": 3,
"name": "Detroit"
},
{
"id": 2,
"name": "Los Angeles"
},
{
"id": 4,
"name": "Memnonia"
},
{
"id": 1,
"name": "San Francisco"
}
],
"truncated": false
}
Tables will include an additional "next"
key for pagination, which can be passed to ?_next=
to fetch the next page of results.
The various ?_shape=
options continue to work as before - see Different shapes for details.
A new ?_extra=
mechanism is available for tables, but has not yet been stabilized or documented. Details on that are available in #262.
Smaller changes¶
Datasette documentation now shows YAML examples for Metadata by default, with a tab interface for switching to JSON. (#1153)
register_output_renderer(datasette) plugins now have access to
error
andtruncated
arguments, allowing them to display error messages and take into account truncated results. (#2130)render_cell()
plugin hook now also supports an optionalrequest
argument. (#2007)New
Justfile
to support development workflows for Datasette using Just.datasette.render_template()
can now accepts adatasette.views.Context
subclass as an alternative to a dictionary. (#2127)datasette install -e path
option for editable installations, useful while developing plugins. (#2106)When started with the
--cors
option Datasette now serves anAccess-Control-Max-Age: 3600
header, ensuring CORS OPTIONS requests are repeated no more than once an hour. (#2079)Fixed a bug where the
_internal
database could displayNone
instead ofnull
for in-memory databases. (#1970)
0.64.2 (2023-03-08)¶
Fixed a bug with
datasette publish cloudrun
where deploys all used the same Docker image tag. This was mostly inconsequential as the service is deployed as soon as the image has been pushed to the registry, but could result in the incorrect image being deployed if two different deploys for two separate services ran at exactly the same time. (#2036)
0.64.1 (2023-01-11)¶
0.64 (2023-01-09)¶
Datasette now strongly recommends against allowing arbitrary SQL queries if you are using SpatiaLite. SpatiaLite includes SQL functions that could cause the Datasette server to crash. See SpatiaLite for more details.
New default_allow_sql setting, providing an easier way to disable all arbitrary SQL execution by end users:
datasette --setting default_allow_sql off
. See also Controlling the ability to execute arbitrary SQL. (#1409)Building a location to time zone API with SpatiaLite is a new Datasette tutorial showing how to safely use SpatiaLite to create a location to time zone API.
New documentation about how to debug problems loading SQLite extensions. The error message shown when an extension cannot be loaded has also been improved. (#1979)
Fixed an accessibility issue: the
<select>
elements in the table filter form now show an outline when they are currently focused. (#1771)
0.63.3 (2022-12-17)¶
Fixed a bug where
datasette --root
, when running in Docker, would only output the URL to sign in root when the server shut down, not when it started up. (#1958)You no longer need to ensure
await datasette.invoke_startup()
has been called in order for Datasette to start correctly serving requests - this is now handled automatically the first time the server receives a request. This fixes a bug experienced when Datasette is served directly by an ASGI application server such as Uvicorn or Gunicorn. It also fixes a bug with the datasette-gunicorn plugin. (#1955)
1.0a2 (2022-12-14)¶
The third Datasette 1.0 alpha release adds upsert support to the JSON API, plus the ability to specify finely grained permissions when creating an API token.
See Datasette 1.0a2: Upserts and finely grained permissions for an extended, annotated version of these release notes.
New
/db/table/-/upsert
API, documented here. upsert is an update-or-insert: existing rows will have specified keys updated, but if no row matches the incoming primary key a brand new row will be inserted instead. (#1878)New register_permissions(datasette) plugin hook. Plugins can now register named permissions, which will then be listed in various interfaces that show available permissions. (#1940)
The
/db/-/create
API for creating a table now accepts"ignore": true
and"replace": true
options when called with the"rows"
property that creates a new table based on an example set of rows. This means the API can be called multiple times with different rows, setting rules for what should happen if a primary key collides with an existing row. (#1927)Arbitrary permissions can now be configured at the instance, database and resource (table, SQL view or canned query) level in Datasette's Metadata JSON and YAML files. The new
"permissions"
key can be used to specify which actors should have which permissions. See Other permissions in datasette.yaml for details. (#1636)The
/-/create-token
page can now be used to create API tokens which are restricted to just a subset of actions, including against specific databases or resources. See API Tokens for details. (#1947)Likewise, the
datasette create-token
CLI command can now create tokens with a subset of permissions. (#1855)New datasette.create_token() API method for programmatically creating signed API tokens. (#1951)
/db/-/create
API now requires actor to haveinsert-row
permission in order to use the"row"
or"rows"
properties. (#1937)
1.0a1 (2022-12-01)¶
Write APIs now serve correct CORS headers if Datasette is started in
--cors
mode. See the full list of CORS headers in the documentation. (#1922)Fixed a bug where the
_memory
database could be written to even though writes were not persisted. (#1917)The https://latest.datasette.io/ demo instance now includes an
ephemeral
database which can be used to test Datasette's write APIs, using the new datasette-ephemeral-tables plugin to drop any created tables after five minutes. This database is only available if you sign in as the root user using the link on the homepage. (#1915)Fixed a bug where hitting the write endpoints with a
GET
request returned a 500 error. It now returns a 405 (method not allowed) error instead. (#1916)The list of endpoints in the API explorer now lists mutable databases first. (#1918)
The
"ignore": true
and"replace": true
options for the insert API are now documented. (#1924)
1.0a0 (2022-11-29)¶
This first alpha release of Datasette 1.0 introduces a brand new collection of APIs for writing to the database (#1850), as well as a new API token mechanism baked into Datasette core. Previously, API tokens have only been supported by installing additional plugins.
This is very much a preview: expect many more backwards incompatible API changes prior to the full 1.0 release.
Feedback enthusiastically welcomed, either through issue comments or via the Datasette Discord community.
Signed API tokens¶
New
/-/create-token
page allowing authenticated users to create signed API tokens that can act on their behalf, see API Tokens. (#1852)New
datasette create-token
command for creating tokens from the command line: datasette create-token.New allow_signed_tokens setting which can be used to turn off signed token support. (#1856)
New max_signed_tokens_ttl setting for restricting the maximum allowed duration of a signed token. (#1858)
Write API¶
New API explorer at
/-/api
for trying out the API. (#1871)/db/-/create
API for Creating a table. (#1882)/db/table/-/insert
API for Inserting rows. (#1851)/db/table/-/drop
API for Dropping tables. (#1874)/db/table/pk/-/update
API for Updating a row. (#1863)/db/table/pk/-/delete
API for Deleting a row. (#1864)
0.63.2 (2022-11-18)¶
Fixed a bug in
datasette publish heroku
where deployments failed due to an older version of Python being requested. (#1905)New
datasette publish heroku --generate-dir <dir>
option for generating a Heroku deployment directory without deploying it.
0.63.1 (2022-11-10)¶
Fixed a bug where Datasette's table filter form would not redirect correctly when run behind a proxy using the base_url setting. (#1883)
SQL query is now shown wrapped in a
<textarea>
if a query exceeds a time limit. (#1876)Fixed an intermittent "Too many open files" error while running the test suite. (#1843)
New db.close() internal method.
0.63 (2022-10-27)¶
See Datasette 0.63: The annotated release notes for more background on the changes in this release.
Features¶
Now tested against Python 3.11. Docker containers used by
datasette publish
anddatasette package
both now use that version of Python. (#1853)--load-extension
option now supports entrypoints. Thanks, Alex Garcia. (#1789)Facet size can now be set per-table with the new
facet_size
table metadata option. (#1804)The truncate_cells_html setting now also affects long URLs in columns. (#1805)
The non-JavaScript SQL editor textarea now increases height to fit the SQL query. (#1786)
Facets are now displayed with better line-breaks in long values. Thanks, Daniel Rech. (#1794)
The
settings.json
file used in Configuration directory mode is now validated on startup. (#1816)SQL queries can now include leading SQL comments, using
/* ... */
or-- ...
syntax. Thanks, Charles Nepote. (#1860)SQL query is now re-displayed when terminated with a time limit error. (#1819)
The inspect data mechanism is now used to speed up server startup - thanks, Forest Gregg. (#1834)
In Configuration directory mode databases with filenames ending in
.sqlite
or.sqlite3
are now automatically added to the Datasette instance. (#1646)Breadcrumb navigation display now respects the current user's permissions. (#1831)
Plugin hooks and internals¶
The prepare_jinja2_environment(env, datasette) plugin hook now accepts an optional
datasette
argument. Hook implementations can also now return anasync
function which will be awaited automatically. (#1809)Database(is_mutable=)
now defaults toTrue
. (#1808)The datasette.check_visibility() method now accepts an optional
permissions=
list, allowing it to take multiple permissions into account at once when deciding if something should be shown as public or private. This has been used to correctly display padlock icons in more places in the Datasette interface. (#1829)Datasette no longer enforces upper bounds on its dependencies. (#1800)
Documentation¶
New tutorial: Cleaning data with sqlite-utils and Datasette.
Screenshots in the documentation are now maintained using shot-scraper, as described in Automating screenshots for the Datasette documentation using shot-scraper. (#1844)
More detailed command descriptions on the CLI reference page. (#1787)
New documentation on Running Datasette using OpenRC - thanks, Adam Simpson. (#1825)
0.62 (2022-08-14)¶
Datasette can now run entirely in your browser using WebAssembly. Try out Datasette Lite, take a look at the code or read more about it in Datasette Lite: a server-side Python web application running in a browser.
Datasette now has a Discord community for questions and discussions about Datasette and its ecosystem of projects.
Features¶
Datasette is now compatible with Pyodide. This is the enabling technology behind Datasette Lite. (#1733)
Database file downloads now implement conditional GET using ETags. (#1739)
HTML for facet results and suggested results has been extracted out into new templates
_facet_results.html
and_suggested_facets.html
. Thanks, M. Nasimul Haque. (#1759)Datasette now runs some SQL queries in parallel. This has limited impact on performance, see this research issue for details.
New
--nolock
option for ignoring file locks when opening read-only databases. (#1744)Spaces in the database names in URLs are now encoded as
+
rather than~20
. (#1701)<Binary: 2427344 bytes>
is now displayed as<Binary: 2,427,344 bytes>
and is accompanied by tooltip showing "2.3MB". (#1712)The base Docker image used by
datasette publish cloudrun
,datasette package
and the official Datasette image has been upgraded to3.10.6-slim-bullseye
. (#1768)Canned writable queries against immutable databases now show a warning message. (#1728)
datasette publish cloudrun
has a new--timeout
option which can be used to increase the time limit applied by the Google Cloud build environment. Thanks, Tim Sherratt. (#1717)datasette publish cloudrun
has new--min-instances
and--max-instances
options. (#1779)
Plugin hooks¶
New plugin hook: handle_exception(), for custom handling of exceptions caught by Datasette. (#1770)
The render_cell() plugin hook is now also passed a
row
argument, representing thesqlite3.Row
object that is being rendered. (#1300)The configuration directory is now stored in
datasette.config_dir
, making it available to plugins. Thanks, Chris Amico. (#1766)
Bug fixes¶
Don't show the facet option in the cog menu if faceting is not allowed. (#1683)
?_sort
and?_sort_desc
now work if the column that is being sorted has been excluded from the query using?_col=
or?_nocol=
. (#1773)Fixed bug where
?_sort_desc
was duplicated in the URL every time the Apply button was clicked. (#1738)
Documentation¶
Examples in the documentation now include a copy-to-clipboard button. (#1748)
Code examples in the documentation are now all formatted using Black. (#1718)
Request.fake()
method is now documented, see Request object.New documentation for plugin authors: Registering a plugin for the duration of a test. (#903)
0.61.1 (2022-03-23)¶
Fixed a bug where databases with a different route from their name (as used by the datasette-hashed-urls plugin) returned errors when executing custom SQL queries. (#1682)
0.61 (2022-03-23)¶
In preparation for Datasette 1.0, this release includes two potentially backwards-incompatible changes. Hashed URL mode has been moved to a separate plugin, and the way Datasette generates URLs to databases and tables with special characters in their name such as /
and .
has changed.
Datasette also now requires Python 3.7 or higher.
URLs within Datasette now use a different encoding scheme for tables or databases that include "special" characters outside of the range of
a-zA-Z0-9_-
. This scheme is explained here: Tilde encoding. (#1657)Removed hashed URL mode from Datasette. The new
datasette-hashed-urls
plugin can be used to achieve the same result, see datasette-hashed-urls for details. (#1661)Databases can now have a custom path within the Datasette instance that is independent of the database name, using the
db.route
property. (#1668)Datasette is now covered by a Code of Conduct. (#1654)
Python 3.6 is no longer supported. (#1577)
Tests now run against Python 3.11-dev. (#1621)
New datasette.ensure_permissions(actor, permissions) internal method for checking multiple permissions at once. (#1675)
New datasette.check_visibility(actor, action, resource=None) internal method for checking if a user can see a resource that would otherwise be invisible to unauthenticated users. (#1678)
Table and row HTML pages now include a
<link rel="alternate" type="application/json+datasette" href="...">
element and return aLink: URL; rel="alternate"; type="application/json+datasette"
HTTP header pointing to the JSON version of those pages. (#1533)Access-Control-Expose-Headers: Link
is now added to the CORS headers, allowing remote JavaScript to access that header.Canned queries are now shown at the top of the database page, directly below the SQL editor. Previously they were shown at the bottom, below the list of tables. (#1612)
Datasette now has a default favicon. (#1603)
sqlite_stat
tables are now hidden by default. (#1587)SpatiaLite tables
data_licenses
,KNN
andKNN2
are now hidden by default. (#1601)SQL query tracing mechanism now works for queries executed in
asyncio
sub-tasks, such as those created byasyncio.gather()
. (#1576)datasette.tracer mechanism is now documented.
Common Datasette symbols can now be imported directly from the top-level
datasette
package, see Import shortcuts. Those symbols areResponse
,Forbidden
,NotFound
,hookimpl
,actor_matches_allow
. (#957)/-/versions
page now returns additional details for libraries used by SpatiaLite. (#1607)Documentation now links to the Datasette Tutorials.
Datasette will now also look for SpatiaLite in
/opt/homebrew
- thanks, Dan Peterson. (#1649)Fixed bug where custom pages did not work on Windows. Thanks, Robert Christie. (#1545)
Fixed error caused when a table had a column named
n
. (#1228)
0.60.2 (2022-02-07)¶
Fixed a bug where Datasette would open the same file twice with two different database names if you ran
datasette file.db file.db
. (#1632)
0.60.1 (2022-01-20)¶
Fixed a bug where installation on Python 3.6 stopped working due to a change to an underlying dependency. This release can now be installed on Python 3.6, but is the last release of Datasette that will support anything less than Python 3.7. (#1609)
0.60 (2022-01-13)¶
Plugins and internals¶
New plugin hook: filters_from_request(request, database, table, datasette), which runs on the table page and can be used to support new custom query string parameters that modify the SQL query. (#473)
Added two additional methods for writing to the database: await db.execute_write_script(sql, block=True) and await db.execute_write_many(sql, params_seq, block=True). (#1570)
The db.execute_write() internal method now defaults to blocking until the write operation has completed. Previously it defaulted to queuing the write and then continuing to run code while the write was in the queue. (#1579)
Database write connections now execute the prepare_connection(conn, database, datasette) plugin hook. (#1564)
The
Datasette()
constructor no longer requires thefiles=
argument, and is now documented at Datasette class. (#1563)The tracing feature now traces write queries, not just read queries. (#1568)
The query string variables exposed by
request.args
will now include blank strings for arguments such asfoo
in?foo=&bar=1
rather than ignoring those parameters entirely. (#1551)
Faceting¶
The number of unique values in a facet is now always displayed. Previously it was only displayed if the user specified
?_facet_size=max
. (#1556)Facets of type
date
orarray
can now be configured inmetadata.json
, see Facets in metadata. Thanks, David Larlet. (#1552)New
?_nosuggest=1
parameter for table views, which disables facet suggestion. (#1557)Fixed bug where
?_facet_array=tags&_facet=tags
would only display one of the two selected facets. (#625)
Other small fixes¶
Made several performance improvements to the database schema introspection code that runs when Datasette first starts up. (#1555)
Label columns detected for foreign keys are now case-insensitive, so
Name
orTITLE
will be detected in the same way asname
ortitle
. (#1544)Upgraded Pluggy dependency to 1.0. (#1575)
Now using Plausible analytics for the Datasette documentation.
explain query plan
is now allowed with varying amounts of whitespace in the query. (#1588)New CLI reference page showing the output of
--help
for each of thedatasette
sub-commands. This lead to several small improvements to the help copy. (#1594)Fixed bug where writable canned queries could not be used with custom templates. (#1547)
Improved fix for a bug where columns with a underscore prefix could result in unnecessary hidden form fields. (#1527)
0.59.4 (2021-11-29)¶
Fixed bug where columns with a leading underscore could not be removed from the interactive filters list. (#1527)
Fixed bug where columns with a leading underscore were not correctly linked to by the "Links from other tables" interface on the row page. (#1525)
Upgraded dependencies
aiofiles
,black
andjanus
.
0.59.3 (2021-11-20)¶
Fixed numerous bugs when running Datasette behind a proxy with a prefix URL path using the base_url setting. A live demo of this mode is now available at datasette-apache-proxy-demo.datasette.io/prefix/. (#1519, #838)
?column__arraycontains=
and?column__arraynotcontains=
table parameters now also work against SQL views. (#448)?_facet_array=column
no longer returns incorrect counts if columns contain the same value more than once.
0.59.2 (2021-11-13)¶
Column names with a leading underscore now work correctly when used as a facet. (#1506)
Applying
?_nocol=
to a column no longer removes that column from the filtering interface. (#1503)Official Datasette Docker container now uses Debian Bullseye as the base image. (#1497)
Datasette is four years old today! Here's the original release announcement from 2017.
0.59.1 (2021-10-24)¶
Fix compatibility with Python 3.10. (#1482)
Documentation on how to use Named parameters with integer and floating point values. (#1496)
0.59 (2021-10-14)¶
Columns can now have associated metadata descriptions in
metadata.json
, see Column descriptions. (#942)New register_commands() plugin hook allows plugins to register additional Datasette CLI commands, e.g.
datasette mycommand file.db
. (#1449)Adding
?_facet_size=max
to a table page now shows the number of unique values in each facet. (#1423)Upgraded dependency httpx 0.20 - the undocumented
allow_redirects=
parameter to datasette.client is nowfollow_redirects=
, and defaults toFalse
where it previously defaulted toTrue
. (#1488)The
--cors
option now causes Datasette to return theAccess-Control-Allow-Headers: Authorization
header, in addition toAccess-Control-Allow-Origin: *
. (#1467)Code that figures out which named parameters a SQL query takes in order to display form fields for them is no longer confused by strings that contain colon characters. (#1421)
Renamed
--help-config
option to--help-settings
. (#1431)datasette.databases
property is now a documented API. (#1443)The
base.html
template now wraps everything other than the<footer>
in a<div class="not-footer">
element, to help with advanced CSS customization. (#1446)The render_cell() plugin hook can now return an awaitable function. This means the hook can execute SQL queries. (#1425)
register_routes(datasette) plugin hook now accepts an optional
datasette
argument. (#1404)New
hide_sql
canned query option for defaulting to hiding the SQL query used by a canned query, see Additional canned query options. (#1422)New
--cpu
option for datasette publish cloudrun. (#1420)If Rich is installed in the same virtual environment as Datasette, it will be used to provide enhanced display of error tracebacks on the console. (#1416)
datasette.utils
parse_metadata(content) function, used by the new datasette-remote-metadata plugin, is now a documented API. (#1405)Fixed bug where
?_next=x&_sort=rowid
could throw an error. (#1470)Column cog menu no longer shows the option to facet by a column that is already selected by the default facets in metadata. (#1469)
0.58.1 (2021-07-16)¶
Fix for an intermittent race condition caused by the
refresh_schemas()
internal function. (#1231)
0.58 (2021-07-14)¶
New
datasette --uds /tmp/datasette.sock
option for binding Datasette to a Unix domain socket, see proxy documentation (#1388)"searchmode": "raw"
table metadata option for defaulting a table to executing SQLite full-text search syntax without first escaping it, see Advanced SQLite search queries. (#1389)New plugin hook:
get_metadata()
, for returning custom metadata for an instance, database or table. Thanks, Brandon Roberts! (#1384)New plugin hook: skip_csrf(datasette, scope), for opting out of CSRF protection based on the incoming request. (#1377)
The menu_links(), table_actions() and database_actions() plugin hooks all gained a new optional
request
argument providing access to the current request. (#1371)Major performance improvement for Datasette faceting. (#1394)
Improved documentation for Running Datasette behind a proxy to recommend using
ProxyPreservehost On
with Apache. (#1387)POST
requests to endpoints that do not support that HTTP verb now return a 405 error.db.path
can now be provided as apathlib.Path
object, useful when writing unit tests for plugins. Thanks, Chris Amico. (#1365)
0.57.1 (2021-06-08)¶
0.57 (2021-06-05)¶
Warning
This release fixes a reflected cross-site scripting security hole with the ?_trace=1
feature. You should upgrade to this version, or to Datasette 0.56.1, as soon as possible. (#1360)
In addition to the security fix, this release includes ?_col=
and ?_nocol=
options for controlling which columns are displayed for a table, ?_facet_size=
for increasing the number of facet results returned, re-display of your SQL query should an error occur and numerous bug fixes.
New features¶
If an error occurs while executing a user-provided SQL query, that query is now re-displayed in an editable form along with the error message. (#619)
New
?_col=
and?_nocol=
parameters to show and hide columns in a table, plus an interface for hiding and showing columns in the column cog menu. (#615)A new
?_facet_size=
parameter for customizing the number of facet results returned on a table or view page. (#1332)?_facet_size=max
sets that to the maximum, which defaults to 1,000 and is controlled by the the max_returned_rows setting. If facet results are truncated the … at the bottom of the facet list now links to this parameter. (#1337)?_nofacet=1
option to disable all facet calculations on a page, used as a performance optimization for CSV exports and?_shape=array/object
. (#1349, #263)?_nocount=1
option to disable full query result counts. (#1353)?_trace=1
debugging option is now controlled by the new trace_debug setting, which is turned off by default. (#1359)
Bug fixes and other improvements¶
Custom pages now work correctly when combined with the base_url setting. (#1238)
Fixed intermittent error displaying the index page when the user did not have permission to access one of the tables. Thanks, Guy Freeman. (#1305)
Columns with the name "Link" are no longer incorrectly displayed in bold. (#1308)
Fixed error caused by tables with a single quote in their names. (#1257)
Updated dependencies:
pytest-asyncio
,Black
,jinja2
,aiofiles
,click
, anditsdangerous
.The official Datasette Docker image now supports
apt-get install
. (#1320)The Heroku runtime used by
datasette publish heroku
is nowpython-3.8.10
.
0.56.1 (2021-06-05)¶
Warning
This release fixes a reflected cross-site scripting security hole with the ?_trace=1
feature. You should upgrade to this version, or to Datasette 0.57, as soon as possible. (#1360)
0.56 (2021-03-28)¶
Documentation improvements, bug fixes and support for SpatiaLite 5.
The SQL editor can now be resized by dragging a handle. (#1236)
Fixed a bug with JSON faceting and the
__arraycontains
filter caused by tables with spaces in their names. (#1239)Upgraded
httpx
dependency. (#1005)JSON faceting is now suggested even if a column contains blank strings. (#1246)
New datasette.add_memory_database() method. (#1247)
The Response.asgi_send() method is now documented. (#1266)
The official Datasette Docker image now bundles SpatiaLite version 5. (#1278)
Fixed a
no such table: pragma_database_list
bug when running Datasette against SQLite versions prior to SQLite 3.16.0. (#1276)HTML lists displayed in table cells are now styled correctly. Thanks, Bob Whitelock. (#1141, #1252)
Configuration directory mode now correctly serves immutable databases that are listed in
inspect-data.json
. Thanks Campbell Allen and Frankie Robertson. (#1031, #1229)
0.55 (2021-02-18)¶
Support for cross-database SQL queries and built-in support for serving via HTTPS.
The new
--crossdb
command-line option causes Datasette to attach up to ten database files to the same/_memory
database connection. This enables cross-database SQL queries, including the ability to use joins and unions to combine data from tables that exist in different database files. See Cross-database queries for details. (#283)--ssl-keyfile
and--ssl-certfile
options can be used to specify a TLS certificate, allowing Datasette to serve traffic overhttps://
without needing to run it behind a separate proxy. (#1221)The
/:memory:
page has been renamed (and redirected) to/_memory
for consistency with the new/_internal
database introduced in Datasette 0.54. (#1205)Added plugin testing documentation on Using pdb for errors thrown inside Datasette. (#1207)
The official Datasette Docker image now uses Python 3.7.10, applying the latest security fix for that Python version. (#1235)
0.54.1 (2021-02-02)¶
Fixed a bug where
?_search=
and?_sort=
parameters were incorrectly duplicated when the filter form on the table page was re-submitted. (#1214)
0.54 (2021-01-25)¶
The two big new features in this release are the _internal
SQLite in-memory database storing details of all connected databases and tables, and support for JavaScript modules in plugins and additional scripts.
For additional commentary on this release, see Datasette 0.54, the annotated release notes.
The _internal database¶
As part of ongoing work to help Datasette handle much larger numbers of connected databases and tables (see Datasette Library) Datasette now maintains an in-memory SQLite database with details of all of the attached databases, tables, columns, indexes and foreign keys. (#1150)
This will support future improvements such as a searchable, paginated homepage of all available tables.
You can explore an example of this database by signing in as root to the latest.datasette.io
demo instance and then navigating to latest.datasette.io/_internal.
Plugins can use these tables to introspect attached data in an efficient way. Plugin authors should note that this is not yet considered a stable interface, so any plugins that use this may need to make changes prior to Datasette 1.0 if the _internal
table schemas change.
Named in-memory database support¶
As part of the work building the _internal
database, Datasette now supports named in-memory databases that can be shared across multiple connections. This allows plugins to create in-memory databases which will persist data for the lifetime of the Datasette server process. (#1151)
The new memory_name=
parameter to the Database class can be used to create named, shared in-memory databases.
JavaScript modules¶
JavaScript modules were introduced in ECMAScript 2015 and provide native browser support for the import
and export
keywords.
To use modules, JavaScript needs to be included in <script>
tags with a type="module"
attribute.
Datasette now has the ability to output <script type="module">
in places where you may wish to take advantage of modules. The extra_js_urls
option described in Custom CSS and JavaScript can now be used with modules, and module support is also available for the extra_body_script() plugin hook. (#1186, #1187)
datasette-leaflet-freedraw is the first example of a Datasette plugin that takes advantage of the new support for JavaScript modules. See Drawing shapes on a map to query a SpatiaLite database for more on this plugin.
Code formatting with Black and Prettier¶
Datasette adopted Black for opinionated Python code formatting in June 2019. Datasette now also embraces Prettier for JavaScript formatting, which like Black is enforced by tests in continuous integration. Instructions for using these two tools can be found in the new section on Code formatting in the contributors documentation. (#1167)
Other changes¶
Datasette can now open multiple database files with the same name, e.g. if you run
datasette path/to/one.db path/to/other/one.db
. (#509)datasette publish cloudrun
now setsforce_https_urls
for every deployment, fixing some incorrecthttp://
links. (#1178)Fixed a bug in the example nginx configuration in Running Datasette behind a proxy. (#1091)
The Datasette Ecosystem documentation page has been reduced in size in favour of the
datasette.io
tools and plugins directories. (#1182)The request object now provides a
request.full_path
property, which returns the path including any query string. (#1184)Better error message for disallowed
PRAGMA
clauses in SQL queries. (#1185)datasette publish heroku
now deploys usingpython-3.8.7
.New plugin testing documentation on Testing outbound HTTP calls with pytest-httpx. (#1198)
All
?_*
query string parameters passed to the table page are now persisted in hidden form fields, so parameters such as?_size=10
will be correctly passed to the next page when query filters are changed. (#1194)Fixed a bug loading a database file called
test-database (1).sqlite
. (#1181)
0.53 (2020-12-10)¶
Datasette has an official project website now, at https://datasette.io/. This release mainly updates the documentation to reflect the new site.
New
?column__arraynotcontains=
table filter. (#1132)datasette serve
has a new--create
option, which will create blank database files if they do not already exist rather than exiting with an error. (#1135)New
?_header=off
option for CSV export which omits the CSV header row, documented here. (#1133)"Powered by Datasette" link in the footer now links to https://datasette.io/. (#1138)
Project news no longer lives in the README - it can now be found at https://datasette.io/news. (#1137)
0.52.5 (2020-12-09)¶
Fix for error caused by combining the
_searchmode=raw
and?_search_COLUMN
parameters. (#1134)
0.52.4 (2020-12-05)¶
0.52.3 (2020-12-03)¶
Fixed bug where static assets would 404 for Datasette installed on ARM Amazon Linux. (#1124)
0.52.2 (2020-12-02)¶
Generated columns from SQLite 3.31.0 or higher are now correctly displayed. (#1116)
Error message if you attempt to open a SpatiaLite database now suggests using
--load-extension=spatialite
if it detects that the extension is available in a common location. (#1115)OPTIONS
requests against the/database
page no longer raise a 500 error. (#1100)Databases larger than 32MB that are published to Cloud Run can now be downloaded. (#749)
Fix for misaligned cog icon on table and database pages. Thanks, Abdussamet Koçak. (#1121)
0.52.1 (2020-11-29)¶
Documentation on Testing plugins now recommends using datasette.client. (#1102)
Fix bug where compound foreign keys produced broken links. (#1098)
datasette --load-module=spatialite
now also checks for/usr/local/lib/mod_spatialite.so
. Thanks, Dan Peterson. (#1114)
0.52 (2020-11-28)¶
This release includes a number of changes relating to an internal rebranding effort: Datasette's configuration mechanism (things like datasette --config default_page_size:10
) has been renamed to settings.
New
--setting default_page_size 10
option as a replacement for--config default_page_size:10
(note the lack of a colon). The--config
option is deprecated but will continue working until Datasette 1.0. (#992)The
/-/config
introspection page is now/-/settings
, and the previous page redirects to the new one. (#1103)The
config.json
file in Configuration directory mode is now calledsettings.json
. (#1104)The undocumented
datasette.config()
internal method has been replaced by a documented .setting(key) method. (#1107)
Also in this release:
New plugin hook: database_actions(datasette, actor, database, request), which adds menu items to a new cog menu shown at the top of the database page. (#1077)
datasette publish cloudrun
has a new--apt-get-install
option that can be used to install additional Ubuntu packages as part of the deployment. This is useful for deploying the new datasette-ripgrep plugin. (#1110)Swept the documentation to remove words that minimize involved difficulty. (#1089)
And some bug fixes:
Foreign keys linking to rows with blank label columns now display as a hyphen, allowing those links to be clicked. (#1086)
Fixed bug where row pages could sometimes 500 if the underlying queries exceeded a time limit. (#1088)
Fixed a bug where the table action menu could appear partially obscured by the edge of the page. (#1084)
0.51.1 (2020-10-31)¶
Improvements to the new Binary data documentation page.
0.51 (2020-10-31)¶
A new visual design, plugin hooks for adding navigation options, better handling of binary data, URL building utility methods and better support for running Datasette behind a proxy.
New visual design¶
Datasette is no longer white and grey with blue and purple links! Natalie Downe has been working on a visual refresh, the first iteration of which is included in this release. (#1056)
Plugins can now add links within Datasette¶
A number of existing Datasette plugins add new pages to the Datasette interface, providig tools for things like uploading CSVs, editing table schemas or configuring full-text search.
Plugins like this can now link to themselves from other parts of Datasette interface. The menu_links(datasette, actor, request) hook (#1064) lets plugins add links to Datasette's new top-right application menu, and the table_actions(datasette, actor, database, table, request) hook (#1066) adds links to a new "table actions" menu on the table page.
The demo at latest.datasette.io now includes some example plugins. To see the new table actions menu first sign into that demo as root and then visit the facetable table to see the new cog icon menu at the top of the page.
Binary data¶
SQLite tables can contain binary data in BLOB
columns. Datasette now provides links for users to download this data directly from Datasette, and uses those links to make binary data available from CSV exports. See Binary data for more details. (#1036 and #1034).
URL building¶
The new datasette.urls family of methods can be used to generate URLs to key pages within the Datasette interface, both within custom templates and Datasette plugins. See Building URLs within plugins for more details. (#904)
Running Datasette behind a proxy¶
The base_url configuration option is designed to help run Datasette on a specific path behind a proxy - for example if you want to run an instance of Datasette at /my-datasette/
within your existing site's URL hierarchy, proxied behind nginx or Apache.
Support for this configuration option has been greatly improved (#1023), and guidelines for using it are now available in a new documentation section on Running Datasette behind a proxy. (#1027)
Smaller changes¶
Wide tables shown within Datasette now scroll horizontally (#998). This is achieved using a new
<div class="table-wrapper">
element which may impact the implementation of some plugins (for example this change to datasette-cluster-map).New debug-menu permission. (#1068)
Removed
--debug
option, which didn't do anything. (#814)Link:
HTTP header pagination. (#1014)x
button for clearing filters. (#1016)Edit SQL button on canned queries, (#1019)
--load-extension=spatialite
shortcut. (#1028)scale-in animation for column action menu. (#1039)
Option to pass a list of templates to
.render_template()
is now documented. (#1045)New
datasette.urls.static_plugins()
method. (#1033)datasette -o
option now opens the most relevant page. (#976)datasette --cors
option now enables access to/database.db
downloads. (#1057)Database file downloads now implement cascading permissions, so you can download a database if you have
view-database-download
permission even if you do not have permission to access the Datasette instance. (#1058)New documentation on Designing URLs for your plugin. (#1053)
0.50.2 (2020-10-09)¶
Fixed another bug introduced in 0.50 where column header links on the table page were broken. (#1011)
0.50.1 (2020-10-09)¶
Fixed a bug introduced in 0.50 where the export as JSON/CSV links on the table, row and query pages were broken. (#1010)
0.50 (2020-10-09)¶
The key new feature in this release is the column actions menu on the table page (#891). This can be used to sort a column in ascending or descending order, facet data by that column or filter the table to just rows that have a value for that column.
Plugin authors can use the new datasette.client object to make internal HTTP requests from their plugins, allowing them to make use of Datasette's JSON API. (#943)
New Deploying Datasette documentation with guides for deploying Datasette on a Linux server using systemd or to hosting providers that support buildpacks. (#514, #997)
Other improvements in this release:
Publishing to Google Cloud Run documentation now covers Google Cloud SDK options. Thanks, Geoffrey Hing. (#995)
New
datasette -o
option which opens your browser as soon as Datasette starts up. (#970)Datasette now sets
sqlite3.enable_callback_tracebacks(True)
so that errors in custom SQL functions will display tracebacks. (#891)Fixed two rendering bugs with column headers in portrait mobile view. (#978, #980)
New
db.table_column_details(table)
introspection method for retrieving full details of the columns in a specific table, see Database introspection.Fixed a routing bug with custom page wildcard templates. (#996)
datasette publish heroku
now deploys using Python 3.8.6.New
datasette publish heroku --tar=
option. (#969)OPTIONS
requests against HTML pages no longer return a 500 error. (#1001)Datasette now supports Python 3.9.
0.49.1 (2020-09-15)¶
Fixed a bug with writable canned queries that use magic parameters but accept no non-magic arguments. (#967)
0.49 (2020-09-14)¶
See also Datasette 0.49: The annotated release notes.
Writable canned queries now expose a JSON API, see JSON API for writable canned queries. (#880)
New mechanism for defining page templates with custom path parameters - a template file called
pages/about/{slug}.html
will be used to render any requests to/about/something
. See Path parameters for pages. (#944)register_output_renderer()
render functions can now return aResponse
. (#953)New
--upgrade
option fordatasette install
. (#945)New
datasette --pdb
option. (#962)datasette --get
exit code now reflects the internal HTTP status code. (#947)New
raise_404()
template function for returning 404 errors. (#964)datasette publish heroku
now deploys using Python 3.8.5Upgraded CodeMirror to 5.57.0. (#948)
Upgraded code style to Black 20.8b1. (#958)
Fixed bug where selected facets were not correctly persisted in hidden form fields on the table page. (#963)
Renamed the default error template from
500.html
toerror.html
.Custom error pages are now documented, see Custom error pages. (#965)
0.48 (2020-08-16)¶
Datasette documentation now lives at docs.datasette.io.
db.is_mutable
property is now documented and tested, see Database introspection.The
extra_template_vars
,extra_css_urls
,extra_js_urls
andextra_body_script
plugin hooks now all accept the same arguments. See extra_template_vars(template, database, table, columns, view_name, request, datasette) for details. (#939)Those hooks now accept a new
columns
argument detailing the table columns that will be rendered on that page. (#938)Fixed bug where plugins calling
db.execute_write_fn()
could hang Datasette if the connection failed. (#935)Fixed bug with the
?_nl=on
output option and binary data. (#914)
0.47.3 (2020-08-15)¶
The
datasette --get
command-line mechanism now ensures any plugins using thestartup()
hook are correctly executed. (#934)
0.47.2 (2020-08-12)¶
Fixed an issue with the Docker image published to Docker Hub. (#931)
0.47.1 (2020-08-11)¶
Fixed a bug where the
sdist
distribution of Datasette was not correctly including the template files. (#930)
0.47 (2020-08-11)¶
Datasette now has a GitHub discussions forum for conversations about the project that go beyond just bug reports and issues.
Datasette can now be installed on macOS using Homebrew! Run
brew install simonw/datasette/datasette
. See Using Homebrew. (#335)Two new commands:
datasette install name-of-plugin
anddatasette uninstall name-of-plugin
. These are equivalent topip install
andpip uninstall
but automatically run in the same virtual environment as Datasette, so users don't have to figure out where that virtual environment is - useful for installations created using Homebrew orpipx
. See Installing plugins. (#925)A new command-line option,
datasette --get
, accepts a path to a URL within the Datasette instance. It will run that request through Datasette (without starting a web server) and print out the response. See datasette --get for an example. (#926)
0.46 (2020-08-09)¶
Warning
This release contains a security fix related to authenticated writable canned queries. If you are using this feature you should upgrade as soon as possible.
Security fix: CSRF tokens were incorrectly included in read-only canned query forms, which could allow them to be leaked to a sophisticated attacker. See issue 918 for details.
Datasette now supports GraphQL via the new datasette-graphql plugin - see GraphQL in Datasette with the new datasette-graphql plugin.
Principle git branch has been renamed from
master
tomain
. (#849)New debugging tool:
/-/allow-debug tool
(demo here) helps test allow blocks against actors, as described in Defining permissions with "allow" blocks. (#908)New logo for the documentation, and a new project tagline: "An open source multi-tool for exploring and publishing data".
Whitespace in column values is now respected on display, using
white-space: pre-wrap
. (#896)New
await request.post_body()
method for accessing the raw POST body, see Request object. (#897)Database file downloads now include a
content-length
HTTP header, enabling download progress bars. (#905)File downloads now also correctly set the suggested file name using a
content-disposition
HTTP header. (#909)tests
are now excluded from the Datasette package properly - thanks, abeyerpath. (#456)The Datasette package published to PyPI now includes
sdist
as well asbdist_wheel
.Better titles for canned query pages. (#887)
Now only loads Python files from a directory passed using the
--plugins-dir
option - thanks, Amjith Ramanujam. (#890)New documentation section on Publishing to Vercel.
0.45 (2020-07-01)¶
See also Datasette 0.45: The annotated release notes.
Magic parameters for canned queries, a log out feature, improved plugin documentation and four new plugin hooks.
Magic parameters for canned queries¶
Canned queries now support Magic parameters, which can be used to insert or select automatically generated values. For example:
insert into logs
(user_id, timestamp)
values
(:_actor_id, :_now_datetime_utc)
This inserts the currently authenticated actor ID and the current datetime. (#842)
Log out¶
The ds_actor cookie can be used by plugins (or by Datasette's --root mechanism) to authenticate users. The new /-/logout
page provides a way to clear that cookie.
A "Log out" button now shows in the global navigation provided the user is authenticated using the ds_actor
cookie. (#840)
Better plugin documentation¶
The plugin documentation has been re-arranged into four sections, including a brand new section on testing plugins. (#687)
Plugins introduces Datasette's plugin system and describes how to install and configure plugins.
Writing plugins describes how to author plugins, from one-off single file plugins to packaged plugins that can be published to PyPI. It also describes how to start a plugin using the new datasette-plugin cookiecutter template.
Plugin hooks is a full list of detailed documentation for every Datasette plugin hook.
Testing plugins describes how to write tests for Datasette plugins, using pytest and HTTPX.
New plugin hooks¶
register_magic_parameters(datasette) can be used to define new types of magic canned query parameters.
startup(datasette) can run custom code when Datasette first starts up. datasette-init is a new plugin that uses this hook to create database tables and views on startup if they have not yet been created. (#834)
canned_queries(datasette, database, actor) lets plugins provide additional canned queries beyond those defined in Datasette's metadata. See datasette-saved-queries for an example of this hook in action. (#852)
forbidden(datasette, request, message) is a hook for customizing how Datasette responds to 403 forbidden errors. (#812)
Smaller changes¶
Cascading view permissions - so if a user has
view-table
they can view the table page even if they do not haveview-database
orview-instance
. (#832)CSRF protection no longer applies to
Authentication: Bearer token
requests or requests without cookies. (#835)datasette.add_message()
now works inside plugins. (#864)Workaround for "Too many open files" error in test runs. (#846)
Respect existing
scope["actor"]
if already set by ASGI middleware. (#854)New process for shipping Alpha and beta releases. (#807)
{{ csrftoken() }}
now works when plugins render a template usingdatasette.render_template(..., request=request)
. (#863)Datasette now creates a single Request object and uses it throughout the lifetime of the current HTTP request. (#870)
0.44 (2020-06-11)¶
See also Datasette 0.44: The annotated release notes.
Authentication and permissions, writable canned queries, flash messages, new plugin hooks and more.
Authentication¶
Prior to this release the Datasette ecosystem has treated authentication as exclusively the realm of plugins, most notably through datasette-auth-github.
0.44 introduces Authentication and permissions as core Datasette concepts (#699). This enables different plugins to share responsibility for authenticating requests - you might have one plugin that handles user accounts and another one that allows automated access via API keys, for example.
You'll need to install plugins if you want full user accounts, but default Datasette can now authenticate a single root user with the new --root
command-line option, which outputs a one-time use URL to authenticate as a root actor (#784):
datasette fixtures.db --root
http://127.0.0.1:8001/-/auth-token?token=5b632f8cd44b868df625f5a6e2185d88eea5b22237fd3cc8773f107cc4fd6477
INFO: Started server process [14973]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://127.0.0.1:8001 (Press CTRL+C to quit)
Plugins can implement new ways of authenticating users using the new actor_from_request(datasette, request) hook.
Permissions¶
Datasette also now has a built-in concept of Permissions. The permissions system answers the following question:
Is this actor allowed to perform this action, optionally against this particular resource?
You can use the new "allow"
block syntax in metadata.json
(or metadata.yaml
) to set required permissions at the instance, database, table or canned query level. For example, to restrict access to the fixtures.db
database to the "root"
user:
{
"databases": {
"fixtures": {
"allow": {
"id" "root"
}
}
}
}
See Defining permissions with "allow" blocks for more details.
Plugins can implement their own custom permission checks using the new permission_allowed(datasette, actor, action, resource) hook.
A new debug page at /-/permissions
shows recent permission checks, to help administrators and plugin authors understand exactly what checks are being performed. This tool defaults to only being available to the root user, but can be exposed to other users by plugins that respond to the permissions-debug
permission. (#788)
Writable canned queries¶
Datasette's Canned queries feature lets you define SQL queries in metadata.json
which can then be executed by users visiting a specific URL. https://latest.datasette.io/fixtures/neighborhood_search for example.
Canned queries were previously restricted to SELECT
, but Datasette 0.44 introduces the ability for canned queries to execute INSERT
or UPDATE
queries as well, using the new "write": true
property (#800):
{
"databases": {
"dogs": {
"queries": {
"add_name": {
"sql": "INSERT INTO names (name) VALUES (:name)",
"write": true
}
}
}
}
}
See Writable canned queries for more details.
Flash messages¶
Writable canned queries needed a mechanism to let the user know that the query has been successfully executed. The new flash messaging system (#790) allows messages to persist in signed cookies which are then displayed to the user on the next page that they visit. Plugins can use this mechanism to display their own messages, see .add_message(request, message, type=datasette.INFO) for details.
You can try out the new messages using the /-/messages
debug tool, for example at https://latest.datasette.io/-/messages
Signed values and secrets¶
Both flash messages and user authentication needed a way to sign values and set signed cookies. Two new methods are now available for plugins to take advantage of this mechanism: .sign(value, namespace="default") and .unsign(value, namespace="default").
Datasette will generate a secret automatically when it starts up, but to avoid resetting the secret (and hence invalidating any cookies) every time the server restarts you should set your own secret. You can pass a secret to Datasette using the new --secret
option or with a DATASETTE_SECRET
environment variable. See Configuring the secret for more details.
You can also set a secret when you deploy Datasette using datasette publish
or datasette package
- see Using secrets with datasette publish.
Plugins can now sign values and verify their signatures using the datasette.sign() and datasette.unsign() methods.
CSRF protection¶
Since writable canned queries are built using POST forms, Datasette now ships with CSRF protection (#798). This applies automatically to any POST request, which means plugins need to include a csrftoken
in any POST forms that they render. They can do that like so:
<input type="hidden" name="csrftoken" value="{{ csrftoken() }}">
register_routes() plugin hooks¶
Plugins can now register new views and routes via the register_routes(datasette) plugin hook (#819). View functions can be defined that accept any of the current datasette
object, the current request
, or the ASGI scope
, send
and receive
objects.
Smaller changes¶
New internals documentation for Request object and Response class. (#706)
request.url
now respects theforce_https_urls
config setting. closes (#781)request.args.getlist()
returns[]
if missing. Removedrequest.raw_args
entirely. (#774)New datasette.get_database() method.
Added
_
prefix to many private, undocumented methods of the Datasette class. (#576)Removed the
db.get_outbound_foreign_keys()
method which duplicated the behaviour ofdb.foreign_keys_for_table()
.New await datasette.permission_allowed() method.
/-/actor
debugging endpoint for viewing the currently authenticated actor.New
request.cookies
property./-/plugins
endpoint now shows a list of hooks implemented by each plugin, e.g. https://latest.datasette.io/-/plugins?all=1request.post_vars()
method no longer discards empty values.New "params" canned query key for explicitly setting named parameters, see Canned query parameters. (#797)
request.args
is now a MultiParams object.Fixed a bug with the
datasette plugins
command. (#802)Nicer pattern for using
make_app_client()
in tests. (#395)New
request.actor
property.Fixed broken CSS on nested 404 pages. (#777)
New
request.url_vars
property. (#822)Fixed a bug with the
python tests/fixtures.py
command for outputting Datasette's testing fixtures database and plugins. (#804)datasette publish heroku
now deploys using Python 3.8.3.Added a warning that the register_facet_classes() hook is unstable and may change in the future. (#830)
The
{"$env": "ENVIRONMENT_VARIBALE"}
mechanism (see Secret configuration values) now works with variables inside nested lists. (#837)
The road to Datasette 1.0¶
I've assembled a milestone for Datasette 1.0. The focus of the 1.0 release will be the following:
Signify confidence in the quality/stability of Datasette
Give plugin authors confidence that their plugins will work for the whole 1.x release cycle
Provide the same confidence to developers building against Datasette JSON APIs
If you have thoughts about what you would like to see for Datasette 1.0 you can join the conversation on issue #519.
0.43 (2020-05-28)¶
The main focus of this release is a major upgrade to the register_output_renderer(datasette) plugin hook, which allows plugins to provide new output formats for Datasette such as datasette-atom and datasette-ics.
Redesign of register_output_renderer(datasette) to provide more context to the render callback and support an optional
"can_render"
callback that controls if a suggested link to the output format is provided. (#581, #770)Visually distinguish float and integer columns - useful for figuring out why order-by-column might be returning unexpected results. (#729)
The Request object, which is passed to several plugin hooks, is now documented. (#706)
New
metadata.json
option for setting a custom default page size for specific tables and views, see Setting a custom page size. (#751)Canned queries can now be configured with a default URL fragment hash, useful when working with plugins such as datasette-vega, see Additional canned query options. (#706)
Fixed a bug in
datasette publish
when running on operating systems where the/tmp
directory lives in a different volume, using a backport of the Python 3.8shutil.copytree()
function. (#744)Every plugin hook is now covered by the unit tests, and a new unit test checks that each plugin hook has at least one corresponding test. (#771, #773)
0.42 (2020-05-08)¶
A small release which provides improved internal methods for use in plugins, along with documentation. See #685.
Added documentation for
db.execute()
, see await db.execute(sql, ...).Renamed
db.execute_against_connection_in_thread()
todb.execute_fn()
and made it a documented method, see await db.execute_fn(fn).New
results.first()
andresults.single_value()
methods, plus documentation for theResults
class - see Results.
0.41 (2020-05-06)¶
You can now create custom pages within your Datasette instance using a custom template file. For example, adding a template file called templates/pages/about.html
will result in a new page being served at /about
on your instance. See the custom pages documentation for full details, including how to return custom HTTP headers, redirects and status codes. (#648)
Configuration directory mode (#731) allows you to define a custom Datasette instance as a directory. So instead of running the following:
datasette one.db two.db \
--metadata=metadata.json \
--template-dir=templates/ \
--plugins-dir=plugins \
--static css:css
You can instead arrange your files in a single directory called my-project
and run this:
datasette my-project/
Also in this release:
New
NOT LIKE
table filter:?colname__notlike=expression
. (#750)Datasette now has a pattern portfolio at
/-/patterns
- e.g. https://latest.datasette.io/-/patterns. This is a page that shows every Datasette user interface component in one place, to aid core development and people building custom CSS themes. (#151)SQLite PRAGMA functions such as
pragma_table_info(tablename)
are now allowed in Datasette SQL queries. (#761)Datasette pages now consistently return a
content-type
oftext/html; charset=utf-8"
. (#752)Datasette now handles an ASGI
raw_path
value ofNone
, which should allow compatibility with the Mangum adapter for running ASGI apps on AWS Lambda. Thanks, Colin Dellow. (#719)Installation documentation now covers how to Using pipx. (#756)
Improved the documentation for Full-text search. (#748)
0.40 (2020-04-21)¶
Datasette Metadata can now be provided as a YAML file as an optional alternative to JSON. (#713)
Removed support for
datasette publish now
, which used the the now-retired Zeit Now v1 hosting platform. A new plugin, datasette-publish-now, can be installed to publish data to Zeit (now Vercel) Now v2. (#710)Fixed a bug where the
extra_template_vars(request, view_name)
plugin hook was not receiving the correctview_name
. (#716)Variables added to the template context by the
extra_template_vars()
plugin hook are now shown in the?_context=1
debugging mode (see template_debug). (#693)Fixed a bug where the "templates considered" HTML comment was no longer being displayed. (#689)
Fixed a
datasette publish
bug where--plugin-secret
would over-ride plugin configuration in the providedmetadata.json
file. (#724)Added a new CSS class for customizing the canned query page. (#727)
0.39 (2020-03-24)¶
New base_url configuration setting for serving up the correct links while running Datasette under a different URL prefix. (#394)
New metadata settings
"sort"
and"sort_desc"
for setting the default sort order for a table. See Setting a default sort order. (#702)Sort direction arrow now displays by default on the primary key. This means you only have to click once (not twice) to sort in reverse order. (#677)
New
await Request(scope, receive).post_vars()
method for accessing POST form variables. (#700)Plugin hooks documentation now links to example uses of each plugin. (#709)
0.38 (2020-03-08)¶
The Docker build of Datasette now uses SQLite 3.31.1, upgraded from 3.26. (#695)
datasette publish cloudrun
now accepts an optional--memory=2Gi
flag for setting the Cloud Run allocated memory to a value other than the default (256Mi). (#694)Fixed bug where templates that shipped with plugins were sometimes not being correctly loaded. (#697)
0.37.1 (2020-03-02)¶
Don't attempt to count table rows to display on the index page for databases > 100MB. (#688)
Print exceptions if they occur in the write thread rather than silently swallowing them.
Handle the possibility of
scope["path"]
being a string rather than bytesBetter documentation for the extra_template_vars(template, database, table, columns, view_name, request, datasette) plugin hook.
0.37 (2020-02-25)¶
Plugins now have a supported mechanism for writing to a database, using the new
.execute_write()
and.execute_write_fn()
methods. Documentation. (#682)Immutable databases that have had their rows counted using the
inspect
command now use the calculated count more effectively - thanks, Kevin Keogh. (#666)--reload
no longer restarts the server if a database file is modified, unless that database was opened immutable mode with-i
. (#494)New
?_searchmode=raw
option turns off escaping for FTS queries in?_search=
allowing full use of SQLite's FTS5 query syntax. (#676)
0.36 (2020-02-21)¶
The
datasette
object passed to plugins now has API documentation: Datasette class. (#576)New methods on
datasette
:.add_database()
and.remove_database()
- documentation. (#671)prepare_connection()
plugin hook now takes optionaldatasette
anddatabase
arguments - prepare_connection(conn, database, datasette). (#678)Added three new plugins and one new conversion tool to the The Datasette Ecosystem.
0.35 (2020-02-04)¶
Added five new plugins and one new conversion tool to the The Datasette Ecosystem.
The
Datasette
class has a newrender_template()
method which can be used by plugins to render templates using Datasette's pre-configured Jinja templating library.You can now execute SQL queries that start with a
-- comment
- thanks, Jay Graves (#653)
0.34 (2020-01-29)¶
_search=
queries are now correctly escaped using a newescape_fts()
custom SQL function. This means you can now run searches for strings likepark.
without seeing errors. (#651)Google Cloud Run is no longer in beta, so
datasette publish cloudrun
has been updated to work even if the user has not installed thegcloud
beta components package. Thanks, Katie McLaughlin (#660)datasette package
now accepts a--port
option for specifying which port the resulting Docker container should listen on. (#661)
0.33 (2019-12-22)¶
rowid
is now included in dropdown menus for filtering tables (#636)Columns are now only suggested for faceting if they have at least one value with more than one record (#638)
Queries with no results now display "0 results" (#637)
Improved documentation for the
--static
option (#641)asyncio task information is now included on the
/-/threads
debug pageBumped Uvicorn dependency 0.11
You can now use
--port 0
to listen on an available portNew template_debug setting for debugging templates, e.g. https://latest.datasette.io/fixtures/roadside_attractions?_context=1 (#654)
0.32 (2019-11-14)¶
Datasette now renders templates using Jinja async mode. This means plugins can provide custom template functions that perform asynchronous actions, for example the new datasette-template-sql plugin which allows custom templates to directly execute SQL queries and render their results. (#628)
0.31.2 (2019-11-13)¶
0.31.1 (2019-11-12)¶
Deployments created using
datasette publish
now usepython:3.8
base Docker image (#629)
0.31 (2019-11-11)¶
This version adds compatibility with Python 3.8 and breaks compatibility with Python 3.5.
If you are still running Python 3.5 you should stick with 0.30.2
, which you can install like this:
pip install datasette==0.30.2
Format SQL button now works with read-only SQL queries - thanks, Tobias Kunze (#602)
New
?column__notin=x,y,z
filter for table views (#614)Table view now uses
select col1, col2, col3
instead ofselect *
Database filenames can now contain spaces - thanks, Tobias Kunze (#590)
Removed obsolete
?_group_count=col
feature (#504)Improved user interface and documentation for
datasette publish cloudrun
(#608)Tables with indexes now show the
CREATE INDEX
statements on the table page (#618)Current version of uvicorn is now shown on
/-/versions
Python 3.8 is now supported! (#622)
Python 3.5 is no longer supported.
0.30.2 (2019-11-02)¶
/-/plugins
page now uses distribution name e.g.datasette-cluster-map
instead of the name of the underlying Python package (datasette_cluster_map
) (#606)Array faceting is now only suggested for columns that contain arrays of strings (#562)
Better documentation for the
--host
argument (#574)Don't show
None
with a broken link for the label on a nullable foreign key (#406)
0.30.1 (2019-10-30)¶
0.30 (2019-10-18)¶
Added
/-/threads
debugging pageAllow
EXPLAIN WITH...
(#583)Button to format SQL - thanks, Tobias Kunze (#136)
Sort databases on homepage by argument order - thanks, Tobias Kunze (#585)
Display metadata footer on custom SQL queries - thanks, Tobias Kunze (#589)
Use
--platform=managed
forpublish cloudrun
(#587)Fixed bug returning non-ASCII characters in CSV (#584)
Fix for
/foo
v.s./foo-bar
bug (#601)
0.29.3 (2019-09-02)¶
Fixed implementation of CodeMirror on database page (#560)
Documentation typo fixes - thanks, Min ho Kim (#561)
Mechanism for detecting if a table has FTS enabled now works if the table name used alternative escaping mechanisms (#570) - for compatibility with a recent change to sqlite-utils.
0.29.2 (2019-07-13)¶
0.29.1 (2019-07-11)¶
0.29 (2019-07-07)¶
ASGI, new plugin hooks, facet by date and much, much more...
ASGI¶
ASGI is the Asynchronous Server Gateway Interface standard. I've been wanting to convert Datasette into an ASGI application for over a year - Port Datasette to ASGI #272 tracks thirteen months of intermittent development - but with Datasette 0.29 the change is finally released. This also means Datasette now runs on top of Uvicorn and no longer depends on Sanic.
I wrote about the significance of this change in Porting Datasette to ASGI, and Turtles all the way down.
The most exciting consequence of this change is that Datasette plugins can now take advantage of the ASGI standard.
New plugin hook: asgi_wrapper¶
The asgi_wrapper(datasette) plugin hook allows plugins to entirely wrap the Datasette ASGI application in their own ASGI middleware. (#520)
Two new plugins take advantage of this hook:
datasette-auth-github adds a authentication layer: users will have to sign in using their GitHub account before they can view data or interact with Datasette. You can also use it to restrict access to specific GitHub users, or to members of specified GitHub organizations or teams.
datasette-cors allows you to configure CORS headers for your Datasette instance. You can use this to enable JavaScript running on a whitelisted set of domains to make
fetch()
calls to the JSON API provided by your Datasette instance.
New plugin hook: extra_template_vars¶
The extra_template_vars(template, database, table, columns, view_name, request, datasette) plugin hook allows plugins to inject their own additional variables into the Datasette template context. This can be used in conjunction with custom templates to customize the Datasette interface. datasette-auth-github uses this hook to add custom HTML to the new top navigation bar (which is designed to be modified by plugins, see #540).
Secret plugin configuration options¶
Plugins like datasette-auth-github need a safe way to set secret configuration options. Since the default mechanism for configuring plugins exposes those settings in /-/metadata
a new mechanism was needed. Secret configuration values describes how plugins can now specify that their settings should be read from a file or an environment variable:
{
"plugins": {
"datasette-auth-github": {
"client_secret": {
"$env": "GITHUB_CLIENT_SECRET"
}
}
}
}
These plugin secrets can be set directly using datasette publish
. See Custom metadata and plugins for details. (#538 and #543)
Facet by date¶
If a column contains datetime values, Datasette can now facet that column by date. (#481)
Easier custom templates for table rows¶
If you want to customize the display of individual table rows, you can do so using a _table.html
template include that looks something like this:
{% for row in display_rows %}
<div>
<h2>{{ row["title"] }}</h2>
<p>{{ row["description"] }}<lp>
<p>Category: {{ row.display("category_id") }}</p>
</div>
{% endfor %}
This is a backwards incompatible change. If you previously had a custom template called _rows_and_columns.html
you need to rename it to _table.html
.
See Custom templates for full details.
?_through= for joins through many-to-many tables¶
The new ?_through={json}
argument to the Table view allows records to be filtered based on a many-to-many relationship. See Special table arguments for full documentation - here's an example. (#355)
This feature was added to help support facet by many-to-many, which isn't quite ready yet but will be coming in the next Datasette release.
Small changes¶
Databases published using
datasette publish
now open in Immutable mode. (#469)?col__date=
now works for columns containing spacesAutomatic label detection (for deciding which column to show when linking to a foreign key) has been improved. (#485)
Fixed bug where pagination broke when combined with an expanded foreign key. (#489)
Contributors can now run
pip install -e .[docs]
to get all of the dependencies needed to build the documentation, includingcd docs && make livehtml
support.Datasette's dependencies are now all specified using the
~=
match operator. (#532)white-space: pre-wrap
now used for table creation SQL. (#505)
Full list of commits between 0.28 and 0.29.
0.28 (2019-05-19)¶
A salmagundi of new features!
Supporting databases that change¶
From the beginning of the project, Datasette has been designed with read-only databases in mind. If a database is guaranteed not to change it opens up all kinds of interesting opportunities - from taking advantage of SQLite immutable mode and HTTP caching to bundling static copies of the database directly in a Docker container. The interesting ideas in Datasette explores this idea in detail.
As my goals for the project have developed, I realized that read-only databases are no longer the right default. SQLite actually supports concurrent access very well provided only one thread attempts to write to a database at a time, and I keep encountering sensible use-cases for running Datasette on top of a database that is processing inserts and updates.
So, as-of version 0.28 Datasette no longer assumes that a database file will not change. It is now safe to point Datasette at a SQLite database which is being updated by another process.
Making this change was a lot of work - see tracking tickets #418, #419 and #420. It required new thinking around how Datasette should calculate table counts (an expensive operation against a large, changing database) and also meant reconsidering the "content hash" URLs Datasette has used in the past to optimize the performance of HTTP caches.
Datasette can still run against immutable files and gains numerous performance benefits from doing so, but this is no longer the default behaviour. Take a look at the new Performance and caching documentation section for details on how to make the most of Datasette against data that you know will be staying read-only and immutable.
Faceting improvements, and faceting plugins¶
Datasette Facets provide an intuitive way to quickly summarize and interact with data. Previously the only supported faceting technique was column faceting, but 0.28 introduces two powerful new capabilities: facet-by-JSON-array and the ability to define further facet types using plugins.
Facet by array (#359) is only available if your SQLite installation provides the json1
extension. Datasette will automatically detect columns that contain JSON arrays of values and offer a faceting interface against those columns - useful for modelling things like tags without needing to break them out into a new table. See Facet by JSON array for more.
The new register_facet_classes() plugin hook (#445) can be used to register additional custom facet classes. Each facet class should provide two methods: suggest()
which suggests facet selections that might be appropriate for a provided SQL query, and facet_results()
which executes a facet operation and returns results. Datasette's own faceting implementations have been refactored to use the same API as these plugins.
datasette publish cloudrun¶
Google Cloud Run is a brand new serverless hosting platform from Google, which allows you to build a Docker container which will run only when HTTP traffic is received and will shut down (and hence cost you nothing) the rest of the time. It's similar to Zeit's Now v1 Docker hosting platform which sadly is no longer accepting signups from new users.
The new datasette publish cloudrun
command was contributed by Romain Primet (#434) and publishes selected databases to a new Datasette instance running on Google Cloud Run.
See Publishing to Google Cloud Run for full documentation.
register_output_renderer plugins¶
Russ Garrett implemented a new Datasette plugin hook called register_output_renderer (#441) which allows plugins to create additional output renderers in addition to Datasette's default .json
and .csv
.
Russ's in-development datasette-geo plugin includes an example of this hook being used to output .geojson
automatically converted from SpatiaLite.
Medium changes¶
Datasette now conforms to the Black coding style (#449) - and has a unit test to enforce this in the future
- New Special table arguments:
?columnname__in=value1,value2,value3
filter for executing SQL IN queries against a table, see Table arguments (#433)?columnname__date=yyyy-mm-dd
filter which returns rows where the spoecified datetime column falls on the specified date (583b22a)?tags__arraycontains=tag
filter which acts against a JSON array contained in a column (78e45ea)?_where=sql-fragment
filter for the table view (#429)?_fts_table=mytable
and?_fts_pk=mycolumn
query string options can be used to specify which FTS table to use for a search query - see Configuring full-text search for a table or view (#428)
You can now pass the same table filter multiple times - for example,
?content__not=world&content__not=hello
will return all rows where the content column is neitherhello
orworld
(#288)You can now specify
about
andabout_url
metadata (in addition tosource
andlicense
) linking to further information about a project - see Source, license and aboutNew
?_trace=1
parameter now adds debug information showing every SQL query that was executed while constructing the page (#435)datasette inspect
now just calculates table counts, and does not introspect other database metadata (#462)Removed
/-/inspect
page entirely - this will be replaced by something similar in the future, see #465Datasette can now run against an in-memory SQLite database. You can do this by starting it without passing any files or by using the new
--memory
option todatasette serve
. This can be useful for experimenting with SQLite queries that do not access any data, such asSELECT 1+1
orSELECT sqlite_version()
.
Small changes¶
We now show the size of the database file next to the download link (#172)
New
/-/databases
introspection page shows currently connected databases (#470)Binary data is no longer displayed on the table and row pages (#442 - thanks, Russ Garrett)
New show/hide SQL links on custom query pages (#415)
The extra_body_script plugin hook now accepts an optional
view_name
argument (#443 - thanks, Russ Garrett)Bumped Jinja2 dependency to 2.10.1 (#426)
All table filters are now documented, and documentation is enforced via unit tests (2c19a27)
New project guideline: master should stay shippable at all times! (31f36e1)
Fixed a bug where
sqlite_timelimit()
occasionally failed to clean up after itself (bac4e01)We no longer load additional plugins when executing pytest (#438)
Homepage now links to database views if there are less than five tables in a database (#373)
The
--cors
option is now respected by error pages (#453)datasette publish heroku
now uses the--include-vcs-ignore
option, which means it works under Travis CI (#407)datasette publish heroku
now publishes using Python 3.6.8 (666c374)Renamed
datasette publish now
todatasette publish nowv1
(#472)datasette publish nowv1
now accepts multiple--alias
parameters (09ef305)Removed the
datasette skeleton
command (#476)The documentation on how to build the documentation now recommends
sphinx-autobuild
0.27.1 (2019-05-09)¶
Tiny bugfix release: don't install
tests/
in the wrong place. Thanks, Veit Heller.
0.27 (2019-01-31)¶
New command:
datasette plugins
(documentation) shows you the currently installed list of plugins.Datasette can now output newline-delimited JSON using the new
?_shape=array&_nl=on
query string option.Added documentation on The Datasette Ecosystem.
Now using Python 3.7.2 as the base for the official Datasette Docker image.
0.26.1 (2019-01-10)¶
/-/versions
now includes SQLitecompile_options
(#396)datasetteproject/datasette Docker image now uses SQLite 3.26.0 (#397)
Cleaned up some deprecation warnings under Python 3.7
0.26 (2019-01-02)¶
datasette serve --reload
now restarts Datasette if a database file changes on disk.datasette publish now
now takes an optional--alias mysite.now.sh
argument. This will attempt to set an alias after the deploy completes.Fixed a bug where the advanced CSV export form failed to include the currently selected filters (#393)
0.25.2 (2018-12-16)¶
datasette publish heroku
now uses thepython-3.6.7
runtimeAdded documentation on how to build the documentation
Added documentation covering our release process
Upgraded to pytest 4.0.2
0.25.1 (2018-11-04)¶
Documentation improvements plus a fix for publishing to Zeit Now.
datasette publish now
now uses Zeit's v1 platform, to work around the new 100MB image limit. Thanks, @slygent - closes #366.
0.25 (2018-09-19)¶
New plugin hooks, improved database view support and an easier way to use more recent versions of SQLite.
New
publish_subcommand
plugin hook. A plugin can now add additionaldatasette publish
publishers in addition to the defaultnow
andheroku
, both of which have been refactored into default plugins. publish_subcommand documentation. Closes #349New
render_cell
plugin hook. Plugins can now customize how values are displayed in the HTML tables produced by Datasette's browsable interface. datasette-json-html and datasette-render-images are two new plugins that use this hook. render_cell documentation. Closes #352New
extra_body_script
plugin hook, enabling plugins to provide additional JavaScript that should be added to the page footer. extra_body_script documentation.extra_css_urls
andextra_js_urls
hooks now take additional optional parameters, allowing them to be more selective about which pages they apply to. Documentation.You can now use the sortable_columns metadata setting to explicitly enable sort-by-column in the interface for database views, as well as for specific tables.
The new
fts_table
andfts_pk
metadata settings can now be used to explicitly configure full-text search for a table or a view, even if that table is not directly coupled to the SQLite FTS feature in the database schema itself.Datasette will now use pysqlite3 in place of the standard library
sqlite3
module if it has been installed in the current environment. This makes it much easier to run Datasette against a more recent version of SQLite, including the just-released SQLite 3.25.0 which adds window function support. More details on how to use this in #360New mechanism that allows plugin configuration options to be set using
metadata.json
.
0.24 (2018-07-23)¶
A number of small new features:
datasette publish heroku
now supports--extra-options
, fixes #334Custom error message if SpatiaLite is needed for specified database, closes #331
New config option:
truncate_cells_html
for truncating long cell values in HTML view - closes #330Documentation for datasette publish and datasette package, closes #337
Fixed compatibility with Python 3.7
datasette publish heroku
now supports app names via the-n
option, which can also be used to overwrite an existing application [Russ Garrett]Title and description metadata can now be set for canned SQL queries, closes #342
New
force_https_on
config option, fixeshttps://
API URLs when deploying to Zeit Now - closes #333?_json_infinity=1
query string argument for handling Infinity/-Infinity values in JSON, closes #332URLs displayed in the results of custom SQL queries are now URLified, closes #298
0.23.2 (2018-07-07)¶
Minor bugfix and documentation release.
CSV export now respects
--cors
, fixes #326Installation instructions, including docker image - closes #328
Fix for row pages for tables with / in, closes #325
0.23.1 (2018-06-21)¶
Minor bugfix release.
Correctly display empty strings in HTML table, closes #314
Allow "." in database filenames, closes #302
404s ending in slash redirect to remove that slash, closes #309
Fixed incorrect display of compound primary keys with foreign key references. Closes #319
Docs + example of canned SQL query using || concatenation. Closes #321
Correctly display facets with value of 0 - closes #318
Default 'expand labels' to checked in CSV advanced export
0.23 (2018-06-18)¶
This release features CSV export, improved options for foreign key expansions, new configuration settings and improved support for SpatiaLite.
See datasette/compare/0.22.1...0.23 for a full list of commits added since the last release.
CSV export¶
Any Datasette table, view or custom SQL query can now be exported as CSV.
Check out the CSV export documentation for more details, or try the feature out on https://fivethirtyeight.datasettes.com/fivethirtyeight/bechdel%2Fmovies
If your table has more than max_returned_rows (default 1,000) Datasette provides the option to stream all rows. This option takes advantage of async Python and Datasette's efficient pagination to iterate through the entire matching result set and stream it back as a downloadable CSV file.
Foreign key expansions¶
When Datasette detects a foreign key reference it attempts to resolve a label for that reference (automatically or using the Specifying the label column for a table metadata option) so it can display a link to the associated row.
This expansion is now also available for JSON and CSV representations of the
table, using the new _labels=on
query string option. See
Expanding foreign key references for more details.
New configuration settings¶
Datasette's Settings now also supports boolean settings. A number of new configuration options have been added:
num_sql_threads
- the number of threads used to execute SQLite queries. Defaults to 3.allow_facet
- enable or disable custom Facets using the _facet= parameter. Defaults to on.suggest_facets
- should Datasette suggest facets? Defaults to on.allow_download
- should users be allowed to download the entire SQLite database? Defaults to on.allow_sql
- should users be allowed to execute custom SQL queries? Defaults to on.default_cache_ttl
- Default HTTP caching max-age header in seconds. Defaults to 365 days - caching can be disabled entirely by settings this to 0.cache_size_kb
- Set the amount of memory SQLite uses for its per-connection cache, in KB.allow_csv_stream
- allow users to stream entire result sets as a single CSV file. Defaults to on.max_csv_mb
- maximum size of a returned CSV file in MB. Defaults to 100MB, set to 0 to disable this limit.
Control HTTP caching with ?_ttl=¶
You can now customize the HTTP max-age header that is sent on a per-URL basis, using the new ?_ttl=
query string parameter.
You can set this to any value in seconds, or you can set it to 0 to disable HTTP caching entirely.
Consider for example this query which returns a randomly selected member of the Avengers:
select * from [avengers/avengers] order by random() limit 1
If you hit the following page repeatedly you will get the same result, due to HTTP caching:
/fivethirtyeight?sql=select+*+from+%5Bavengers%2Favengers%5D+order+by+random%28%29+limit+1
By adding ?_ttl=0 to the zero you can ensure the page will not be cached and get back a different super hero every time:
/fivethirtyeight?sql=select+*+from+%5Bavengers%2Favengers%5D+order+by+random%28%29+limit+1&_ttl=0
Improved support for SpatiaLite¶
The SpatiaLite module for SQLite adds robust geospatial features to the database.
Getting SpatiaLite working can be tricky, especially if you want to use the most recent alpha version (with support for K-nearest neighbor).
Datasette now includes extensive documentation on SpatiaLite, and thanks to Ravi Kotecha our GitHub repo includes a Dockerfile that can build the latest SpatiaLite and configure it for use with Datasette.
The datasette publish
and datasette package
commands now accept a new
--spatialite
argument which causes them to install and configure SpatiaLite
as part of the container they deploy.
latest.datasette.io¶
Every commit to Datasette master is now automatically deployed by Travis CI to https://latest.datasette.io/ - ensuring there is always a live demo of the latest version of the software.
The demo uses the fixtures from our unit tests, ensuring it demonstrates the same range of functionality that is covered by the tests.
You can see how the deployment mechanism works in our .travis.yml file.
Miscellaneous¶
Got JSON data in one of your columns? Use the new
?_json=COLNAME
argument to tell Datasette to return that JSON value directly rather than encoding it as a string.If you just want an array of the first value of each row, use the new
?_shape=arrayfirst
option - example.
0.22.1 (2018-05-23)¶
Bugfix release, plus we now use versioneer for our version numbers.
Faceting no longer breaks pagination, fixes #282
Add
__version_info__
derived from __version__ [Robert Gieseke]This might be tuple of more than two values (major and minor version) if commits have been made after a release.
Add version number support with Versioneer. [Robert Gieseke]
Versioneer Licence: Public Domain (CC0-1.0)
Closes #273
Refactor inspect logic [Russ Garrett]
0.22 (2018-05-20)¶
The big new feature in this release is Facets. Datasette can now apply faceted browse to any column in any table. It will also suggest possible facets. See the Datasette Facets announcement post for more details.
In addition to the work on facets:
New
--config
option, added--help-config
, closes #274Removed the
--page_size=
argument todatasette serve
in favour of:datasette serve --config default_page_size:50 mydb.db
Added new help section:
datasette --help-config
Config options: default_page_size Default page size for the table view (default=100) max_returned_rows Maximum rows that can be returned from a table or custom query (default=1000) sql_time_limit_ms Time limit for a SQL query in milliseconds (default=1000) default_facet_size Number of values to return for requested facets (default=30) facet_time_limit_ms Time limit for calculating a requested facet (default=200) facet_suggest_time_limit_ms Time limit for calculating a suggested facet (default=50)
Only apply responsive table styles to
.rows-and-column
Otherwise they interfere with tables in the description, e.g. on https://fivethirtyeight.datasettes.com/fivethirtyeight/nba-elo%2Fnbaallelo
Refactored views into new
views/
modules, refs #256Documentation for SQLite full-text search support, closes #253
/-/versions
now includes SQLitefts_versions
, closes #252
0.21 (2018-05-05)¶
New JSON _shape=
options, the ability to set table _size=
and a mechanism for searching within specific columns.
Default tests to using a longer timelimit
Every now and then a test will fail in Travis CI on Python 3.5 because it hit the default 20ms SQL time limit.
Test fixtures now default to a 200ms time limit, and we only use the 20ms time limit for the specific test that tests query interruption. This should make our tests on Python 3.5 in Travis much more stable.
Support
_search_COLUMN=text
searches, closes #237Show version on
/-/plugins
page, closes #248?_size=max
option, closes #249Added
/-/versions
and/-/versions.json
, closes #244Sample output:
{ "python": { "version": "3.6.3", "full": "3.6.3 (default, Oct 4 2017, 06:09:38) \n[GCC 4.2.1 Compatible Apple LLVM 9.0.0 (clang-900.0.37)]" }, "datasette": { "version": "0.20" }, "sqlite": { "version": "3.23.1", "extensions": { "json1": null, "spatialite": "4.3.0a" } } }
Renamed
?_sql_time_limit_ms=
to?_timelimit
, closes #242New
?_shape=array
option + tweaks to_shape
, closes #245Default is now
?_shape=arrays
(renamed fromlists
)New
?_shape=array
returns an array of objects as the root objectChanged
?_shape=object
to return the object as the rootUpdated docs
FTS tables now detected by
inspect()
, closes #240New
?_size=XXX
query string parameter for table view, closes #229Also added documentation for all of the
_special
arguments.Plus deleted some duplicate logic implementing
_group_count
.If
max_returned_rows==page_size
, incrementmax_returned_rows
- fixes #230New
hidden: True
option for table metadata, closes #239Hide
idx_*
tables if spatialite detected, closes #228Added
class=rows-and-columns
to custom query results tableAdded CSS class
rows-and-columns
to main tablelabel_column
option inmetadata.json
- closes #234
0.20 (2018-04-20)¶
Mostly new work on the Plugins mechanism: plugins can now bundle static assets and custom templates, and datasette publish
has a new --install=name-of-plugin
option.
Add col-X classes to HTML table on custom query page
Fixed out-dated template in documentation
Plugins can now bundle custom templates, #224
Added /-/metadata /-/plugins /-/inspect, #225
Documentation for --install option, refs #223
Datasette publish/package --install option, #223
Fix for plugins in Python 3.5, #222
New plugin hooks: extra_css_urls() and extra_js_urls(), #214
/-/static-plugins/PLUGIN_NAME/ now serves static/ from plugins
<th> now gets class="col-X" - plus added col-X documentation
Use to_css_class for table cell column classes
This ensures that columns with spaces in the name will still generate usable CSS class names. Refs #209
Add column name classes to <td>s, make PK bold [Russ Garrett]
Don't duplicate simple primary keys in the link column [Russ Garrett]
When there's a simple (single-column) primary key, it looks weird to duplicate it in the link column.
This change removes the second PK column and treats the link column as if it were the PK column from a header/sorting perspective.
Correct escaping for HTML display of row links [Russ Garrett]
Longer time limit for test_paginate_compound_keys
It was failing intermittently in Travis - see #209
Use application/octet-stream for downloadable databases
Updated PyPI classifiers
Updated PyPI link to pypi.org
0.19 (2018-04-16)¶
This is the first preview of the new Datasette plugins mechanism. Only two plugin hooks are available so far - for custom SQL functions and custom template filters. There's plenty more to come - read the documentation and get involved in the tracking ticket if you have feedback on the direction so far.
Fix for
_sort_desc=sortable_with_nulls
test, refs #216Fixed #216 - paginate correctly when sorting by nullable column
Initial documentation for plugins, closes #213
New
--plugins-dir=plugins/
option (#212)New option causing Datasette to load and evaluate all of the Python files in the specified directory and register any plugins that are defined in those files.
This new option is available for the following commands:
datasette serve mydb.db --plugins-dir=plugins/ datasette publish now/heroku mydb.db --plugins-dir=plugins/ datasette package mydb.db --plugins-dir=plugins/
Start of the plugin system, based on pluggy (#210)
Uses https://pluggy.readthedocs.io/ originally created for the py.test project
We're starting with two plugin hooks:
prepare_connection(conn)
This is called when a new SQLite connection is created. It can be used to register custom SQL functions.
prepare_jinja2_environment(env)
This is called with the Jinja2 environment. It can be used to register custom template tags and filters.
An example plugin which uses these two hooks can be found at https://github.com/simonw/datasette-plugin-demos or installed using
pip install datasette-plugin-demos
Refs #14
Return HTTP 405 on InvalidUsage rather than 500. [Russ Garrett]
This also stops it filling up the logs. This happens for HEAD requests at the moment - which perhaps should be handled better, but that's a different issue.
0.18 (2018-04-14)¶
This release introduces support for units,
contributed by Russ Garrett (#203).
You can now optionally specify the units for specific columns using metadata.json
.
Once specified, units will be displayed in the HTML view of your table. They also become
available for use in filters - if a column is configured with a unit of distance, you can
request all rows where that column is less than 50 meters or more than 20 feet for example.
Link foreign keys which don't have labels. [Russ Garrett]
This renders unlabeled FKs as simple links.
Also includes bonus fixes for two minor issues:
In foreign key link hrefs the primary key was escaped using HTML escaping rather than URL escaping. This broke some non-integer PKs.
Print tracebacks to console when handling 500 errors.
Fix SQLite error when loading rows with no incoming FKs. [Russ Garrett]
This fixes an error caused by an invalid query when loading incoming FKs.
The error was ignored due to async but it still got printed to the console.
Allow custom units to be registered with Pint. [Russ Garrett]
Support units in filters. [Russ Garrett]
Tidy up units support. [Russ Garrett]
Add units to exported JSON
Units key in metadata skeleton
Docs
Initial units support. [Russ Garrett]
Add support for specifying units for a column in
metadata.json
and rendering them on display using pint
0.17 (2018-04-13)¶
Release 0.17 to fix issues with PyPI
0.16 (2018-04-13)¶
Better mechanism for handling errors; 404s for missing table/database
New error mechanism closes #193
404s for missing tables/databases closes #184
long_description in markdown for the new PyPI
Hide SpatiaLite system tables. [Russ Garrett]
Allow
explain select
/explain query plan select
#201Datasette inspect now finds primary_keys #195
Ability to sort using form fields (for mobile portrait mode) #199
We now display sort options as a select box plus a descending checkbox, which means you can apply sort orders even in portrait mode on a mobile phone where the column headers are hidden.
0.15 (2018-04-09)¶
The biggest new feature in this release is the ability to sort by column. On the
table page the column headers can now be clicked to apply sort (or descending
sort), or you can specify ?_sort=column
or ?_sort_desc=column
directly
in the URL.
table_rows
=>table_rows_count
,filtered_table_rows
=>filtered_table_rows_count
Renamed properties. Closes #194
New
sortable_columns
option inmetadata.json
to control sort options.You can now explicitly set which columns in a table can be used for sorting using the
_sort
and_sort_desc
arguments usingmetadata.json
:{ "databases": { "database1": { "tables": { "example_table": { "sortable_columns": [ "height", "weight" ] } } } } }
Refs #189
Column headers now link to sort/desc sort - refs #189
_sort
and_sort_desc
parameters for table viewsAllows for paginated sorted results based on a specified column.
Refs #189
Total row count now correct even if
_next
appliedUse .custom_sql() for _group_count implementation (refs #150)
Make HTML title more readable in query template (#180) [Ryan Pitts]
New
?_shape=objects/object/lists
param for JSON API (#192)New
_shape=
parameter replacing old.jsono
extensionNow instead of this:
/database/table.jsono
We use the
_shape
parameter like this:/database/table.json?_shape=objects
Also introduced a new
_shape
calledobject
which looks like this:/database/table.json?_shape=object
Returning an object for the rows key:
... "rows": { "pk1": { ... }, "pk2": { ... } }
Refs #122
Utility for writing test database fixtures to a .db file
python tests/fixtures.py /tmp/hello.db
This is useful for making a SQLite database of the test fixtures for interactive exploration.
Compound primary key
_next=
now plays well with extra filtersCloses #190
Fixed bug with keyset pagination over compound primary keys
Refs #190
Database/Table views inherit
source/license/source_url/license_url
metadataIf you set the
source_url/license_url/source/license
fields in your root metadata those values will now be inherited all the way down to the database and table templates.The
title/description
are NOT inherited.Also added unit tests for the HTML generated by the metadata.
Refs #185
Add metadata, if it exists, to heroku temp dir (#178) [Tony Hirst]
Initial documentation for pagination
Broke up test_app into test_api and test_html
Fixed bug with .json path regular expression
I had a table called
geojson
and it caused an exception because the regex was matching.json
and not\.json
Deploy to Heroku with Python 3.6.3
0.14 (2017-12-09)¶
The theme of this release is customization: Datasette now allows every aspect of its presentation to be customized either using additional CSS or by providing entirely new templates.
Datasette's metadata.json format
has also been expanded, to allow per-database and per-table metadata. A new
datasette skeleton
command can be used to generate a skeleton JSON file
ready to be filled in with per-database and per-table details.
The metadata.json
file can also be used to define
canned queries,
as a more powerful alternative to SQL views.
extra_css_urls
/extra_js_urls
in metadataA mechanism in the
metadata.json
format for adding custom CSS and JS urls.Create a
metadata.json
file that looks like this:{ "extra_css_urls": [ "https://simonwillison.net/static/css/all.bf8cd891642c.css" ], "extra_js_urls": [ "https://code.jquery.com/jquery-3.2.1.slim.min.js" ] }
Then start datasette like this:
datasette mydb.db --metadata=metadata.json
The CSS and JavaScript files will be linked in the
<head>
of every page.You can also specify a SRI (subresource integrity hash) for these assets:
{ "extra_css_urls": [ { "url": "https://simonwillison.net/static/css/all.bf8cd891642c.css", "sri": "sha384-9qIZekWUyjCyDIf2YK1FRoKiPJq4PHt6tp/ulnuuyRBvazd0hG7pWbE99zvwSznI" } ], "extra_js_urls": [ { "url": "https://code.jquery.com/jquery-3.2.1.slim.min.js", "sri": "sha256-k2WSCIexGzOj3Euiig+TlR8gA0EmPjuc79OEeY5L45g=" } ] }
Modern browsers will only execute the stylesheet or JavaScript if the SRI hash matches the content served. You can generate hashes using https://www.srihash.org/
Auto-link column values that look like URLs (#153)
CSS styling hooks as classes on the body (#153)
Every template now gets CSS classes in the body designed to support custom styling.
The index template (the top level page at
/
) gets this:<body class="index">
The database template (
/dbname/
) gets this:<body class="db db-dbname">
The table template (
/dbname/tablename
) gets:<body class="table db-dbname table-tablename">
The row template (
/dbname/tablename/rowid
) gets:<body class="row db-dbname table-tablename">
The
db-x
andtable-x
classes use the database or table names themselves IF they are valid CSS identifiers. If they aren't, we strip any invalid characters out and append a 6 character md5 digest of the original name, in order to ensure that multiple tables which resolve to the same stripped character version still have different CSS classes.Some examples (extracted from the unit tests):
"simple" => "simple" "MixedCase" => "MixedCase" "-no-leading-hyphens" => "no-leading-hyphens-65bea6" "_no-leading-underscores" => "no-leading-underscores-b921bc" "no spaces" => "no-spaces-7088d7" "-" => "336d5e" "no $ characters" => "no--characters-59e024"
datasette --template-dir=mytemplates/
argumentYou can now pass an additional argument specifying a directory to look for custom templates in.
Datasette will fall back on the default templates if a template is not found in that directory.
Ability to over-ride templates for individual tables/databases.
It is now possible to over-ride templates on a per-database / per-row or per- table basis.
When you access e.g.
/mydatabase/mytable
Datasette will look for the following:- table-mydatabase-mytable.html - table.html
If you provided a
--template-dir
argument to datasette serve it will look in that directory first.The lookup rules are as follows:
Index page (/): index.html Database page (/mydatabase): database-mydatabase.html database.html Table page (/mydatabase/mytable): table-mydatabase-mytable.html table.html Row page (/mydatabase/mytable/id): row-mydatabase-mytable.html row.html
If a table name has spaces or other unexpected characters in it, the template filename will follow the same rules as our custom
<body>
CSS classes - for example, a table called "Food Trucks" will attempt to load the following templates:table-mydatabase-Food-Trucks-399138.html table.html
It is possible to extend the default templates using Jinja template inheritance. If you want to customize EVERY row template with some additional content you can do so by creating a row.html template like this:
{% extends "default:row.html" %} {% block content %} <h1>EXTRA HTML AT THE TOP OF THE CONTENT BLOCK</h1> <p>This line renders the original block:</p> {{ super() }} {% endblock %}
--static
option for datasette serve (#160)You can now tell Datasette to serve static files from a specific location at a specific mountpoint.
For example:
datasette serve mydb.db --static extra-css:/tmp/static/css
Now if you visit this URL:
http://localhost:8001/extra-css/blah.css
The following file will be served:
/tmp/static/css/blah.css
Canned query support.
Named canned queries can now be defined in
metadata.json
like this:{ "databases": { "timezones": { "queries": { "timezone_for_point": "select tzid from timezones ..." } } } }
These will be shown in a new "Queries" section beneath "Views" on the database page.
New
datasette skeleton
command for generatingmetadata.json
(#164)metadata.json
support for per-table/per-database metadata (#165)Also added support for descriptions and HTML descriptions.
Here's an example metadata.json file illustrating custom per-database and per- table metadata:
{ "title": "Overall datasette title", "description_html": "This is a <em>description with HTML</em>.", "databases": { "db1": { "title": "First database", "description": "This is a string description & has no HTML", "license_url": "http://example.com/", "license": "The example license", "queries": { "canned_query": "select * from table1 limit 3;" }, "tables": { "table1": { "title": "Custom title for table1", "description": "Tables can have descriptions too", "source": "This has a custom source", "source_url": "http://example.com/" } } } } }
Renamed
datasette build
command todatasette inspect
(#130)Upgrade to Sanic 0.7.0 (#168)
Package and publish commands now accept
--static
and--template-dir
Example usage:
datasette package --static css:extra-css/ --static js:extra-js/ \ sf-trees.db --template-dir templates/ --tag sf-trees --branch master
This creates a local Docker image that includes copies of the templates/, extra-css/ and extra-js/ directories. You can then run it like this:
docker run -p 8001:8001 sf-trees
For publishing to Zeit now:
datasette publish now --static css:extra-css/ --static js:extra-js/ \ sf-trees.db --template-dir templates/ --name sf-trees --branch master
HTML comment showing which templates were considered for a page (#171)
0.13 (2017-11-24)¶
Search now applies to current filters.
Combined search into the same form as filters.
Closes #133
Much tidier design for table view header.
Closes #147
Added
?column__not=blah
filter.Closes #148
Row page now resolves foreign keys.
Closes #132
Further tweaks to select/input filter styling.
Refs #86 - thanks for the help, @natbat!
Show linked foreign key in table cells.
Added UI for editing table filters.
Refs #86
Hide FTS-created tables on index pages.
Closes #129
Add publish to heroku support [Jacob Kaplan-Moss]
datasette publish heroku mydb.db
Pull request #104
Initial implementation of
?_group_count=column
.URL shortcut for counting rows grouped by one or more columns.
?_group_count=column1&_group_count=column2
works as well.SQL generated looks like this:
select "qSpecies", count(*) as "count" from Street_Tree_List group by "qSpecies" order by "count" desc limit 100
Or for two columns like this:
select "qSpecies", "qSiteInfo", count(*) as "count" from Street_Tree_List group by "qSpecies", "qSiteInfo" order by "count" desc limit 100
Refs #44
Added
--build=master
option to datasette publish and package.The
datasette publish
anddatasette package
commands both now accept an optional--build
argument. If provided, this can be used to specify a branch published to GitHub that should be built into the container.This makes it easier to test code that has not yet been officially released to PyPI, e.g.:
datasette publish now mydb.db --branch=master
Implemented
?_search=XXX
+ UI if a FTS table is detected.Closes #131
Added
datasette --version
support.Table views now show expanded foreign key references, if possible.
If a table has foreign key columns, and those foreign key tables have
label_columns
, the TableView will now query those other tables for the corresponding values and display those values as links in the corresponding table cells.label_columns are currently detected by the
inspect()
function, which looks for any table that has just two columns - an ID column and one other - and sets thelabel_column
to be that second non-ID column.Don't prevent tabbing to "Run SQL" button (#117) [Robert Gieseke]
See comment in #115
Add keyboard shortcut to execute SQL query (#115) [Robert Gieseke]
Allow
--load-extension
to be set via environment variable.Add support for
?field__isnull=1
(#107) [Ray N]Add spatialite, switch to debian and local build (#114) [Ariel Núñez]
Added
--load-extension
argument to datasette serve.Allows loading of SQLite extensions. Refs #110.
0.12 (2017-11-16)¶
Added
__version__
, now displayed as tooltip in page footer (#108).Added initial docs, including a changelog (#99).
Turned on auto-escaping in Jinja.
Added a UI for editing named parameters (#96).
You can now construct a custom SQL statement using SQLite named parameters (e.g.
:name
) and datasette will display form fields for editing those parameters. Here’s an example which lets you see the most popular names for dogs of different species registered through various dog registration schemes in Australia.
Pin to specific Jinja version. (#100).
Default to 127.0.0.1 not 0.0.0.0. (#98).
Added extra metadata options to publish and package commands. (#92).
You can now run these commands like so:
datasette now publish mydb.db \ --title="My Title" \ --source="Source" \ --source_url="http://www.example.com/" \ --license="CC0" \ --license_url="https://creativecommons.org/publicdomain/zero/1.0/"
This will write those values into the metadata.json that is packaged with the app. If you also pass
--metadata=metadata.json
that file will be updated with the extra values before being written into the Docker image.Added production-ready Dockerfile (#94) [Andrew Cutler]
New
?_sql_time_limit_ms=10
argument to database and table page (#95)SQL syntax highlighting with Codemirror (#89) [Tom Dyson]
0.11 (2017-11-14)¶
Added
datasette publish now --force
option.This calls
now
with--force
- useful as it means you get a fresh copy of datasette even if Now has already cached that docker layer.Enable
--cors
by default when running in a container.
0.10 (2017-11-14)¶
Fixed #83 - 500 error on individual row pages.
Stop using sqlite WITH RECURSIVE in our tests.
The version of Python 3 running in Travis CI doesn't support this.
0.9 (2017-11-13)¶
Added
--sql_time_limit_ms
and--extra-options
.The serve command now accepts
--sql_time_limit_ms
for customizing the SQL time limit.The publish and package commands now accept
--extra-options
which can be used to specify additional options to be passed to the datasite serve command when it executes inside the resulting Docker containers.
0.8 (2017-11-13)¶
V0.8 - added PyPI metadata, ready to ship.
Implemented offset/limit pagination for views (#70).
Improved pagination. (#78)
Limit on max rows returned, controlled by
--max_returned_rows
option. (#69)If someone executes 'select * from table' against a table with a million rows in it, we could run into problems: just serializing that much data as JSON is likely to lock up the server.
Solution: we now have a hard limit on the maximum number of rows that can be returned by a query. If that limit is exceeded, the server will return a
"truncated": true
field in the JSON.This limit can be optionally controlled by the new
--max_returned_rows
option. Setting that option to 0 disables the limit entirely.