Migration¶
Listed here are all versions that necessitate migration. Depending on the version you are migrating from you might need to follow multiple migration steps.
In case of database migrations it is sufficient to marv dump
the database with the version you are currently using and marv restore
with the latest version; marv is able to dump itself and restore any older version. In case this does not hold true marv restore
will complain and provide instructions what to do.
20.12.0¶
Database migration¶
Extended RPC filters, among others, required changes to database schemas, a migration of the MARV database is necessary. Export the database with your current version of MARV:
marv dump dump-2008.json
mv db/db.sqlite db/db.sqlite.2008
After updating MARV run:
marv init
marv restore dump-2008.json
Code migration (CE)¶
Node marv_robotics.trajectory.navsatfix()
now returns timestamps as integer in nanoseconds instead of as float in seconds. Directly consuming custom nodes need migration, all nodes shipping with marv are already adjusted accordingly.
Config migration (EE)¶
To implement partial downloads, the Enterprise Edition now uses a dedicated connections section. Please adjust your config accordingly and rerun the corresponding node:
- marv_robotics.detail:connections_section
+ marv_ee_nodes.detail:connections_section
marv run --force --node connections_section --col=\*
20.08.0¶
Session key file¶
A marv site contains a session key file used in authenticating marv users. This file was previously readable by all users of the host system. We recommend to delete the file and restart marv to recreate it automatically. All marv users will have to relogin.
rm site/sessionkey
Table column sorting¶
Table columns containing links use dictionaries instead of plain values and the sort order of these will seem random. Use sortkey
on table columns to define the dictionary key for sorting (see Table for more information).
In case you are using marv_robotics.detail.bagmeta_table()
and have datasets with multiple files, you might want to run:
marv run --node bagmeta_table --force --col=*
Likewise for marv_nodes.meta_table()
.
20.06.0¶
Changes to access control¶
MARV supports access control profiles (see acl) that dictate who is permitted to perform what. Profiles map actions like comment
or tag
to lists of user groups that are permitted to perform these actions. This version of MARV cleans up and streamlines the list of supported verbs.
If you created a custom profile, please refer to marv_webapi.acls
as an example and make sure to adjust your custom profile to use the new action verbs.
No migration is necessary for installations that use one of the two default profiles (authenticated
or public
).
Database migration¶
Changes to collection management required changes to database schemas. A migration of the MARV database is necessary. Export the database with your current version of MARV:
marv dump dump-2004.json
mv db/db.sqlite db/db.sqlite.2004
After updating MARV run:
marv init
marv restore dump-2004.json
20.04.0¶
Database migration¶
An updated version of tortoise-orm required changes to the database schemas. A migration of the MARV database is necessary. Export the database with your current version of MARV:
marv dump dump-1911.json
mv db/db.sqlite db/db.sqlite.1911
After updating MARV run:
marv init
marv restore dump-1911.json
19.11.0¶
The gunicorn configuration was simplified. Instead of providing gunicorn_cfg.py
and running gunicorn manually, the marv serve
cli was added. Check it out with marv serve --help
.
If you were overriding the dburi path in your marv.conf, there is no need for the odd sqlalchemy URIs with four slashes anymore. If your custom dburi starts with sqlite:////
please remove one slash.
This version makes the switch from sqlalchemy to tortoise as the underlying ORM, which makes a migration of the MARV database necessary. Export the database with your current version of MARV:
marv dump dump-1909.json
mv db/db.sqlite db/db.sqlite.1909
After updating MARV run:
marv init
marv restore dump-1909.json
19.09.0¶
Uwsgi was replaced in favour of Gunicorn. In your site directory, replace uwsgi.conf
with a gunicorn_cfg.py
:
"""Gunicorn configuration for MARV."""
import multiprocessing
import pathlib
# pylint: disable=invalid-name
bind = ':8000'
proc_name = 'marvweb'
raw_env = {
f'MARV_CONFIG={pathlib.Path(__file__).parent.resolve() / "marv.conf"}',
}
workers = multiprocessing.cpu_count() * 2 + 1
worker_class = 'aiohttp.GunicornUVLoopWebWorker'
If you made any changes to your old uwsgi.conf
please adjust the above config accordingly. If you are using an nginx reverse proxy you also have to adjust its configuration. Replace any uwsgi_pass
directives with a proxy_pass
:
- uwsgi_pass 127.0.0.1:8000;
- include uwsgi_params;
+ proxy_set_header Host $host;
+ proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
+ proxy_pass 127.0.0.1:8000;
19.02.0¶
Support for ipdb has been dropped in favour of pdb++. Use PDB=1 marv run
instead of marv-ipdb run
. For more information see Debugging.
18.07¶
The way inputs are handled has changed. Inputs selecting an individual topic are now optional. See Optional inputs for more information.
18.04¶
The list of system dependencies is updated and the installation has significantly changed. We recommend that you re-read the Installation instructions. The database has not changed and existing sites continue to function without migration.
MARV now supports offloading the delivery of files to nginx. In case you are not using nginx as reverse-proxy, yet, you should seriously consider to change that now. See reverse_proxy and Gunicorn behind NGINX.
MARV now supports access control lists (ACLs). The default ACL requires authentication to read anything, tag and comment; and only members of the group admin may discard datasets. For users of the Enterprise Edition this corresponds to the same behaviour as before. The marv_webapi.acls.public()
closely resembles the previous Community Edition default, apart from requiring admin group membership to discard datasets. See acl to change the effective ACL.
18.03¶
With this release:
geojson property object has changed.
To update the store to the new format rerun the trajectory nodes using:
marv run --node trajectory --force --force-dependent --collection=*
18.02¶
With this release:
message definitions are read from bag files instead of being expected on the system
marv allows mixing message types per topic.
As part of this the topics section has been renamed to connections_section
and the bagmeta
node has changed a bit. Please update your configuration along the lines of:
diff --git a/docs/config/marv.conf b/docs/config/marv.conf
index 0e44b19..31dfe57 100644
--- a/docs/config/marv.conf
+++ b/docs/config/marv.conf
@@ -24,7 +24,7 @@ nodes =
marv_robotics.detail:summary_keyval
marv_robotics.detail:bagmeta_table
# detail sections
- marv_robotics.detail:topics_section
+ marv_robotics.detail:connections_section
marv_robotics.detail:images_section
marv_robotics.detail:video_section
marv_robotics.detail:gnss_section
@@ -43,8 +43,8 @@ filters =
start_time | Start time | lt le eq ne ge gt | datetime | (get "bagmeta.start_time")
end_time | End time | lt le eq ne ge gt | datetime | (get "bagmeta.end_time")
duration | Duration | lt le eq ne ge gt | timedelta | (get "bagmeta.duration")
- topics | Topics | any all | subset | (get "bagmeta.topics[:].name")
- msg_types | Message types | any all | subset | (get "bagmeta.msg_types[:].name")
+ topics | Topics | any all | subset | (get "bagmeta.topics")
+ msg_types | Message types | any all | subset | (get "bagmeta.msg_types")
listing_columns =
# id | Heading | formatter | value function
@@ -69,7 +69,7 @@ detail_summary_widgets =
bagmeta_table
detail_sections =
- topics_section
+ connections_section
images_section
video_section
gnss_section
And then rerun the bagmeta node and the new connections section.
marv run --node bagmeta --node connections_section --force --collection=*
Now, nodes can be run, that were previously missing message type definitions. gnss_plots
for example works differently, if it cannot find navsat orientations. To rerun it and all nodes depending on it:
marv run --node gnss_plots --force --force-dependent --collection=*
17.11¶
In the old site:
Make sure you have a backup of your old site!
Dump database of old marv:
curl -LO https://gist.githubusercontent.com/chaoflow/02a1be706cf4948a9f4d7f1fd66d6c73/raw/de4feab88bcfa756abfb6c7f5a8ccaef7f25b36d/marv-16.10-dump.py python2 marv-16.10-dump.py > /tmp/dump.json
For and in the new instance:
Follow Installation and Tutorial: Setup basic site to setup a basic site.
Replace
marv.conf
with the default Configuration and adjust as needed (e.g. scanroot).Initialize site with new configuration:
marv init
If your scanroot has moved, adjust paths as needed:
sed -i -e 's,/old/scanroot/,/path/to/new/scanroot/,g' /tmp/dump.json
Restore database in new marv:
marv restore /tmp/dump.json
Set password for each user:
marv user pw <username>
Run nodes:
marv query -0 --collection=bags |xargs -0 -L25 -P4 marv run --keep-going
Run again sequentially to see if there are nodes producing errors:
marv run --collection=bags