Compare commits

...

53 commits

Author SHA1 Message Date
Markus Unterwaditzer
535911c9fd Remove unsupported zesty 2018-02-14 19:44:53 +01:00
Markus Unterwaditzer
8f2734c33e
Singlefile storage in rust (#698)
* Singlefile storage in rust

* add NOW

* Avoid global item
2018-02-14 19:15:11 +01:00
Markus Unterwaditzer
4d3860d449
Test radicale and xandikos again (#715) 2018-02-10 16:11:06 +01:00
Markus Unterwaditzer
9c3a2b48e9 Unify badges 2018-02-09 20:53:14 +01:00
Markus Unterwaditzer
2a2457e364
CI refactor (#713)
* Switch to CircleCI

* add circleci badge
2018-02-09 20:50:48 +01:00
Hugo Osvaldo Barrera
855f29cc35 Update link to official Arch package (#710)
There's now an official Arch package
2018-02-06 09:25:33 +01:00
Markus Unterwaditzer
cc37e6a312 Merge branch '0.16-maintenance' 2018-02-05 17:01:46 +01:00
Markus Unterwaditzer
01573f0d66 Merge branch '0.16-maintenance' 2018-02-05 15:54:17 +01:00
Markus Unterwaditzer
c1aec4527c Remove useless path change 2018-01-23 23:16:37 +01:00
Markus Unterwaditzer
b1ec9c26c7 Fix unused formatting string 2018-01-22 01:02:44 +01:00
Markus Unterwaditzer
82f47737a0 Revert use of hypothesis 2018-01-21 23:23:08 +01:00
Markus Unterwaditzer
45d76c889c Remove remotestorage leftovers 2018-01-21 20:51:30 +01:00
Markus Unterwaditzer
c92b4f38eb Update copyright year 2018-01-21 00:11:24 +01:00
Markus Unterwaditzer
47b2a43a0e Disable davical 2018-01-19 11:18:46 +01:00
Markus Unterwaditzer
2d0527ecf0 Skip davical test skipper 2018-01-19 11:17:58 +01:00
Markus Unterwaditzer
991076d12a stylefixes 2018-01-18 23:30:47 +01:00
Markus Unterwaditzer
f58f06d2b5 Remove hypothesis from system test 2018-01-18 23:25:49 +01:00
Markus Unterwaditzer
b1cddde635 Remove baikal and owncloud from docs, see #489 2018-01-18 23:18:42 +01:00
Markus Unterwaditzer
41f64e2dca
Dockerize nextcloud (#704)
* Dockerize nextcloud

* Remove ownCloud and baikal, fix #489

* Remove branch from travis conf
2018-01-18 23:10:53 +01:00
Markus Unterwaditzer
401c441acb Add slowest tests to testrun 2018-01-15 21:23:09 +01:00
Markus Unterwaditzer
f1310883b9 Screw git hooks 2018-01-05 18:25:00 +01:00
Markus Unterwaditzer
afa8031eec Improve handling of malformed items 2018-01-05 18:14:32 +01:00
Markus Unterwaditzer
50604f24f1 Add simple doc for todoman 2018-01-05 16:34:26 +01:00
Amanda Hickman
cd6cb92b59 Little spelling fix (#695)
* Fixed spelling of "occurred"

* Fix spelling of occurred.

* fixed one lingering misspelling
2018-01-03 15:52:55 +01:00
Markus Unterwaditzer
39c2df99eb Update legalities 2017-12-25 21:50:29 +01:00
Markus Unterwaditzer
7fdff404e6 No wheels 2017-12-04 20:16:29 +01:00
Markus Unterwaditzer
1bdde25c0c Fix etesync build 2017-12-04 19:52:02 +01:00
Markus Unterwaditzer
b32932bd13 Relax recurrence tests 2017-12-03 14:00:21 +01:00
Markus Unterwaditzer
22d009b824 Remove unnecessary filter 2017-11-27 19:52:15 +01:00
Markus Unterwaditzer
792dbc171f Fix missing XML header, see #688 2017-11-25 14:15:14 +01:00
Markus Unterwaditzer
5700c4688b
rustup (#686)
* rustup

* rust-vobject upgrade
2017-11-07 21:58:17 +01:00
Markus Unterwaditzer
3984f547ce
Update nextcloud (#684) 2017-11-05 15:59:42 +01:00
Markus Unterwaditzer
9769dab02e
Update owncloud (#685) 2017-11-05 15:59:34 +01:00
Markus Unterwaditzer
bd2e09a84b Small refactor in dav.py 2017-10-26 02:22:18 +02:00
Markus Unterwaditzer
f7b6e67095 Ignore new flake8 linters 2017-10-26 01:41:43 +02:00
Markus Unterwaditzer
a2c509adf5 rustup, fix broken struct export 2017-10-25 22:36:28 +02:00
Markus Unterwaditzer
28fdf42238 Fix #681 2017-10-21 17:23:41 +02:00
Markus Unterwaditzer
0d3b028b17 Cache rust artifacts 2017-10-19 23:47:20 +02:00
Markus Unterwaditzer
f8e65878d8 Update rust installation instructions 2017-10-19 23:41:43 +02:00
Markus Unterwaditzer
75e83cd0f6 Commit cargo.lock 2017-10-19 23:27:29 +02:00
Malte Kiefer
96a8ab35c3 fixed typo (#678)
fixed typo
2017-10-13 19:34:37 +02:00
Markus Unterwaditzer
619373a8e8 Rust: new item module 2017-10-11 13:53:10 +02:00
Markus Unterwaditzer
cbb15e1895 Move all target to top again 2017-10-11 13:28:00 +02:00
Markus Unterwaditzer
325304c50f Lazy-load component in item 2017-10-11 12:01:52 +02:00
Markus Unterwaditzer
bdbfc360ff Move item hashing into rust 2017-10-10 00:52:58 +02:00
Markus Unterwaditzer
c17fa308fb Adapt virtualenv steps to always select python3 2017-10-06 18:32:17 +02:00
Markus Unterwaditzer
81f7472e3a Update installation instructions for Rust dependencies 2017-10-06 18:30:10 +02:00
Markus Unterwaditzer
69543b8615 Install rust on readthedocs 2017-10-05 17:45:19 +02:00
Markus Unterwaditzer
1b7cb4e656 Use rust-vobject (#675)
Use rust-vobject
2017-10-04 22:41:18 +02:00
Markus Unterwaditzer
7bdb22a207 Fix Ubuntu package name of Python 3. 2017-10-03 22:48:13 +02:00
Markus Unterwaditzer
cb41a9df28 Add fast_finish to Travis 2017-10-03 20:59:43 +02:00
Markus Unterwaditzer
33f96f5eca Fix broken link 2017-10-03 13:13:44 +02:00
Markus Unterwaditzer
178ac237ad Fix installation link 2017-10-03 11:29:51 +02:00
70 changed files with 2590 additions and 925 deletions

199
.circleci/config.yml Normal file
View file

@ -0,0 +1,199 @@
version: 2
references:
basic_env: &basic_env
CI: true
DAV_SERVER: xandikos
restore_caches: &restore_caches
restore_cache:
keys:
- cache-{{ arch }}-{{ .Branch }}
save_caches: &save_caches
save_cache:
key: cache-{{ arch }}-{{ .Branch }}
paths:
- "rust/target/"
- "~/.cargo/"
- "~/.cache/pip/"
- "~/.rustup/"
basic_setup: &basic_setup
run: . scripts/circleci-install.sh
jobs:
nextcloud:
docker:
- image: circleci/python:3.6
environment:
<<: *basic_env
NEXTCLOUD_HOST: localhost:80
DAV_SERVER: nextcloud
- image: nextcloud
environment:
SQLITE_DATABASE: nextcloud
NEXTCLOUD_ADMIN_USER: asdf
NEXTCLOUD_ADMIN_PASSWORD: asdf
steps:
- checkout
- *restore_caches
- *basic_setup
- *save_caches
- run: wget -O - --retry-connrefused http://localhost:80/
- run: make -e storage-test
fastmail:
docker:
- image: circleci/python:3.6
environment:
<<: *basic_env
DAV_SERVER: fastmail
steps:
- checkout
- *restore_caches
- *basic_setup
- *save_caches
- run: make -e storage-test
style:
docker:
- image: circleci/python:3.6
environment:
<<: *basic_env
steps:
- checkout
- *restore_caches
- *basic_setup
- *save_caches
- run: make -e style
py34-minimal:
docker:
- image: circleci/python:3.4
environment:
<<: *basic_env
REQUIREMENTS: minimal
steps:
- checkout
- *restore_caches
- *basic_setup
- *save_caches
- run: make -e test
py34-release:
docker:
- image: circleci/python:3.4
environment:
<<: *basic_env
REQUIREMENTS: release
steps:
- checkout
- *restore_caches
- *basic_setup
- *save_caches
- run: make -e test
py34-devel:
docker:
- image: circleci/python:3.4
environment:
<<: *basic_env
REQUIREMENTS: devel
steps:
- checkout
- *restore_caches
- *basic_setup
- *save_caches
- run: make -e test
py36-minimal:
docker:
- image: circleci/python:3.6
environment:
<<: *basic_env
REQUIREMENTS: minimal
steps:
- checkout
- *restore_caches
- *basic_setup
- *save_caches
- run: make -e test
py36-release:
docker:
- image: circleci/python:3.6
environment:
<<: *basic_env
REQUIREMENTS: release
steps:
- checkout
- *restore_caches
- *basic_setup
- *save_caches
- run: make -e test
py36-release-radicale:
docker:
- image: circleci/python:3.6
environment:
<<: *basic_env
REQUIREMENTS: release
DAV_SERVER: radicale
steps:
- checkout
- *restore_caches
- *basic_setup
- *save_caches
- run: make -e test
py36-devel:
docker:
- image: circleci/python:3.6
environment:
<<: *basic_env
REQUIREMENTS: devel
steps:
- checkout
- *restore_caches
- *basic_setup
- *save_caches
- run: make -e test
rust:
docker:
- image: circleci/python:3.6
environment:
<<: *basic_env
REQUIREMENTS: release
steps:
- checkout
- *restore_caches
- *basic_setup
- *save_caches
- run: make -e rust-test
workflows:
version: 2
test_all:
jobs:
- nextcloud
- fastmail
- style
- py34-minimal
- py34-release
- py34-devel
- py36-minimal
- py36-release
- py36-devel
- rust

2
.gitignore vendored
View file

@ -13,4 +13,6 @@ env
dist
docs/_build/
vdirsyncer/version.py
vdirsyncer/_native*
.hypothesis
codecov.sh

9
.gitmodules vendored
View file

@ -1,9 +0,0 @@
[submodule "tests/storage/servers/baikal"]
path = tests/storage/servers/baikal
url = https://github.com/vdirsyncer/baikal-testserver
[submodule "tests/storage/servers/owncloud"]
path = tests/storage/servers/owncloud
url = https://github.com/vdirsyncer/owncloud-testserver
[submodule "tests/storage/servers/nextcloud"]
path = tests/storage/servers/nextcloud
url = https://github.com/vdirsyncer/nextcloud-testserver

View file

@ -1,120 +0,0 @@
{
"branches": {
"only": [
"auto",
"master",
"/^.*-maintenance$/"
]
},
"cache": "pip",
"dist": "trusty",
"git": {
"submodules": false
},
"install": [
". scripts/travis-install.sh",
"pip install -U pip setuptools",
"pip install wheel",
"make -e install-dev",
"make -e install-$BUILD"
],
"language": "python",
"matrix": {
"include": [
{
"env": "BUILD=style",
"python": "3.6"
},
{
"env": "BUILD=test DAV_SERVER=radicale REQUIREMENTS=devel ",
"python": "3.4"
},
{
"env": "BUILD=test DAV_SERVER=xandikos REQUIREMENTS=devel ",
"python": "3.4"
},
{
"env": "BUILD=test DAV_SERVER=radicale REQUIREMENTS=release ",
"python": "3.4"
},
{
"env": "BUILD=test DAV_SERVER=xandikos REQUIREMENTS=release ",
"python": "3.4"
},
{
"env": "BUILD=test DAV_SERVER=radicale REQUIREMENTS=minimal ",
"python": "3.4"
},
{
"env": "BUILD=test DAV_SERVER=xandikos REQUIREMENTS=minimal ",
"python": "3.4"
},
{
"env": "BUILD=test DAV_SERVER=radicale REQUIREMENTS=devel ",
"python": "3.5"
},
{
"env": "BUILD=test DAV_SERVER=xandikos REQUIREMENTS=devel ",
"python": "3.5"
},
{
"env": "BUILD=test DAV_SERVER=radicale REQUIREMENTS=release ",
"python": "3.5"
},
{
"env": "BUILD=test DAV_SERVER=xandikos REQUIREMENTS=release ",
"python": "3.5"
},
{
"env": "BUILD=test DAV_SERVER=radicale REQUIREMENTS=minimal ",
"python": "3.5"
},
{
"env": "BUILD=test DAV_SERVER=xandikos REQUIREMENTS=minimal ",
"python": "3.5"
},
{
"env": "BUILD=test DAV_SERVER=radicale REQUIREMENTS=devel ",
"python": "3.6"
},
{
"env": "BUILD=test DAV_SERVER=xandikos REQUIREMENTS=devel ",
"python": "3.6"
},
{
"env": "BUILD=test DAV_SERVER=radicale REQUIREMENTS=release ",
"python": "3.6"
},
{
"env": "BUILD=test DAV_SERVER=xandikos REQUIREMENTS=release ",
"python": "3.6"
},
{
"env": "BUILD=test DAV_SERVER=fastmail REQUIREMENTS=release ",
"if": "NOT (type IN (pull_request))",
"python": "3.6"
},
{
"env": "BUILD=test DAV_SERVER=radicale REQUIREMENTS=minimal ",
"python": "3.6"
},
{
"env": "BUILD=test DAV_SERVER=xandikos REQUIREMENTS=minimal ",
"python": "3.6"
},
{
"env": "BUILD=test ETESYNC_TESTS=true REQUIREMENTS=latest",
"python": "3.6"
},
{
"env": "BUILD=test",
"language": "generic",
"os": "osx"
}
]
},
"script": [
"make -e $BUILD"
],
"sudo": true
}

View file

@ -9,6 +9,13 @@ Package maintainers and users who have to manually update their installation
may want to subscribe to `GitHub's tag feed
<https://github.com/pimutils/vdirsyncer/tags.atom>`_.
Version 0.17.0
==============
- Fix bug where collection discovery under DAV-storages would produce invalid
XML. See :gh:`688`.
- ownCloud and Baikal are no longer tested.
Version 0.16.4
==============

View file

@ -1,4 +1,4 @@
Copyright (c) 2014-2016 by Markus Unterwaditzer & contributors. See
Copyright (c) 2014-2018 by Markus Unterwaditzer & contributors. See
AUTHORS.rst for more details.
Some rights reserved.
@ -31,3 +31,10 @@ LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
SOFTWARE AND DOCUMENTATION, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH
DAMAGE.
== etesync ==
I, Tom Hacohen, hereby grant a license for EteSync's journal-manager
(https://github.com/etesync/journal-manager) to be used as a dependency in
vdirsyncer's test suite for the purpose of testing vdirsyncer without having
the copyleft section of the AGPL apply to it (vdirsyncer).

View file

@ -1,7 +1,7 @@
# See the documentation on how to run the tests:
# https://vdirsyncer.pimutils.org/en/stable/contributing.html
# Which DAV server to run the tests against (radicale, xandikos, skip, owncloud, nextcloud, ...)
# Which DAV server to run the tests against (radicale, xandikos, skip, nextcloud, ...)
export DAV_SERVER := skip
# release (install release versions of dependencies)
@ -36,8 +36,7 @@ ifeq ($(COVERAGE), true)
endif
ifeq ($(ETESYNC_TESTS), true)
TEST_EXTRA_PACKAGES += git+https://github.com/etesync/journal-manager
TEST_EXTRA_PACKAGES += django djangorestframework wsgi_intercept drf-nested-routers
TEST_EXTRA_PACKAGES += django-etesync-journal django djangorestframework wsgi_intercept drf-nested-routers
endif
PYTEST = py.test $(PYTEST_ARGS)
@ -45,23 +44,34 @@ PYTEST = py.test $(PYTEST_ARGS)
export TESTSERVER_BASE := ./tests/storage/servers/
CODECOV_PATH = /tmp/codecov.sh
ifeq ($(CI), true)
test:
curl -s https://codecov.io/bash > $(CODECOV_PATH)
$(PYTEST) tests/unit/
bash $(CODECOV_PATH) -c -F unit
$(PYTEST) tests/system/
bash $(CODECOV_PATH) -c -F system
$(PYTEST) tests/storage/
bash $(CODECOV_PATH) -c -F storage
else
test:
$(PYTEST)
endif
all:
$(error Take a look at https://vdirsyncer.pimutils.org/en/stable/tutorial.html#installation)
ifeq ($(CI), true)
codecov.sh:
curl -s https://codecov.io/bash > $@
else
codecov.sh:
echo > $@
endif
rust-test:
cd rust/ && cargo test --release
test: unit-test system-test storage-test
unit-test: codecov.sh
$(PYTEST) tests/unit/
bash codecov.sh -c -F unit
system-test: codecov.sh
$(PYTEST) tests/system/
bash codecov.sh -c -F system
storage-test: codecov.sh
$(PYTEST) tests/storage/
bash codecov.sh -c -F storage
install-servers:
set -ex; \
for server in $(DAV_SERVER); do \
@ -82,17 +92,19 @@ install-test: install-servers
[ -z "$(TEST_EXTRA_PACKAGES)" ] || pip install $(TEST_EXTRA_PACKAGES)
install-style: install-docs
pip install -U flake8 flake8-import-order 'flake8-bugbear>=17.3.0' autopep8
pip install -U flake8 flake8-import-order 'flake8-bugbear>=17.3.0'
which cargo-install-update || cargo +nightly install cargo-update
cargo +nightly install-update -i clippy
cargo +nightly install-update -i rustfmt-nightly
cargo +nightly install-update -i cargo-update
style:
flake8
! git grep -i syncroniz */*
! git grep -i 'text/icalendar' */*
sphinx-build -W -b html ./docs/ ./docs/_build/html/
python3 scripts/make_travisconf.py | diff -b .travis.yml -
travis-conf:
python3 scripts/make_travisconf.py > .travis.yml
cd rust/ && cargo +nightly clippy
cd rust/ && cargo fmt
install-docs:
pip install -Ur docs-requirements.txt
@ -104,17 +116,16 @@ linkcheck:
sphinx-build -W -b linkcheck ./docs/ ./docs/_build/linkcheck/
release:
python setup.py sdist bdist_wheel upload
python setup.py sdist upload
release-deb:
sh scripts/release-deb.sh debian jessie
sh scripts/release-deb.sh debian stretch
sh scripts/release-deb.sh ubuntu trusty
sh scripts/release-deb.sh ubuntu xenial
sh scripts/release-deb.sh ubuntu zesty
install-dev:
pip install -e .
pip install -ve .
[ "$(ETESYNC_TESTS)" = "false" ] || pip install -Ue .[etesync]
set -xe && if [ "$(REQUIREMENTS)" = "devel" ]; then \
pip install -U --force-reinstall \
@ -124,13 +135,6 @@ install-dev:
pip install -U --force-reinstall $$(python setup.py --quiet minimal_requirements); \
fi
install-git-hooks: install-style
echo "make style-autocorrect" > .git/hooks/pre-commit
chmod +x .git/hooks/pre-commit
style-autocorrect:
git diff --cached --name-only | egrep '\.py$$' | xargs --no-run-if-empty autopep8 -ri
ssh-submodule-urls:
git submodule foreach "\
echo -n 'Old: '; \
@ -139,4 +143,11 @@ ssh-submodule-urls:
echo -n 'New URL: '; \
git remote get-url origin"
install-rust:
curl https://sh.rustup.rs -sSf | sh -s -- -y --default-toolchain nightly
rust-ext:
[ "$$READTHEDOCS" != "True" ] || $(MAKE) install-rust
cd ./rust && cargo build --release
.PHONY: docs

View file

@ -20,8 +20,8 @@ It aims to be for calendars and contacts what `OfflineIMAP
.. _programs: https://vdirsyncer.pimutils.org/en/latest/tutorials/
.. image:: https://travis-ci.org/pimutils/vdirsyncer.svg?branch=master
:target: https://travis-ci.org/pimutils/vdirsyncer
.. image:: https://circleci.com/gh/pimutils/vdirsyncer.svg?style=shield
:target: https://circleci.com/gh/pimutils/vdirsyncer
.. image:: https://codecov.io/github/pimutils/vdirsyncer/coverage.svg?branch=master
:target: https://codecov.io/github/pimutils/vdirsyncer?branch=master

View file

@ -43,7 +43,7 @@ fileext = ".vcf"
[storage bob_contacts_remote]
type = "carddav"
url = "https://owncloud.example.com/remote.php/carddav/"
url = "https://nextcloud.example.com/"
#username =
# The password can also be fetched from the system password storage, netrc or a
# custom command. See http://vdirsyncer.pimutils.org/en/stable/keyring.html
@ -65,6 +65,6 @@ fileext = ".ics"
[storage bob_calendar_remote]
type = "caldav"
url = "https://owncloud.example.com/remote.php/caldav/"
url = "https://nextcloud.example.com/"
#username =
#password =

11
docker-compose.yml Normal file
View file

@ -0,0 +1,11 @@
version: '2'
services:
nextcloud:
image: nextcloud
ports:
- '8080:80'
environment:
- SQLITE_DATABASE=nextcloud
- NEXTCLOUD_ADMIN_USER=asdf
- NEXTCLOUD_ADMIN_PASSWORD=asdf

View file

@ -10,9 +10,9 @@ OS/distro packages
The following packages are user-contributed and were up-to-date at the time of
writing:
- `ArchLinux (AUR) <https://aur.archlinux.org/packages/vdirsyncer>`_
- `ArchLinux <https://www.archlinux.org/packages/community/any/vdirsyncer/>`_
- `Ubuntu and Debian, x86_64-only
<https://packagecloud.io/pimutils/vdirsyncer/install>`_ (packages also exist
<https://packagecloud.io/pimutils/vdirsyncer>`_ (packages also exist
in the official repositories but may be out of date)
- `GNU Guix <https://www.gnu.org/software/guix/package-list.html#vdirsyncer>`_
- `OS X (homebrew) <http://braumeister.org/formula/vdirsyncer>`_
@ -44,12 +44,17 @@ following things are installed:
- Python 3.4+ and pip.
- ``libxml`` and ``libxslt``
- ``zlib``
- Linux or OS X. **Windows is not supported, see :gh:`535`.**
- `Rust <https://www.rust-lang.org/>`, the programming language, together with
its package manager ``cargo``.
- Linux or OS X. **Windows is not supported**, see :gh:`535`.
On Linux systems, using the distro's package manager is the best
way to do this, for example, using Ubuntu::
On Linux systems, using the distro's package manager is the best way to do
this, for example, using Ubuntu (last tried on Trusty)::
sudo apt-get install libxml2 libxslt1.1 zlib1g python
sudo apt-get install python3 python3-pip libffi-dev
Rust may need to be installed separately, as the packages in Ubuntu are usually
out-of-date. I recommend `rustup <https://rustup.rs/>`_ for that.
Then you have several options. The following text applies for most Python
software by the way.
@ -59,11 +64,14 @@ The dirty, easy way
The easiest way to install vdirsyncer at this point would be to run::
pip install --user --ignore-installed vdirsyncer
pip3 install -v --user --ignore-installed vdirsyncer
- ``--user`` is to install without root rights (into your home directory)
- ``--ignore-installed`` is to work around Debian's potentially broken packages
(see :ref:`debian-urllib3`).
(see :ref:`debian-urllib3`). You can try to omit it if you run into other
problems related to certificates, for example.
Your executable is then in ``~/.local/bin/``.
This method has a major flaw though: Pip doesn't keep track of the files it
installs. Vdirsyncer's files would be located somewhere in
@ -79,9 +87,9 @@ There is a way to install Python software without scattering stuff across
your filesystem: virtualenv_. There are a lot of resources on how to use it,
the simplest possible way would look something like::
virtualenv ~/vdirsyncer_env
~/vdirsyncer_env/bin/pip install vdirsyncer
alias vdirsyncer="~/vdirsyncer_env/bin/vdirsyncer
virtualenv --python python3 ~/vdirsyncer_env
~/vdirsyncer_env/bin/pip install -v vdirsyncer
alias vdirsyncer="$HOME/vdirsyncer_env/bin/vdirsyncer"
You'll have to put the last line into your ``.bashrc`` or ``.bash_profile``.

View file

@ -66,7 +66,3 @@ For such purposes you can set the ``partial_sync`` parameter to ``ignore``::
partial_sync = ignore
See :ref:`the config docs <partial_sync_def>` for more information.
.. _nextCloud: https://nextcloud.com/
.. _Baikal: http://sabre.io/baikal/
.. _DAViCal: http://www.davical.org/

View file

@ -53,7 +53,8 @@ pairs of storages should actually be synchronized is defined in :ref:`pair
section <pair_config>`. This format is copied from OfflineIMAP, where storages
are called repositories and pairs are called accounts.
The following example synchronizes ownCloud's addressbooks to ``~/.contacts/``::
The following example synchronizes addressbooks from a :doc:`NextCloud
<tutorials/nextcloud>` to ``~/.contacts/``::
[pair my_contacts]
@ -70,7 +71,7 @@ The following example synchronizes ownCloud's addressbooks to ``~/.contacts/``::
type = "carddav"
# We can simplify this URL here as well. In theory it shouldn't matter.
url = "https://owncloud.example.com/remote.php/carddav/"
url = "https://nextcloud.example.com/"
username = "bob"
password = "asdf"
@ -162,13 +163,13 @@ let's switch to a different base example. This time we'll synchronize calendars:
[storage my_calendars_remote]
type = "caldav"
url = "https://owncloud.example.com/remote.php/caldav/"
url = "https://nextcloud.example.com/"
username = "bob"
password = "asdf"
Run ``vdirsyncer discover`` for discovery. Then you can use ``vdirsyncer
metasync`` to synchronize the ``color`` property between your local calendars
in ``~/.calendars/`` and your ownCloud. Locally the color is just represented
in ``~/.calendars/`` and your NextCloud. Locally the color is just represented
as a file called ``color`` within the calendar folder.
.. _collections_tutorial:

View file

@ -1,10 +0,0 @@
======
Baikal
======
Vdirsyncer is continuously tested against the latest version of Baikal_.
- Baikal up to ``0.2.7`` also uses an old version of SabreDAV, with the same
issue as ownCloud, see :gh:`160`. This issue is fixed in later versions.
.. _Baikal: http://baikal-server.com/

View file

@ -86,7 +86,7 @@ Crontab
On the end we create a crontab, so that vdirsyncer syncs automatically
every 30 minutes our contacts::
contab -e
crontab -e
On the end of that file enter this line::

View file

@ -52,12 +52,10 @@ Servers
.. toctree::
:maxdepth: 1
baikal
davmail
fastmail
google
icloud
nextcloud
owncloud
radicale
xandikos

View file

@ -1,8 +1,8 @@
=========
nextCloud
NextCloud
=========
Vdirsyncer is continuously tested against the latest version of nextCloud_::
Vdirsyncer is continuously tested against the latest version of NextCloud_::
[storage cal]
type = "caldav"
@ -17,4 +17,4 @@ Vdirsyncer is continuously tested against the latest version of nextCloud_::
- WebCAL-subscriptions can't be discovered by vdirsyncer. See `this relevant
issue <https://github.com/nextcloud/calendar/issues/63>`_.
.. _nextCloud: https://nextcloud.com/
.. _NextCloud: https://nextcloud.com/

View file

@ -1,26 +0,0 @@
.. _owncloud_setup:
========
ownCloud
========
Vdirsyncer is continuously tested against the latest version of ownCloud_::
[storage cal]
type = "caldav"
url = "https://example.com/remote.php/dav/"
username = ...
password = ...
[storage card]
type = "carddav"
url = "https://example.com/remote.php/dav/"
username = ...
password = ...
- *Versions older than 7.0.0:* ownCloud uses SabreDAV, which had problems
detecting collisions and race-conditions. The problems were reported and are
fixed in SabreDAV's repo, and the corresponding fix is also in ownCloud since
7.0.0. See :gh:`16` for more information.
.. _ownCloud: https://owncloud.org/

View file

@ -10,4 +10,61 @@ todoman_ is a CLI task manager supporting :doc:`vdir </vdir>`. Its interface is
similar to the ones of Taskwarrior or the todo.txt CLI app. You can use
:storage:`filesystem` with it.
.. _todoman: https://hugo.barrera.io/journal/2015/03/30/introducing-todoman/
.. _todoman: http://todoman.readthedocs.io/
Setting up vdirsyncer
=====================
For this tutorial we will use NextCloud.
Assuming a config like this::
[general]
status_path = "~/.vdirsyncer/status/"
[pair calendars]
conflict_resolution = "b wins"
a = "calendars_local"
b = "calendars_dav"
collections = ["from b"]
metadata = ["color", "displayname"]
[storage calendars_local]
type = "filesystem"
path = "~/.calendars/"
fileext = ".ics"
[storage calendars_dav]
type = "caldav"
url = "https://nextcloud.example.net/"
username = ...
password = ...
``vdirsyncer sync`` will then synchronize the calendars of your NextCloud_
instance to subfolders of ``~/.calendar/``.
.. _NextCloud: https://nextcloud.com/
Setting up todoman
==================
Write this to ``~/.config/todoman/todoman.conf``::
[main]
path = ~/.calendars/*
The glob_ pattern in ``path`` will match all subfolders in ``~/.calendars/``,
which is exactly the tasklists we want. Now you can use ``todoman`` as
described in its documentation_ and run ``vdirsyncer sync`` to synchronize the changes to NextCloud.
.. _glob: https://en.wikipedia.org/wiki/Glob_(programming)
.. _documentation: http://todoman.readthedocs.io/
Other clients
=============
The following client applications also synchronize over CalDAV:
- The Tasks-app found on iOS
- `OpenTasks for Android <https://github.com/dmfs/opentasks>`_
- The `Tasks <https://apps.nextcloud.com/apps/tasks>`_-app for NextCloud's web UI

2
rust/.gitignore vendored Normal file
View file

@ -0,0 +1,2 @@
target/
src/storage/exports.rs

501
rust/Cargo.lock generated Normal file
View file

@ -0,0 +1,501 @@
[[package]]
name = "ansi_term"
version = "0.10.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "atomicwrites"
version = "0.1.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"kernel32-sys 0.2.2 (registry+https://github.com/rust-lang/crates.io-index)",
"nix 0.9.0 (registry+https://github.com/rust-lang/crates.io-index)",
"tempdir 0.3.5 (registry+https://github.com/rust-lang/crates.io-index)",
"winapi 0.2.8 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "atty"
version = "0.2.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"libc 0.2.35 (registry+https://github.com/rust-lang/crates.io-index)",
"termion 1.5.1 (registry+https://github.com/rust-lang/crates.io-index)",
"winapi 0.3.3 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "backtrace"
version = "0.3.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"backtrace-sys 0.1.16 (registry+https://github.com/rust-lang/crates.io-index)",
"cfg-if 0.1.2 (registry+https://github.com/rust-lang/crates.io-index)",
"libc 0.2.35 (registry+https://github.com/rust-lang/crates.io-index)",
"rustc-demangle 0.1.5 (registry+https://github.com/rust-lang/crates.io-index)",
"winapi 0.3.3 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "backtrace-sys"
version = "0.1.16"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"cc 1.0.4 (registry+https://github.com/rust-lang/crates.io-index)",
"libc 0.2.35 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "bitflags"
version = "0.9.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "bitflags"
version = "1.0.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "cbindgen"
version = "0.4.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"clap 2.29.1 (registry+https://github.com/rust-lang/crates.io-index)",
"log 0.3.9 (registry+https://github.com/rust-lang/crates.io-index)",
"serde 1.0.27 (registry+https://github.com/rust-lang/crates.io-index)",
"serde_json 1.0.9 (registry+https://github.com/rust-lang/crates.io-index)",
"syn 0.11.11 (registry+https://github.com/rust-lang/crates.io-index)",
"tempdir 0.3.5 (registry+https://github.com/rust-lang/crates.io-index)",
"toml 0.4.5 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "cc"
version = "1.0.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "cfg-if"
version = "0.1.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "clap"
version = "2.29.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"ansi_term 0.10.2 (registry+https://github.com/rust-lang/crates.io-index)",
"atty 0.2.6 (registry+https://github.com/rust-lang/crates.io-index)",
"bitflags 1.0.1 (registry+https://github.com/rust-lang/crates.io-index)",
"strsim 0.6.0 (registry+https://github.com/rust-lang/crates.io-index)",
"textwrap 0.9.0 (registry+https://github.com/rust-lang/crates.io-index)",
"unicode-width 0.1.4 (registry+https://github.com/rust-lang/crates.io-index)",
"vec_map 0.8.0 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "coco"
version = "0.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"either 1.4.0 (registry+https://github.com/rust-lang/crates.io-index)",
"scopeguard 0.3.3 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "dtoa"
version = "0.4.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "either"
version = "1.4.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "error-chain"
version = "0.11.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"backtrace 0.3.5 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "fuchsia-zircon"
version = "0.3.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"bitflags 1.0.1 (registry+https://github.com/rust-lang/crates.io-index)",
"fuchsia-zircon-sys 0.3.3 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "fuchsia-zircon-sys"
version = "0.3.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "gcc"
version = "0.3.54"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "itoa"
version = "0.3.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "kernel32-sys"
version = "0.2.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"winapi 0.2.8 (registry+https://github.com/rust-lang/crates.io-index)",
"winapi-build 0.1.1 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "lazy_static"
version = "0.2.11"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "libc"
version = "0.2.35"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "log"
version = "0.3.9"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"log 0.4.1 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "log"
version = "0.4.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"cfg-if 0.1.2 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "nix"
version = "0.9.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"bitflags 0.9.1 (registry+https://github.com/rust-lang/crates.io-index)",
"cfg-if 0.1.2 (registry+https://github.com/rust-lang/crates.io-index)",
"libc 0.2.35 (registry+https://github.com/rust-lang/crates.io-index)",
"void 1.0.2 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "num-traits"
version = "0.1.41"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "num_cpus"
version = "1.8.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"libc 0.2.35 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "quote"
version = "0.3.15"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "rand"
version = "0.3.20"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"fuchsia-zircon 0.3.3 (registry+https://github.com/rust-lang/crates.io-index)",
"libc 0.2.35 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "rayon"
version = "0.8.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"rayon-core 1.3.0 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "rayon-core"
version = "1.3.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"coco 0.1.1 (registry+https://github.com/rust-lang/crates.io-index)",
"lazy_static 0.2.11 (registry+https://github.com/rust-lang/crates.io-index)",
"libc 0.2.35 (registry+https://github.com/rust-lang/crates.io-index)",
"num_cpus 1.8.0 (registry+https://github.com/rust-lang/crates.io-index)",
"rand 0.3.20 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "redox_syscall"
version = "0.1.37"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "redox_termios"
version = "0.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"redox_syscall 0.1.37 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "ring"
version = "0.12.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"gcc 0.3.54 (registry+https://github.com/rust-lang/crates.io-index)",
"lazy_static 0.2.11 (registry+https://github.com/rust-lang/crates.io-index)",
"libc 0.2.35 (registry+https://github.com/rust-lang/crates.io-index)",
"rayon 0.8.2 (registry+https://github.com/rust-lang/crates.io-index)",
"untrusted 0.5.1 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "rustc-demangle"
version = "0.1.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "scopeguard"
version = "0.3.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "serde"
version = "1.0.27"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"serde_derive 1.0.27 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "serde_derive"
version = "1.0.27"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"quote 0.3.15 (registry+https://github.com/rust-lang/crates.io-index)",
"serde_derive_internals 0.19.0 (registry+https://github.com/rust-lang/crates.io-index)",
"syn 0.11.11 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "serde_derive_internals"
version = "0.19.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"syn 0.11.11 (registry+https://github.com/rust-lang/crates.io-index)",
"synom 0.11.3 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "serde_json"
version = "1.0.9"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"dtoa 0.4.2 (registry+https://github.com/rust-lang/crates.io-index)",
"itoa 0.3.4 (registry+https://github.com/rust-lang/crates.io-index)",
"num-traits 0.1.41 (registry+https://github.com/rust-lang/crates.io-index)",
"serde 1.0.27 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "strsim"
version = "0.6.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "syn"
version = "0.11.11"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"quote 0.3.15 (registry+https://github.com/rust-lang/crates.io-index)",
"synom 0.11.3 (registry+https://github.com/rust-lang/crates.io-index)",
"unicode-xid 0.0.4 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "synom"
version = "0.11.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"unicode-xid 0.0.4 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "tempdir"
version = "0.3.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"rand 0.3.20 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "termion"
version = "1.5.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"libc 0.2.35 (registry+https://github.com/rust-lang/crates.io-index)",
"redox_syscall 0.1.37 (registry+https://github.com/rust-lang/crates.io-index)",
"redox_termios 0.1.1 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "textwrap"
version = "0.9.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"unicode-width 0.1.4 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "toml"
version = "0.4.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"serde 1.0.27 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "unicode-width"
version = "0.1.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "unicode-xid"
version = "0.0.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "untrusted"
version = "0.5.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "vdirsyncer_rustext"
version = "0.1.0"
dependencies = [
"atomicwrites 0.1.4 (registry+https://github.com/rust-lang/crates.io-index)",
"cbindgen 0.4.0 (registry+https://github.com/rust-lang/crates.io-index)",
"error-chain 0.11.0 (registry+https://github.com/rust-lang/crates.io-index)",
"ring 0.12.1 (registry+https://github.com/rust-lang/crates.io-index)",
"vobject 0.4.2 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "vec_map"
version = "0.8.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "vobject"
version = "0.4.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"error-chain 0.11.0 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "void"
version = "1.0.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "winapi"
version = "0.2.8"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "winapi"
version = "0.3.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"winapi-i686-pc-windows-gnu 0.3.2 (registry+https://github.com/rust-lang/crates.io-index)",
"winapi-x86_64-pc-windows-gnu 0.3.2 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "winapi-build"
version = "0.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "winapi-i686-pc-windows-gnu"
version = "0.3.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "winapi-x86_64-pc-windows-gnu"
version = "0.3.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
[metadata]
"checksum ansi_term 0.10.2 (registry+https://github.com/rust-lang/crates.io-index)" = "6b3568b48b7cefa6b8ce125f9bb4989e52fbcc29ebea88df04cc7c5f12f70455"
"checksum atomicwrites 0.1.4 (registry+https://github.com/rust-lang/crates.io-index)" = "4560dd4eadad8c80a88e25426f96a74ad62c95d4ee424226803013c0ba94f1cf"
"checksum atty 0.2.6 (registry+https://github.com/rust-lang/crates.io-index)" = "8352656fd42c30a0c3c89d26dea01e3b77c0ab2af18230835c15e2e13cd51859"
"checksum backtrace 0.3.5 (registry+https://github.com/rust-lang/crates.io-index)" = "ebbbf59b1c43eefa8c3ede390fcc36820b4999f7914104015be25025e0d62af2"
"checksum backtrace-sys 0.1.16 (registry+https://github.com/rust-lang/crates.io-index)" = "44585761d6161b0f57afc49482ab6bd067e4edef48c12a152c237eb0203f7661"
"checksum bitflags 0.9.1 (registry+https://github.com/rust-lang/crates.io-index)" = "4efd02e230a02e18f92fc2735f44597385ed02ad8f831e7c1c1156ee5e1ab3a5"
"checksum bitflags 1.0.1 (registry+https://github.com/rust-lang/crates.io-index)" = "b3c30d3802dfb7281680d6285f2ccdaa8c2d8fee41f93805dba5c4cf50dc23cf"
"checksum cbindgen 0.4.0 (registry+https://github.com/rust-lang/crates.io-index)" = "370c18a61741bd716377aba3fc42d78788df5d1af5e4bfbe22926013bd91d50a"
"checksum cc 1.0.4 (registry+https://github.com/rust-lang/crates.io-index)" = "deaf9ec656256bb25b404c51ef50097207b9cbb29c933d31f92cae5a8a0ffee0"
"checksum cfg-if 0.1.2 (registry+https://github.com/rust-lang/crates.io-index)" = "d4c819a1287eb618df47cc647173c5c4c66ba19d888a6e50d605672aed3140de"
"checksum clap 2.29.1 (registry+https://github.com/rust-lang/crates.io-index)" = "8f4a2b3bb7ef3c672d7c13d15613211d5a6976b6892c598b0fcb5d40765f19c2"
"checksum coco 0.1.1 (registry+https://github.com/rust-lang/crates.io-index)" = "c06169f5beb7e31c7c67ebf5540b8b472d23e3eade3b2ec7d1f5b504a85f91bd"
"checksum dtoa 0.4.2 (registry+https://github.com/rust-lang/crates.io-index)" = "09c3753c3db574d215cba4ea76018483895d7bff25a31b49ba45db21c48e50ab"
"checksum either 1.4.0 (registry+https://github.com/rust-lang/crates.io-index)" = "740178ddf48b1a9e878e6d6509a1442a2d42fd2928aae8e7a6f8a36fb01981b3"
"checksum error-chain 0.11.0 (registry+https://github.com/rust-lang/crates.io-index)" = "ff511d5dc435d703f4971bc399647c9bc38e20cb41452e3b9feb4765419ed3f3"
"checksum fuchsia-zircon 0.3.3 (registry+https://github.com/rust-lang/crates.io-index)" = "2e9763c69ebaae630ba35f74888db465e49e259ba1bc0eda7d06f4a067615d82"
"checksum fuchsia-zircon-sys 0.3.3 (registry+https://github.com/rust-lang/crates.io-index)" = "3dcaa9ae7725d12cdb85b3ad99a434db70b468c09ded17e012d86b5c1010f7a7"
"checksum gcc 0.3.54 (registry+https://github.com/rust-lang/crates.io-index)" = "5e33ec290da0d127825013597dbdfc28bee4964690c7ce1166cbc2a7bd08b1bb"
"checksum itoa 0.3.4 (registry+https://github.com/rust-lang/crates.io-index)" = "8324a32baf01e2ae060e9de58ed0bc2320c9a2833491ee36cd3b4c414de4db8c"
"checksum kernel32-sys 0.2.2 (registry+https://github.com/rust-lang/crates.io-index)" = "7507624b29483431c0ba2d82aece8ca6cdba9382bff4ddd0f7490560c056098d"
"checksum lazy_static 0.2.11 (registry+https://github.com/rust-lang/crates.io-index)" = "76f033c7ad61445c5b347c7382dd1237847eb1bce590fe50365dcb33d546be73"
"checksum libc 0.2.35 (registry+https://github.com/rust-lang/crates.io-index)" = "96264e9b293e95d25bfcbbf8a88ffd1aedc85b754eba8b7d78012f638ba220eb"
"checksum log 0.3.9 (registry+https://github.com/rust-lang/crates.io-index)" = "e19e8d5c34a3e0e2223db8e060f9e8264aeeb5c5fc64a4ee9965c062211c024b"
"checksum log 0.4.1 (registry+https://github.com/rust-lang/crates.io-index)" = "89f010e843f2b1a31dbd316b3b8d443758bc634bed37aabade59c686d644e0a2"
"checksum nix 0.9.0 (registry+https://github.com/rust-lang/crates.io-index)" = "a2c5afeb0198ec7be8569d666644b574345aad2e95a53baf3a532da3e0f3fb32"
"checksum num-traits 0.1.41 (registry+https://github.com/rust-lang/crates.io-index)" = "cacfcab5eb48250ee7d0c7896b51a2c5eec99c1feea5f32025635f5ae4b00070"
"checksum num_cpus 1.8.0 (registry+https://github.com/rust-lang/crates.io-index)" = "c51a3322e4bca9d212ad9a158a02abc6934d005490c054a2778df73a70aa0a30"
"checksum quote 0.3.15 (registry+https://github.com/rust-lang/crates.io-index)" = "7a6e920b65c65f10b2ae65c831a81a073a89edd28c7cce89475bff467ab4167a"
"checksum rand 0.3.20 (registry+https://github.com/rust-lang/crates.io-index)" = "512870020642bb8c221bf68baa1b2573da814f6ccfe5c9699b1c303047abe9b1"
"checksum rayon 0.8.2 (registry+https://github.com/rust-lang/crates.io-index)" = "b614fe08b6665cb9a231d07ac1364b0ef3cb3698f1239ee0c4c3a88a524f54c8"
"checksum rayon-core 1.3.0 (registry+https://github.com/rust-lang/crates.io-index)" = "e64b609139d83da75902f88fd6c01820046840a18471e4dfcd5ac7c0f46bea53"
"checksum redox_syscall 0.1.37 (registry+https://github.com/rust-lang/crates.io-index)" = "0d92eecebad22b767915e4d529f89f28ee96dbbf5a4810d2b844373f136417fd"
"checksum redox_termios 0.1.1 (registry+https://github.com/rust-lang/crates.io-index)" = "7e891cfe48e9100a70a3b6eb652fef28920c117d366339687bd5576160db0f76"
"checksum ring 0.12.1 (registry+https://github.com/rust-lang/crates.io-index)" = "6f7d28b30a72c01b458428e0ae988d4149c20d902346902be881e3edc4bb325c"
"checksum rustc-demangle 0.1.5 (registry+https://github.com/rust-lang/crates.io-index)" = "aee45432acc62f7b9a108cc054142dac51f979e69e71ddce7d6fc7adf29e817e"
"checksum scopeguard 0.3.3 (registry+https://github.com/rust-lang/crates.io-index)" = "94258f53601af11e6a49f722422f6e3425c52b06245a5cf9bc09908b174f5e27"
"checksum serde 1.0.27 (registry+https://github.com/rust-lang/crates.io-index)" = "db99f3919e20faa51bb2996057f5031d8685019b5a06139b1ce761da671b8526"
"checksum serde_derive 1.0.27 (registry+https://github.com/rust-lang/crates.io-index)" = "f4ba7591cfe93755e89eeecdbcc668885624829b020050e6aec99c2a03bd3fd0"
"checksum serde_derive_internals 0.19.0 (registry+https://github.com/rust-lang/crates.io-index)" = "6e03f1c9530c3fb0a0a5c9b826bdd9246a5921ae995d75f512ac917fc4dd55b5"
"checksum serde_json 1.0.9 (registry+https://github.com/rust-lang/crates.io-index)" = "c9db7266c7d63a4c4b7fe8719656ccdd51acf1bed6124b174f933b009fb10bcb"
"checksum strsim 0.6.0 (registry+https://github.com/rust-lang/crates.io-index)" = "b4d15c810519a91cf877e7e36e63fe068815c678181439f2f29e2562147c3694"
"checksum syn 0.11.11 (registry+https://github.com/rust-lang/crates.io-index)" = "d3b891b9015c88c576343b9b3e41c2c11a51c219ef067b264bd9c8aa9b441dad"
"checksum synom 0.11.3 (registry+https://github.com/rust-lang/crates.io-index)" = "a393066ed9010ebaed60b9eafa373d4b1baac186dd7e008555b0f702b51945b6"
"checksum tempdir 0.3.5 (registry+https://github.com/rust-lang/crates.io-index)" = "87974a6f5c1dfb344d733055601650059a3363de2a6104819293baff662132d6"
"checksum termion 1.5.1 (registry+https://github.com/rust-lang/crates.io-index)" = "689a3bdfaab439fd92bc87df5c4c78417d3cbe537487274e9b0b2dce76e92096"
"checksum textwrap 0.9.0 (registry+https://github.com/rust-lang/crates.io-index)" = "c0b59b6b4b44d867f1370ef1bd91bfb262bf07bf0ae65c202ea2fbc16153b693"
"checksum toml 0.4.5 (registry+https://github.com/rust-lang/crates.io-index)" = "a7540f4ffc193e0d3c94121edb19b055670d369f77d5804db11ae053a45b6e7e"
"checksum unicode-width 0.1.4 (registry+https://github.com/rust-lang/crates.io-index)" = "bf3a113775714a22dcb774d8ea3655c53a32debae63a063acc00a91cc586245f"
"checksum unicode-xid 0.0.4 (registry+https://github.com/rust-lang/crates.io-index)" = "8c1f860d7d29cf02cb2f3f359fd35991af3d30bac52c57d265a3c461074cb4dc"
"checksum untrusted 0.5.1 (registry+https://github.com/rust-lang/crates.io-index)" = "f392d7819dbe58833e26872f5f6f0d68b7bbbe90fc3667e98731c4a15ad9a7ae"
"checksum vec_map 0.8.0 (registry+https://github.com/rust-lang/crates.io-index)" = "887b5b631c2ad01628bbbaa7dd4c869f80d3186688f8d0b6f58774fbe324988c"
"checksum vobject 0.4.2 (registry+https://github.com/rust-lang/crates.io-index)" = "6041995691036270fabeb41975ca858f3b5113b82eea19a4f276bfb8b32e9ae4"
"checksum void 1.0.2 (registry+https://github.com/rust-lang/crates.io-index)" = "6a02e4885ed3bc0f2de90ea6dd45ebcbb66dacffe03547fadbb0eeae2770887d"
"checksum winapi 0.2.8 (registry+https://github.com/rust-lang/crates.io-index)" = "167dc9d6949a9b857f3451275e911c3f44255842c1f7a76f33c55103a909087a"
"checksum winapi 0.3.3 (registry+https://github.com/rust-lang/crates.io-index)" = "b09fb3b6f248ea4cd42c9a65113a847d612e17505d6ebd1f7357ad68a8bf8693"
"checksum winapi-build 0.1.1 (registry+https://github.com/rust-lang/crates.io-index)" = "2d315eee3b34aca4797b2da6b13ed88266e6d612562a0c46390af8299fc699bc"
"checksum winapi-i686-pc-windows-gnu 0.3.2 (registry+https://github.com/rust-lang/crates.io-index)" = "ec6667f60c23eca65c561e63a13d81b44234c2e38a6b6c959025ee907ec614cc"
"checksum winapi-x86_64-pc-windows-gnu 0.3.2 (registry+https://github.com/rust-lang/crates.io-index)" = "98f12c52b2630cd05d2c3ffd8e008f7f48252c042b4871c72aed9dc733b96668"

18
rust/Cargo.toml Normal file
View file

@ -0,0 +1,18 @@
[package]
name = "vdirsyncer_rustext"
version = "0.1.0"
authors = ["Markus Unterwaditzer <markus@unterwaditzer.net>"]
build = "build.rs"
[lib]
name = "vdirsyncer_rustext"
crate-type = ["cdylib"]
[dependencies]
vobject = "0.4.2"
ring = "0.12.1"
error-chain = "0.11.0"
atomicwrites = "0.1.4"
[build-dependencies]
cbindgen = "0.4"

161
rust/build.rs Normal file
View file

@ -0,0 +1,161 @@
extern crate cbindgen;
use std::env;
use std::fs::{remove_file, File};
use std::io::Write;
use std::path::Path;
const TEMPLATE_EACH: &'static str = r#"
#[no_mangle]
pub unsafe extern "C" fn vdirsyncer_{name}_list(
storage: *mut {path},
err: *mut VdirsyncerError
) -> *mut VdirsyncerStorageListing {
match (*storage).list() {
Ok(x) => Box::into_raw(Box::new(VdirsyncerStorageListing {
iterator: x,
href: None,
etag: None
})),
Err(e) => {
e.fill_c_err(err);
mem::zeroed()
}
}
}
#[no_mangle]
pub unsafe extern "C" fn vdirsyncer_{name}_get(
storage: *mut {path},
c_href: *const c_char,
err: *mut VdirsyncerError
) -> *mut VdirsyncerStorageGetResult {
let href = CStr::from_ptr(c_href);
match (*storage).get(href.to_str().unwrap()) {
Ok((item, href)) => {
Box::into_raw(Box::new(VdirsyncerStorageGetResult {
item: Box::into_raw(Box::new(item)),
etag: CString::new(href).unwrap().into_raw()
}))
},
Err(e) => {
e.fill_c_err(err);
mem::zeroed()
}
}
}
#[no_mangle]
pub unsafe extern "C" fn vdirsyncer_{name}_upload(
storage: *mut {path},
item: *mut Item,
err: *mut VdirsyncerError
) -> *mut VdirsyncerStorageUploadResult {
match (*storage).upload((*item).clone()) {
Ok((href, etag)) => {
Box::into_raw(Box::new(VdirsyncerStorageUploadResult {
href: CString::new(href).unwrap().into_raw(),
etag: CString::new(etag).unwrap().into_raw()
}))
},
Err(e) => {
e.fill_c_err(err);
mem::zeroed()
}
}
}
#[no_mangle]
pub unsafe extern "C" fn vdirsyncer_{name}_update(
storage: *mut {path},
c_href: *const c_char,
item: *mut Item,
c_etag: *const c_char,
err: *mut VdirsyncerError
) -> *const c_char {
let href = CStr::from_ptr(c_href);
let etag = CStr::from_ptr(c_etag);
match (*storage).update(href.to_str().unwrap(), (*item).clone(), etag.to_str().unwrap()) {
Ok(etag) => CString::new(etag).unwrap().into_raw(),
Err(e) => {
e.fill_c_err(err);
mem::zeroed()
}
}
}
#[no_mangle]
pub unsafe extern "C" fn vdirsyncer_{name}_delete(
storage: *mut {path},
c_href: *const c_char,
c_etag: *const c_char,
err: *mut VdirsyncerError
) {
let href = CStr::from_ptr(c_href);
let etag = CStr::from_ptr(c_etag);
match (*storage).delete(href.to_str().unwrap(), etag.to_str().unwrap()) {
Ok(()) => (),
Err(e) => e.fill_c_err(err)
}
}
#[no_mangle]
pub unsafe extern "C" fn vdirsyncer_{name}_buffered(storage: *mut {path}) {
(*storage).buffered();
}
#[no_mangle]
pub unsafe extern "C" fn vdirsyncer_{name}_flush(
storage: *mut {path},
err: *mut VdirsyncerError
) {
match (*storage).flush() {
Ok(_) => (),
Err(e) => e.fill_c_err(err)
}
}
"#;
fn export_storage(f: &mut File, name: &str, path: &str) {
// String formatting in rust is at compile time. That doesn't work well for our case.
write!(
f,
"{}",
TEMPLATE_EACH
.replace("{name}", name)
.replace("{path}", path)
).unwrap();
}
fn main() {
let crate_dir = env::var("CARGO_MANIFEST_DIR").unwrap();
let mut f = File::create(Path::new(&crate_dir).join("src/storage/exports.rs")).unwrap();
write!(f, "// Auto-generated, do not check in.\n").unwrap();
write!(f, "use std::os::raw::c_char;\n").unwrap();
write!(f, "use std::mem;\n").unwrap();
write!(f, "use std::ffi::{{CStr, CString}};\n").unwrap();
write!(f, "use errors::*;\n").unwrap();
write!(f, "use item::Item;\n").unwrap();
write!(f, "use super::VdirsyncerStorageListing;\n").unwrap();
write!(f, "use super::VdirsyncerStorageGetResult;\n").unwrap();
write!(f, "use super::VdirsyncerStorageUploadResult;\n").unwrap();
write!(f, "use super::Storage;\n").unwrap();
write!(f, "use super::singlefile;\n").unwrap();
export_storage(&mut f, "singlefile", "singlefile::SinglefileStorage");
drop(f);
let _ = remove_file(Path::new(&crate_dir).join("target/vdirsyncer_rustext.h"));
let res = cbindgen::Builder::new()
.with_crate(crate_dir)
.with_language(cbindgen::Language::C)
.generate();
match res {
Ok(x) => x.write_to_file("target/vdirsyncer_rustext.h"),
Err(e) => println!("FAILED TO GENERATE BINDINGS: {:?}", e),
}
}

258
rust/src/item.rs Normal file
View file

@ -0,0 +1,258 @@
use vobject;
use ring;
use std::fmt::Write;
use errors::*;
#[derive(Clone)]
pub enum Item {
Parsed(vobject::Component),
Unparseable(String), // FIXME: maybe use https://crates.io/crates/terminated
}
impl Item {
pub fn from_raw(raw: String) -> Self {
match vobject::parse_component(&raw) {
Ok(x) => Item::Parsed(x),
// Don't chain vobject error here because it cannot be stored/cloned FIXME
_ => Item::Unparseable(raw),
}
}
pub fn from_component(component: vobject::Component) -> Self {
Item::Parsed(component)
}
/// Global identifier of the item, across storages, doesn't change after a modification of the
/// item.
pub fn get_uid(&self) -> Option<String> {
// FIXME: Cache
if let Item::Parsed(ref c) = *self {
let mut stack: Vec<&vobject::Component> = vec![c];
while let Some(vobj) = stack.pop() {
if let Some(prop) = vobj.get_only("UID") {
return Some(prop.value_as_string());
}
stack.extend(vobj.subcomponents.iter());
}
}
None
}
pub fn with_uid(&self, uid: &str) -> Result<Self> {
if let Item::Parsed(ref component) = *self {
let mut new_component = component.clone();
change_uid(&mut new_component, uid);
Ok(Item::from_raw(vobject::write_component(&new_component)))
} else {
Err(ErrorKind::ItemUnparseable.into())
}
}
/// Raw unvalidated content of the item
pub fn get_raw(&self) -> String {
match *self {
Item::Parsed(ref component) => vobject::write_component(component),
Item::Unparseable(ref x) => x.to_owned(),
}
}
/// Component of item if parseable
pub fn get_component(&self) -> Result<&vobject::Component> {
match *self {
Item::Parsed(ref component) => Ok(component),
_ => Err(ErrorKind::ItemUnparseable.into()),
}
}
/// Component of item if parseable
pub fn into_component(self) -> Result<vobject::Component> {
match self {
Item::Parsed(component) => Ok(component),
_ => Err(ErrorKind::ItemUnparseable.into()),
}
}
/// Used for etags
pub fn get_hash(&self) -> Result<String> {
// FIXME: cache
if let Item::Parsed(ref component) = *self {
Ok(hash_component(component))
} else {
Err(ErrorKind::ItemUnparseable.into())
}
}
/// Used for generating hrefs and matching up items during synchronization. This is either the
/// UID or the hash of the item's content.
pub fn get_ident(&self) -> Result<String> {
if let Some(x) = self.get_uid() {
return Ok(x);
}
// We hash the item instead of directly using its raw content, because
// 1. The raw content might be really large, e.g. when it's a contact
// with a picture, which bloats the status file.
//
// 2. The status file would contain really sensitive information.
self.get_hash()
}
pub fn is_parseable(&self) -> bool {
if let Item::Parsed(_) = *self {
true
} else {
false
}
}
}
fn change_uid(c: &mut vobject::Component, uid: &str) {
let mut stack = vec![c];
while let Some(component) = stack.pop() {
match component.name.as_ref() {
"VEVENT" | "VTODO" | "VJOURNAL" | "VCARD" => {
if !uid.is_empty() {
component.set(vobject::Property::new("UID", uid));
} else {
component.remove("UID");
}
}
_ => (),
}
stack.extend(component.subcomponents.iter_mut());
}
}
fn hash_component(c: &vobject::Component) -> String {
let mut new_c = c.clone();
{
let mut stack = vec![&mut new_c];
while let Some(component) = stack.pop() {
// PRODID is changed by radicale for some reason after upload
component.remove("PRODID");
// Sometimes METHOD:PUBLISH is added by WebCAL providers, for us it doesn't make a difference
component.remove("METHOD");
// X-RADICALE-NAME is used by radicale, because hrefs don't really exist in their filesystem backend
component.remove("X-RADICALE-NAME");
// Those are from the VCARD specification and is supposed to change when the
// item does -- however, we can determine that ourselves
component.remove("REV");
component.remove("LAST-MODIFIED");
component.remove("CREATED");
// Some iCalendar HTTP calendars generate the DTSTAMP at request time, so
// this property always changes when the rest of the item didn't. Some do
// the same with the UID.
//
// - Google's read-only calendar links
// - http://www.feiertage-oesterreich.at/
component.remove("DTSTAMP");
component.remove("UID");
if component.name == "VCALENDAR" {
// CALSCALE's default value is gregorian
let calscale = component.get_only("CALSCALE").map(|x| x.value_as_string());
if let Some(x) = calscale {
if x == "GREGORIAN" {
component.remove("CALSCALE");
}
}
// Apparently this is set by Horde?
// https://github.com/pimutils/vdirsyncer/issues/318
// Also Google sets those properties
component.remove("X-WR-CALNAME");
component.remove("X-WR-TIMEZONE");
component.subcomponents.retain(|c| c.name != "VTIMEZONE");
}
stack.extend(component.subcomponents.iter_mut());
}
}
// FIXME: Possible optimization: Stream component to hasher instead of allocating new string
let raw = vobject::write_component(&new_c);
let mut lines: Vec<_> = raw.lines().collect();
lines.sort();
let digest = ring::digest::digest(&ring::digest::SHA256, lines.join("\r\n").as_bytes());
let mut rv = String::new();
for &byte in digest.as_ref() {
write!(&mut rv, "{:x}", byte).unwrap();
}
rv
}
pub mod exports {
use super::Item;
use std::mem;
use std::ffi::{CStr, CString};
use std::os::raw::c_char;
use errors::*;
const EMPTY_STRING: *const c_char = b"\0" as *const u8 as *const c_char;
#[no_mangle]
pub unsafe extern "C" fn vdirsyncer_get_uid(c: *mut Item) -> *const c_char {
match (*c).get_uid() {
Some(x) => CString::new(x).unwrap().into_raw(),
None => EMPTY_STRING,
}
}
#[no_mangle]
pub unsafe extern "C" fn vdirsyncer_get_raw(c: *mut Item) -> *const c_char {
CString::new((*c).get_raw()).unwrap().into_raw()
}
#[no_mangle]
pub unsafe extern "C" fn vdirsyncer_item_from_raw(s: *const c_char) -> *mut Item {
let cstring = CStr::from_ptr(s);
Box::into_raw(Box::new(Item::from_raw(
cstring.to_str().unwrap().to_owned(),
)))
}
#[no_mangle]
pub unsafe extern "C" fn vdirsyncer_free_item(c: *mut Item) {
let _: Box<Item> = Box::from_raw(c);
}
#[no_mangle]
pub unsafe extern "C" fn vdirsyncer_with_uid(
c: *mut Item,
uid: *const c_char,
err: *mut VdirsyncerError,
) -> *mut Item {
let uid_cstring = CStr::from_ptr(uid);
match (*c).with_uid(uid_cstring.to_str().unwrap()) {
Ok(x) => Box::into_raw(Box::new(x)),
Err(e) => {
e.fill_c_err(err);
mem::zeroed()
}
}
}
#[no_mangle]
pub unsafe extern "C" fn vdirsyncer_get_hash(
c: *mut Item,
err: *mut VdirsyncerError,
) -> *const c_char {
match (*c).get_hash() {
Ok(x) => CString::new(x).unwrap().into_raw(),
Err(e) => {
e.fill_c_err(err);
mem::zeroed()
}
}
}
#[no_mangle]
pub unsafe extern "C" fn vdirsyncer_item_is_parseable(c: *mut Item) -> bool {
(*c).is_parseable()
}
}

104
rust/src/lib.rs Normal file
View file

@ -0,0 +1,104 @@
extern crate atomicwrites;
#[macro_use]
extern crate error_chain;
extern crate ring;
extern crate vobject;
pub mod item;
pub mod storage;
mod errors {
use std::ffi::CString;
use std::os::raw::c_char;
use vobject;
use atomicwrites;
error_chain!{
links {
Vobject(vobject::error::VObjectError, vobject::error::VObjectErrorKind);
}
foreign_links {
Io(::std::io::Error);
}
errors {
ItemUnparseable {
description("ItemUnparseable: The item cannot be parsed."),
display("The item cannot be parsed."),
}
VobjectVersionMismatch(first: String, second: String) {
description("Incompatible vobject versions."),
display("Conflict between {} and {}", first, second),
}
UnexpectedVobject(found: String, expected: String) {
description("Unexpected component type"),
display("Found type {}, expected {}", found, expected),
}
ItemNotFound(href: String) {
description("ItemNotFound: The item could not be found"),
display("The item '{}' could not be found", href),
}
AlreadyExisting(href: String) {
description("AlreadyExisting: An item at this href already exists"),
display("The href '{}' is already taken", href),
}
WrongEtag(href: String) {
description("WrongEtag: A wrong etag was provided."),
display("A wrong etag for '{}' was provided. This indicates that two clients are writing data at the same time.", href),
}
MtimeMismatch(filepath: String) {
description("MtimeMismatch: Two programs access the same file."),
display("The mtime of {} has unexpectedly changed. Please close other programs accessing this file.", filepath),
}
}
}
impl From<atomicwrites::Error<Error>> for Error {
fn from(e: atomicwrites::Error<Error>) -> Error {
match e {
atomicwrites::Error::Internal(x) => x.into(),
atomicwrites::Error::User(x) => x,
}
}
}
pub trait ErrorExt: ::std::error::Error {
unsafe fn fill_c_err(&self, err: *mut VdirsyncerError) {
(*err).failed = true;
(*err).msg = CString::new(self.description()).unwrap().into_raw();
}
}
impl ErrorExt for Error {}
#[repr(C)]
pub struct VdirsyncerError {
pub failed: bool,
pub msg: *mut c_char,
}
}
pub mod exports {
use std::ffi::{CStr, CString};
use std::ptr;
use std::os::raw::c_char;
use errors::*;
#[no_mangle]
pub unsafe extern "C" fn vdirsyncer_free_str(s: *const c_char) {
CStr::from_ptr(s);
}
#[no_mangle]
pub unsafe extern "C" fn vdirsyncer_clear_err(e: *mut VdirsyncerError) {
CString::from_raw((*e).msg);
(*e).msg = ptr::null_mut();
}
}

174
rust/src/storage/mod.rs Normal file
View file

@ -0,0 +1,174 @@
pub mod singlefile;
pub mod exports;
use std::ffi::CString;
use std::os::raw::c_char;
use errors::*;
use item::Item;
type ItemAndEtag = (Item, String);
pub trait Storage: Sized {
/// returns an iterator of `(href, etag)`
fn list<'a>(&'a mut self) -> Result<Box<Iterator<Item = (String, String)> + 'a>>;
///Fetch a single item.
///
///:param href: href to fetch
///:returns: (item, etag)
///:raises: :exc:`vdirsyncer.exceptions.PreconditionFailed` if item can't be found.
fn get(&mut self, href: &str) -> Result<ItemAndEtag>;
/// Fetch multiple items. Duplicate hrefs must be ignored.
///
/// Functionally similar to `get`, but might bring performance benefits on some storages when
/// used cleverly.
///
/// # Parameters
/// - `hrefs`: list of hrefs to fetch
/// - returns an iterator of `(href, item, etag)`
fn get_multi<'a, I: Iterator<Item = String> + 'a>(
&'a mut self,
hrefs: I,
) -> Box<Iterator<Item = (String, Result<ItemAndEtag>)> + 'a> {
Box::new(DefaultGetMultiIterator {
storage: self,
href_iter: hrefs,
})
}
/// Upload a new item.
///
/// In cases where the new etag cannot be atomically determined (i.e. in the same
/// "transaction" as the upload itself), this method may return `None` as etag. This
/// special case only exists because of DAV. Avoid this situation whenever possible.
///
/// Returns `(href, etag)`
fn upload(&mut self, item: Item) -> Result<(String, String)>;
/// Update an item.
///
/// The etag may be none in some cases, see `upload`.
///
/// Returns `etag`
fn update(&mut self, href: &str, item: Item, etag: &str) -> Result<String>;
/// Delete an item by href.
fn delete(&mut self, href: &str, etag: &str) -> Result<()>;
/// Enter buffered mode for storages that support it.
///
/// Uploads, updates and deletions may not be effective until `flush` is explicitly called.
///
/// Use this if you will potentially write a lot of data to the storage, it improves
/// performance for storages that implement it.
fn buffered(&mut self) {}
/// Write back all changes to the collection.
fn flush(&mut self) -> Result<()> {
Ok(())
}
}
struct DefaultGetMultiIterator<'a, S: Storage + 'a, I: Iterator<Item = String>> {
storage: &'a mut S,
href_iter: I,
}
impl<'a, S, I> Iterator for DefaultGetMultiIterator<'a, S, I>
where
S: Storage,
I: Iterator<Item = String>,
{
type Item = (String, Result<ItemAndEtag>);
fn next(&mut self) -> Option<Self::Item> {
match self.href_iter.next() {
Some(x) => Some((x.to_owned(), self.storage.get(&x))),
None => None,
}
}
}
pub struct VdirsyncerStorageListing {
iterator: Box<Iterator<Item = (String, String)>>,
href: Option<String>,
etag: Option<String>,
}
impl VdirsyncerStorageListing {
pub fn advance(&mut self) -> bool {
match self.iterator.next() {
Some((href, etag)) => {
self.href = Some(href);
self.etag = Some(etag);
true
}
None => {
self.href = None;
self.etag = None;
false
}
}
}
pub fn get_href(&mut self) -> Option<String> {
self.href.take()
}
pub fn get_etag(&mut self) -> Option<String> {
self.etag.take()
}
}
#[no_mangle]
pub unsafe extern "C" fn vdirsyncer_free_storage_listing(listing: *mut VdirsyncerStorageListing) {
let _: Box<VdirsyncerStorageListing> = Box::from_raw(listing);
}
#[no_mangle]
pub unsafe extern "C" fn vdirsyncer_advance_storage_listing(
listing: *mut VdirsyncerStorageListing,
) -> bool {
(*listing).advance()
}
#[no_mangle]
pub unsafe extern "C" fn vdirsyncer_storage_listing_get_href(
listing: *mut VdirsyncerStorageListing,
) -> *const c_char {
CString::new((*listing).get_href().unwrap())
.unwrap()
.into_raw()
}
#[no_mangle]
pub unsafe extern "C" fn vdirsyncer_storage_listing_get_etag(
listing: *mut VdirsyncerStorageListing,
) -> *const c_char {
CString::new((*listing).get_etag().unwrap())
.unwrap()
.into_raw()
}
#[repr(C)]
pub struct VdirsyncerStorageGetResult {
pub item: *mut Item,
pub etag: *const c_char,
}
#[no_mangle]
pub unsafe extern "C" fn vdirsyncer_free_storage_get_result(res: *mut VdirsyncerStorageGetResult) {
let _: Box<VdirsyncerStorageGetResult> = Box::from_raw(res);
}
#[repr(C)]
pub struct VdirsyncerStorageUploadResult {
pub href: *const c_char,
pub etag: *const c_char,
}
#[no_mangle]
pub unsafe extern "C" fn vdirsyncer_free_storage_upload_result(
res: *mut VdirsyncerStorageUploadResult,
) {
let _: Box<VdirsyncerStorageUploadResult> = Box::from_raw(res);
}

View file

@ -0,0 +1,349 @@
use std::ffi::CStr;
use std::os::raw::c_char;
use std::path::{Path, PathBuf};
use std::collections::{BTreeMap, BTreeSet};
use std::collections::btree_map::Entry::*;
use std::fs::{metadata, File};
use std::io::{Read, Write};
use std::time::SystemTime;
use super::Storage;
use errors::*;
use vobject;
use atomicwrites::{AllowOverwrite, AtomicFile};
use item::Item;
type ItemCache = BTreeMap<String, (Item, String)>;
pub struct SinglefileStorage {
path: PathBuf,
// href -> (item, etag)
items_cache: Option<(ItemCache, SystemTime)>,
buffered_mode: bool,
dirty_cache: bool,
}
impl SinglefileStorage {
pub fn new<P: AsRef<Path>>(path: P) -> Self {
SinglefileStorage {
path: path.as_ref().to_owned(),
items_cache: None,
buffered_mode: false,
dirty_cache: false,
}
}
fn get_items(&mut self) -> Result<&mut ItemCache> {
if self.items_cache.is_none() {
self.list()?;
}
Ok(&mut self.items_cache.as_mut().unwrap().0)
}
fn write_back(&mut self) -> Result<()> {
self.dirty_cache = true;
if self.buffered_mode {
return Ok(());
}
self.flush()?;
Ok(())
}
}
#[no_mangle]
pub unsafe extern "C" fn vdirsyncer_init_singlefile(path: *const c_char) -> *mut SinglefileStorage {
let cstring = CStr::from_ptr(path);
Box::into_raw(Box::new(SinglefileStorage::new(cstring.to_str().unwrap())))
}
#[no_mangle]
pub unsafe extern "C" fn vdirsyncer_free_singlefile(s: *mut SinglefileStorage) {
let _: Box<SinglefileStorage> = Box::from_raw(s);
}
impl Storage for SinglefileStorage {
fn list<'a>(&'a mut self) -> Result<Box<Iterator<Item = (String, String)> + 'a>> {
let mut new_cache = BTreeMap::new();
let mtime = metadata(&self.path)?.modified()?;
let mut f = File::open(&self.path)?;
let mut s = String::new();
f.read_to_string(&mut s)?;
for component in split_collection(&s)? {
let item = Item::from_component(component);
let hash = item.get_hash()?;
let ident = item.get_ident()?;
new_cache.insert(ident, (item, hash));
}
self.items_cache = Some((new_cache, mtime));
self.dirty_cache = false;
Ok(Box::new(self.items_cache.as_ref().unwrap().0.iter().map(
|(href, &(_, ref etag))| (href.clone(), etag.clone()),
)))
}
fn get(&mut self, href: &str) -> Result<(Item, String)> {
match self.get_items()?.get(href) {
Some(&(ref href, ref etag)) => Ok((href.clone(), etag.clone())),
None => Err(ErrorKind::ItemNotFound(href.to_owned()))?,
}
}
fn upload(&mut self, item: Item) -> Result<(String, String)> {
let hash = item.get_hash()?;
let href = item.get_ident()?;
match self.get_items()?.entry(href.clone()) {
Occupied(_) => Err(ErrorKind::AlreadyExisting(href.clone()))?,
Vacant(vc) => vc.insert((item, hash.clone())),
};
self.write_back()?;
Ok((href, hash))
}
fn update(&mut self, href: &str, item: Item, etag: &str) -> Result<String> {
let hash = match self.get_items()?.entry(href.to_owned()) {
Occupied(mut oc) => {
if oc.get().1 == etag {
let hash = item.get_hash()?;
oc.insert((item, hash.clone()));
Ok(hash)
} else {
Err(ErrorKind::WrongEtag(href.to_owned()))
}
}
Vacant(_) => Err(ErrorKind::ItemNotFound(href.to_owned())),
}?;
self.write_back()?;
Ok(hash)
}
fn delete(&mut self, href: &str, etag: &str) -> Result<()> {
match self.get_items()?.entry(href.to_owned()) {
Occupied(oc) => {
if oc.get().1 == etag {
oc.remove();
} else {
Err(ErrorKind::WrongEtag(href.to_owned()))?
}
}
Vacant(_) => Err(ErrorKind::ItemNotFound(href.to_owned()))?,
}
self.write_back()?;
Ok(())
}
fn buffered(&mut self) {
self.buffered_mode = true;
}
fn flush(&mut self) -> Result<()> {
if !self.dirty_cache {
return Ok(());
}
let (items, mtime) = self.items_cache.take().unwrap();
let af = AtomicFile::new(&self.path, AllowOverwrite);
let content = join_collection(items.into_iter().map(|(_, (item, _))| item))?;
af.write::<(), Error, _>(|f| {
f.write_all(content.as_bytes())?;
let real_mtime = metadata(&self.path)?.modified()?;
if mtime != real_mtime {
Err(ErrorKind::MtimeMismatch(
self.path.to_string_lossy().into_owned(),
))?;
}
Ok(())
})?;
self.dirty_cache = false;
Ok(())
}
}
fn split_collection(mut input: &str) -> Result<Vec<vobject::Component>> {
let mut rv = vec![];
while !input.is_empty() {
let (component, remainder) = vobject::read_component(input)?;
input = remainder;
match component.name.as_ref() {
"VCALENDAR" => rv.extend(split_vcalendar(component)?),
"VCARD" => rv.push(component),
"VADDRESSBOOK" => for vcard in component.subcomponents {
if vcard.name != "VCARD" {
Err(ErrorKind::UnexpectedVobject(
vcard.name.clone(),
"VCARD".to_owned(),
))?;
}
rv.push(vcard);
},
_ => Err(ErrorKind::UnexpectedVobject(
component.name.clone(),
"VCALENDAR | VCARD | VADDRESSBOOK".to_owned(),
))?,
}
}
Ok(rv)
}
/// Split one VCALENDAR component into multiple VCALENDAR components
#[inline]
fn split_vcalendar(mut vcalendar: vobject::Component) -> Result<Vec<vobject::Component>> {
vcalendar.props.remove("METHOD");
let mut timezones = BTreeMap::new(); // tzid => component
let mut subcomponents = vec![];
for component in vcalendar.subcomponents.drain(..) {
match component.name.as_ref() {
"VTIMEZONE" => {
let tzid = match component.get_only("TZID") {
Some(x) => x.value_as_string().clone(),
None => continue,
};
timezones.insert(tzid, component);
}
"VTODO" | "VEVENT" | "VJOURNAL" => subcomponents.push(component),
_ => Err(ErrorKind::UnexpectedVobject(
component.name.clone(),
"VTIMEZONE | VTODO | VEVENT | VJOURNAL".to_owned(),
))?,
};
}
let mut by_uid = BTreeMap::new();
let mut no_uid = vec![];
for component in subcomponents {
let uid = component.get_only("UID").cloned();
let mut wrapper = match uid.as_ref()
.and_then(|u| by_uid.remove(&u.value_as_string()))
{
Some(x) => x,
None => vcalendar.clone(),
};
let mut required_tzids = BTreeSet::new();
for props in component.props.values() {
for prop in props {
if let Some(x) = prop.params.get("TZID") {
required_tzids.insert(x.to_owned());
}
}
}
for tzid in required_tzids {
if let Some(tz) = timezones.get(&tzid) {
wrapper.subcomponents.push(tz.clone());
}
}
wrapper.subcomponents.push(component);
match uid {
Some(p) => {
by_uid.insert(p.value_as_string(), wrapper);
}
None => no_uid.push(wrapper),
}
}
Ok(by_uid
.into_iter()
.map(|(_, v)| v)
.chain(no_uid.into_iter())
.collect())
}
fn join_collection<I: Iterator<Item = Item>>(item_iter: I) -> Result<String> {
let mut items = item_iter.peekable();
let item_name = match items.peek() {
Some(x) => x.get_component()?.name.clone(),
None => return Ok("".to_owned()),
};
let wrapper_name = match item_name.as_ref() {
"VCARD" => "VADDRESSBOOK",
"VCALENDAR" => "VCALENDAR",
_ => Err(ErrorKind::UnexpectedVobject(
item_name.clone(),
"VCARD | VCALENDAR".to_owned(),
))?,
};
let mut wrapper = vobject::Component::new(wrapper_name);
let mut version: Option<vobject::Property> = None;
for item in items {
let mut c = item.into_component()?;
if c.name != item_name {
return Err(ErrorKind::UnexpectedVobject(c.name, item_name.clone()).into());
}
if item_name == wrapper_name {
wrapper.subcomponents.extend(c.subcomponents.drain(..));
match (version.as_ref(), c.get_only("VERSION")) {
(Some(x), Some(y)) if x.raw_value != y.raw_value => {
return Err(ErrorKind::VobjectVersionMismatch(
x.raw_value.clone(),
y.raw_value.clone(),
).into());
}
(None, Some(_)) => (),
_ => continue,
}
version = c.get_only("VERSION").cloned();
} else {
wrapper.subcomponents.push(c);
}
}
if let Some(v) = version {
wrapper.set(v);
}
Ok(vobject::write_component(&wrapper))
}
#[cfg(test)]
mod tests {
use super::*;
fn check_roundtrip(raw: &str) {
let components = split_collection(raw).unwrap();
let raw2 = join_collection(components.into_iter().map(Item::from_component)).unwrap();
assert_eq!(
Item::from_raw(raw.to_owned()).get_hash().unwrap(),
Item::from_raw(raw2.to_owned()).get_hash().unwrap()
);
}
#[test]
fn test_wrapper_properties_roundtrip() {
let raw = r#"BEGIN:VCALENDAR
PRODID:-//Google Inc//Google Calendar 70.9054//EN
X-WR-CALNAME:markus.unterwaditzer@runtastic.com
X-WR-TIMEZONE:Europe/Vienna
VERSION:2.0
CALSCALE:GREGORIAN
BEGIN:VEVENT
DTSTART;TZID=Europe/Vienna:20171012T153000
DTEND;TZID=Europe/Vienna:20171012T170000
DTSTAMP:20171009T085029Z
UID:test@test.com
STATUS:CONFIRMED
SUMMARY:Test
TRANSP:OPAQUE
END:VEVENT
END:VCALENDAR"#;
check_roundtrip(raw);
}
}

View file

@ -0,0 +1,15 @@
make install-rust
echo "export PATH=$HOME/.cargo/bin/:$PATH" >> $BASH_ENV
sudo apt-get install -y cmake
pip install --user virtualenv
~/.local/bin/virtualenv ~/env
echo ". ~/env/bin/activate" >> $BASH_ENV
. $BASH_ENV
pip install docker-compose
make -e install-dev install-test
if python --version | grep -q 'Python 3.6'; then
make -e install-style
fi

View file

@ -8,9 +8,12 @@ ARG distrover
RUN apt-get update
RUN apt-get install -y build-essential fakeroot debhelper git
RUN apt-get install -y python3-all python3-pip
RUN apt-get install -y python3-all python3-dev python3-pip
RUN apt-get install -y ruby ruby-dev
RUN apt-get install -y python-all python-pip
RUN curl https://sh.rustup.rs -sSf | sh -s -- -y
RUN apt-get install -y libssl-dev libffi-dev
ENV PATH="/root/.cargo/bin/:${PATH}"
RUN gem install fpm
@ -24,7 +27,7 @@ RUN mkdir /vdirsyncer/pkgs/
RUN basename *.tar.gz .tar.gz | cut -d'-' -f2 | sed -e 's/\.dev/~/g' | tee version
RUN (echo -n *.tar.gz; echo '[google]') | tee requirements.txt
RUN . /vdirsyncer/env/bin/activate; fpm -s virtualenv -t deb \
RUN . /vdirsyncer/env/bin/activate; fpm --verbose -s virtualenv -t deb \
-n "vdirsyncer-latest" \
-v "$(cat version)" \
--prefix /opt/venvs/vdirsyncer-latest \

View file

@ -1,78 +0,0 @@
import itertools
import json
import sys
python_versions = ("3.4", "3.5", "3.6")
latest_python = "3.6"
cfg = {}
cfg['sudo'] = True
cfg['dist'] = 'trusty'
cfg['language'] = 'python'
cfg['cache'] = 'pip'
cfg['git'] = {
'submodules': False
}
cfg['branches'] = {
'only': ['auto', 'master', '/^.*-maintenance$/']
}
cfg['install'] = """
. scripts/travis-install.sh
pip install -U pip setuptools
pip install wheel
make -e install-dev
make -e install-$BUILD
""".strip().splitlines()
cfg['script'] = ["make -e $BUILD"]
matrix = []
cfg['matrix'] = {'include': matrix}
matrix.append({
'python': latest_python,
'env': 'BUILD=style'
})
for python, requirements in itertools.product(python_versions,
("devel", "release", "minimal")):
dav_servers = ("radicale", "xandikos")
if python == latest_python and requirements == "release":
dav_servers += ("fastmail",)
for dav_server in dav_servers:
job = {
'python': python,
'env': ("BUILD=test "
"DAV_SERVER={dav_server} "
"REQUIREMENTS={requirements} "
.format(dav_server=dav_server,
requirements=requirements))
}
build_prs = dav_server not in ("fastmail", "davical", "icloud")
if not build_prs:
job['if'] = 'NOT (type IN (pull_request))'
matrix.append(job)
matrix.append({
'python': latest_python,
'env': ("BUILD=test "
"ETESYNC_TESTS=true "
"REQUIREMENTS=latest")
})
matrix.append({
'language': 'generic',
'os': 'osx',
'env': 'BUILD=test'
})
json.dump(cfg, sys.stdout, sort_keys=True, indent=2)

View file

@ -1,10 +0,0 @@
#!/bin/sh
# The OS X VM doesn't have any Python support at all
# See https://github.com/travis-ci/travis-ci/issues/2312
if [ "$TRAVIS_OS_NAME" = "osx" ]; then
brew update
brew install python3
virtualenv -p python3 $HOME/osx-py3
. $HOME/osx-py3/bin/activate
fi

View file

@ -1,14 +1,11 @@
[wheel]
universal = 1
[tool:pytest]
norecursedirs = tests/storage/servers/*
addopts = --tb=short
addopts = --tb=short --duration 3
[flake8]
# E731: Use a def instead of lambda expr
# E743: Ambiguous function definition
ignore = E731, E743
select = C,E,F,W,B,B9
exclude = .eggs, tests/storage/servers/owncloud/, tests/storage/servers/nextcloud/, tests/storage/servers/baikal/, build/
exclude = .eggs/, tests/storage/servers/nextcloud/, build/, vdirsyncer/_native*
application-package-names = tests,vdirsyncer

View file

@ -9,6 +9,7 @@ how to package vdirsyncer.
from setuptools import Command, find_packages, setup
milksnake = 'milksnake'
requirements = [
# https://github.com/mitsuhiko/click/issues/200
@ -32,10 +33,29 @@ requirements = [
'requests_toolbelt >=0.4.0',
# https://github.com/untitaker/python-atomicwrites/commit/4d12f23227b6a944ab1d99c507a69fdbc7c9ed6d # noqa
'atomicwrites>=0.1.7'
'atomicwrites>=0.1.7',
milksnake
]
def build_native(spec):
build = spec.add_external_build(
cmd=['make', 'rust-ext'],
path='.'
)
spec.add_cffi_module(
module_path='vdirsyncer._native',
dylib=lambda: build.find_dylib(
'vdirsyncer_rustext', in_path='rust/target/release'),
header_filename=lambda: build.find_header(
'vdirsyncer_rustext.h', in_path='rust/target'),
# Rust bug: If thread-local storage is used, this flag is necessary
# (mitsuhiko)
rtld_flags=['NOW', 'NODELETE']
)
class PrintRequirements(Command):
description = 'Prints minimal requirements'
user_options = []
@ -75,7 +95,10 @@ setup(
},
# Build dependencies
setup_requires=['setuptools_scm != 1.12.0'],
setup_requires=[
'setuptools_scm != 1.12.0',
milksnake,
],
# Other
packages=find_packages(exclude=['tests.*', 'tests']),
@ -101,4 +124,7 @@ setup(
'Topic :: Internet',
'Topic :: Utilities',
],
milksnake_tasks=[build_native],
zip_safe=False,
platforms='any'
)

View file

@ -3,9 +3,11 @@
Test suite for vdirsyncer.
'''
import random
import hypothesis.strategies as st
from vdirsyncer.vobject import normalize_item
from vdirsyncer.vobject import Item
import urllib3
import urllib3.exceptions
@ -18,7 +20,7 @@ def blow_up(*a, **kw):
def assert_item_equals(a, b):
assert normalize_item(a) == normalize_item(b)
assert a.hash == b.hash
VCARD_TEMPLATE = u'''BEGIN:VCARD
@ -109,3 +111,10 @@ uid_strategy = st.text(
)),
min_size=1
).filter(lambda x: x.strip() == x)
def format_item(uid=None, item_template=VCARD_TEMPLATE):
# assert that special chars are handled correctly.
r = random.random()
uid = uid or r
return Item(item_template.format(r=r, uid=uid))

View file

@ -1,14 +1,10 @@
# -*- coding: utf-8 -*-
import random
import uuid
import textwrap
from urllib.parse import quote as urlquote, unquote as urlunquote
import hypothesis.strategies as st
from hypothesis import given
import pytest
from vdirsyncer import exceptions
@ -16,7 +12,7 @@ from vdirsyncer.storage.base import normalize_meta_value
from vdirsyncer.vobject import Item
from .. import EVENT_TEMPLATE, TASK_TEMPLATE, VCARD_TEMPLATE, \
assert_item_equals, normalize_item, printable_characters_strategy
assert_item_equals, format_item
def get_server_mixin(server_name):
@ -25,12 +21,6 @@ def get_server_mixin(server_name):
return x.ServerMixin
def format_item(item_template, uid=None):
# assert that special chars are handled correctly.
r = random.random()
return Item(item_template.format(r=r, uid=uid or r))
class StorageTests(object):
storage_class = None
supports_collections = True
@ -62,7 +52,7 @@ class StorageTests(object):
'VCARD': VCARD_TEMPLATE,
}[item_type]
return lambda **kw: format_item(template, **kw)
return lambda **kw: format_item(item_template=template, **kw)
@pytest.fixture
def requires_collections(self):
@ -297,18 +287,16 @@ class StorageTests(object):
assert rv == x
assert isinstance(rv, str)
@given(value=st.one_of(
st.none(),
printable_characters_strategy
))
@pytest.mark.parametrize('value', [
'fööbör',
'ананасовое перо'
])
def test_metadata_normalization(self, requires_metadata, s, value):
x = s.get_meta('displayname')
assert x == normalize_meta_value(x)
if not getattr(self, 'dav_server', None):
# ownCloud replaces "" with "unnamed"
s.set_meta('displayname', value)
assert s.get_meta('displayname') == normalize_meta_value(value)
s.set_meta('displayname', value)
assert s.get_meta('displayname') == normalize_meta_value(value)
def test_recurring_events(self, s, item_type):
if item_type != 'VEVENT':
@ -354,4 +342,60 @@ class StorageTests(object):
href, etag = s.upload(item)
item2, etag2 = s.get(href)
assert normalize_item(item) == normalize_item(item2)
assert item2.raw.count('BEGIN:VEVENT') == 2
assert 'RRULE' in item2.raw
def test_buffered(self, get_storage_args, get_item, requires_collections):
args = get_storage_args()
s1 = self.storage_class(**args)
s2 = self.storage_class(**args)
s1.upload(get_item())
assert sorted(list(s1.list())) == sorted(list(s2.list()))
s1.buffered()
s1.upload(get_item())
s1.flush()
assert sorted(list(s1.list())) == sorted(list(s2.list()))
def test_retain_timezones(self, item_type, s):
if item_type != 'VEVENT':
pytest.skip('This storage instance doesn\'t support iCalendar.')
item = Item(textwrap.dedent('''
BEGIN:VCALENDAR
PRODID:-//ownCloud calendar v1.4.0
VERSION:2.0
CALSCALE:GREGORIAN
BEGIN:VEVENT
CREATED:20161004T110533
DTSTAMP:20161004T110533
LAST-MODIFIED:20161004T110533
UID:y2lmgz48mg
SUMMARY:Test
CLASS:PUBLIC
STATUS:CONFIRMED
DTSTART;TZID=Europe/Berlin:20161014T101500
DTEND;TZID=Europe/Berlin:20161014T114500
END:VEVENT
BEGIN:VTIMEZONE
TZID:Europe/Berlin
BEGIN:DAYLIGHT
DTSTART:20160327T030000
TZNAME:CEST
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
END:DAYLIGHT
BEGIN:STANDARD
DTSTART:20161030T020000
TZNAME:CET
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
END:STANDARD
END:VTIMEZONE
END:VCALENDAR
''').strip())
href, etag = s.upload(item)
item2, _ = s.get(href)
assert 'VTIMEZONE' in item2.raw
assert item2.hash == item.hash

View file

@ -6,14 +6,8 @@ import os
import pytest
import requests
import requests.exceptions
from tests import assert_item_equals
from vdirsyncer import exceptions
from vdirsyncer.vobject import Item
from .. import StorageTests, get_server_mixin
@ -24,14 +18,6 @@ ServerMixin = get_server_mixin(dav_server)
class DAVStorageTests(ServerMixin, StorageTests):
dav_server = dav_server
@pytest.mark.skipif(dav_server == 'radicale',
reason='Radicale is very tolerant.')
def test_dav_broken_item(self, s):
item = Item(u'HAHA:YES')
with pytest.raises((exceptions.Error, requests.exceptions.HTTPError)):
s.upload(item)
assert not list(s.list())
def test_dav_empty_get_multi_performance(self, s, monkeypatch):
def breakdown(*a, **kw):
raise AssertionError('Expected not to be called.')

View file

@ -28,7 +28,7 @@ class TestCalDAVStorage(DAVStorageTests):
s = self.storage_class(item_types=(item_type,), **get_storage_args())
try:
s.upload(format_item(VCARD_TEMPLATE))
s.upload(format_item(item_template=VCARD_TEMPLATE))
except (exceptions.Error, requests.exceptions.HTTPError):
pass
assert not list(s.list())
@ -64,7 +64,7 @@ class TestCalDAVStorage(DAVStorageTests):
s = self.storage_class(start_date=start_date, end_date=end_date,
**get_storage_args())
too_old_item = format_item(dedent(u'''
too_old_item = format_item(item_template=dedent(u'''
BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//hacksw/handcal//NONSGML v1.0//EN
@ -78,7 +78,7 @@ class TestCalDAVStorage(DAVStorageTests):
END:VCALENDAR
''').strip())
too_new_item = format_item(dedent(u'''
too_new_item = format_item(item_template=dedent(u'''
BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//hacksw/handcal//NONSGML v1.0//EN
@ -92,7 +92,7 @@ class TestCalDAVStorage(DAVStorageTests):
END:VCALENDAR
''').strip())
good_item = format_item(dedent(u'''
good_item = format_item(item_template=dedent(u'''
BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//hacksw/handcal//NONSGML v1.0//EN
@ -136,8 +136,8 @@ class TestCalDAVStorage(DAVStorageTests):
@pytest.mark.skipif(dav_server == 'icloud',
reason='iCloud only accepts VEVENT')
def test_item_types_general(self, s):
event = s.upload(format_item(EVENT_TEMPLATE))[0]
task = s.upload(format_item(TASK_TEMPLATE))[0]
event = s.upload(format_item(item_template=EVENT_TEMPLATE))[0]
task = s.upload(format_item(item_template=TASK_TEMPLATE))[0]
s.item_types = ('VTODO', 'VEVENT')
def l():

@ -1 +0,0 @@
Subproject commit 6c8c379f1ee8bf4ab0ac54fc4eec3e4a6349c237

View file

@ -10,13 +10,15 @@ try:
'url': 'https://brutus.lostpackets.de/davical-test/caldav.php/',
}
except KeyError as e:
pytestmark = pytest.mark.skip('Missing envkey: {}'.format(str(e)))
caldav_args = None
@pytest.mark.flaky(reruns=5)
class ServerMixin(object):
@pytest.fixture
def davical_args(self):
if caldav_args is None:
pytest.skip('Missing envkeys for davical')
if self.storage_class.fileext == '.ics':
return dict(caldav_args)
elif self.storage_class.fileext == '.vcf':

View file

@ -1 +0,0 @@
mysteryshack

View file

@ -1,75 +0,0 @@
# -*- coding: utf-8 -*-
import os
import subprocess
import time
import shutil
import pytest
import requests
testserver_repo = os.path.dirname(__file__)
make_sh = os.path.abspath(os.path.join(testserver_repo, 'make.sh'))
def wait():
for i in range(100):
try:
requests.get('http://127.0.0.1:6767/', verify=False)
except Exception as e:
# Don't know exact exception class, don't care.
# Also, https://github.com/kennethreitz/requests/issues/2192
if 'connection refused' not in str(e).lower():
raise
time.sleep(2 ** i)
else:
return True
return False
class ServerMixin(object):
@pytest.fixture(scope='session')
def setup_mysteryshack_server(self, xprocess):
def preparefunc(cwd):
return wait, ['sh', make_sh, 'testserver']
subprocess.check_call(['sh', make_sh, 'testserver-config'])
xprocess.ensure('mysteryshack_server', preparefunc)
return subprocess.check_output([
os.path.join(
testserver_repo,
'mysteryshack/target/debug/mysteryshack'
),
'-c', '/tmp/mysteryshack/config',
'user',
'authorize',
'testuser',
'https://example.com',
self.storage_class.scope + ':rw'
]).strip().decode()
@pytest.fixture
def get_storage_args(self, monkeypatch, setup_mysteryshack_server):
from requests import Session
monkeypatch.setitem(os.environ, 'OAUTHLIB_INSECURE_TRANSPORT', 'true')
old_request = Session.request
def request(self, method, url, **kw):
url = url.replace('https://', 'http://')
return old_request(self, method, url, **kw)
monkeypatch.setattr(Session, 'request', request)
shutil.rmtree('/tmp/mysteryshack/testuser/data', ignore_errors=True)
shutil.rmtree('/tmp/mysteryshack/testuser/meta', ignore_errors=True)
def inner(**kw):
kw['account'] = 'testuser@127.0.0.1:6767'
kw['access_token'] = setup_mysteryshack_server
if self.storage_class.fileext == '.ics':
kw.setdefault('collection', 'test')
return kw
return inner

View file

@ -1,18 +0,0 @@
#!/bin/sh
set -ex
cd "$(dirname "$0")"
. ./variables.sh
if [ "$CI" = "true" ]; then
curl -sL https://static.rust-lang.org/rustup.sh -o ~/rust-installer/rustup.sh
sh ~/rust-installer/rustup.sh --prefix=~/rust --spec=stable -y --disable-sudo 2> /dev/null
fi
if [ ! -d mysteryshack ]; then
git clone https://github.com/untitaker/mysteryshack
fi
pip install pytest-xprocess
cd mysteryshack
make debug-build # such that first test doesn't hang too long w/o output

View file

@ -1,9 +0,0 @@
#!/bin/sh
set -e
# pytest-xprocess doesn't allow us to CD into a particular directory before
# launching a command, so we do it here.
cd "$(dirname "$0")"
. ./variables.sh
cd mysteryshack
exec make "$@"

View file

@ -1 +0,0 @@
export PATH="$PATH:$HOME/.cargo/bin/"

@ -1 +0,0 @@
Subproject commit a27144ddcf39a3283179a4f7ce1ab22b2e810205

View file

@ -0,0 +1,29 @@
import os
import requests
import pytest
port = os.environ.get('NEXTCLOUD_HOST', None) or 'localhost:8080'
user = os.environ.get('NEXTCLOUD_USER', None) or 'asdf'
pwd = os.environ.get('NEXTCLOUD_PASS', None) or 'asdf'
class ServerMixin(object):
storage_class = None
wsgi_teardown = None
@pytest.fixture
def get_storage_args(self, item_type,
slow_create_collection):
def inner(collection='test'):
args = {
'username': user,
'password': pwd,
'url': 'http://{}/remote.php/dav/'.format(port)
}
if collection is not None:
args = slow_create_collection(self.storage_class, args,
collection)
return args
return inner

@ -1 +0,0 @@
Subproject commit bb4fcc6f524467d58c95f1dcec8470fdfcd65adf

View file

@ -5,9 +5,9 @@ import subprocess
import pytest
from vdirsyncer.storage.filesystem import FilesystemStorage
from vdirsyncer.vobject import Item
from . import StorageTests
from tests import format_item
class TestFilesystemStorage(StorageTests):
@ -42,13 +42,13 @@ class TestFilesystemStorage(StorageTests):
def test_ident_with_slash(self, tmpdir):
s = self.storage_class(str(tmpdir), '.txt')
s.upload(Item(u'UID:a/b/c'))
s.upload(format_item('a/b/c'))
item_file, = tmpdir.listdir()
assert '/' not in item_file.basename and item_file.isfile()
def test_too_long_uid(self, tmpdir):
s = self.storage_class(str(tmpdir), '.txt')
item = Item(u'UID:' + u'hue' * 600)
item = format_item('hue' * 600)
href, etag = s.upload(item)
assert item.uid not in href
@ -60,7 +60,7 @@ class TestFilesystemStorage(StorageTests):
monkeypatch.setattr(subprocess, 'call', check_call_mock)
s = self.storage_class(str(tmpdir), '.txt', post_hook=None)
s.upload(Item(u'UID:a/b/c'))
s.upload(format_item('a/b/c'))
def test_post_hook_active(self, tmpdir, monkeypatch):
@ -75,7 +75,7 @@ class TestFilesystemStorage(StorageTests):
monkeypatch.setattr(subprocess, 'call', check_call_mock)
s = self.storage_class(str(tmpdir), '.txt', post_hook=exe)
s.upload(Item(u'UID:a/b/c'))
s.upload(format_item('a/b/c'))
assert calls
def test_ignore_git_dirs(self, tmpdir):

View file

@ -4,10 +4,9 @@ import pytest
from requests import Response
from tests import normalize_item
from vdirsyncer.exceptions import UserError
from vdirsyncer.storage.http import HttpStorage, prepare_auth
from vdirsyncer.vobject import Item
def test_list(monkeypatch):
@ -56,9 +55,9 @@ def test_list(monkeypatch):
item, etag2 = s.get(href)
assert item.uid is not None
assert etag2 == etag
found_items[normalize_item(item)] = href
found_items[item.hash] = href
expected = set(normalize_item(u'BEGIN:VCALENDAR\n' + x + '\nEND:VCALENDAR')
expected = set(Item(u'BEGIN:VCALENDAR\n' + x + '\nEND:VCALENDAR').hash
for x in items)
assert set(found_items) == expected
@ -67,7 +66,7 @@ def test_list(monkeypatch):
item, etag2 = s.get(href)
assert item.uid is not None
assert etag2 == etag
assert found_items[normalize_item(item)] == href
assert found_items[item.hash] == href
def test_readonly_param():

View file

@ -1,9 +1,7 @@
import pytest
import json
from textwrap import dedent
import hypothesis.strategies as st
from hypothesis import given
from vdirsyncer import exceptions
from vdirsyncer.storage.base import Storage
@ -176,7 +174,8 @@ def test_null_collection_with_named_collection(tmpdir, runner):
assert 'HAHA' in bar.read()
@given(a_requires=st.booleans(), b_requires=st.booleans())
@pytest.mark.parametrize('a_requires,b_requires',
[(x, y) for x in (0, 1) for y in (0, 1)])
def test_collection_required(a_requires, b_requires, tmpdir, runner,
monkeypatch):

View file

@ -62,7 +62,7 @@ def test_repair_uids(storage, runner, repair_uids):
assert 'UID or href is unsafe, assigning random UID' in result.output
assert not f.exists()
new_f, = storage.listdir()
s = new_f.read()
s = new_f.read().strip()
assert s.startswith('BEGIN:VCARD')
assert s.endswith('END:VCARD')

View file

@ -4,11 +4,10 @@ import json
import sys
from textwrap import dedent
import hypothesis.strategies as st
from hypothesis import example, given
import pytest
from tests import format_item
def test_simple_run(tmpdir, runner):
runner.write_with_general(dedent('''
@ -37,10 +36,12 @@ def test_simple_run(tmpdir, runner):
result = runner.invoke(['sync'])
assert not result.exception
tmpdir.join('path_a/haha.txt').write('UID:haha')
item = format_item('haha')
tmpdir.join('path_a/haha.txt').write(item.raw)
result = runner.invoke(['sync'])
assert 'Copying (uploading) item haha to my_b' in result.output
assert tmpdir.join('path_b/haha.txt').read() == 'UID:haha'
assert tmpdir.join('path_b/haha.txt').read().splitlines() == \
item.raw.splitlines()
def test_sync_inexistant_pair(tmpdir, runner):
@ -109,7 +110,8 @@ def test_empty_storage(tmpdir, runner):
result = runner.invoke(['sync'])
assert not result.exception
tmpdir.join('path_a/haha.txt').write('UID:haha')
item = format_item('haha')
tmpdir.join('path_a/haha.txt').write(item.raw)
result = runner.invoke(['sync'])
assert not result.exception
tmpdir.join('path_b/haha.txt').remove()
@ -152,7 +154,7 @@ def test_collections_cache_invalidation(tmpdir, runner):
collections = ["a", "b", "c"]
''').format(str(tmpdir)))
foo.join('a/itemone.txt').write('UID:itemone')
foo.join('a/itemone.txt').write(format_item('itemone').raw)
result = runner.invoke(['discover'])
assert not result.exception
@ -271,25 +273,13 @@ def test_multiple_pairs(tmpdir, runner):
# XXX: https://github.com/pimutils/vdirsyncer/issues/617
@pytest.mark.skipif(sys.platform == 'darwin',
reason='This test inexplicably fails')
@given(collections=st.sets(
st.text(
st.characters(
blacklist_characters=set(
u'./\x00' # Invalid chars on POSIX filesystems
),
# Surrogates can't be encoded to utf-8 in Python
blacklist_categories=set(['Cs'])
),
min_size=1,
max_size=50
),
min_size=1
))
@example(collections=[u'persönlich'])
@example(collections={'a', 'A'})
@example(collections={'\ufffe'})
@pytest.mark.xfail(sys.platform == 'darwin',
reason='This test inexplicably fails')
@pytest.mark.parametrize('collections', [
{'persönlich'},
{'a', 'A'},
{'\ufffe'},
])
def test_create_collections(subtest, collections):
@subtest
@ -347,9 +337,10 @@ def test_ident_conflict(tmpdir, runner):
foo = tmpdir.mkdir('foo')
tmpdir.mkdir('bar')
foo.join('one.txt').write('UID:1')
foo.join('two.txt').write('UID:1')
foo.join('three.txt').write('UID:1')
item = format_item('1')
foo.join('one.txt').write(item.raw)
foo.join('two.txt').write(item.raw)
foo.join('three.txt').write(item.raw)
result = runner.invoke(['discover'])
assert not result.exception
@ -403,17 +394,16 @@ def test_no_configured_pairs(tmpdir, runner, cmd):
assert result.exception.code == 5
@pytest.mark.parametrize('resolution,expect_foo,expect_bar', [
(['command', 'cp'], 'UID:lol\nfööcontent', 'UID:lol\nfööcontent')
])
def test_conflict_resolution(tmpdir, runner, resolution, expect_foo,
expect_bar):
def test_conflict_resolution(tmpdir, runner):
item_a = format_item('lol')
item_b = format_item('lol')
runner.write_with_general(dedent('''
[pair foobar]
a = "foo"
b = "bar"
collections = null
conflict_resolution = {val}
conflict_resolution = ["command", "cp"]
[storage foo]
type = "filesystem"
@ -424,14 +414,14 @@ def test_conflict_resolution(tmpdir, runner, resolution, expect_foo,
type = "filesystem"
fileext = ".txt"
path = "{base}/bar"
'''.format(base=str(tmpdir), val=json.dumps(resolution))))
'''.format(base=str(tmpdir))))
foo = tmpdir.join('foo')
bar = tmpdir.join('bar')
fooitem = foo.join('lol.txt').ensure()
fooitem.write('UID:lol\nfööcontent')
fooitem.write(item_a.raw)
baritem = bar.join('lol.txt').ensure()
baritem.write('UID:lol\nbööcontent')
baritem.write(item_b.raw)
r = runner.invoke(['discover'])
assert not r.exception
@ -439,8 +429,8 @@ def test_conflict_resolution(tmpdir, runner, resolution, expect_foo,
r = runner.invoke(['sync'])
assert not r.exception
assert fooitem.read() == expect_foo
assert baritem.read() == expect_bar
assert fooitem.read().splitlines() == item_a.raw.splitlines()
assert baritem.read().splitlines() == item_a.raw.splitlines()
@pytest.mark.parametrize('partial_sync', ['error', 'ignore', 'revert', None])
@ -471,11 +461,12 @@ def test_partial_sync(tmpdir, runner, partial_sync):
foo = tmpdir.mkdir('foo')
bar = tmpdir.mkdir('bar')
foo.join('other.txt').write('UID:other')
bar.join('other.txt').write('UID:other')
item = format_item('other')
foo.join('other.txt').write(item.raw)
bar.join('other.txt').write(item.raw)
baritem = bar.join('lol.txt')
baritem.write('UID:lol')
baritem.write(format_item('lol').raw)
r = runner.invoke(['discover'])
assert not r.exception

View file

@ -1,6 +1,9 @@
import os
import pytest
from io import StringIO
from textwrap import dedent
from vdirsyncer.cli.config import _resolve_conflict_via_command
from vdirsyncer.cli.config import Config, _resolve_conflict_via_command
from vdirsyncer.vobject import Item
@ -22,3 +25,26 @@ def test_conflict_resolution_command():
a, b, ['~/command'], 'a', 'b',
_check_call=check_call
).raw == a.raw
def test_config_reader_invalid_collections():
s = StringIO(dedent('''
[general]
status_path = "foo"
[storage foo]
type = "memory"
[storage bar]
type = "memory"
[pair foobar]
a = "foo"
b = "bar"
collections = [["a", "b", "c", "d"]]
''').strip())
with pytest.raises(ValueError) as excinfo:
Config.from_fileobject(s)
assert 'Expected list of format' in str(excinfo.value)

View file

@ -8,7 +8,7 @@ from hypothesis.stateful import Bundle, RuleBasedStateMachine, rule
import pytest
from tests import blow_up, uid_strategy
from tests import blow_up, format_item, uid_strategy
from vdirsyncer.storage.memory import MemoryStorage, _random_string
from vdirsyncer.sync import sync as _sync
@ -49,7 +49,7 @@ def test_missing_status():
a = MemoryStorage()
b = MemoryStorage()
status = {}
item = Item(u'asdf')
item = format_item('asdf')
a.upload(item)
b.upload(item)
sync(a, b, status)
@ -62,8 +62,8 @@ def test_missing_status_and_different_items():
b = MemoryStorage()
status = {}
item1 = Item(u'UID:1\nhaha')
item2 = Item(u'UID:1\nhoho')
item1 = format_item('1')
item2 = format_item('1')
a.upload(item1)
b.upload(item2)
with pytest.raises(SyncConflict):
@ -79,8 +79,8 @@ def test_read_only_and_prefetch():
b.read_only = True
status = {}
item1 = Item(u'UID:1\nhaha')
item2 = Item(u'UID:2\nhoho')
item1 = format_item('1')
item2 = format_item('2')
a.upload(item1)
a.upload(item2)
@ -95,7 +95,8 @@ def test_partial_sync_error():
b = MemoryStorage()
status = {}
a.upload(Item('UID:0'))
item = format_item('0')
a.upload(item)
b.read_only = True
with pytest.raises(PartialSync):
@ -107,13 +108,13 @@ def test_partial_sync_ignore():
b = MemoryStorage()
status = {}
item0 = Item('UID:0\nhehe')
item0 = format_item('0')
a.upload(item0)
b.upload(item0)
b.read_only = True
item1 = Item('UID:1\nhaha')
item1 = format_item('1')
a.upload(item1)
sync(a, b, status, partial_sync='ignore')
@ -128,23 +129,25 @@ def test_partial_sync_ignore2():
b = MemoryStorage()
status = {}
href, etag = a.upload(Item('UID:0'))
item = format_item('0')
href, etag = a.upload(item)
a.read_only = True
sync(a, b, status, partial_sync='ignore', force_delete=True)
assert items(b) == items(a) == {'UID:0'}
assert items(b) == items(a) == {item.raw}
b.items.clear()
sync(a, b, status, partial_sync='ignore', force_delete=True)
sync(a, b, status, partial_sync='ignore', force_delete=True)
assert items(a) == {'UID:0'}
assert items(a) == {item.raw}
assert not b.items
a.read_only = False
a.update(href, Item('UID:0\nupdated'), etag)
new_item = format_item('0')
a.update(href, new_item, etag)
a.read_only = True
sync(a, b, status, partial_sync='ignore', force_delete=True)
assert items(b) == items(a) == {'UID:0\nupdated'}
assert items(b) == items(a) == {new_item.raw}
def test_upload_and_update():
@ -152,22 +155,22 @@ def test_upload_and_update():
b = MemoryStorage(fileext='.b')
status = {}
item = Item(u'UID:1') # new item 1 in a
item = format_item('1') # new item 1 in a
a.upload(item)
sync(a, b, status)
assert items(b) == items(a) == {item.raw}
item = Item(u'UID:1\nASDF:YES') # update of item 1 in b
item = format_item('1') # update of item 1 in b
b.update('1.b', item, b.get('1.b')[1])
sync(a, b, status)
assert items(b) == items(a) == {item.raw}
item2 = Item(u'UID:2') # new item 2 in b
item2 = format_item('2') # new item 2 in b
b.upload(item2)
sync(a, b, status)
assert items(b) == items(a) == {item.raw, item2.raw}
item2 = Item(u'UID:2\nASDF:YES') # update of item 2 in a
item2 = format_item('2') # update of item 2 in a
a.update('2.a', item2, a.get('2.a')[1])
sync(a, b, status)
assert items(b) == items(a) == {item.raw, item2.raw}
@ -178,9 +181,9 @@ def test_deletion():
b = MemoryStorage(fileext='.b')
status = {}
item = Item(u'UID:1')
item = format_item('1')
a.upload(item)
item2 = Item(u'UID:2')
item2 = format_item('2')
a.upload(item2)
sync(a, b, status)
b.delete('1.b', b.get('1.b')[1])
@ -200,14 +203,14 @@ def test_insert_hash():
b = MemoryStorage()
status = {}
item = Item('UID:1')
item = format_item('1')
href, etag = a.upload(item)
sync(a, b, status)
for d in status['1']:
del d['hash']
a.update(href, Item('UID:1\nHAHA:YES'), etag)
a.update(href, format_item('1'), etag) # new item content
sync(a, b, status)
assert 'hash' in status['1'][0] and 'hash' in status['1'][1]
@ -215,7 +218,7 @@ def test_insert_hash():
def test_already_synced():
a = MemoryStorage(fileext='.a')
b = MemoryStorage(fileext='.b')
item = Item(u'UID:1')
item = format_item('1')
a.upload(item)
b.upload(item)
status = {
@ -243,14 +246,14 @@ def test_already_synced():
def test_conflict_resolution_both_etags_new(winning_storage):
a = MemoryStorage()
b = MemoryStorage()
item = Item(u'UID:1')
item = format_item('1')
href_a, etag_a = a.upload(item)
href_b, etag_b = b.upload(item)
status = {}
sync(a, b, status)
assert status
item_a = Item(u'UID:1\nitem a')
item_b = Item(u'UID:1\nitem b')
item_a = format_item('1')
item_b = format_item('1')
a.update(href_a, item_a, etag_a)
b.update(href_b, item_b, etag_b)
with pytest.raises(SyncConflict):
@ -264,13 +267,14 @@ def test_conflict_resolution_both_etags_new(winning_storage):
def test_updated_and_deleted():
a = MemoryStorage()
b = MemoryStorage()
href_a, etag_a = a.upload(Item(u'UID:1'))
item = format_item('1')
href_a, etag_a = a.upload(item)
status = {}
sync(a, b, status, force_delete=True)
(href_b, etag_b), = b.list()
b.delete(href_b, etag_b)
updated = Item(u'UID:1\nupdated')
updated = format_item('1')
a.update(href_a, updated, etag_a)
sync(a, b, status, force_delete=True)
@ -280,8 +284,8 @@ def test_updated_and_deleted():
def test_conflict_resolution_invalid_mode():
a = MemoryStorage()
b = MemoryStorage()
item_a = Item(u'UID:1\nitem a')
item_b = Item(u'UID:1\nitem b')
item_a = format_item('1')
item_b = format_item('1')
a.upload(item_a)
b.upload(item_b)
with pytest.raises(ValueError):
@ -291,7 +295,7 @@ def test_conflict_resolution_invalid_mode():
def test_conflict_resolution_new_etags_without_changes():
a = MemoryStorage()
b = MemoryStorage()
item = Item(u'UID:1')
item = format_item('1')
href_a, etag_a = a.upload(item)
href_b, etag_b = b.upload(item)
status = {'1': (href_a, 'BOGUS_a', href_b, 'BOGUS_b')}
@ -326,7 +330,7 @@ def test_uses_get_multi(monkeypatch):
a = MemoryStorage()
b = MemoryStorage()
item = Item(u'UID:1')
item = format_item('1')
expected_href, etag = a.upload(item)
sync(a, b, {})
@ -336,8 +340,8 @@ def test_uses_get_multi(monkeypatch):
def test_empty_storage_dataloss():
a = MemoryStorage()
b = MemoryStorage()
a.upload(Item(u'UID:1'))
a.upload(Item(u'UID:2'))
for i in '12':
a.upload(format_item(i))
status = {}
sync(a, b, status)
with pytest.raises(StorageEmpty):
@ -350,22 +354,24 @@ def test_empty_storage_dataloss():
def test_no_uids():
a = MemoryStorage()
b = MemoryStorage()
a.upload(Item(u'ASDF'))
b.upload(Item(u'FOOBAR'))
item_a = format_item('')
item_b = format_item('')
a.upload(item_a)
b.upload(item_b)
status = {}
sync(a, b, status)
assert items(a) == items(b) == {u'ASDF', u'FOOBAR'}
assert items(a) == items(b) == {item_a.raw, item_b.raw}
def test_changed_uids():
a = MemoryStorage()
b = MemoryStorage()
href_a, etag_a = a.upload(Item(u'UID:A-ONE'))
href_b, etag_b = b.upload(Item(u'UID:B-ONE'))
href_a, etag_a = a.upload(format_item('a1'))
href_b, etag_b = b.upload(format_item('b1'))
status = {}
sync(a, b, status)
a.update(href_a, Item(u'UID:A-TWO'), etag_a)
a.update(href_a, format_item('a2'), etag_a)
sync(a, b, status)
@ -383,34 +389,37 @@ def test_partial_sync_revert():
a = MemoryStorage(instance_name='a')
b = MemoryStorage(instance_name='b')
status = {}
a.upload(Item(u'UID:1'))
b.upload(Item(u'UID:2'))
item1 = format_item('1')
item2 = format_item('2')
a.upload(item1)
b.upload(item2)
b.read_only = True
sync(a, b, status, partial_sync='revert')
assert len(status) == 2
assert items(a) == {'UID:1', 'UID:2'}
assert items(b) == {'UID:2'}
assert items(a) == {item1.raw, item2.raw}
assert items(b) == {item2.raw}
sync(a, b, status, partial_sync='revert')
assert len(status) == 1
assert items(a) == {'UID:2'}
assert items(b) == {'UID:2'}
assert items(a) == {item2.raw}
assert items(b) == {item2.raw}
# Check that updates get reverted
a.items[next(iter(a.items))] = ('foo', Item('UID:2\nupdated'))
assert items(a) == {'UID:2\nupdated'}
item2_up = format_item('2')
a.items[next(iter(a.items))] = ('foo', item2_up)
assert items(a) == {item2_up.raw}
sync(a, b, status, partial_sync='revert')
assert len(status) == 1
assert items(a) == {'UID:2\nupdated'}
assert items(a) == {item2_up.raw}
sync(a, b, status, partial_sync='revert')
assert items(a) == {'UID:2'}
assert items(a) == {item2.raw}
# Check that deletions get reverted
a.items.clear()
sync(a, b, status, partial_sync='revert', force_delete=True)
sync(a, b, status, partial_sync='revert', force_delete=True)
assert items(a) == {'UID:2'}
assert items(a) == {item2.raw}
@pytest.mark.parametrize('sync_inbetween', (True, False))
@ -418,13 +427,16 @@ def test_ident_conflict(sync_inbetween):
a = MemoryStorage()
b = MemoryStorage()
status = {}
href_a, etag_a = a.upload(Item(u'UID:aaa'))
href_b, etag_b = a.upload(Item(u'UID:bbb'))
item_a = format_item('aaa')
item_b = format_item('bbb')
href_a, etag_a = a.upload(item_a)
href_b, etag_b = a.upload(item_b)
if sync_inbetween:
sync(a, b, status)
a.update(href_a, Item(u'UID:xxx'), etag_a)
a.update(href_b, Item(u'UID:xxx'), etag_b)
item_x = format_item('xxx')
a.update(href_a, item_x, etag_a)
a.update(href_b, item_x, etag_b)
with pytest.raises(IdentConflict):
sync(a, b, status)
@ -441,7 +453,8 @@ def test_moved_href():
a = MemoryStorage()
b = MemoryStorage()
status = {}
href, etag = a.upload(Item(u'UID:haha'))
item = format_item('haha')
href, etag = a.upload(item)
sync(a, b, status)
b.items['lol'] = b.items.pop('haha')
@ -454,7 +467,7 @@ def test_moved_href():
sync(a, b, status)
assert len(status) == 1
assert items(a) == items(b) == {'UID:haha'}
assert items(a) == items(b) == {item.raw}
assert status['haha'][1]['href'] == 'lol'
old_status = deepcopy(status)
@ -463,7 +476,7 @@ def test_moved_href():
sync(a, b, status)
assert old_status == status
assert items(a) == items(b) == {'UID:haha'}
assert items(a) == items(b) == {item.raw}
def test_bogus_etag_change():
@ -476,26 +489,31 @@ def test_bogus_etag_change():
a = MemoryStorage()
b = MemoryStorage()
status = {}
href_a, etag_a = a.upload(Item(u'UID:ASDASD'))
sync(a, b, status)
assert len(status) == len(list(a.list())) == len(list(b.list())) == 1
item = format_item('ASDASD')
href_a, etag_a = a.upload(item)
sync(a, b, status)
assert len(status) == 1
assert items(a) == items(b) == {item.raw}
new_item = format_item('ASDASD')
(href_b, etag_b), = b.list()
a.update(href_a, Item(u'UID:ASDASD'), etag_a)
b.update(href_b, Item(u'UID:ASDASD\nACTUALCHANGE:YES'), etag_b)
a.update(href_a, item, etag_a)
b.update(href_b, new_item, etag_b)
b.delete = b.update = b.upload = blow_up
sync(a, b, status)
assert len(status) == 1
assert items(a) == items(b) == {u'UID:ASDASD\nACTUALCHANGE:YES'}
assert items(a) == items(b) == {new_item.raw}
def test_unicode_hrefs():
a = MemoryStorage()
b = MemoryStorage()
status = {}
href, etag = a.upload(Item(u'UID:äää'))
item = format_item('äää')
href, etag = a.upload(item)
sync(a, b, status)
@ -565,7 +583,7 @@ class SyncMachine(RuleBasedStateMachine):
uid=uid_strategy,
etag=st.text())
def upload(self, storage, uid, etag):
item = Item(u'UID:{}'.format(uid))
item = Item('BEGIN:VCARD\r\nUID:{}\r\nEND:VCARD'.format(uid))
storage.items[uid] = (etag, item)
@rule(storage=Storage, href=st.text())
@ -643,8 +661,8 @@ def test_rollback(error_callback):
b = MemoryStorage()
status = {}
a.items['0'] = ('', Item('UID:0'))
b.items['1'] = ('', Item('UID:1'))
a.items['0'] = ('', format_item('0'))
b.items['1'] = ('', format_item('1'))
b.upload = b.update = b.delete = action_failure
@ -668,7 +686,7 @@ def test_duplicate_hrefs():
a = MemoryStorage()
b = MemoryStorage()
a.list = lambda: [('a', 'a')] * 3
a.items['a'] = ('a', Item('UID:a'))
a.items['a'] = ('a', format_item('a'))
status = {}
sync(a, b, status)

View file

@ -2,7 +2,7 @@ from vdirsyncer import exceptions
def test_user_error_problems():
e = exceptions.UserError('A few problems occured', problems=[
e = exceptions.UserError('A few problems occurred', problems=[
'Problem one',
'Problem two',
'Problem three'
@ -11,4 +11,4 @@ def test_user_error_problems():
assert 'one' in str(e)
assert 'two' in str(e)
assert 'three' in str(e)
assert 'problems occured' in str(e)
assert 'problems occurred' in str(e)

View file

@ -38,7 +38,7 @@ def test_repair_uids(uid):
@settings(perform_health_check=False) # Using the random module for UIDs
def test_repair_unsafe_uids(uid):
s = MemoryStorage()
item = Item(u'BEGIN:VCARD\nUID:{}\nEND:VCARD'.format(uid))
item = Item(u'BEGIN:VCARD\nUID:123\nEND:VCARD').with_uid(uid)
href, etag = s.upload(item)
assert s.get(href)[0].uid == uid
assert not href_safe(uid)

View file

@ -9,12 +9,23 @@ from hypothesis.stateful import Bundle, RuleBasedStateMachine, rule
import pytest
from tests import BARE_EVENT_TEMPLATE, EVENT_TEMPLATE, \
EVENT_WITH_TIMEZONE_TEMPLATE, VCARD_TEMPLATE, normalize_item, \
EVENT_WITH_TIMEZONE_TEMPLATE, VCARD_TEMPLATE, \
uid_strategy
import vdirsyncer.vobject as vobject
@pytest.fixture
def check_roundtrip(benchmark):
def inner(split):
joined = benchmark(lambda: vobject.join_collection(split))
split2 = benchmark(lambda: list(vobject.split_collection(joined)))
assert [vobject.Item(item).hash for item in split] == \
[vobject.Item(item).hash for item in split2]
return inner
_simple_split = [
VCARD_TEMPLATE.format(r=123, uid=123),
VCARD_TEMPLATE.format(r=345, uid=345),
@ -28,11 +39,13 @@ _simple_joined = u'\r\n'.join(
)
def test_split_collection_simple(benchmark):
def test_split_collection_simple(benchmark, check_roundtrip):
check_roundtrip(_simple_split)
given = benchmark(lambda: list(vobject.split_collection(_simple_joined)))
assert [normalize_item(item) for item in given] == \
[normalize_item(item) for item in _simple_split]
assert [vobject.Item(item).hash for item in given] == \
[vobject.Item(item).hash for item in _simple_split]
assert [x.splitlines() for x in given] == \
[x.splitlines() for x in _simple_split]
@ -46,9 +59,10 @@ def test_split_collection_multiple_wrappers(benchmark):
for x in _simple_split
)
given = benchmark(lambda: list(vobject.split_collection(joined)))
check_roundtrip(given)
assert [normalize_item(item) for item in given] == \
[normalize_item(item) for item in _simple_split]
assert [vobject.Item(item).hash for item in given] == \
[vobject.Item(item).hash for item in _simple_split]
assert [x.splitlines() for x in given] == \
[x.splitlines() for x in _simple_split]
@ -56,7 +70,7 @@ def test_split_collection_multiple_wrappers(benchmark):
def test_join_collection_simple(benchmark):
given = benchmark(lambda: vobject.join_collection(_simple_split))
assert normalize_item(given) == normalize_item(_simple_joined)
assert vobject.Item(given).hash == vobject.Item(_simple_joined).hash
assert given.splitlines() == _simple_joined.splitlines()
@ -123,12 +137,12 @@ def test_split_collection_timezones():
[timezone, u'END:VCALENDAR']
)
given = set(normalize_item(item)
given = set(vobject.Item(item).hash
for item in vobject.split_collection(full))
expected = set(
normalize_item(u'\r\n'.join((
vobject.Item(u'\r\n'.join((
u'BEGIN:VCALENDAR', item, timezone, u'END:VCALENDAR'
)))
))).hash
for item in items
)
@ -146,11 +160,11 @@ def test_split_contacts():
with_wrapper.splitlines()
def test_hash_item():
def test_hash_item2():
a = EVENT_TEMPLATE.format(r=1, uid=1)
b = u'\n'.join(line for line in a.splitlines()
if u'PRODID' not in line)
assert vobject.hash_item(a) == vobject.hash_item(b)
assert vobject.Item(a).hash == vobject.Item(b).hash
def test_multiline_uid(benchmark):
@ -223,7 +237,7 @@ def test_replace_uid(template, uid):
item = vobject.Item(template.format(r=123, uid=123)).with_uid(uid)
assert item.uid == uid
if uid:
assert item.raw.count('\nUID:{}'.format(uid)) == 1
assert item.raw.count('\nUID:') == 1
else:
assert '\nUID:' not in item.raw
@ -235,7 +249,7 @@ def test_broken_item():
assert 'Parsing error at line 1' in str(excinfo.value)
item = vobject.Item('END:FOO')
assert item.parsed is None
assert not item.is_parseable
def test_multiple_items():
@ -351,3 +365,88 @@ def test_component_contains():
with pytest.raises(ValueError):
42 in item
def test_hash_item():
item1 = vobject.Item(
'BEGIN:FOO\r\n'
'X-RADICALE-NAME:YES\r\n'
'END:FOO\r\n'
)
item2 = vobject.Item(
'BEGIN:FOO\r\n'
'X-RADICALE-NAME:NO\r\n'
'END:FOO\r\n'
)
assert item1.hash == item2.hash
item2 = vobject.Item(
'BEGIN:FOO\r\n'
'X-RADICALE-NAME:NO\r\n'
'OTHER-PROP:YAY\r\n'
'END:FOO\r\n'
)
assert item1.hash != item2.hash
def test_hash_item_timezones():
item1 = vobject.Item(
'BEGIN:VCALENDAR\r\n'
'HELLO:HAHA\r\n'
'BEGIN:VTIMEZONE\r\n'
'PROP:YES\r\n'
'END:VTIMEZONE\r\n'
'END:VCALENDAR\r\n'
)
item2 = vobject.Item(
'BEGIN:VCALENDAR\r\n'
'HELLO:HAHA\r\n'
'END:VCALENDAR\r\n'
)
assert item1.hash == item2.hash
def test_hash_item_line_wrapping():
item1 = vobject.Item(
'BEGIN:VCALENDAR\r\n'
'PROP:a\r\n'
' b\r\n'
' c\r\n'
'END:VCALENDAR\r\n'
)
item2 = vobject.Item(
'BEGIN:VCALENDAR\r\n'
'PROP:abc\r\n'
'END:VCALENDAR\r\n'
)
assert item1.hash == item2.hash
def test_wrapper_properties(check_roundtrip):
raws = [dedent('''
BEGIN:VCALENDAR
PRODID:-//Google Inc//Google Calendar 70.9054//EN
VERSION:2.0
CALSCALE:GREGORIAN
X-WR-CALNAME:hans.gans@gmail.com
X-WR-TIMEZONE:Europe/Vienna
BEGIN:VEVENT
DTSTART;TZID=Europe/Vienna:20171012T153000
DTEND;TZID=Europe/Vienna:20171012T170000
DTSTAMP:20171009T085029Z
UID:test@test.com
STATUS:CONFIRMED
SUMMARY:Test
TRANSP:OPAQUE
END:VEVENT
END:VCALENDAR
''').strip()]
check_roundtrip(raws)

View file

@ -63,8 +63,9 @@ def _validate_collections_param(collections):
elif isinstance(collection, list):
e = ValueError(
'Expected list of format '
'["config_name", "storage_a_name", "storage_b_name"]'
.format(len(collection)))
'["config_name", "storage_a_name", "storage_b_name"], but '
'found {!r} instead.'
.format(collection))
if len(collection) != 3:
raise e

View file

@ -146,9 +146,9 @@ def handle_cli_error(status_name=None, e=None):
import traceback
tb = traceback.format_tb(tb)
if status_name:
msg = 'Unknown error occured for {}'.format(status_name)
msg = 'Unknown error occurred for {}'.format(status_name)
else:
msg = 'Unknown error occured'
msg = 'Unknown error occurred'
msg += ': {}\nUse `-vdebug` to see the full traceback.'.format(e)
@ -244,6 +244,9 @@ def save_status(base_path, pair, collection=None, data_type=None, data=None):
def storage_class_from_config(config):
config = dict(config)
if 'type' not in config:
raise exceptions.UserError('Missing parameter "type"')
storage_name = config.pop('type')
try:
cls = storage_names[storage_name]

View file

@ -79,3 +79,7 @@ class UnsupportedMetadataError(Error, NotImplementedError):
class CollectionRequired(Error):
'''`collection = null` is not allowed.'''
class VobjectParseError(Error, ValueError):
'''The parsed vobject is invalid.'''

32
vdirsyncer/native.py Normal file
View file

@ -0,0 +1,32 @@
from ._native import ffi, lib
from . import exceptions
def string_rv(c_str):
try:
return ffi.string(c_str).decode('utf-8')
finally:
lib.vdirsyncer_free_str(c_str)
def item_rv(c):
return ffi.gc(c, lib.vdirsyncer_free_item)
def check_error(e):
try:
if e.failed:
msg = ffi.string(e.msg).decode('utf-8')
if msg.startswith('ItemNotFound'):
raise exceptions.NotFoundError(msg)
elif msg.startswith('AlreadyExisting'):
raise exceptions.AlreadyExistingError(msg)
elif msg.startswith('WrongEtag'):
raise exceptions.WrongEtagError(msg)
elif msg.startswith('ItemUnparseable'):
raise ValueError(msg)
else:
raise Exception(msg)
finally:
if e.failed:
lib.vdirsyncer_clear_err(e)

View file

@ -40,7 +40,7 @@ def repair_storage(storage, repair_unsafe_uid):
def repair_item(href, item, seen_uids, repair_unsafe_uid):
if item.parsed is None:
if not item.is_parseable:
raise IrreparableItem()
new_item = item

View file

@ -0,0 +1,73 @@
from .. import native
from ..vobject import Item
from functools import partial
class RustStorageMixin:
_native_storage = None
def _native(self, name):
return partial(
getattr(native.lib,
'vdirsyncer_{}_{}'.format(self.storage_name, name)),
self._native_storage
)
def list(self):
e = native.ffi.new('VdirsyncerError *')
listing = self._native('list')(e)
native.check_error(e)
listing = native.ffi.gc(listing,
native.lib.vdirsyncer_free_storage_listing)
while native.lib.vdirsyncer_advance_storage_listing(listing):
href = native.string_rv(
native.lib.vdirsyncer_storage_listing_get_href(listing))
etag = native.string_rv(
native.lib.vdirsyncer_storage_listing_get_etag(listing))
yield href, etag
def get(self, href):
href = href.encode('utf-8')
e = native.ffi.new('VdirsyncerError *')
result = self._native('get')(href, e)
native.check_error(e)
result = native.ffi.gc(result,
native.lib.vdirsyncer_free_storage_get_result)
item = native.item_rv(result.item)
etag = native.string_rv(result.etag)
return Item(None, _native=item), etag
# FIXME: implement get_multi
def upload(self, item):
e = native.ffi.new('VdirsyncerError *')
result = self._native('upload')(item._native, e)
native.check_error(e)
result = native.ffi.gc(
result, native.lib.vdirsyncer_free_storage_upload_result)
href = native.string_rv(result.href)
etag = native.string_rv(result.etag)
return href, etag
def update(self, href, item, etag):
href = href.encode('utf-8')
etag = etag.encode('utf-8')
e = native.ffi.new('VdirsyncerError *')
etag = self._native('update')(href, item._native, etag, e)
native.check_error(e)
return native.string_rv(etag)
def delete(self, href, etag):
href = href.encode('utf-8')
etag = etag.encode('utf-8')
e = native.ffi.new('VdirsyncerError *')
self._native('delete')(href, etag, e)
native.check_error(e)
def buffered(self):
self._native('buffered')()
def flush(self):
e = native.ffi.new('VdirsyncerError *')
self._native('flush')(e)
native.check_error(e)

View file

@ -1,6 +1,5 @@
# -*- coding: utf-8 -*-
import contextlib
import functools
from .. import exceptions
@ -198,26 +197,6 @@ class Storage(metaclass=StorageMeta):
'''
raise NotImplementedError()
@contextlib.contextmanager
def at_once(self):
'''A contextmanager that buffers all writes.
Essentially, this::
s.upload(...)
s.update(...)
becomes this::
with s.at_once():
s.upload(...)
s.update(...)
Note that this removes guarantees about which exceptions are returned
when.
'''
yield
def get_meta(self, key):
'''Get metadata value for collection/storage.
@ -240,6 +219,14 @@ class Storage(metaclass=StorageMeta):
raise NotImplementedError('This storage does not support metadata.')
def buffered(self):
'''See documentation in rust/storage/mod.rs'''
pass
def flush(self):
'''See documentation in rust/storage/mod.rs'''
pass
def normalize_meta_value(value):
# `None` is returned by iCloud for empty properties.

View file

@ -146,7 +146,7 @@ class Discover(object):
_homeset_xml = None
_homeset_tag = None
_well_known_uri = None
_collection_xml = b"""
_collection_xml = b"""<?xml version="1.0" encoding="utf-8" ?>
<d:propfind xmlns:d="DAV:">
<d:prop>
<d:resourcetype />
@ -724,8 +724,8 @@ class CalDAVStorage(DAVStorage):
example, the following would synchronize the timerange from one year in the
past to one year in the future::
start_date = datetime.now() - timedelta(days=365)
end_date = datetime.now() + timedelta(days=365)
start_date = "datetime.now() - timedelta(days=365)"
end_date = "datetime.now() + timedelta(days=365)"
Either both or none have to be specified. The default is to synchronize
everything.
@ -792,32 +792,28 @@ class CalDAVStorage(DAVStorage):
@staticmethod
def _get_list_filters(components, start, end):
if components:
caldavfilter = '''
<C:comp-filter name="VCALENDAR">
<C:comp-filter name="{component}">
{timefilter}
</C:comp-filter>
caldavfilter = '''
<C:comp-filter name="VCALENDAR">
<C:comp-filter name="{component}">
{timefilter}
</C:comp-filter>
'''
</C:comp-filter>
'''
if start is not None and end is not None:
start = start.strftime(CALDAV_DT_FORMAT)
end = end.strftime(CALDAV_DT_FORMAT)
timefilter = ''
timefilter = ('<C:time-range start="{start}" end="{end}"/>'
.format(start=start, end=end))
else:
timefilter = ''
if start is not None and end is not None:
start = start.strftime(CALDAV_DT_FORMAT)
end = end.strftime(CALDAV_DT_FORMAT)
for component in components:
yield caldavfilter.format(component=component,
timefilter=timefilter)
else:
if start is not None and end is not None:
for x in CalDAVStorage._get_list_filters(('VTODO', 'VEVENT'),
start, end):
yield x
timefilter = ('<C:time-range start="{start}" end="{end}"/>'
.format(start=start, end=end))
if not components:
components = ('VTODO', 'VEVENT')
for component in components:
yield caldavfilter.format(component=component,
timefilter=timefilter)
def list(self):
caldavfilters = list(self._get_list_filters(
@ -833,8 +829,8 @@ class CalDAVStorage(DAVStorage):
# instead?
#
# See https://github.com/dmfs/tasks/issues/118 for backstory.
for x in DAVStorage.list(self):
yield x
yield from DAVStorage.list(self)
return
data = '''<?xml version="1.0" encoding="utf-8" ?>
<C:calendar-query xmlns:D="DAV:"

View file

@ -1,4 +1,3 @@
import contextlib
import functools
import logging
import os
@ -30,10 +29,10 @@ logger = logging.getLogger(__name__)
def _writing_op(f):
@functools.wraps(f)
def inner(self, *args, **kwargs):
if not self._at_once:
if not self._buffered:
self._sync_journal()
rv = f(self, *args, **kwargs)
if not self._at_once:
if not self._buffered:
self._sync_journal()
return rv
return inner
@ -110,7 +109,7 @@ class EtesyncStorage(Storage):
_collection_type = None
_item_type = None
_at_once = False
_buffered = False
def __init__(self, email, secrets_dir, server_url=None, db_path=None,
**kwargs):
@ -213,15 +212,11 @@ class EtesyncStorage(Storage):
except etesync.exceptions.DoesNotExist as e:
raise exceptions.NotFoundError(e)
@contextlib.contextmanager
def at_once(self):
def buffered(self):
self._buffered = True
def flush(self):
self._sync_journal()
self._at_once = True
try:
yield self
self._sync_journal()
finally:
self._at_once = False
class EtesyncContacts(EtesyncStorage):

View file

@ -1,35 +1,18 @@
# -*- coding: utf-8 -*-
import collections
import contextlib
import functools
import glob
import logging
import os
from atomicwrites import atomic_write
from .base import Storage
from .. import exceptions
from ..utils import checkfile, expand_path, get_etag_from_file
from ..vobject import Item, join_collection, split_collection
from ._rust import RustStorageMixin
from .. import native
from ..utils import checkfile, expand_path
logger = logging.getLogger(__name__)
def _writing_op(f):
@functools.wraps(f)
def inner(self, *args, **kwargs):
if self._items is None or not self._at_once:
self.list()
rv = f(self, *args, **kwargs)
if not self._at_once:
self._write()
return rv
return inner
class SingleFileStorage(Storage):
class SingleFileStorage(RustStorageMixin, Storage):
'''Save data in single local ``.vcf`` or ``.ics`` file.
The storage basically guesses how items should be joined in the file.
@ -83,21 +66,20 @@ class SingleFileStorage(Storage):
storage_name = 'singlefile'
_repr_attributes = ('path',)
_write_mode = 'wb'
_append_mode = 'ab'
_read_mode = 'rb'
_items = None
_last_etag = None
def __init__(self, path, encoding='utf-8', **kwargs):
def __init__(self, path, **kwargs):
super(SingleFileStorage, self).__init__(**kwargs)
path = os.path.abspath(expand_path(path))
checkfile(path, create=False)
self.path = path
self.encoding = encoding
self._at_once = False
self._native_storage = native.ffi.gc(
native.lib.vdirsyncer_init_singlefile(path.encode('utf-8')),
native.lib.vdirsyncer_free_singlefile
)
@classmethod
def discover(cls, path, **kwargs):
@ -144,94 +126,3 @@ class SingleFileStorage(Storage):
kwargs['path'] = path
kwargs['collection'] = collection
return kwargs
def list(self):
self._items = collections.OrderedDict()
try:
self._last_etag = get_etag_from_file(self.path)
with open(self.path, self._read_mode) as f:
text = f.read().decode(self.encoding)
except OSError as e:
import errno
if e.errno != errno.ENOENT: # file not found
raise IOError(e)
text = None
if not text:
return ()
for item in split_collection(text):
item = Item(item)
etag = item.hash
self._items[item.ident] = item, etag
return ((href, etag) for href, (item, etag) in self._items.items())
def get(self, href):
if self._items is None or not self._at_once:
self.list()
try:
return self._items[href]
except KeyError:
raise exceptions.NotFoundError(href)
@_writing_op
def upload(self, item):
href = item.ident
if href in self._items:
raise exceptions.AlreadyExistingError(existing_href=href)
self._items[href] = item, item.hash
return href, item.hash
@_writing_op
def update(self, href, item, etag):
if href not in self._items:
raise exceptions.NotFoundError(href)
_, actual_etag = self._items[href]
if etag != actual_etag:
raise exceptions.WrongEtagError(etag, actual_etag)
self._items[href] = item, item.hash
return item.hash
@_writing_op
def delete(self, href, etag):
if href not in self._items:
raise exceptions.NotFoundError(href)
_, actual_etag = self._items[href]
if etag != actual_etag:
raise exceptions.WrongEtagError(etag, actual_etag)
del self._items[href]
def _write(self):
if self._last_etag is not None and \
self._last_etag != get_etag_from_file(self.path):
raise exceptions.PreconditionFailed(
'Some other program modified the file {r!}. Re-run the '
'synchronization and make sure absolutely no other program is '
'writing into the same file.'.format(self.path))
text = join_collection(
item.raw for item, etag in self._items.values()
)
try:
with atomic_write(self.path, mode='wb', overwrite=True) as f:
f.write(text.encode(self.encoding))
finally:
self._items = None
self._last_etag = None
@contextlib.contextmanager
def at_once(self):
self.list()
self._at_once = True
try:
yield self
self._write()
finally:
self._at_once = False

View file

@ -27,6 +27,7 @@ sync_logger = logging.getLogger(__name__)
class _StorageInfo(object):
'''A wrapper class that holds prefetched items, the status and other
things.'''
def __init__(self, storage, status):
self.storage = storage
self.status = status
@ -57,6 +58,12 @@ class _StorageInfo(object):
# Prefetch items
for href, item, etag in (self.storage.get_multi(prefetch)
if prefetch else ()):
if not item.is_parseable:
sync_logger.warning(
'Storage "{}": item {} is malformed. '
'Please try to repair it.'
.format(self.storage.instance_name, href)
)
_store_props(item.ident, ItemMetadata(
href=href,
hash=item.hash,
@ -143,20 +150,25 @@ def sync(storage_a, storage_b, status, conflict_resolution=None,
actions = list(_get_actions(a_info, b_info))
with storage_a.at_once(), storage_b.at_once():
for action in actions:
try:
action.run(
a_info,
b_info,
conflict_resolution,
partial_sync
)
except Exception as e:
if error_callback:
error_callback(e)
else:
raise
storage_a.buffered()
storage_b.buffered()
for action in actions:
try:
action.run(
a_info,
b_info,
conflict_resolution,
partial_sync
)
except Exception as e:
if error_callback:
error_callback(e)
else:
raise
storage_a.flush()
storage_b.flush()
class Action:

View file

@ -1,37 +1,9 @@
# -*- coding: utf-8 -*-
import hashlib
from itertools import chain, tee
from .utils import cached_property, uniq
IGNORE_PROPS = (
# PRODID is changed by radicale for some reason after upload
'PRODID',
# Sometimes METHOD:PUBLISH is added by WebCAL providers, for us it doesn't
# make a difference
'METHOD',
# X-RADICALE-NAME is used by radicale, because hrefs don't really exist in
# their filesystem backend
'X-RADICALE-NAME',
# Apparently this is set by Horde?
# https://github.com/pimutils/vdirsyncer/issues/318
'X-WR-CALNAME',
# Those are from the VCARD specification and is supposed to change when the
# item does -- however, we can determine that ourselves
'REV',
'LAST-MODIFIED',
'CREATED',
# Some iCalendar HTTP calendars generate the DTSTAMP at request time, so
# this property always changes when the rest of the item didn't. Some do
# the same with the UID.
#
# - Google's read-only calendar links
# - http://www.feiertage-oesterreich.at/
'DTSTAMP',
'UID',
)
from . import native
class Item(object):
@ -39,101 +11,53 @@ class Item(object):
'''Immutable wrapper class for VCALENDAR (VEVENT, VTODO) and
VCARD'''
def __init__(self, raw):
def __init__(self, raw, _native=None):
if raw is None:
assert _native
self._native = _native
return
assert isinstance(raw, str), type(raw)
self._raw = raw
assert _native is None
self._native = native.item_rv(
native.lib.vdirsyncer_item_from_raw(raw.encode('utf-8'))
)
def with_uid(self, new_uid):
parsed = _Component.parse(self.raw)
stack = [parsed]
while stack:
component = stack.pop()
stack.extend(component.subcomponents)
new_uid = new_uid or ''
assert isinstance(new_uid, str), type(new_uid)
if component.name in ('VEVENT', 'VTODO', 'VJOURNAL', 'VCARD'):
del component['UID']
if new_uid:
component['UID'] = new_uid
e = native.ffi.new('VdirsyncerError *')
rv = native.lib.vdirsyncer_with_uid(self._native,
new_uid.encode('utf-8'),
e)
native.check_error(e)
return Item(None, _native=native.item_rv(rv))
return Item('\r\n'.join(parsed.dump_lines()))
@cached_property
def is_parseable(self):
return native.lib.vdirsyncer_item_is_parseable(self._native)
@cached_property
def raw(self):
'''Raw content of the item, as unicode string.
Vdirsyncer doesn't validate the content in any way.
'''
return self._raw
return native.string_rv(native.lib.vdirsyncer_get_raw(self._native))
@cached_property
def uid(self):
'''Global identifier of the item, across storages, doesn't change after
a modification of the item.'''
# Don't actually parse component, but treat all lines as single
# component, avoiding traversal through all subcomponents.
x = _Component('TEMP', self.raw.splitlines(), [])
try:
return x['UID'].strip() or None
except KeyError:
return None
rv = native.string_rv(native.lib.vdirsyncer_get_uid(self._native))
return rv or None
@cached_property
def hash(self):
'''Hash of self.raw, used for etags.'''
return hash_item(self.raw)
e = native.ffi.new('VdirsyncerError *')
rv = native.lib.vdirsyncer_get_hash(self._native, e)
native.check_error(e)
return native.string_rv(rv)
@cached_property
def ident(self):
'''Used for generating hrefs and matching up items during
synchronization. This is either the UID or the hash of the item's
content.'''
# We hash the item instead of directly using its raw content, because
#
# 1. The raw content might be really large, e.g. when it's a contact
# with a picture, which bloats the status file.
#
# 2. The status file would contain really sensitive information.
return self.uid or self.hash
@property
def parsed(self):
'''Don't cache because the rv is mutable.'''
try:
return _Component.parse(self.raw)
except Exception:
return None
def normalize_item(item, ignore_props=IGNORE_PROPS):
'''Create syntactically invalid mess that is equal for similar items.'''
if not isinstance(item, Item):
item = Item(item)
item = _strip_timezones(item)
x = _Component('TEMP', item.raw.splitlines(), [])
for prop in IGNORE_PROPS:
del x[prop]
x.props.sort()
return u'\r\n'.join(filter(bool, (line.strip() for line in x.props)))
def _strip_timezones(item):
parsed = item.parsed
if not parsed or parsed.name != 'VCALENDAR':
return item
parsed.subcomponents = [c for c in parsed.subcomponents
if c.name != 'VTIMEZONE']
return Item('\r\n'.join(parsed.dump_lines()))
def hash_item(text):
return hashlib.sha256(normalize_item(text).encode('utf-8')).hexdigest()
def split_collection(text):
assert isinstance(text, str)