Merge branch 'master' into feat/ldap
This commit is contained in:
commit
4229cebd61
2
.github/FUNDING.yml
vendored
Normal file
2
.github/FUNDING.yml
vendored
Normal file
@ -0,0 +1,2 @@
|
|||||||
|
github: etesync
|
||||||
|
custom: https://www.etesync.com/contribute/#donate
|
3
.gitignore
vendored
3
.gitignore
vendored
@ -7,9 +7,10 @@ Session.vim
|
|||||||
/.coverage
|
/.coverage
|
||||||
/tmp
|
/tmp
|
||||||
/media
|
/media
|
||||||
|
/.idea
|
||||||
|
|
||||||
__pycache__
|
__pycache__
|
||||||
.*.swp
|
.*.swp
|
||||||
|
|
||||||
|
|
||||||
/etebase_server_settings.py
|
/etebase_server_settings.py
|
||||||
|
/secret.txt
|
||||||
|
18
ChangeLog.md
18
ChangeLog.md
@ -1,5 +1,23 @@
|
|||||||
# Changelog
|
# Changelog
|
||||||
|
|
||||||
|
## Version 0.7.0
|
||||||
|
* Chunks: improve the chunk download endpoint to use sendfile extensions
|
||||||
|
* Chunks: support not passing chunk content if exists
|
||||||
|
* Chunks: fix chunk uploading media type to accept everything
|
||||||
|
* Gracefull handle uploading the same revision
|
||||||
|
* Pass generic context to callbacks instead of the whole view
|
||||||
|
* Fix handling of some validation errors
|
||||||
|
|
||||||
|
## Version 0.6.1
|
||||||
|
* Collection: save the UID on the model to use the db for enforcing uniqueness
|
||||||
|
|
||||||
|
## Version 0.6.0
|
||||||
|
* Fix stoken calculation performance - was VERY slow in some rare cases
|
||||||
|
* Fix issues with host verification failing with a custom port - part 2
|
||||||
|
|
||||||
|
## Version 0.5.3
|
||||||
|
* Add missing migration
|
||||||
|
|
||||||
## Version 0.5.2
|
## Version 0.5.2
|
||||||
* Fix issues with host verification failing with a custom port
|
* Fix issues with host verification failing with a custom port
|
||||||
* Add env variable to change configuration file path.
|
* Add env variable to change configuration file path.
|
||||||
|
56
README.md
56
README.md
@ -9,13 +9,17 @@ An [Etebase](https://www.etebase.com) (EteSync 2.0) server so you can run your o
|
|||||||
|
|
||||||
# Installation
|
# Installation
|
||||||
|
|
||||||
|
## Requirements
|
||||||
|
|
||||||
|
Etebase requires Python 3.7 or newer and has a few Python dependencies (listed in `requirements.in/base.txt`).
|
||||||
|
|
||||||
## From source
|
## From source
|
||||||
|
|
||||||
Before installing the Etebase server make sure you install `virtualenv` (for **Python 3**):
|
Before installing the Etebase server make sure you install `virtualenv` (for **Python 3**):
|
||||||
|
|
||||||
* Arch Linux: `pacman -S python-virtualenv`
|
* Arch Linux: `pacman -S python-virtualenv`
|
||||||
* Debian/Ubuntu: `apt-get install python3-virtualenv`
|
* Debian/Ubuntu: `apt-get install python3-virtualenv`
|
||||||
* Mac/Windows/Other Linux: install virtualenv or just skip the instructions mentioning virtualenv.
|
* Mac/Windows (WSL)/Other Linux: install virtualenv or just skip the instructions mentioning virtualenv.
|
||||||
|
|
||||||
Then just clone the git repo and set up this app:
|
Then just clone the git repo and set up this app:
|
||||||
|
|
||||||
@ -23,11 +27,10 @@ Then just clone the git repo and set up this app:
|
|||||||
git clone https://github.com/etesync/server.git etebase
|
git clone https://github.com/etesync/server.git etebase
|
||||||
|
|
||||||
cd etebase
|
cd etebase
|
||||||
git checkout etebase
|
|
||||||
|
|
||||||
# Set up the environment and deps
|
# Set up the environment and deps
|
||||||
virtualenv -p python3 venv # If doesn't work, try: virtualenv3 venv
|
virtualenv -p python3 .venv # If doesn't work, try: virtualenv3 .venv
|
||||||
source venv/bin/activate
|
source .venv/bin/activate
|
||||||
|
|
||||||
pip install -r requirements.txt
|
pip install -r requirements.txt
|
||||||
```
|
```
|
||||||
@ -42,12 +45,14 @@ To use the easy configuration file rename it to `etebase-server.ini` and place i
|
|||||||
There is also a [wikipage](https://github.com/etesync/server/wiki/Basic-Setup-Etebase-(EteSync-v2)) detailing this basic setup.
|
There is also a [wikipage](https://github.com/etesync/server/wiki/Basic-Setup-Etebase-(EteSync-v2)) detailing this basic setup.
|
||||||
|
|
||||||
Some particular settings that should be edited are:
|
Some particular settings that should be edited are:
|
||||||
* [`ALLOWED_HOSTS`](https://docs.djangoproject.com/en/1.11/ref/settings/#std:setting-ALLOWED_HOSTS)
|
* [`ALLOWED_HOSTS`](https://docs.djangoproject.com/en/dev/ref/settings/#std:setting-ALLOWED_HOSTS)
|
||||||
-- this is the list of host/domain names or addresses on which the app
|
-- this is the list of host/domain names or addresses on which the app
|
||||||
will be served
|
will be served. For example: `etebase.example.com`
|
||||||
* [`DEBUG`](https://docs.djangoproject.com/en/1.11/ref/settings/#debug)
|
* [`DEBUG`](https://docs.djangoproject.com/en/dev/ref/settings/#debug)
|
||||||
-- handy for debugging, set to `False` for production
|
-- handy for debugging, set to `False` for production
|
||||||
* [`SECRET_KEY`](https://docs.djangoproject.com/en/1.11/ref/settings/#std:setting-SECRET_KEY)
|
* [`MEDIA_ROOT`](https://docs.djangoproject.com/en/dev/ref/settings/#media-root)
|
||||||
|
-- the path to the directory that will hold user data.
|
||||||
|
* [`SECRET_KEY`](https://docs.djangoproject.com/en/dev/ref/settings/#std:setting-SECRET_KEY)
|
||||||
-- an ephemeral secret used for various cryptographic signing and token
|
-- an ephemeral secret used for various cryptographic signing and token
|
||||||
generation purposes. See below for how default configuration of
|
generation purposes. See below for how default configuration of
|
||||||
`SECRET_KEY` works for this project.
|
`SECRET_KEY` works for this project.
|
||||||
@ -61,22 +66,26 @@ Now you can initialise our django app.
|
|||||||
And you are done! You can now run the debug server just to see everything works as expected by running:
|
And you are done! You can now run the debug server just to see everything works as expected by running:
|
||||||
|
|
||||||
```
|
```
|
||||||
./manage.py runserver 0.0.0.0:8000
|
uvicorn etebase_server.asgi:application --host 0.0.0.0 --port 8000
|
||||||
```
|
```
|
||||||
|
|
||||||
Using the debug server in production is not recommended, so please read the following section for a proper deployment.
|
Using the debug server in production is not recommended, so please read the following section for a proper deployment.
|
||||||
|
|
||||||
# Production deployment
|
# Production deployment
|
||||||
|
|
||||||
There are more details about a proper production setup using Daphne and Nginx in the [wiki](https://github.com/etesync/server/wiki/Production-setup-using-Daphne-and-Nginx).
|
There are more details about a proper production setup using uvicorn and Nginx in the [wiki](https://github.com/etesync/server/wiki/Production-setup-using-Nginx).
|
||||||
|
|
||||||
Etebase is based on Django so you should refer to one of the following
|
|
||||||
* The instructions of the Django project [here](https://docs.djangoproject.com/en/2.2/howto/deployment/wsgi/).
|
|
||||||
* Instructions from uwsgi [here](http://uwsgi-docs.readthedocs.io/en/latest/tutorials/Django_and_nginx.html).
|
|
||||||
|
|
||||||
The webserver should also be configured to serve Etebase using TLS.
|
The webserver should also be configured to serve Etebase using TLS.
|
||||||
A guide for doing so can be found in the [wiki](https://github.com/etesync/server/wiki/Setup-HTTPS-for-Etebase) as well.
|
A guide for doing so can be found in the [wiki](https://github.com/etesync/server/wiki/Setup-HTTPS-for-Etebase) as well.
|
||||||
|
|
||||||
|
The Etebase server needs to be aware of the URL it's been served as, so make sure to forward the `Host` header to the server if using a reverse proxy. For example, you would need to use the following directive in nginx: `proxy_set_header Host $host;`.
|
||||||
|
|
||||||
|
# Data locations and backups
|
||||||
|
|
||||||
|
The server stores user data in two different locations that need to be backed up:
|
||||||
|
1. The database - how to backup depends on which database you use.
|
||||||
|
2. The `MEDIA_ROOT` - the path where user data is stored.
|
||||||
|
|
||||||
# Usage
|
# Usage
|
||||||
|
|
||||||
Create yourself an admin user:
|
Create yourself an admin user:
|
||||||
@ -129,6 +138,23 @@ Here are the update steps:
|
|||||||
4. Run the migration tool to migrate all of your data.
|
4. Run the migration tool to migrate all of your data.
|
||||||
5. Add your new EteSync 2.0 accounts to all of your devices.
|
5. Add your new EteSync 2.0 accounts to all of your devices.
|
||||||
|
|
||||||
# Supporting Etebase
|
# License
|
||||||
|
|
||||||
|
Etebase is free software: you can redistribute it and/or modify it under the terms of the GNU Affero General Public License version 3 as published by the Free Software Foundation. See the [LICENSE](./LICENSE) for more information.
|
||||||
|
|
||||||
|
A quick summary can be found [on tldrlegal](https://tldrlegal.com/license/gnu-affero-general-public-license-v3-(agpl-3.0)). Though in even simpler terms (not part of the license, and not legal advice): you can use it in however way you want, including self-hosting and commercial offerings as long as you release the code to any modifications you have made to the server software (clients are not affected).
|
||||||
|
|
||||||
|
## Commercial licensing
|
||||||
|
|
||||||
|
For commercial licensing options, contact license@etebase.com
|
||||||
|
|
||||||
|
# Financially Supporting Etebase
|
||||||
|
|
||||||
Please consider registering an account even if you self-host in order to support the development of Etebase, or visit the [contribution](https://www.etesync.com/contribute/) for more information on how to support the service.
|
Please consider registering an account even if you self-host in order to support the development of Etebase, or visit the [contribution](https://www.etesync.com/contribute/) for more information on how to support the service.
|
||||||
|
|
||||||
|
Become a financial contributor and help us sustain our community!
|
||||||
|
|
||||||
|
## Contributors ($10 / month)
|
||||||
|
|
||||||
|
[![ilovept](https://github.com/ilovept.png?size=40)](https://github.com/ilovept)
|
||||||
|
[![ryanleesipes](https://github.com/ryanleesipes.png?size=40)](https://github.com/ryanleesipes)
|
||||||
|
@ -1 +1 @@
|
|||||||
from .app_settings import app_settings
|
from .app_settings_inner import app_settings
|
||||||
|
@ -1,3 +0,0 @@
|
|||||||
from django.contrib import admin
|
|
||||||
|
|
||||||
# Register your models here.
|
|
@ -11,6 +11,8 @@
|
|||||||
#
|
#
|
||||||
# You should have received a copy of the GNU General Public License
|
# You should have received a copy of the GNU General Public License
|
||||||
# along with this program. If not, see <http://www.gnu.org/licenses/>.
|
# along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||||
|
import typing as t
|
||||||
|
|
||||||
from django.utils.functional import cached_property
|
from django.utils.functional import cached_property
|
||||||
|
|
||||||
|
|
||||||
@ -32,22 +34,20 @@ class AppSettings:
|
|||||||
return getattr(settings, self.prefix + name, dflt)
|
return getattr(settings, self.prefix + name, dflt)
|
||||||
|
|
||||||
@cached_property
|
@cached_property
|
||||||
def API_PERMISSIONS(self): # pylint: disable=invalid-name
|
def REDIS_URI(self) -> t.Optional[str]: # pylint: disable=invalid-name
|
||||||
perms = self._setting("API_PERMISSIONS", ("rest_framework.permissions.IsAuthenticated",))
|
return self._setting("REDIS_URI", None)
|
||||||
|
|
||||||
|
@cached_property
|
||||||
|
def API_PERMISSIONS_READ(self): # pylint: disable=invalid-name
|
||||||
|
perms = self._setting("API_PERMISSIONS_READ", tuple())
|
||||||
ret = []
|
ret = []
|
||||||
for perm in perms:
|
for perm in perms:
|
||||||
ret.append(self.import_from_str(perm))
|
ret.append(self.import_from_str(perm))
|
||||||
return ret
|
return ret
|
||||||
|
|
||||||
@cached_property
|
@cached_property
|
||||||
def API_AUTHENTICATORS(self): # pylint: disable=invalid-name
|
def API_PERMISSIONS_WRITE(self): # pylint: disable=invalid-name
|
||||||
perms = self._setting(
|
perms = self._setting("API_PERMISSIONS_WRITE", tuple())
|
||||||
"API_AUTHENTICATORS",
|
|
||||||
(
|
|
||||||
"rest_framework.authentication.TokenAuthentication",
|
|
||||||
"rest_framework.authentication.SessionAuthentication",
|
|
||||||
),
|
|
||||||
)
|
|
||||||
ret = []
|
ret = []
|
||||||
for perm in perms:
|
for perm in perms:
|
||||||
ret.append(self.import_from_str(perm))
|
ret.append(self.import_from_str(perm))
|
@ -1,5 +0,0 @@
|
|||||||
from django.apps import AppConfig
|
|
||||||
|
|
||||||
|
|
||||||
class DrfMsgpackConfig(AppConfig):
|
|
||||||
name = "drf_msgpack"
|
|
@ -1,14 +0,0 @@
|
|||||||
import msgpack
|
|
||||||
|
|
||||||
from rest_framework.parsers import BaseParser
|
|
||||||
from rest_framework.exceptions import ParseError
|
|
||||||
|
|
||||||
|
|
||||||
class MessagePackParser(BaseParser):
|
|
||||||
media_type = "application/msgpack"
|
|
||||||
|
|
||||||
def parse(self, stream, media_type=None, parser_context=None):
|
|
||||||
try:
|
|
||||||
return msgpack.unpackb(stream.read(), raw=False)
|
|
||||||
except Exception as exc:
|
|
||||||
raise ParseError("MessagePack parse error - %s" % str(exc))
|
|
@ -1,15 +0,0 @@
|
|||||||
import msgpack
|
|
||||||
|
|
||||||
from rest_framework.renderers import BaseRenderer
|
|
||||||
|
|
||||||
|
|
||||||
class MessagePackRenderer(BaseRenderer):
|
|
||||||
media_type = "application/msgpack"
|
|
||||||
format = "msgpack"
|
|
||||||
render_style = "binary"
|
|
||||||
charset = None
|
|
||||||
|
|
||||||
def render(self, data, media_type=None, renderer_context=None):
|
|
||||||
if data is None:
|
|
||||||
return b""
|
|
||||||
return msgpack.packb(data, use_bin_type=True)
|
|
@ -1,3 +0,0 @@
|
|||||||
from django.shortcuts import render
|
|
||||||
|
|
||||||
# Create your views here.
|
|
@ -1,9 +0,0 @@
|
|||||||
from rest_framework import serializers, status
|
|
||||||
|
|
||||||
|
|
||||||
class EtebaseValidationError(serializers.ValidationError):
|
|
||||||
def __init__(self, code, detail, status_code=status.HTTP_400_BAD_REQUEST):
|
|
||||||
super().__init__(
|
|
||||||
{"code": code, "detail": detail,}
|
|
||||||
)
|
|
||||||
self.status_code = status_code
|
|
@ -33,7 +33,9 @@ class Migration(migrations.Migration):
|
|||||||
("version", models.PositiveSmallIntegerField()),
|
("version", models.PositiveSmallIntegerField()),
|
||||||
("owner", models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL)),
|
("owner", models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL)),
|
||||||
],
|
],
|
||||||
options={"unique_together": {("uid", "owner")},},
|
options={
|
||||||
|
"unique_together": {("uid", "owner")},
|
||||||
|
},
|
||||||
),
|
),
|
||||||
migrations.CreateModel(
|
migrations.CreateModel(
|
||||||
name="CollectionItem",
|
name="CollectionItem",
|
||||||
@ -61,7 +63,9 @@ class Migration(migrations.Migration):
|
|||||||
),
|
),
|
||||||
),
|
),
|
||||||
],
|
],
|
||||||
options={"unique_together": {("uid", "collection")},},
|
options={
|
||||||
|
"unique_together": {("uid", "collection")},
|
||||||
|
},
|
||||||
),
|
),
|
||||||
migrations.CreateModel(
|
migrations.CreateModel(
|
||||||
name="CollectionItemChunk",
|
name="CollectionItemChunk",
|
||||||
@ -122,7 +126,9 @@ class Migration(migrations.Migration):
|
|||||||
),
|
),
|
||||||
),
|
),
|
||||||
],
|
],
|
||||||
options={"unique_together": {("item", "current")},},
|
options={
|
||||||
|
"unique_together": {("item", "current")},
|
||||||
|
},
|
||||||
),
|
),
|
||||||
migrations.CreateModel(
|
migrations.CreateModel(
|
||||||
name="RevisionChunkRelation",
|
name="RevisionChunkRelation",
|
||||||
@ -145,7 +151,9 @@ class Migration(migrations.Migration):
|
|||||||
),
|
),
|
||||||
),
|
),
|
||||||
],
|
],
|
||||||
options={"ordering": ("id",),},
|
options={
|
||||||
|
"ordering": ("id",),
|
||||||
|
},
|
||||||
),
|
),
|
||||||
migrations.CreateModel(
|
migrations.CreateModel(
|
||||||
name="CollectionMember",
|
name="CollectionMember",
|
||||||
@ -170,6 +178,8 @@ class Migration(migrations.Migration):
|
|||||||
),
|
),
|
||||||
("user", models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL)),
|
("user", models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL)),
|
||||||
],
|
],
|
||||||
options={"unique_together": {("user", "collection")},},
|
options={
|
||||||
|
"unique_together": {("user", "collection")},
|
||||||
|
},
|
||||||
),
|
),
|
||||||
]
|
]
|
||||||
|
@ -54,6 +54,8 @@ class Migration(migrations.Migration):
|
|||||||
),
|
),
|
||||||
),
|
),
|
||||||
],
|
],
|
||||||
options={"unique_together": {("user", "fromMember")},},
|
options={
|
||||||
|
"unique_together": {("user", "fromMember")},
|
||||||
|
},
|
||||||
),
|
),
|
||||||
]
|
]
|
||||||
|
@ -11,6 +11,8 @@ class Migration(migrations.Migration):
|
|||||||
|
|
||||||
operations = [
|
operations = [
|
||||||
migrations.AddField(
|
migrations.AddField(
|
||||||
model_name="collectioninvitation", name="version", field=models.PositiveSmallIntegerField(default=1),
|
model_name="collectioninvitation",
|
||||||
|
name="version",
|
||||||
|
field=models.PositiveSmallIntegerField(default=1),
|
||||||
),
|
),
|
||||||
]
|
]
|
||||||
|
@ -10,5 +10,9 @@ class Migration(migrations.Migration):
|
|||||||
]
|
]
|
||||||
|
|
||||||
operations = [
|
operations = [
|
||||||
migrations.RenameField(model_name="userinfo", old_name="pubkey", new_name="loginPubkey",),
|
migrations.RenameField(
|
||||||
|
model_name="userinfo",
|
||||||
|
old_name="pubkey",
|
||||||
|
new_name="loginPubkey",
|
||||||
|
),
|
||||||
]
|
]
|
||||||
|
@ -33,6 +33,8 @@ class Migration(migrations.Migration):
|
|||||||
),
|
),
|
||||||
("user", models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL)),
|
("user", models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL)),
|
||||||
],
|
],
|
||||||
options={"unique_together": {("user", "collection")},},
|
options={
|
||||||
|
"unique_together": {("user", "collection")},
|
||||||
|
},
|
||||||
),
|
),
|
||||||
]
|
]
|
||||||
|
@ -10,5 +10,9 @@ class Migration(migrations.Migration):
|
|||||||
]
|
]
|
||||||
|
|
||||||
operations = [
|
operations = [
|
||||||
migrations.RenameField(model_name="userinfo", old_name="encryptedSeckey", new_name="encryptedContent",),
|
migrations.RenameField(
|
||||||
|
model_name="userinfo",
|
||||||
|
old_name="encryptedSeckey",
|
||||||
|
new_name="encryptedContent",
|
||||||
|
),
|
||||||
]
|
]
|
||||||
|
@ -11,6 +11,8 @@ class Migration(migrations.Migration):
|
|||||||
|
|
||||||
operations = [
|
operations = [
|
||||||
migrations.AddField(
|
migrations.AddField(
|
||||||
model_name="collectionitemrevision", name="salt", field=models.BinaryField(default=b"", editable=True),
|
model_name="collectionitemrevision",
|
||||||
|
name="salt",
|
||||||
|
field=models.BinaryField(default=b"", editable=True),
|
||||||
),
|
),
|
||||||
]
|
]
|
||||||
|
@ -21,7 +21,16 @@ class Migration(migrations.Migration):
|
|||||||
to="django_etebase.CollectionItem",
|
to="django_etebase.CollectionItem",
|
||||||
),
|
),
|
||||||
),
|
),
|
||||||
migrations.AlterUniqueTogether(name="collection", unique_together=set(),),
|
migrations.AlterUniqueTogether(
|
||||||
migrations.RemoveField(model_name="collection", name="uid",),
|
name="collection",
|
||||||
migrations.RemoveField(model_name="collection", name="version",),
|
unique_together=set(),
|
||||||
|
),
|
||||||
|
migrations.RemoveField(
|
||||||
|
model_name="collection",
|
||||||
|
name="uid",
|
||||||
|
),
|
||||||
|
migrations.RemoveField(
|
||||||
|
model_name="collection",
|
||||||
|
name="version",
|
||||||
|
),
|
||||||
]
|
]
|
||||||
|
@ -10,5 +10,8 @@ class Migration(migrations.Migration):
|
|||||||
]
|
]
|
||||||
|
|
||||||
operations = [
|
operations = [
|
||||||
migrations.RemoveField(model_name="collectionitemrevision", name="salt",),
|
migrations.RemoveField(
|
||||||
|
model_name="collectionitemrevision",
|
||||||
|
name="salt",
|
||||||
|
),
|
||||||
]
|
]
|
||||||
|
@ -10,5 +10,8 @@ class Migration(migrations.Migration):
|
|||||||
]
|
]
|
||||||
|
|
||||||
operations = [
|
operations = [
|
||||||
migrations.AlterUniqueTogether(name="collectionitemchunk", unique_together={("item", "uid")},),
|
migrations.AlterUniqueTogether(
|
||||||
|
name="collectionitemchunk",
|
||||||
|
unique_together={("item", "uid")},
|
||||||
|
),
|
||||||
]
|
]
|
||||||
|
@ -18,6 +18,12 @@ class Migration(migrations.Migration):
|
|||||||
on_delete=django.db.models.deletion.CASCADE, related_name="chunks", to="django_etebase.Collection"
|
on_delete=django.db.models.deletion.CASCADE, related_name="chunks", to="django_etebase.Collection"
|
||||||
),
|
),
|
||||||
),
|
),
|
||||||
migrations.AlterUniqueTogether(name="collectionitemchunk", unique_together={("collection", "uid")},),
|
migrations.AlterUniqueTogether(
|
||||||
migrations.RemoveField(model_name="collectionitemchunk", name="item",),
|
name="collectionitemchunk",
|
||||||
|
unique_together={("collection", "uid")},
|
||||||
|
),
|
||||||
|
migrations.RemoveField(
|
||||||
|
model_name="collectionitemchunk",
|
||||||
|
name="item",
|
||||||
|
),
|
||||||
]
|
]
|
||||||
|
@ -10,6 +10,14 @@ class Migration(migrations.Migration):
|
|||||||
]
|
]
|
||||||
|
|
||||||
operations = [
|
operations = [
|
||||||
migrations.RenameField(model_name="collectioninvitation", old_name="accessLevel", new_name="accessLevelOld",),
|
migrations.RenameField(
|
||||||
migrations.RenameField(model_name="collectionmember", old_name="accessLevel", new_name="accessLevelOld",),
|
model_name="collectioninvitation",
|
||||||
|
old_name="accessLevel",
|
||||||
|
new_name="accessLevelOld",
|
||||||
|
),
|
||||||
|
migrations.RenameField(
|
||||||
|
model_name="collectionmember",
|
||||||
|
old_name="accessLevel",
|
||||||
|
new_name="accessLevelOld",
|
||||||
|
),
|
||||||
]
|
]
|
||||||
|
@ -10,6 +10,12 @@ class Migration(migrations.Migration):
|
|||||||
]
|
]
|
||||||
|
|
||||||
operations = [
|
operations = [
|
||||||
migrations.RemoveField(model_name="collectioninvitation", name="accessLevelOld",),
|
migrations.RemoveField(
|
||||||
migrations.RemoveField(model_name="collectionmember", name="accessLevelOld",),
|
model_name="collectioninvitation",
|
||||||
|
name="accessLevelOld",
|
||||||
|
),
|
||||||
|
migrations.RemoveField(
|
||||||
|
model_name="collectionmember",
|
||||||
|
name="accessLevelOld",
|
||||||
|
),
|
||||||
]
|
]
|
||||||
|
@ -13,6 +13,6 @@ class Migration(migrations.Migration):
|
|||||||
migrations.AlterField(
|
migrations.AlterField(
|
||||||
model_name="collectiontype",
|
model_name="collectiontype",
|
||||||
name="uid",
|
name="uid",
|
||||||
field=models.BinaryField(db_index=True, editable=True, unique=True),
|
field=models.BinaryField(db_index=True, editable=True, max_length=1024, unique=True),
|
||||||
),
|
),
|
||||||
]
|
]
|
||||||
|
19
django_etebase/migrations/0033_collection_uid.py
Normal file
19
django_etebase/migrations/0033_collection_uid.py
Normal file
@ -0,0 +1,19 @@
|
|||||||
|
# Generated by Django 3.1.1 on 2020-12-14 11:21
|
||||||
|
|
||||||
|
import django.core.validators
|
||||||
|
from django.db import migrations, models
|
||||||
|
|
||||||
|
|
||||||
|
class Migration(migrations.Migration):
|
||||||
|
|
||||||
|
dependencies = [
|
||||||
|
('django_etebase', '0032_auto_20201013_1409'),
|
||||||
|
]
|
||||||
|
|
||||||
|
operations = [
|
||||||
|
migrations.AddField(
|
||||||
|
model_name='collection',
|
||||||
|
name='uid',
|
||||||
|
field=models.CharField(db_index=True, max_length=43, null=True, validators=[django.core.validators.RegexValidator(message='Not a valid UID', regex='^[a-zA-Z0-9\\-_]{20,}$')]),
|
||||||
|
),
|
||||||
|
]
|
22
django_etebase/migrations/0034_auto_20201214_1124.py
Normal file
22
django_etebase/migrations/0034_auto_20201214_1124.py
Normal file
@ -0,0 +1,22 @@
|
|||||||
|
# Generated by Django 3.1.1 on 2020-12-14 11:24
|
||||||
|
|
||||||
|
from django.db import migrations
|
||||||
|
|
||||||
|
|
||||||
|
def update_collection_uid(apps, schema_editor):
|
||||||
|
Collection = apps.get_model("django_etebase", "Collection")
|
||||||
|
|
||||||
|
for collection in Collection.objects.all():
|
||||||
|
collection.uid = collection.main_item.uid
|
||||||
|
collection.save()
|
||||||
|
|
||||||
|
|
||||||
|
class Migration(migrations.Migration):
|
||||||
|
|
||||||
|
dependencies = [
|
||||||
|
("django_etebase", "0033_collection_uid"),
|
||||||
|
]
|
||||||
|
|
||||||
|
operations = [
|
||||||
|
migrations.RunPython(update_collection_uid),
|
||||||
|
]
|
19
django_etebase/migrations/0035_auto_20201214_1126.py
Normal file
19
django_etebase/migrations/0035_auto_20201214_1126.py
Normal file
@ -0,0 +1,19 @@
|
|||||||
|
# Generated by Django 3.1.1 on 2020-12-14 11:26
|
||||||
|
|
||||||
|
import django.core.validators
|
||||||
|
from django.db import migrations, models
|
||||||
|
|
||||||
|
|
||||||
|
class Migration(migrations.Migration):
|
||||||
|
|
||||||
|
dependencies = [
|
||||||
|
('django_etebase', '0034_auto_20201214_1124'),
|
||||||
|
]
|
||||||
|
|
||||||
|
operations = [
|
||||||
|
migrations.AlterField(
|
||||||
|
model_name='collection',
|
||||||
|
name='uid',
|
||||||
|
field=models.CharField(db_index=True, max_length=43, validators=[django.core.validators.RegexValidator(message='Not a valid UID', regex='^[a-zA-Z0-9\\-_]{20,}$')]),
|
||||||
|
),
|
||||||
|
]
|
19
django_etebase/migrations/0036_auto_20201214_1128.py
Normal file
19
django_etebase/migrations/0036_auto_20201214_1128.py
Normal file
@ -0,0 +1,19 @@
|
|||||||
|
# Generated by Django 3.1.1 on 2020-12-14 11:28
|
||||||
|
|
||||||
|
import django.core.validators
|
||||||
|
from django.db import migrations, models
|
||||||
|
|
||||||
|
|
||||||
|
class Migration(migrations.Migration):
|
||||||
|
|
||||||
|
dependencies = [
|
||||||
|
('django_etebase', '0035_auto_20201214_1126'),
|
||||||
|
]
|
||||||
|
|
||||||
|
operations = [
|
||||||
|
migrations.AlterField(
|
||||||
|
model_name='collection',
|
||||||
|
name='uid',
|
||||||
|
field=models.CharField(db_index=True, max_length=43, unique=True, validators=[django.core.validators.RegexValidator(message='Not a valid UID', regex='^[a-zA-Z0-9\\-_]{20,}$')]),
|
||||||
|
),
|
||||||
|
]
|
18
django_etebase/migrations/0037_auto_20210127_1237.py
Normal file
18
django_etebase/migrations/0037_auto_20210127_1237.py
Normal file
@ -0,0 +1,18 @@
|
|||||||
|
# Generated by Django 3.1.1 on 2021-01-27 12:37
|
||||||
|
|
||||||
|
from django.db import migrations, models
|
||||||
|
|
||||||
|
|
||||||
|
class Migration(migrations.Migration):
|
||||||
|
|
||||||
|
dependencies = [
|
||||||
|
("django_etebase", "0036_auto_20201214_1128"),
|
||||||
|
]
|
||||||
|
|
||||||
|
operations = [
|
||||||
|
migrations.AlterField(
|
||||||
|
model_name="collectiontype",
|
||||||
|
name="uid",
|
||||||
|
field=models.BinaryField(db_index=True, editable=True, max_length=1024, unique=True),
|
||||||
|
),
|
||||||
|
]
|
@ -12,76 +12,70 @@
|
|||||||
# You should have received a copy of the GNU General Public License
|
# You should have received a copy of the GNU General Public License
|
||||||
# along with this program. If not, see <http://www.gnu.org/licenses/>.
|
# along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||||
|
|
||||||
|
import typing as t
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
|
|
||||||
from django.db import models, transaction
|
from django.db import models, transaction
|
||||||
from django.conf import settings
|
from django.conf import settings
|
||||||
from django.core.validators import RegexValidator
|
from django.core.validators import RegexValidator
|
||||||
from django.db.models import Q
|
from django.db.models import Max, Value as V
|
||||||
|
from django.db.models.functions import Coalesce, Greatest
|
||||||
from django.utils.functional import cached_property
|
from django.utils.functional import cached_property
|
||||||
from django.utils.crypto import get_random_string
|
from django.utils.crypto import get_random_string
|
||||||
|
|
||||||
from rest_framework import status
|
|
||||||
|
|
||||||
from . import app_settings
|
from . import app_settings
|
||||||
from .exceptions import EtebaseValidationError
|
|
||||||
|
|
||||||
|
|
||||||
UidValidator = RegexValidator(regex=r"^[a-zA-Z0-9\-_]{20,}$", message="Not a valid UID")
|
UidValidator = RegexValidator(regex=r"^[a-zA-Z0-9\-_]{20,}$", message="Not a valid UID")
|
||||||
|
|
||||||
|
|
||||||
|
def stoken_annotation_builder(stoken_id_fields: t.List[str]):
|
||||||
|
aggr_fields = [Coalesce(Max(field), V(0)) for field in stoken_id_fields]
|
||||||
|
return Greatest(*aggr_fields) if len(aggr_fields) > 1 else aggr_fields[0]
|
||||||
|
|
||||||
|
|
||||||
class CollectionType(models.Model):
|
class CollectionType(models.Model):
|
||||||
owner = models.ForeignKey(settings.AUTH_USER_MODEL, on_delete=models.CASCADE)
|
owner = models.ForeignKey(settings.AUTH_USER_MODEL, on_delete=models.CASCADE)
|
||||||
uid = models.BinaryField(editable=True, blank=False, null=False, db_index=True, unique=True)
|
uid = models.BinaryField(editable=True, blank=False, null=False, db_index=True, unique=True, max_length=1024)
|
||||||
|
|
||||||
|
objects: models.manager.BaseManager["CollectionType"]
|
||||||
|
|
||||||
|
|
||||||
class Collection(models.Model):
|
class Collection(models.Model):
|
||||||
main_item = models.OneToOneField("CollectionItem", related_name="parent", null=True, on_delete=models.SET_NULL)
|
main_item = models.OneToOneField("CollectionItem", related_name="parent", null=True, on_delete=models.SET_NULL)
|
||||||
|
# The same as main_item.uid, we just also save it here so we have DB constraints for uniqueness (and efficiency)
|
||||||
|
uid = models.CharField(db_index=True, unique=True, blank=False, max_length=43, validators=[UidValidator])
|
||||||
owner = models.ForeignKey(settings.AUTH_USER_MODEL, on_delete=models.CASCADE)
|
owner = models.ForeignKey(settings.AUTH_USER_MODEL, on_delete=models.CASCADE)
|
||||||
|
|
||||||
|
stoken_annotation = stoken_annotation_builder(["items__revisions__stoken", "members__stoken"])
|
||||||
|
|
||||||
|
objects: models.manager.BaseManager["Collection"]
|
||||||
|
|
||||||
def __str__(self):
|
def __str__(self):
|
||||||
return self.uid
|
return self.uid
|
||||||
|
|
||||||
@cached_property
|
|
||||||
def uid(self):
|
|
||||||
return self.main_item.uid
|
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def content(self):
|
def content(self) -> "CollectionItemRevision":
|
||||||
|
assert self.main_item is not None
|
||||||
return self.main_item.content
|
return self.main_item.content
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def etag(self):
|
def etag(self) -> str:
|
||||||
return self.content.uid
|
return self.content.uid
|
||||||
|
|
||||||
@cached_property
|
@cached_property
|
||||||
def stoken(self):
|
def stoken(self) -> str:
|
||||||
stoken = (
|
stoken_id = (
|
||||||
Stoken.objects.filter(
|
self.__class__.objects.filter(main_item=self.main_item)
|
||||||
Q(collectionitemrevision__item__collection=self) | Q(collectionmember__collection=self)
|
.annotate(max_stoken=self.stoken_annotation)
|
||||||
)
|
.values("max_stoken")
|
||||||
.order_by("id")
|
.first()["max_stoken"]
|
||||||
.last()
|
|
||||||
)
|
)
|
||||||
|
|
||||||
if stoken is None:
|
if stoken_id == 0:
|
||||||
raise Exception("stoken is None. Should never happen")
|
raise Exception("stoken is None. Should never happen")
|
||||||
|
|
||||||
return stoken.uid
|
return Stoken.objects.get(id=stoken_id).uid
|
||||||
|
|
||||||
def validate_unique(self, exclude=None):
|
|
||||||
super().validate_unique(exclude=exclude)
|
|
||||||
if exclude is None or "main_item" in exclude:
|
|
||||||
return
|
|
||||||
|
|
||||||
if (
|
|
||||||
self.__class__.objects.filter(owner=self.owner, main_item__uid=self.main_item.uid)
|
|
||||||
.exclude(id=self.id)
|
|
||||||
.exists()
|
|
||||||
):
|
|
||||||
raise EtebaseValidationError(
|
|
||||||
"unique_uid", "Collection with this uid already exists", status_code=status.HTTP_409_CONFLICT
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
class CollectionItem(models.Model):
|
class CollectionItem(models.Model):
|
||||||
@ -90,6 +84,10 @@ class CollectionItem(models.Model):
|
|||||||
version = models.PositiveSmallIntegerField()
|
version = models.PositiveSmallIntegerField()
|
||||||
encryptionKey = models.BinaryField(editable=True, blank=False, null=True)
|
encryptionKey = models.BinaryField(editable=True, blank=False, null=True)
|
||||||
|
|
||||||
|
stoken_annotation = stoken_annotation_builder(["revisions__stoken"])
|
||||||
|
|
||||||
|
objects: models.manager.BaseManager["CollectionItem"]
|
||||||
|
|
||||||
class Meta:
|
class Meta:
|
||||||
unique_together = ("uid", "collection")
|
unique_together = ("uid", "collection")
|
||||||
|
|
||||||
@ -97,23 +95,23 @@ class CollectionItem(models.Model):
|
|||||||
return "{} {}".format(self.uid, self.collection.uid)
|
return "{} {}".format(self.uid, self.collection.uid)
|
||||||
|
|
||||||
@cached_property
|
@cached_property
|
||||||
def content(self):
|
def content(self) -> "CollectionItemRevision":
|
||||||
return self.revisions.get(current=True)
|
return self.revisions.get(current=True)
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def etag(self):
|
def etag(self) -> str:
|
||||||
return self.content.uid
|
return self.content.uid
|
||||||
|
|
||||||
|
|
||||||
def chunk_directory_path(instance, filename):
|
def chunk_directory_path(instance: "CollectionItemChunk", filename: str) -> Path:
|
||||||
custom_func = app_settings.CHUNK_PATH_FUNC
|
custom_func = app_settings.CHUNK_PATH_FUNC
|
||||||
if custom_func is not None:
|
if custom_func is not None:
|
||||||
return custom_func(instance, filename)
|
return custom_func(instance, filename)
|
||||||
|
|
||||||
col = instance.collection
|
col: Collection = instance.collection
|
||||||
user_id = col.owner.id
|
user_id: int = col.owner.id
|
||||||
uid_prefix = instance.uid[:2]
|
uid_prefix: str = instance.uid[:2]
|
||||||
uid_rest = instance.uid[2:]
|
uid_rest: str = instance.uid[2:]
|
||||||
return Path("user_{}".format(user_id), col.uid, uid_prefix, uid_rest)
|
return Path("user_{}".format(user_id), col.uid, uid_prefix, uid_rest)
|
||||||
|
|
||||||
|
|
||||||
@ -122,6 +120,8 @@ class CollectionItemChunk(models.Model):
|
|||||||
collection = models.ForeignKey(Collection, related_name="chunks", on_delete=models.CASCADE)
|
collection = models.ForeignKey(Collection, related_name="chunks", on_delete=models.CASCADE)
|
||||||
chunkFile = models.FileField(upload_to=chunk_directory_path, max_length=150, unique=True)
|
chunkFile = models.FileField(upload_to=chunk_directory_path, max_length=150, unique=True)
|
||||||
|
|
||||||
|
objects: models.manager.BaseManager["CollectionItemChunk"]
|
||||||
|
|
||||||
def __str__(self):
|
def __str__(self):
|
||||||
return self.uid
|
return self.uid
|
||||||
|
|
||||||
@ -134,6 +134,7 @@ def generate_stoken_uid():
|
|||||||
|
|
||||||
|
|
||||||
class Stoken(models.Model):
|
class Stoken(models.Model):
|
||||||
|
id: int
|
||||||
uid = models.CharField(
|
uid = models.CharField(
|
||||||
db_index=True,
|
db_index=True,
|
||||||
unique=True,
|
unique=True,
|
||||||
@ -144,6 +145,8 @@ class Stoken(models.Model):
|
|||||||
validators=[UidValidator],
|
validators=[UidValidator],
|
||||||
)
|
)
|
||||||
|
|
||||||
|
objects: models.manager.BaseManager["Stoken"]
|
||||||
|
|
||||||
|
|
||||||
class CollectionItemRevision(models.Model):
|
class CollectionItemRevision(models.Model):
|
||||||
stoken = models.OneToOneField(Stoken, on_delete=models.PROTECT)
|
stoken = models.OneToOneField(Stoken, on_delete=models.PROTECT)
|
||||||
@ -155,6 +158,8 @@ class CollectionItemRevision(models.Model):
|
|||||||
current = models.BooleanField(db_index=True, default=True, null=True)
|
current = models.BooleanField(db_index=True, default=True, null=True)
|
||||||
deleted = models.BooleanField(default=False)
|
deleted = models.BooleanField(default=False)
|
||||||
|
|
||||||
|
objects: models.manager.BaseManager["CollectionItemRevision"]
|
||||||
|
|
||||||
class Meta:
|
class Meta:
|
||||||
unique_together = ("item", "current")
|
unique_together = ("item", "current")
|
||||||
|
|
||||||
@ -166,6 +171,8 @@ class RevisionChunkRelation(models.Model):
|
|||||||
chunk = models.ForeignKey(CollectionItemChunk, related_name="revisions_relation", on_delete=models.CASCADE)
|
chunk = models.ForeignKey(CollectionItemChunk, related_name="revisions_relation", on_delete=models.CASCADE)
|
||||||
revision = models.ForeignKey(CollectionItemRevision, related_name="chunks_relation", on_delete=models.CASCADE)
|
revision = models.ForeignKey(CollectionItemRevision, related_name="chunks_relation", on_delete=models.CASCADE)
|
||||||
|
|
||||||
|
objects: models.manager.BaseManager["RevisionChunkRelation"]
|
||||||
|
|
||||||
class Meta:
|
class Meta:
|
||||||
ordering = ("id",)
|
ordering = ("id",)
|
||||||
|
|
||||||
@ -182,7 +189,14 @@ class CollectionMember(models.Model):
|
|||||||
user = models.ForeignKey(settings.AUTH_USER_MODEL, on_delete=models.CASCADE)
|
user = models.ForeignKey(settings.AUTH_USER_MODEL, on_delete=models.CASCADE)
|
||||||
encryptionKey = models.BinaryField(editable=True, blank=False, null=False)
|
encryptionKey = models.BinaryField(editable=True, blank=False, null=False)
|
||||||
collectionType = models.ForeignKey(CollectionType, on_delete=models.PROTECT, null=True)
|
collectionType = models.ForeignKey(CollectionType, on_delete=models.PROTECT, null=True)
|
||||||
accessLevel = models.IntegerField(choices=AccessLevels.choices, default=AccessLevels.READ_ONLY,)
|
accessLevel = models.IntegerField(
|
||||||
|
choices=AccessLevels.choices,
|
||||||
|
default=AccessLevels.READ_ONLY,
|
||||||
|
)
|
||||||
|
|
||||||
|
stoken_annotation = stoken_annotation_builder(["stoken"])
|
||||||
|
|
||||||
|
objects: models.manager.BaseManager["CollectionMember"]
|
||||||
|
|
||||||
class Meta:
|
class Meta:
|
||||||
unique_together = ("user", "collection")
|
unique_together = ("user", "collection")
|
||||||
@ -193,7 +207,11 @@ class CollectionMember(models.Model):
|
|||||||
def revoke(self):
|
def revoke(self):
|
||||||
with transaction.atomic():
|
with transaction.atomic():
|
||||||
CollectionMemberRemoved.objects.update_or_create(
|
CollectionMemberRemoved.objects.update_or_create(
|
||||||
collection=self.collection, user=self.user, defaults={"stoken": Stoken.objects.create(),},
|
collection=self.collection,
|
||||||
|
user=self.user,
|
||||||
|
defaults={
|
||||||
|
"stoken": Stoken.objects.create(),
|
||||||
|
},
|
||||||
)
|
)
|
||||||
|
|
||||||
self.delete()
|
self.delete()
|
||||||
@ -204,6 +222,8 @@ class CollectionMemberRemoved(models.Model):
|
|||||||
collection = models.ForeignKey(Collection, related_name="removed_members", on_delete=models.CASCADE)
|
collection = models.ForeignKey(Collection, related_name="removed_members", on_delete=models.CASCADE)
|
||||||
user = models.ForeignKey(settings.AUTH_USER_MODEL, on_delete=models.CASCADE)
|
user = models.ForeignKey(settings.AUTH_USER_MODEL, on_delete=models.CASCADE)
|
||||||
|
|
||||||
|
objects: models.manager.BaseManager["CollectionMemberRemoved"]
|
||||||
|
|
||||||
class Meta:
|
class Meta:
|
||||||
unique_together = ("user", "collection")
|
unique_together = ("user", "collection")
|
||||||
|
|
||||||
@ -220,7 +240,12 @@ class CollectionInvitation(models.Model):
|
|||||||
|
|
||||||
user = models.ForeignKey(settings.AUTH_USER_MODEL, related_name="incoming_invitations", on_delete=models.CASCADE)
|
user = models.ForeignKey(settings.AUTH_USER_MODEL, related_name="incoming_invitations", on_delete=models.CASCADE)
|
||||||
signedEncryptionKey = models.BinaryField(editable=False, blank=False, null=False)
|
signedEncryptionKey = models.BinaryField(editable=False, blank=False, null=False)
|
||||||
accessLevel = models.IntegerField(choices=AccessLevels.choices, default=AccessLevels.READ_ONLY,)
|
accessLevel = models.IntegerField(
|
||||||
|
choices=AccessLevels.choices,
|
||||||
|
default=AccessLevels.READ_ONLY,
|
||||||
|
)
|
||||||
|
|
||||||
|
objects: models.manager.BaseManager["CollectionInvitation"]
|
||||||
|
|
||||||
class Meta:
|
class Meta:
|
||||||
unique_together = ("user", "fromMember")
|
unique_together = ("user", "fromMember")
|
||||||
@ -229,7 +254,7 @@ class CollectionInvitation(models.Model):
|
|||||||
return "{} {}".format(self.fromMember.collection.uid, self.user)
|
return "{} {}".format(self.fromMember.collection.uid, self.user)
|
||||||
|
|
||||||
@cached_property
|
@cached_property
|
||||||
def collection(self):
|
def collection(self) -> Collection:
|
||||||
return self.fromMember.collection
|
return self.fromMember.collection
|
||||||
|
|
||||||
|
|
||||||
@ -241,5 +266,7 @@ class UserInfo(models.Model):
|
|||||||
encryptedContent = models.BinaryField(editable=True, blank=False, null=False)
|
encryptedContent = models.BinaryField(editable=True, blank=False, null=False)
|
||||||
salt = models.BinaryField(editable=True, blank=False, null=False)
|
salt = models.BinaryField(editable=True, blank=False, null=False)
|
||||||
|
|
||||||
|
objects: models.manager.BaseManager["UserInfo"]
|
||||||
|
|
||||||
def __str__(self):
|
def __str__(self):
|
||||||
return "UserInfo<{}>".format(self.owner)
|
return "UserInfo<{}>".format(self.owner)
|
||||||
|
@ -1,16 +0,0 @@
|
|||||||
from rest_framework.parsers import FileUploadParser
|
|
||||||
|
|
||||||
|
|
||||||
class ChunkUploadParser(FileUploadParser):
|
|
||||||
"""
|
|
||||||
Parser for chunk upload data.
|
|
||||||
"""
|
|
||||||
|
|
||||||
media_type = "application/octet-stream"
|
|
||||||
|
|
||||||
def get_filename(self, stream, media_type, parser_context):
|
|
||||||
"""
|
|
||||||
Detects the uploaded file name.
|
|
||||||
"""
|
|
||||||
view = parser_context["view"]
|
|
||||||
return parser_context["kwargs"][view.lookup_field]
|
|
@ -1,93 +0,0 @@
|
|||||||
# Copyright © 2017 Tom Hacohen
|
|
||||||
#
|
|
||||||
# This program is free software: you can redistribute it and/or modify
|
|
||||||
# it under the terms of the GNU Affero General Public License as
|
|
||||||
# published by the Free Software Foundation, version 3.
|
|
||||||
#
|
|
||||||
# This library is distributed in the hope that it will be useful,
|
|
||||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
|
||||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
|
||||||
# GNU General Public License for more details.
|
|
||||||
#
|
|
||||||
# You should have received a copy of the GNU General Public License
|
|
||||||
# along with this program. If not, see <http://www.gnu.org/licenses/>.
|
|
||||||
|
|
||||||
from rest_framework import permissions
|
|
||||||
from django_etebase.models import Collection, AccessLevels
|
|
||||||
|
|
||||||
|
|
||||||
def is_collection_admin(collection, user):
|
|
||||||
member = collection.members.filter(user=user).first()
|
|
||||||
return (member is not None) and (member.accessLevel == AccessLevels.ADMIN)
|
|
||||||
|
|
||||||
|
|
||||||
class IsCollectionAdmin(permissions.BasePermission):
|
|
||||||
"""
|
|
||||||
Custom permission to only allow owners of a collection to view it
|
|
||||||
"""
|
|
||||||
|
|
||||||
message = {
|
|
||||||
"detail": "Only collection admins can perform this operation.",
|
|
||||||
"code": "admin_access_required",
|
|
||||||
}
|
|
||||||
|
|
||||||
def has_permission(self, request, view):
|
|
||||||
collection_uid = view.kwargs["collection_uid"]
|
|
||||||
try:
|
|
||||||
collection = view.get_collection_queryset().get(main_item__uid=collection_uid)
|
|
||||||
return is_collection_admin(collection, request.user)
|
|
||||||
except Collection.DoesNotExist:
|
|
||||||
# If the collection does not exist, we want to 404 later, not permission denied.
|
|
||||||
return True
|
|
||||||
|
|
||||||
|
|
||||||
class IsCollectionAdminOrReadOnly(permissions.BasePermission):
|
|
||||||
"""
|
|
||||||
Custom permission to only allow owners of a collection to edit it
|
|
||||||
"""
|
|
||||||
|
|
||||||
message = {
|
|
||||||
"detail": "Only collection admins can edit collections.",
|
|
||||||
"code": "admin_access_required",
|
|
||||||
}
|
|
||||||
|
|
||||||
def has_permission(self, request, view):
|
|
||||||
collection_uid = view.kwargs.get("collection_uid", None)
|
|
||||||
|
|
||||||
# Allow creating new collections
|
|
||||||
if collection_uid is None:
|
|
||||||
return True
|
|
||||||
|
|
||||||
try:
|
|
||||||
collection = view.get_collection_queryset().get(main_item__uid=collection_uid)
|
|
||||||
if request.method in permissions.SAFE_METHODS:
|
|
||||||
return True
|
|
||||||
|
|
||||||
return is_collection_admin(collection, request.user)
|
|
||||||
except Collection.DoesNotExist:
|
|
||||||
# If the collection does not exist, we want to 404 later, not permission denied.
|
|
||||||
return True
|
|
||||||
|
|
||||||
|
|
||||||
class HasWriteAccessOrReadOnly(permissions.BasePermission):
|
|
||||||
"""
|
|
||||||
Custom permission to restrict write
|
|
||||||
"""
|
|
||||||
|
|
||||||
message = {
|
|
||||||
"detail": "You need write access to write to this collection",
|
|
||||||
"code": "no_write_access",
|
|
||||||
}
|
|
||||||
|
|
||||||
def has_permission(self, request, view):
|
|
||||||
collection_uid = view.kwargs["collection_uid"]
|
|
||||||
try:
|
|
||||||
collection = view.get_collection_queryset().get(main_item__uid=collection_uid)
|
|
||||||
if request.method in permissions.SAFE_METHODS:
|
|
||||||
return True
|
|
||||||
else:
|
|
||||||
member = collection.members.get(user=request.user)
|
|
||||||
return member.accessLevel != AccessLevels.READ_ONLY
|
|
||||||
except Collection.DoesNotExist:
|
|
||||||
# If the collection does not exist, we want to 404 later, not permission denied.
|
|
||||||
return True
|
|
@ -1,19 +0,0 @@
|
|||||||
from rest_framework.utils.encoders import JSONEncoder as DRFJSONEncoder
|
|
||||||
from rest_framework.renderers import JSONRenderer as DRFJSONRenderer
|
|
||||||
|
|
||||||
from .serializers import b64encode
|
|
||||||
|
|
||||||
|
|
||||||
class JSONEncoder(DRFJSONEncoder):
|
|
||||||
def default(self, obj):
|
|
||||||
if isinstance(obj, bytes) or isinstance(obj, memoryview):
|
|
||||||
return b64encode(obj)
|
|
||||||
return super().default(obj)
|
|
||||||
|
|
||||||
|
|
||||||
class JSONRenderer(DRFJSONRenderer):
|
|
||||||
"""
|
|
||||||
Renderer which serializes to JSON with support for our base64
|
|
||||||
"""
|
|
||||||
|
|
||||||
encoder_class = JSONEncoder
|
|
@ -1,569 +0,0 @@
|
|||||||
# Copyright © 2017 Tom Hacohen
|
|
||||||
#
|
|
||||||
# This program is free software: you can redistribute it and/or modify
|
|
||||||
# it under the terms of the GNU Affero General Public License as
|
|
||||||
# published by the Free Software Foundation, version 3.
|
|
||||||
#
|
|
||||||
# This library is distributed in the hope that it will be useful,
|
|
||||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
|
||||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
|
||||||
# GNU General Public License for more details.
|
|
||||||
#
|
|
||||||
# You should have received a copy of the GNU General Public License
|
|
||||||
# along with this program. If not, see <http://www.gnu.org/licenses/>.
|
|
||||||
|
|
||||||
import base64
|
|
||||||
|
|
||||||
from django.core.files.base import ContentFile
|
|
||||||
from django.core import exceptions as django_exceptions
|
|
||||||
from django.contrib.auth import get_user_model
|
|
||||||
from django.db import IntegrityError, transaction
|
|
||||||
from rest_framework import serializers, status
|
|
||||||
from . import models
|
|
||||||
from .utils import get_user_queryset, create_user
|
|
||||||
|
|
||||||
from .exceptions import EtebaseValidationError
|
|
||||||
|
|
||||||
User = get_user_model()
|
|
||||||
|
|
||||||
|
|
||||||
def process_revisions_for_item(item, revision_data):
|
|
||||||
chunks_objs = []
|
|
||||||
chunks = revision_data.pop("chunks_relation")
|
|
||||||
|
|
||||||
revision = models.CollectionItemRevision(**revision_data, item=item)
|
|
||||||
revision.validate_unique() # Verify there aren't any validation issues
|
|
||||||
|
|
||||||
for chunk in chunks:
|
|
||||||
uid = chunk[0]
|
|
||||||
chunk_obj = models.CollectionItemChunk.objects.filter(uid=uid).first()
|
|
||||||
if len(chunk) > 1:
|
|
||||||
content = chunk[1]
|
|
||||||
# If the chunk already exists we assume it's fine. Otherwise, we upload it.
|
|
||||||
if chunk_obj is None:
|
|
||||||
chunk_obj = models.CollectionItemChunk(uid=uid, collection=item.collection)
|
|
||||||
chunk_obj.chunkFile.save("IGNORED", ContentFile(content))
|
|
||||||
chunk_obj.save()
|
|
||||||
else:
|
|
||||||
if chunk_obj is None:
|
|
||||||
raise EtebaseValidationError("chunk_no_content", "Tried to create a new chunk without content")
|
|
||||||
|
|
||||||
chunks_objs.append(chunk_obj)
|
|
||||||
|
|
||||||
stoken = models.Stoken.objects.create()
|
|
||||||
revision.stoken = stoken
|
|
||||||
revision.save()
|
|
||||||
|
|
||||||
for chunk in chunks_objs:
|
|
||||||
models.RevisionChunkRelation.objects.create(chunk=chunk, revision=revision)
|
|
||||||
return revision
|
|
||||||
|
|
||||||
|
|
||||||
def b64encode(value):
|
|
||||||
return base64.urlsafe_b64encode(value).decode("ascii").strip("=")
|
|
||||||
|
|
||||||
|
|
||||||
def b64decode(data):
|
|
||||||
data += "=" * ((4 - len(data) % 4) % 4)
|
|
||||||
return base64.urlsafe_b64decode(data)
|
|
||||||
|
|
||||||
|
|
||||||
def b64decode_or_bytes(data):
|
|
||||||
if isinstance(data, bytes):
|
|
||||||
return data
|
|
||||||
else:
|
|
||||||
return b64decode(data)
|
|
||||||
|
|
||||||
|
|
||||||
class BinaryBase64Field(serializers.Field):
|
|
||||||
def to_representation(self, value):
|
|
||||||
return value
|
|
||||||
|
|
||||||
def to_internal_value(self, data):
|
|
||||||
return b64decode_or_bytes(data)
|
|
||||||
|
|
||||||
|
|
||||||
class CollectionEncryptionKeyField(BinaryBase64Field):
|
|
||||||
def get_attribute(self, instance):
|
|
||||||
request = self.context.get("request", None)
|
|
||||||
if request is not None:
|
|
||||||
return instance.members.get(user=request.user).encryptionKey
|
|
||||||
return None
|
|
||||||
|
|
||||||
|
|
||||||
class CollectionTypeField(BinaryBase64Field):
|
|
||||||
def get_attribute(self, instance):
|
|
||||||
request = self.context.get("request", None)
|
|
||||||
if request is not None:
|
|
||||||
collection_type = instance.members.get(user=request.user).collectionType
|
|
||||||
return collection_type and collection_type.uid
|
|
||||||
return None
|
|
||||||
|
|
||||||
|
|
||||||
class UserSlugRelatedField(serializers.SlugRelatedField):
|
|
||||||
def get_queryset(self):
|
|
||||||
view = self.context.get("view", None)
|
|
||||||
return get_user_queryset(super().get_queryset(), view)
|
|
||||||
|
|
||||||
def __init__(self, **kwargs):
|
|
||||||
super().__init__(slug_field=User.USERNAME_FIELD, **kwargs)
|
|
||||||
|
|
||||||
def to_internal_value(self, data):
|
|
||||||
return super().to_internal_value(data.lower())
|
|
||||||
|
|
||||||
|
|
||||||
class ChunksField(serializers.RelatedField):
|
|
||||||
def to_representation(self, obj):
|
|
||||||
obj = obj.chunk
|
|
||||||
if self.context.get("prefetch") == "auto":
|
|
||||||
with open(obj.chunkFile.path, "rb") as f:
|
|
||||||
return (obj.uid, f.read())
|
|
||||||
else:
|
|
||||||
return (obj.uid,)
|
|
||||||
|
|
||||||
def to_internal_value(self, data):
|
|
||||||
if data[0] is None or data[1] is None:
|
|
||||||
raise EtebaseValidationError("no_null", "null is not allowed")
|
|
||||||
return (data[0], b64decode_or_bytes(data[1]))
|
|
||||||
|
|
||||||
|
|
||||||
class BetterErrorsMixin:
|
|
||||||
@property
|
|
||||||
def errors(self):
|
|
||||||
nice = []
|
|
||||||
errors = super().errors
|
|
||||||
for error_type in errors:
|
|
||||||
if error_type == "non_field_errors":
|
|
||||||
nice.extend(self.flatten_errors(None, errors[error_type]))
|
|
||||||
else:
|
|
||||||
nice.extend(self.flatten_errors(error_type, errors[error_type]))
|
|
||||||
if nice:
|
|
||||||
return {"code": "field_errors", "detail": "Field validations failed.", "errors": nice}
|
|
||||||
return {}
|
|
||||||
|
|
||||||
def flatten_errors(self, field_name, errors):
|
|
||||||
ret = []
|
|
||||||
if isinstance(errors, dict):
|
|
||||||
for error_key in errors:
|
|
||||||
error = errors[error_key]
|
|
||||||
ret.extend(self.flatten_errors("{}.{}".format(field_name, error_key), error))
|
|
||||||
else:
|
|
||||||
for error in errors:
|
|
||||||
if hasattr(error, "detail"):
|
|
||||||
message = error.detail[0]
|
|
||||||
elif hasattr(error, "message"):
|
|
||||||
message = error.message
|
|
||||||
else:
|
|
||||||
message = str(error)
|
|
||||||
ret.append(
|
|
||||||
{"field": field_name, "code": error.code, "detail": message,}
|
|
||||||
)
|
|
||||||
return ret
|
|
||||||
|
|
||||||
def transform_validation_error(self, prefix, err):
|
|
||||||
if hasattr(err, "error_dict"):
|
|
||||||
errors = self.flatten_errors(prefix, err.error_dict)
|
|
||||||
elif not hasattr(err, "message"):
|
|
||||||
errors = self.flatten_errors(prefix, err.error_list)
|
|
||||||
else:
|
|
||||||
raise EtebaseValidationError(err.code, err.message)
|
|
||||||
|
|
||||||
raise serializers.ValidationError(
|
|
||||||
{"code": "field_errors", "detail": "Field validations failed.", "errors": errors,}
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
class CollectionItemChunkSerializer(BetterErrorsMixin, serializers.ModelSerializer):
|
|
||||||
class Meta:
|
|
||||||
model = models.CollectionItemChunk
|
|
||||||
fields = ("uid", "chunkFile")
|
|
||||||
|
|
||||||
|
|
||||||
class CollectionItemRevisionSerializer(BetterErrorsMixin, serializers.ModelSerializer):
|
|
||||||
chunks = ChunksField(
|
|
||||||
source="chunks_relation",
|
|
||||||
queryset=models.RevisionChunkRelation.objects.all(),
|
|
||||||
style={"base_template": "input.html"},
|
|
||||||
many=True,
|
|
||||||
)
|
|
||||||
meta = BinaryBase64Field()
|
|
||||||
|
|
||||||
class Meta:
|
|
||||||
model = models.CollectionItemRevision
|
|
||||||
fields = ("chunks", "meta", "uid", "deleted")
|
|
||||||
extra_kwargs = {
|
|
||||||
"uid": {"validators": []}, # We deal with it in the serializers
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
class CollectionItemSerializer(BetterErrorsMixin, serializers.ModelSerializer):
|
|
||||||
encryptionKey = BinaryBase64Field(required=False, default=None, allow_null=True)
|
|
||||||
etag = serializers.CharField(allow_null=True, write_only=True)
|
|
||||||
content = CollectionItemRevisionSerializer(many=False)
|
|
||||||
|
|
||||||
class Meta:
|
|
||||||
model = models.CollectionItem
|
|
||||||
fields = ("uid", "version", "encryptionKey", "content", "etag")
|
|
||||||
|
|
||||||
def create(self, validated_data):
|
|
||||||
"""Function that's called when this serializer creates an item"""
|
|
||||||
validate_etag = self.context.get("validate_etag", False)
|
|
||||||
etag = validated_data.pop("etag")
|
|
||||||
revision_data = validated_data.pop("content")
|
|
||||||
uid = validated_data.pop("uid")
|
|
||||||
|
|
||||||
Model = self.__class__.Meta.model
|
|
||||||
|
|
||||||
with transaction.atomic():
|
|
||||||
instance, created = Model.objects.get_or_create(uid=uid, defaults=validated_data)
|
|
||||||
cur_etag = instance.etag if not created else None
|
|
||||||
|
|
||||||
# If we are trying to update an up to date item, abort early and consider it a success
|
|
||||||
if cur_etag == revision_data.get("uid"):
|
|
||||||
return instance
|
|
||||||
|
|
||||||
if validate_etag and cur_etag != etag:
|
|
||||||
raise EtebaseValidationError(
|
|
||||||
"wrong_etag",
|
|
||||||
"Wrong etag. Expected {} got {}".format(cur_etag, etag),
|
|
||||||
status_code=status.HTTP_409_CONFLICT,
|
|
||||||
)
|
|
||||||
|
|
||||||
if not created:
|
|
||||||
# We don't have to use select_for_update here because the unique constraint on current guards against
|
|
||||||
# the race condition. But it's a good idea because it'll lock and wait rather than fail.
|
|
||||||
current_revision = instance.revisions.filter(current=True).select_for_update().first()
|
|
||||||
current_revision.current = None
|
|
||||||
current_revision.save()
|
|
||||||
|
|
||||||
try:
|
|
||||||
process_revisions_for_item(instance, revision_data)
|
|
||||||
except django_exceptions.ValidationError as e:
|
|
||||||
self.transform_validation_error("content", e)
|
|
||||||
|
|
||||||
return instance
|
|
||||||
|
|
||||||
def update(self, instance, validated_data):
|
|
||||||
# We never update, we always update in the create method
|
|
||||||
raise NotImplementedError()
|
|
||||||
|
|
||||||
|
|
||||||
class CollectionItemDepSerializer(BetterErrorsMixin, serializers.ModelSerializer):
|
|
||||||
etag = serializers.CharField()
|
|
||||||
|
|
||||||
class Meta:
|
|
||||||
model = models.CollectionItem
|
|
||||||
fields = ("uid", "etag")
|
|
||||||
|
|
||||||
def validate(self, data):
|
|
||||||
item = self.__class__.Meta.model.objects.get(uid=data["uid"])
|
|
||||||
etag = data["etag"]
|
|
||||||
if item.etag != etag:
|
|
||||||
raise EtebaseValidationError(
|
|
||||||
"wrong_etag",
|
|
||||||
"Wrong etag. Expected {} got {}".format(item.etag, etag),
|
|
||||||
status_code=status.HTTP_409_CONFLICT,
|
|
||||||
)
|
|
||||||
|
|
||||||
return data
|
|
||||||
|
|
||||||
|
|
||||||
class CollectionItemBulkGetSerializer(BetterErrorsMixin, serializers.ModelSerializer):
|
|
||||||
etag = serializers.CharField(required=False)
|
|
||||||
|
|
||||||
class Meta:
|
|
||||||
model = models.CollectionItem
|
|
||||||
fields = ("uid", "etag")
|
|
||||||
|
|
||||||
|
|
||||||
class CollectionListMultiSerializer(BetterErrorsMixin, serializers.Serializer):
|
|
||||||
collectionTypes = serializers.ListField(child=BinaryBase64Field())
|
|
||||||
|
|
||||||
|
|
||||||
class CollectionSerializer(BetterErrorsMixin, serializers.ModelSerializer):
|
|
||||||
collectionKey = CollectionEncryptionKeyField()
|
|
||||||
collectionType = CollectionTypeField()
|
|
||||||
accessLevel = serializers.SerializerMethodField("get_access_level_from_context")
|
|
||||||
stoken = serializers.CharField(read_only=True)
|
|
||||||
|
|
||||||
item = CollectionItemSerializer(many=False, source="main_item")
|
|
||||||
|
|
||||||
class Meta:
|
|
||||||
model = models.Collection
|
|
||||||
fields = ("item", "accessLevel", "collectionKey", "collectionType", "stoken")
|
|
||||||
|
|
||||||
def get_access_level_from_context(self, obj):
|
|
||||||
request = self.context.get("request", None)
|
|
||||||
if request is not None:
|
|
||||||
return obj.members.get(user=request.user).accessLevel
|
|
||||||
return None
|
|
||||||
|
|
||||||
def create(self, validated_data):
|
|
||||||
"""Function that's called when this serializer creates an item"""
|
|
||||||
collection_key = validated_data.pop("collectionKey")
|
|
||||||
collection_type = validated_data.pop("collectionType")
|
|
||||||
|
|
||||||
user = validated_data.get("owner")
|
|
||||||
main_item_data = validated_data.pop("main_item")
|
|
||||||
etag = main_item_data.pop("etag")
|
|
||||||
revision_data = main_item_data.pop("content")
|
|
||||||
|
|
||||||
instance = self.__class__.Meta.model(**validated_data)
|
|
||||||
|
|
||||||
with transaction.atomic():
|
|
||||||
_ = self.__class__.Meta.model.objects.select_for_update().filter(owner=user)
|
|
||||||
if etag is not None:
|
|
||||||
raise EtebaseValidationError("bad_etag", "etag is not null")
|
|
||||||
|
|
||||||
instance.save()
|
|
||||||
main_item = models.CollectionItem.objects.create(**main_item_data, collection=instance)
|
|
||||||
|
|
||||||
instance.main_item = main_item
|
|
||||||
|
|
||||||
instance.full_clean()
|
|
||||||
instance.save()
|
|
||||||
|
|
||||||
process_revisions_for_item(main_item, revision_data)
|
|
||||||
|
|
||||||
collection_type_obj, _ = models.CollectionType.objects.get_or_create(uid=collection_type, owner=user)
|
|
||||||
|
|
||||||
models.CollectionMember(
|
|
||||||
collection=instance,
|
|
||||||
stoken=models.Stoken.objects.create(),
|
|
||||||
user=user,
|
|
||||||
accessLevel=models.AccessLevels.ADMIN,
|
|
||||||
encryptionKey=collection_key,
|
|
||||||
collectionType=collection_type_obj,
|
|
||||||
).save()
|
|
||||||
|
|
||||||
return instance
|
|
||||||
|
|
||||||
def update(self, instance, validated_data):
|
|
||||||
raise NotImplementedError()
|
|
||||||
|
|
||||||
|
|
||||||
class CollectionMemberSerializer(BetterErrorsMixin, serializers.ModelSerializer):
|
|
||||||
username = UserSlugRelatedField(source="user", read_only=True, style={"base_template": "input.html"},)
|
|
||||||
|
|
||||||
class Meta:
|
|
||||||
model = models.CollectionMember
|
|
||||||
fields = ("username", "accessLevel")
|
|
||||||
|
|
||||||
def create(self, validated_data):
|
|
||||||
raise NotImplementedError()
|
|
||||||
|
|
||||||
def update(self, instance, validated_data):
|
|
||||||
with transaction.atomic():
|
|
||||||
# We only allow updating accessLevel
|
|
||||||
access_level = validated_data.pop("accessLevel")
|
|
||||||
if instance.accessLevel != access_level:
|
|
||||||
instance.stoken = models.Stoken.objects.create()
|
|
||||||
instance.accessLevel = access_level
|
|
||||||
instance.save()
|
|
||||||
|
|
||||||
return instance
|
|
||||||
|
|
||||||
|
|
||||||
class CollectionInvitationSerializer(BetterErrorsMixin, serializers.ModelSerializer):
|
|
||||||
username = UserSlugRelatedField(source="user", queryset=User.objects, style={"base_template": "input.html"},)
|
|
||||||
collection = serializers.CharField(source="collection.uid")
|
|
||||||
fromUsername = BinaryBase64Field(source="fromMember.user.username", read_only=True)
|
|
||||||
fromPubkey = BinaryBase64Field(source="fromMember.user.userinfo.pubkey", read_only=True)
|
|
||||||
signedEncryptionKey = BinaryBase64Field()
|
|
||||||
|
|
||||||
class Meta:
|
|
||||||
model = models.CollectionInvitation
|
|
||||||
fields = (
|
|
||||||
"username",
|
|
||||||
"uid",
|
|
||||||
"collection",
|
|
||||||
"signedEncryptionKey",
|
|
||||||
"accessLevel",
|
|
||||||
"fromUsername",
|
|
||||||
"fromPubkey",
|
|
||||||
"version",
|
|
||||||
)
|
|
||||||
|
|
||||||
def validate_user(self, value):
|
|
||||||
request = self.context["request"]
|
|
||||||
|
|
||||||
if request.user.username == value.lower():
|
|
||||||
raise EtebaseValidationError("no_self_invite", "Inviting yourself is not allowed")
|
|
||||||
return value
|
|
||||||
|
|
||||||
def create(self, validated_data):
|
|
||||||
request = self.context["request"]
|
|
||||||
collection = validated_data.pop("collection")
|
|
||||||
|
|
||||||
member = collection.members.get(user=request.user)
|
|
||||||
|
|
||||||
with transaction.atomic():
|
|
||||||
try:
|
|
||||||
return type(self).Meta.model.objects.create(**validated_data, fromMember=member)
|
|
||||||
except IntegrityError:
|
|
||||||
raise EtebaseValidationError("invitation_exists", "Invitation already exists")
|
|
||||||
|
|
||||||
def update(self, instance, validated_data):
|
|
||||||
with transaction.atomic():
|
|
||||||
instance.accessLevel = validated_data.pop("accessLevel")
|
|
||||||
instance.signedEncryptionKey = validated_data.pop("signedEncryptionKey")
|
|
||||||
instance.save()
|
|
||||||
|
|
||||||
return instance
|
|
||||||
|
|
||||||
|
|
||||||
class InvitationAcceptSerializer(BetterErrorsMixin, serializers.Serializer):
|
|
||||||
collectionType = BinaryBase64Field()
|
|
||||||
encryptionKey = BinaryBase64Field()
|
|
||||||
|
|
||||||
def create(self, validated_data):
|
|
||||||
|
|
||||||
with transaction.atomic():
|
|
||||||
invitation = self.context["invitation"]
|
|
||||||
encryption_key = validated_data.get("encryptionKey")
|
|
||||||
collection_type = validated_data.pop("collectionType")
|
|
||||||
|
|
||||||
user = invitation.user
|
|
||||||
collection_type_obj, _ = models.CollectionType.objects.get_or_create(uid=collection_type, owner=user)
|
|
||||||
|
|
||||||
member = models.CollectionMember.objects.create(
|
|
||||||
collection=invitation.collection,
|
|
||||||
stoken=models.Stoken.objects.create(),
|
|
||||||
user=user,
|
|
||||||
accessLevel=invitation.accessLevel,
|
|
||||||
encryptionKey=encryption_key,
|
|
||||||
collectionType=collection_type_obj,
|
|
||||||
)
|
|
||||||
|
|
||||||
models.CollectionMemberRemoved.objects.filter(
|
|
||||||
user=invitation.user, collection=invitation.collection
|
|
||||||
).delete()
|
|
||||||
|
|
||||||
invitation.delete()
|
|
||||||
|
|
||||||
return member
|
|
||||||
|
|
||||||
def update(self, instance, validated_data):
|
|
||||||
raise NotImplementedError()
|
|
||||||
|
|
||||||
|
|
||||||
class UserSerializer(BetterErrorsMixin, serializers.ModelSerializer):
|
|
||||||
pubkey = BinaryBase64Field(source="userinfo.pubkey")
|
|
||||||
encryptedContent = BinaryBase64Field(source="userinfo.encryptedContent")
|
|
||||||
|
|
||||||
class Meta:
|
|
||||||
model = User
|
|
||||||
fields = (User.USERNAME_FIELD, User.EMAIL_FIELD, "pubkey", "encryptedContent")
|
|
||||||
|
|
||||||
|
|
||||||
class UserInfoPubkeySerializer(BetterErrorsMixin, serializers.ModelSerializer):
|
|
||||||
pubkey = BinaryBase64Field()
|
|
||||||
|
|
||||||
class Meta:
|
|
||||||
model = models.UserInfo
|
|
||||||
fields = ("pubkey",)
|
|
||||||
|
|
||||||
|
|
||||||
class UserSignupSerializer(BetterErrorsMixin, serializers.ModelSerializer):
|
|
||||||
class Meta:
|
|
||||||
model = User
|
|
||||||
fields = (User.USERNAME_FIELD, User.EMAIL_FIELD)
|
|
||||||
extra_kwargs = {
|
|
||||||
"username": {"validators": []}, # We specifically validate in SignupSerializer
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
class AuthenticationSignupSerializer(BetterErrorsMixin, serializers.Serializer):
|
|
||||||
"""Used both for creating new accounts and setting up existing ones for the first time.
|
|
||||||
When setting up existing ones the email is ignored."
|
|
||||||
"""
|
|
||||||
|
|
||||||
user = UserSignupSerializer(many=False)
|
|
||||||
salt = BinaryBase64Field()
|
|
||||||
loginPubkey = BinaryBase64Field()
|
|
||||||
pubkey = BinaryBase64Field()
|
|
||||||
encryptedContent = BinaryBase64Field()
|
|
||||||
|
|
||||||
def create(self, validated_data):
|
|
||||||
"""Function that's called when this serializer creates an item"""
|
|
||||||
user_data = validated_data.pop("user")
|
|
||||||
|
|
||||||
with transaction.atomic():
|
|
||||||
try:
|
|
||||||
view = self.context.get("view", None)
|
|
||||||
user_queryset = get_user_queryset(User.objects.all(), view)
|
|
||||||
instance = user_queryset.get(**{User.USERNAME_FIELD: user_data["username"].lower()})
|
|
||||||
except User.DoesNotExist:
|
|
||||||
# Create the user and save the casing the user chose as the first name
|
|
||||||
try:
|
|
||||||
instance = create_user(**user_data, password=None, first_name=user_data["username"], view=view)
|
|
||||||
instance.clean_fields()
|
|
||||||
except EtebaseValidationError as e:
|
|
||||||
raise e
|
|
||||||
except django_exceptions.ValidationError as e:
|
|
||||||
self.transform_validation_error("user", e)
|
|
||||||
except Exception as e:
|
|
||||||
raise EtebaseValidationError("generic", str(e))
|
|
||||||
|
|
||||||
if hasattr(instance, "userinfo"):
|
|
||||||
raise EtebaseValidationError("user_exists", "User already exists", status_code=status.HTTP_409_CONFLICT)
|
|
||||||
|
|
||||||
models.UserInfo.objects.create(**validated_data, owner=instance)
|
|
||||||
|
|
||||||
return instance
|
|
||||||
|
|
||||||
def update(self, instance, validated_data):
|
|
||||||
raise NotImplementedError()
|
|
||||||
|
|
||||||
|
|
||||||
class AuthenticationLoginChallengeSerializer(BetterErrorsMixin, serializers.Serializer):
|
|
||||||
username = serializers.CharField(required=True)
|
|
||||||
|
|
||||||
def create(self, validated_data):
|
|
||||||
raise NotImplementedError()
|
|
||||||
|
|
||||||
def update(self, instance, validated_data):
|
|
||||||
raise NotImplementedError()
|
|
||||||
|
|
||||||
|
|
||||||
class AuthenticationLoginSerializer(BetterErrorsMixin, serializers.Serializer):
|
|
||||||
response = BinaryBase64Field()
|
|
||||||
signature = BinaryBase64Field()
|
|
||||||
|
|
||||||
def create(self, validated_data):
|
|
||||||
raise NotImplementedError()
|
|
||||||
|
|
||||||
def update(self, instance, validated_data):
|
|
||||||
raise NotImplementedError()
|
|
||||||
|
|
||||||
|
|
||||||
class AuthenticationLoginInnerSerializer(AuthenticationLoginChallengeSerializer):
|
|
||||||
challenge = BinaryBase64Field()
|
|
||||||
host = serializers.CharField()
|
|
||||||
action = serializers.CharField()
|
|
||||||
|
|
||||||
def create(self, validated_data):
|
|
||||||
raise NotImplementedError()
|
|
||||||
|
|
||||||
def update(self, instance, validated_data):
|
|
||||||
raise NotImplementedError()
|
|
||||||
|
|
||||||
|
|
||||||
class AuthenticationChangePasswordInnerSerializer(AuthenticationLoginInnerSerializer):
|
|
||||||
loginPubkey = BinaryBase64Field()
|
|
||||||
encryptedContent = BinaryBase64Field()
|
|
||||||
|
|
||||||
class Meta:
|
|
||||||
model = models.UserInfo
|
|
||||||
fields = ("loginPubkey", "encryptedContent")
|
|
||||||
|
|
||||||
def create(self, validated_data):
|
|
||||||
raise NotImplementedError()
|
|
||||||
|
|
||||||
def update(self, instance, validated_data):
|
|
||||||
with transaction.atomic():
|
|
||||||
instance.loginPubkey = validated_data.pop("loginPubkey")
|
|
||||||
instance.encryptedContent = validated_data.pop("encryptedContent")
|
|
||||||
instance.save()
|
|
||||||
|
|
||||||
return instance
|
|
@ -1,3 +0,0 @@
|
|||||||
from django.test import TestCase
|
|
||||||
|
|
||||||
# Create your tests here.
|
|
@ -1,46 +0,0 @@
|
|||||||
from django.utils import timezone
|
|
||||||
from django.utils.translation import gettext_lazy as _
|
|
||||||
|
|
||||||
from rest_framework import exceptions
|
|
||||||
from rest_framework.authentication import TokenAuthentication as DRFTokenAuthentication
|
|
||||||
|
|
||||||
from .models import AuthToken, get_default_expiry
|
|
||||||
|
|
||||||
|
|
||||||
AUTO_REFRESH = True
|
|
||||||
MIN_REFRESH_INTERVAL = 60
|
|
||||||
|
|
||||||
|
|
||||||
class TokenAuthentication(DRFTokenAuthentication):
|
|
||||||
keyword = "Token"
|
|
||||||
model = AuthToken
|
|
||||||
|
|
||||||
def authenticate_credentials(self, key):
|
|
||||||
msg = _("Invalid token.")
|
|
||||||
model = self.get_model()
|
|
||||||
try:
|
|
||||||
token = model.objects.select_related("user").get(key=key)
|
|
||||||
except model.DoesNotExist:
|
|
||||||
raise exceptions.AuthenticationFailed(msg)
|
|
||||||
|
|
||||||
if not token.user.is_active:
|
|
||||||
raise exceptions.AuthenticationFailed(_("User inactive or deleted."))
|
|
||||||
|
|
||||||
if token.expiry is not None:
|
|
||||||
if token.expiry < timezone.now():
|
|
||||||
token.delete()
|
|
||||||
raise exceptions.AuthenticationFailed(msg)
|
|
||||||
|
|
||||||
if AUTO_REFRESH:
|
|
||||||
self.renew_token(token)
|
|
||||||
|
|
||||||
return (token.user, token)
|
|
||||||
|
|
||||||
def renew_token(self, auth_token):
|
|
||||||
current_expiry = auth_token.expiry
|
|
||||||
new_expiry = get_default_expiry()
|
|
||||||
# Throttle refreshing of token to avoid db writes
|
|
||||||
delta = (new_expiry - current_expiry).total_seconds()
|
|
||||||
if delta > MIN_REFRESH_INTERVAL:
|
|
||||||
auth_token.expiry = new_expiry
|
|
||||||
auth_token.save(update_fields=("expiry",))
|
|
@ -1,9 +1,9 @@
|
|||||||
from django.contrib.auth import get_user_model
|
|
||||||
from django.db import models
|
from django.db import models
|
||||||
from django.utils import timezone
|
from django.utils import timezone
|
||||||
from django.utils.crypto import get_random_string
|
from django.utils.crypto import get_random_string
|
||||||
|
from myauth.models import get_typed_user_model
|
||||||
|
|
||||||
User = get_user_model()
|
User = get_typed_user_model()
|
||||||
|
|
||||||
|
|
||||||
def generate_key():
|
def generate_key():
|
||||||
|
@ -1,30 +0,0 @@
|
|||||||
from django.conf import settings
|
|
||||||
from django.conf.urls import include
|
|
||||||
from django.urls import path
|
|
||||||
|
|
||||||
from rest_framework_nested import routers
|
|
||||||
|
|
||||||
from django_etebase import views
|
|
||||||
|
|
||||||
router = routers.DefaultRouter()
|
|
||||||
router.register(r"collection", views.CollectionViewSet)
|
|
||||||
router.register(r"authentication", views.AuthenticationViewSet, basename="authentication")
|
|
||||||
router.register(r"invitation/incoming", views.InvitationIncomingViewSet, basename="invitation_incoming")
|
|
||||||
router.register(r"invitation/outgoing", views.InvitationOutgoingViewSet, basename="invitation_outgoing")
|
|
||||||
|
|
||||||
collections_router = routers.NestedSimpleRouter(router, r"collection", lookup="collection")
|
|
||||||
collections_router.register(r"item", views.CollectionItemViewSet, basename="collection_item")
|
|
||||||
collections_router.register(r"member", views.CollectionMemberViewSet, basename="collection_member")
|
|
||||||
|
|
||||||
item_router = routers.NestedSimpleRouter(collections_router, r"item", lookup="collection_item")
|
|
||||||
item_router.register(r"chunk", views.CollectionItemChunkViewSet, basename="collection_items_chunk")
|
|
||||||
|
|
||||||
if settings.DEBUG:
|
|
||||||
router.register(r"test/authentication", views.TestAuthenticationViewSet, basename="test_authentication")
|
|
||||||
|
|
||||||
app_name = "django_etebase"
|
|
||||||
urlpatterns = [
|
|
||||||
path("v1/", include(router.urls)),
|
|
||||||
path("v1/", include(collections_router.urls)),
|
|
||||||
path("v1/", include(item_router.urls)),
|
|
||||||
]
|
|
@ -1,24 +1,35 @@
|
|||||||
from django.contrib.auth import get_user_model
|
import typing as t
|
||||||
|
from dataclasses import dataclass
|
||||||
|
|
||||||
|
from django.db.models import QuerySet
|
||||||
from django.core.exceptions import PermissionDenied
|
from django.core.exceptions import PermissionDenied
|
||||||
|
from myauth.models import UserType, get_typed_user_model
|
||||||
|
|
||||||
from . import app_settings
|
from . import app_settings
|
||||||
|
|
||||||
|
|
||||||
User = get_user_model()
|
User = get_typed_user_model()
|
||||||
|
|
||||||
|
|
||||||
def get_user_queryset(queryset, view):
|
@dataclass
|
||||||
|
class CallbackContext:
|
||||||
|
"""Class for passing extra context to callbacks"""
|
||||||
|
|
||||||
|
url_kwargs: t.Dict[str, t.Any]
|
||||||
|
user: t.Optional[UserType] = None
|
||||||
|
|
||||||
|
|
||||||
|
def get_user_queryset(queryset: QuerySet[UserType], context: CallbackContext) -> QuerySet[UserType]:
|
||||||
custom_func = app_settings.GET_USER_QUERYSET_FUNC
|
custom_func = app_settings.GET_USER_QUERYSET_FUNC
|
||||||
if custom_func is not None:
|
if custom_func is not None:
|
||||||
return custom_func(queryset, view)
|
return custom_func(queryset, context)
|
||||||
return queryset
|
return queryset
|
||||||
|
|
||||||
|
|
||||||
def create_user(*args, **kwargs):
|
def create_user(context: CallbackContext, *args, **kwargs) -> UserType:
|
||||||
custom_func = app_settings.CREATE_USER_FUNC
|
custom_func = app_settings.CREATE_USER_FUNC
|
||||||
if custom_func is not None:
|
if custom_func is not None:
|
||||||
return custom_func(*args, **kwargs)
|
return custom_func(context, *args, **kwargs)
|
||||||
_ = kwargs.pop("view")
|
|
||||||
return User.objects.create_user(*args, **kwargs)
|
return User.objects.create_user(*args, **kwargs)
|
||||||
|
|
||||||
|
|
||||||
|
@ -1,868 +0,0 @@
|
|||||||
# Copyright © 2017 Tom Hacohen
|
|
||||||
#
|
|
||||||
# This program is free software: you can redistribute it and/or modify
|
|
||||||
# it under the terms of the GNU Affero General Public License as
|
|
||||||
# published by the Free Software Foundation, version 3.
|
|
||||||
#
|
|
||||||
# This library is distributed in the hope that it will be useful,
|
|
||||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
|
||||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
|
||||||
# GNU General Public License for more details.
|
|
||||||
#
|
|
||||||
# You should have received a copy of the GNU General Public License
|
|
||||||
# along with this program. If not, see <http://www.gnu.org/licenses/>.
|
|
||||||
|
|
||||||
import msgpack
|
|
||||||
|
|
||||||
from django.conf import settings
|
|
||||||
from django.contrib.auth import get_user_model, user_logged_in, user_logged_out
|
|
||||||
from django.core.exceptions import PermissionDenied
|
|
||||||
from django.db import transaction, IntegrityError
|
|
||||||
from django.db.models import Max, Value as V, Q
|
|
||||||
from django.db.models.functions import Coalesce, Greatest
|
|
||||||
from django.http import HttpResponseBadRequest, HttpResponse, Http404
|
|
||||||
from django.shortcuts import get_object_or_404
|
|
||||||
|
|
||||||
from rest_framework import status
|
|
||||||
from rest_framework import viewsets
|
|
||||||
from rest_framework.decorators import action as action_decorator
|
|
||||||
from rest_framework.response import Response
|
|
||||||
from rest_framework.parsers import JSONParser, FormParser, MultiPartParser
|
|
||||||
from rest_framework.renderers import BrowsableAPIRenderer
|
|
||||||
from rest_framework.exceptions import AuthenticationFailed
|
|
||||||
from rest_framework.permissions import IsAuthenticated
|
|
||||||
|
|
||||||
import nacl.encoding
|
|
||||||
import nacl.signing
|
|
||||||
import nacl.secret
|
|
||||||
import nacl.hash
|
|
||||||
|
|
||||||
from .token_auth.models import AuthToken
|
|
||||||
|
|
||||||
from .drf_msgpack.parsers import MessagePackParser
|
|
||||||
from .drf_msgpack.renderers import MessagePackRenderer
|
|
||||||
|
|
||||||
from . import app_settings, permissions
|
|
||||||
from .renderers import JSONRenderer
|
|
||||||
from .models import (
|
|
||||||
Collection,
|
|
||||||
CollectionItem,
|
|
||||||
CollectionItemRevision,
|
|
||||||
CollectionMember,
|
|
||||||
CollectionMemberRemoved,
|
|
||||||
CollectionInvitation,
|
|
||||||
Stoken,
|
|
||||||
UserInfo,
|
|
||||||
)
|
|
||||||
from .serializers import (
|
|
||||||
AuthenticationChangePasswordInnerSerializer,
|
|
||||||
AuthenticationSignupSerializer,
|
|
||||||
AuthenticationLoginChallengeSerializer,
|
|
||||||
AuthenticationLoginSerializer,
|
|
||||||
AuthenticationLoginInnerSerializer,
|
|
||||||
CollectionSerializer,
|
|
||||||
CollectionItemSerializer,
|
|
||||||
CollectionItemBulkGetSerializer,
|
|
||||||
CollectionItemDepSerializer,
|
|
||||||
CollectionItemRevisionSerializer,
|
|
||||||
CollectionItemChunkSerializer,
|
|
||||||
CollectionListMultiSerializer,
|
|
||||||
CollectionMemberSerializer,
|
|
||||||
CollectionInvitationSerializer,
|
|
||||||
InvitationAcceptSerializer,
|
|
||||||
UserInfoPubkeySerializer,
|
|
||||||
UserSerializer,
|
|
||||||
)
|
|
||||||
from .utils import get_user_queryset
|
|
||||||
from .exceptions import EtebaseValidationError
|
|
||||||
from .parsers import ChunkUploadParser
|
|
||||||
from .signals import user_signed_up
|
|
||||||
|
|
||||||
User = get_user_model()
|
|
||||||
|
|
||||||
|
|
||||||
def msgpack_encode(content):
|
|
||||||
return msgpack.packb(content, use_bin_type=True)
|
|
||||||
|
|
||||||
|
|
||||||
def msgpack_decode(content):
|
|
||||||
return msgpack.unpackb(content, raw=False)
|
|
||||||
|
|
||||||
|
|
||||||
class BaseViewSet(viewsets.ModelViewSet):
|
|
||||||
authentication_classes = tuple(app_settings.API_AUTHENTICATORS)
|
|
||||||
permission_classes = tuple(app_settings.API_PERMISSIONS)
|
|
||||||
renderer_classes = [JSONRenderer, MessagePackRenderer] + ([BrowsableAPIRenderer] if settings.DEBUG else [])
|
|
||||||
parser_classes = [JSONParser, MessagePackParser, FormParser, MultiPartParser]
|
|
||||||
stoken_id_fields = None
|
|
||||||
|
|
||||||
def get_serializer_class(self):
|
|
||||||
serializer_class = self.serializer_class
|
|
||||||
|
|
||||||
if self.request.method == "PUT":
|
|
||||||
serializer_class = getattr(self, "serializer_update_class", serializer_class)
|
|
||||||
|
|
||||||
return serializer_class
|
|
||||||
|
|
||||||
def get_collection_queryset(self, queryset=Collection.objects):
|
|
||||||
user = self.request.user
|
|
||||||
return queryset.filter(members__user=user)
|
|
||||||
|
|
||||||
def get_stoken_obj_id(self, request):
|
|
||||||
return request.GET.get("stoken", None)
|
|
||||||
|
|
||||||
def get_stoken_obj(self, request):
|
|
||||||
stoken = self.get_stoken_obj_id(request)
|
|
||||||
|
|
||||||
if stoken is not None:
|
|
||||||
try:
|
|
||||||
return Stoken.objects.get(uid=stoken)
|
|
||||||
except Stoken.DoesNotExist:
|
|
||||||
raise EtebaseValidationError("bad_stoken", "Invalid stoken.", status_code=status.HTTP_400_BAD_REQUEST)
|
|
||||||
|
|
||||||
return None
|
|
||||||
|
|
||||||
def filter_by_stoken(self, request, queryset):
|
|
||||||
stoken_rev = self.get_stoken_obj(request)
|
|
||||||
|
|
||||||
aggr_fields = [Coalesce(Max(field), V(0)) for field in self.stoken_id_fields]
|
|
||||||
max_stoken = Greatest(*aggr_fields) if len(aggr_fields) > 1 else aggr_fields[0]
|
|
||||||
queryset = queryset.annotate(max_stoken=max_stoken).order_by("max_stoken")
|
|
||||||
|
|
||||||
if stoken_rev is not None:
|
|
||||||
queryset = queryset.filter(max_stoken__gt=stoken_rev.id)
|
|
||||||
|
|
||||||
return queryset, stoken_rev
|
|
||||||
|
|
||||||
def get_queryset_stoken(self, queryset):
|
|
||||||
maxid = -1
|
|
||||||
for row in queryset:
|
|
||||||
rowmaxid = getattr(row, "max_stoken") or -1
|
|
||||||
maxid = max(maxid, rowmaxid)
|
|
||||||
new_stoken = (maxid >= 0) and Stoken.objects.get(id=maxid)
|
|
||||||
|
|
||||||
return new_stoken or None
|
|
||||||
|
|
||||||
def filter_by_stoken_and_limit(self, request, queryset):
|
|
||||||
limit = int(request.GET.get("limit", 50))
|
|
||||||
|
|
||||||
queryset, stoken_rev = self.filter_by_stoken(request, queryset)
|
|
||||||
|
|
||||||
result = list(queryset[: limit + 1])
|
|
||||||
if len(result) < limit + 1:
|
|
||||||
done = True
|
|
||||||
else:
|
|
||||||
done = False
|
|
||||||
result = result[:-1]
|
|
||||||
|
|
||||||
new_stoken_obj = self.get_queryset_stoken(result) or stoken_rev
|
|
||||||
|
|
||||||
return result, new_stoken_obj, done
|
|
||||||
|
|
||||||
# Change how our list works by default
|
|
||||||
def list(self, request, collection_uid=None, *args, **kwargs):
|
|
||||||
queryset = self.get_queryset()
|
|
||||||
serializer = self.get_serializer(queryset, many=True)
|
|
||||||
|
|
||||||
ret = {
|
|
||||||
"data": serializer.data,
|
|
||||||
"done": True, # we always return all the items, so it's always done
|
|
||||||
}
|
|
||||||
|
|
||||||
return Response(ret)
|
|
||||||
|
|
||||||
|
|
||||||
class CollectionViewSet(BaseViewSet):
|
|
||||||
allowed_methods = ["GET", "POST"]
|
|
||||||
permission_classes = BaseViewSet.permission_classes + (permissions.IsCollectionAdminOrReadOnly,)
|
|
||||||
queryset = Collection.objects.all()
|
|
||||||
serializer_class = CollectionSerializer
|
|
||||||
lookup_field = "main_item__uid"
|
|
||||||
lookup_url_kwarg = "uid"
|
|
||||||
stoken_id_fields = ["items__revisions__stoken__id", "members__stoken__id"]
|
|
||||||
|
|
||||||
def get_queryset(self, queryset=None):
|
|
||||||
if queryset is None:
|
|
||||||
queryset = type(self).queryset
|
|
||||||
return self.get_collection_queryset(queryset)
|
|
||||||
|
|
||||||
def get_serializer_context(self):
|
|
||||||
context = super().get_serializer_context()
|
|
||||||
prefetch = self.request.query_params.get("prefetch", "auto")
|
|
||||||
context.update({"request": self.request, "prefetch": prefetch})
|
|
||||||
return context
|
|
||||||
|
|
||||||
def destroy(self, request, uid=None, *args, **kwargs):
|
|
||||||
# FIXME: implement
|
|
||||||
return Response(status=status.HTTP_405_METHOD_NOT_ALLOWED)
|
|
||||||
|
|
||||||
def partial_update(self, request, uid=None, *args, **kwargs):
|
|
||||||
return Response(status=status.HTTP_405_METHOD_NOT_ALLOWED)
|
|
||||||
|
|
||||||
def update(self, request, *args, **kwargs):
|
|
||||||
return Response(status=status.HTTP_405_METHOD_NOT_ALLOWED)
|
|
||||||
|
|
||||||
def create(self, request, *args, **kwargs):
|
|
||||||
serializer = self.get_serializer(data=request.data)
|
|
||||||
serializer.is_valid(raise_exception=True)
|
|
||||||
serializer.save(owner=self.request.user)
|
|
||||||
|
|
||||||
return Response({}, status=status.HTTP_201_CREATED)
|
|
||||||
|
|
||||||
def list(self, request, *args, **kwargs):
|
|
||||||
queryset = self.get_queryset()
|
|
||||||
return self.list_common(request, queryset, *args, **kwargs)
|
|
||||||
|
|
||||||
@action_decorator(detail=False, methods=["POST"])
|
|
||||||
def list_multi(self, request, *args, **kwargs):
|
|
||||||
serializer = CollectionListMultiSerializer(data=request.data)
|
|
||||||
serializer.is_valid(raise_exception=True)
|
|
||||||
|
|
||||||
collection_types = serializer.validated_data["collectionTypes"]
|
|
||||||
|
|
||||||
queryset = self.get_queryset()
|
|
||||||
# FIXME: Remove the isnull part once we attach collection types to all objects ("collection-type-migration")
|
|
||||||
queryset = queryset.filter(
|
|
||||||
Q(members__collectionType__uid__in=collection_types) | Q(members__collectionType__isnull=True)
|
|
||||||
)
|
|
||||||
|
|
||||||
return self.list_common(request, queryset, *args, **kwargs)
|
|
||||||
|
|
||||||
def list_common(self, request, queryset, *args, **kwargs):
|
|
||||||
result, new_stoken_obj, done = self.filter_by_stoken_and_limit(request, queryset)
|
|
||||||
new_stoken = new_stoken_obj and new_stoken_obj.uid
|
|
||||||
|
|
||||||
serializer = self.get_serializer(result, many=True)
|
|
||||||
|
|
||||||
ret = {
|
|
||||||
"data": serializer.data,
|
|
||||||
"stoken": new_stoken,
|
|
||||||
"done": done,
|
|
||||||
}
|
|
||||||
|
|
||||||
stoken_obj = self.get_stoken_obj(request)
|
|
||||||
if stoken_obj is not None:
|
|
||||||
# FIXME: honour limit? (the limit should be combined for data and this because of stoken)
|
|
||||||
remed_qs = CollectionMemberRemoved.objects.filter(user=request.user, stoken__id__gt=stoken_obj.id)
|
|
||||||
if not ret["done"]:
|
|
||||||
# We only filter by the new_stoken if we are not done. This is because if we are done, the new stoken
|
|
||||||
# can point to the most recent collection change rather than most recent removed membership.
|
|
||||||
remed_qs = remed_qs.filter(stoken__id__lte=new_stoken_obj.id)
|
|
||||||
|
|
||||||
remed = remed_qs.values_list("collection__main_item__uid", flat=True)
|
|
||||||
if len(remed) > 0:
|
|
||||||
ret["removedMemberships"] = [{"uid": x} for x in remed]
|
|
||||||
|
|
||||||
return Response(ret)
|
|
||||||
|
|
||||||
|
|
||||||
class CollectionItemViewSet(BaseViewSet):
|
|
||||||
allowed_methods = ["GET", "POST", "PUT"]
|
|
||||||
permission_classes = BaseViewSet.permission_classes + (permissions.HasWriteAccessOrReadOnly,)
|
|
||||||
queryset = CollectionItem.objects.all()
|
|
||||||
serializer_class = CollectionItemSerializer
|
|
||||||
lookup_field = "uid"
|
|
||||||
stoken_id_fields = ["revisions__stoken__id"]
|
|
||||||
|
|
||||||
def get_queryset(self):
|
|
||||||
collection_uid = self.kwargs["collection_uid"]
|
|
||||||
try:
|
|
||||||
collection = self.get_collection_queryset(Collection.objects).get(main_item__uid=collection_uid)
|
|
||||||
except Collection.DoesNotExist:
|
|
||||||
raise Http404("Collection does not exist")
|
|
||||||
# XXX Potentially add this for performance: .prefetch_related('revisions__chunks')
|
|
||||||
queryset = type(self).queryset.filter(collection__pk=collection.pk, revisions__current=True)
|
|
||||||
|
|
||||||
return queryset
|
|
||||||
|
|
||||||
def get_serializer_context(self):
|
|
||||||
context = super().get_serializer_context()
|
|
||||||
prefetch = self.request.query_params.get("prefetch", "auto")
|
|
||||||
context.update({"request": self.request, "prefetch": prefetch})
|
|
||||||
return context
|
|
||||||
|
|
||||||
def create(self, request, collection_uid=None, *args, **kwargs):
|
|
||||||
# We create using batch and transaction
|
|
||||||
return Response(status=status.HTTP_405_METHOD_NOT_ALLOWED)
|
|
||||||
|
|
||||||
def destroy(self, request, collection_uid=None, uid=None, *args, **kwargs):
|
|
||||||
# We can't have destroy because we need to get data from the user (in the body) such as hmac.
|
|
||||||
return Response(status=status.HTTP_405_METHOD_NOT_ALLOWED)
|
|
||||||
|
|
||||||
def update(self, request, collection_uid=None, uid=None, *args, **kwargs):
|
|
||||||
return Response(status=status.HTTP_405_METHOD_NOT_ALLOWED)
|
|
||||||
|
|
||||||
def partial_update(self, request, collection_uid=None, uid=None, *args, **kwargs):
|
|
||||||
return Response(status=status.HTTP_405_METHOD_NOT_ALLOWED)
|
|
||||||
|
|
||||||
def list(self, request, collection_uid=None, *args, **kwargs):
|
|
||||||
queryset = self.get_queryset()
|
|
||||||
|
|
||||||
if not self.request.query_params.get("withCollection", False):
|
|
||||||
queryset = queryset.filter(parent__isnull=True)
|
|
||||||
|
|
||||||
result, new_stoken_obj, done = self.filter_by_stoken_and_limit(request, queryset)
|
|
||||||
new_stoken = new_stoken_obj and new_stoken_obj.uid
|
|
||||||
|
|
||||||
serializer = self.get_serializer(result, many=True)
|
|
||||||
|
|
||||||
ret = {
|
|
||||||
"data": serializer.data,
|
|
||||||
"stoken": new_stoken,
|
|
||||||
"done": done,
|
|
||||||
}
|
|
||||||
return Response(ret)
|
|
||||||
|
|
||||||
@action_decorator(detail=True, methods=["GET"])
|
|
||||||
def revision(self, request, collection_uid=None, uid=None, *args, **kwargs):
|
|
||||||
col = get_object_or_404(self.get_collection_queryset(Collection.objects), main_item__uid=collection_uid)
|
|
||||||
item = get_object_or_404(col.items, uid=uid)
|
|
||||||
|
|
||||||
limit = int(request.GET.get("limit", 50))
|
|
||||||
iterator = request.GET.get("iterator", None)
|
|
||||||
|
|
||||||
queryset = item.revisions.order_by("-id")
|
|
||||||
|
|
||||||
if iterator is not None:
|
|
||||||
iterator = get_object_or_404(queryset, uid=iterator)
|
|
||||||
queryset = queryset.filter(id__lt=iterator.id)
|
|
||||||
|
|
||||||
result = list(queryset[: limit + 1])
|
|
||||||
if len(result) < limit + 1:
|
|
||||||
done = True
|
|
||||||
else:
|
|
||||||
done = False
|
|
||||||
result = result[:-1]
|
|
||||||
|
|
||||||
serializer = CollectionItemRevisionSerializer(result, context=self.get_serializer_context(), many=True)
|
|
||||||
|
|
||||||
iterator = serializer.data[-1]["uid"] if len(result) > 0 else None
|
|
||||||
|
|
||||||
ret = {
|
|
||||||
"data": serializer.data,
|
|
||||||
"iterator": iterator,
|
|
||||||
"done": done,
|
|
||||||
}
|
|
||||||
return Response(ret)
|
|
||||||
|
|
||||||
# FIXME: rename to something consistent with what the clients have - maybe list_updates?
|
|
||||||
@action_decorator(detail=False, methods=["POST"])
|
|
||||||
def fetch_updates(self, request, collection_uid=None, *args, **kwargs):
|
|
||||||
queryset = self.get_queryset()
|
|
||||||
|
|
||||||
serializer = CollectionItemBulkGetSerializer(data=request.data, many=True)
|
|
||||||
serializer.is_valid(raise_exception=True)
|
|
||||||
# FIXME: make configurable?
|
|
||||||
item_limit = 200
|
|
||||||
|
|
||||||
if len(serializer.validated_data) > item_limit:
|
|
||||||
content = {"code": "too_many_items", "detail": "Request has too many items. Limit: {}".format(item_limit)}
|
|
||||||
return Response(content, status=status.HTTP_400_BAD_REQUEST)
|
|
||||||
|
|
||||||
queryset, stoken_rev = self.filter_by_stoken(request, queryset)
|
|
||||||
|
|
||||||
uids, etags = zip(*[(item["uid"], item.get("etag")) for item in serializer.validated_data])
|
|
||||||
revs = CollectionItemRevision.objects.filter(uid__in=etags, current=True)
|
|
||||||
queryset = queryset.filter(uid__in=uids).exclude(revisions__in=revs)
|
|
||||||
|
|
||||||
new_stoken_obj = self.get_queryset_stoken(queryset)
|
|
||||||
new_stoken = new_stoken_obj and new_stoken_obj.uid
|
|
||||||
stoken = stoken_rev and getattr(stoken_rev, "uid", None)
|
|
||||||
new_stoken = new_stoken or stoken
|
|
||||||
|
|
||||||
serializer = self.get_serializer(queryset, many=True)
|
|
||||||
|
|
||||||
ret = {
|
|
||||||
"data": serializer.data,
|
|
||||||
"stoken": new_stoken,
|
|
||||||
"done": True, # we always return all the items, so it's always done
|
|
||||||
}
|
|
||||||
return Response(ret)
|
|
||||||
|
|
||||||
@action_decorator(detail=False, methods=["POST"])
|
|
||||||
def batch(self, request, collection_uid=None, *args, **kwargs):
|
|
||||||
return self.transaction(request, collection_uid, validate_etag=False)
|
|
||||||
|
|
||||||
@action_decorator(detail=False, methods=["POST"])
|
|
||||||
def transaction(self, request, collection_uid=None, validate_etag=True, *args, **kwargs):
|
|
||||||
stoken = request.GET.get("stoken", None)
|
|
||||||
with transaction.atomic(): # We need this for locking on the collection object
|
|
||||||
collection_object = get_object_or_404(
|
|
||||||
self.get_collection_queryset(Collection.objects).select_for_update(), # Lock writes on the collection
|
|
||||||
main_item__uid=collection_uid,
|
|
||||||
)
|
|
||||||
|
|
||||||
if stoken is not None and stoken != collection_object.stoken:
|
|
||||||
content = {"code": "stale_stoken", "detail": "Stoken is too old"}
|
|
||||||
return Response(content, status=status.HTTP_409_CONFLICT)
|
|
||||||
|
|
||||||
items = request.data.get("items")
|
|
||||||
deps = request.data.get("deps", None)
|
|
||||||
# FIXME: It should just be one serializer
|
|
||||||
context = self.get_serializer_context()
|
|
||||||
context.update({"validate_etag": validate_etag})
|
|
||||||
serializer = self.get_serializer_class()(data=items, context=context, many=True)
|
|
||||||
deps_serializer = CollectionItemDepSerializer(data=deps, context=context, many=True)
|
|
||||||
|
|
||||||
ser_valid = serializer.is_valid()
|
|
||||||
deps_ser_valid = deps is None or deps_serializer.is_valid()
|
|
||||||
if ser_valid and deps_ser_valid:
|
|
||||||
items = serializer.save(collection=collection_object)
|
|
||||||
|
|
||||||
ret = {}
|
|
||||||
return Response(ret, status=status.HTTP_200_OK)
|
|
||||||
|
|
||||||
return Response(
|
|
||||||
{"items": serializer.errors, "deps": deps_serializer.errors if deps is not None else [],},
|
|
||||||
status=status.HTTP_409_CONFLICT,
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
class CollectionItemChunkViewSet(viewsets.ViewSet):
|
|
||||||
allowed_methods = ["GET", "PUT"]
|
|
||||||
authentication_classes = BaseViewSet.authentication_classes
|
|
||||||
permission_classes = BaseViewSet.permission_classes
|
|
||||||
renderer_classes = BaseViewSet.renderer_classes
|
|
||||||
parser_classes = (ChunkUploadParser,)
|
|
||||||
serializer_class = CollectionItemChunkSerializer
|
|
||||||
lookup_field = "uid"
|
|
||||||
|
|
||||||
def get_serializer_class(self):
|
|
||||||
return self.serializer_class
|
|
||||||
|
|
||||||
def get_collection_queryset(self, queryset=Collection.objects):
|
|
||||||
user = self.request.user
|
|
||||||
return queryset.filter(members__user=user)
|
|
||||||
|
|
||||||
def update(self, request, *args, collection_uid=None, collection_item_uid=None, uid=None, **kwargs):
|
|
||||||
col = get_object_or_404(self.get_collection_queryset(), main_item__uid=collection_uid)
|
|
||||||
# IGNORED FOR NOW: col_it = get_object_or_404(col.items, uid=collection_item_uid)
|
|
||||||
|
|
||||||
data = {
|
|
||||||
"uid": uid,
|
|
||||||
"chunkFile": request.data["file"],
|
|
||||||
}
|
|
||||||
|
|
||||||
serializer = self.get_serializer_class()(data=data)
|
|
||||||
serializer.is_valid(raise_exception=True)
|
|
||||||
try:
|
|
||||||
serializer.save(collection=col)
|
|
||||||
except IntegrityError:
|
|
||||||
return Response(
|
|
||||||
{"code": "chunk_exists", "detail": "Chunk already exists."}, status=status.HTTP_409_CONFLICT
|
|
||||||
)
|
|
||||||
|
|
||||||
return Response({}, status=status.HTTP_201_CREATED)
|
|
||||||
|
|
||||||
@action_decorator(detail=True, methods=["GET"])
|
|
||||||
def download(self, request, collection_uid=None, collection_item_uid=None, uid=None, *args, **kwargs):
|
|
||||||
import os
|
|
||||||
from django.views.static import serve
|
|
||||||
|
|
||||||
col = get_object_or_404(self.get_collection_queryset(), main_item__uid=collection_uid)
|
|
||||||
# IGNORED FOR NOW: col_it = get_object_or_404(col.items, uid=collection_item_uid)
|
|
||||||
chunk = get_object_or_404(col.chunks, uid=uid)
|
|
||||||
|
|
||||||
filename = chunk.chunkFile.path
|
|
||||||
dirname = os.path.dirname(filename)
|
|
||||||
basename = os.path.basename(filename)
|
|
||||||
|
|
||||||
# FIXME: DO NOT USE! Use django-send file or etc instead.
|
|
||||||
return serve(request, basename, dirname)
|
|
||||||
|
|
||||||
|
|
||||||
class CollectionMemberViewSet(BaseViewSet):
|
|
||||||
allowed_methods = ["GET", "PUT", "DELETE"]
|
|
||||||
our_base_permission_classes = BaseViewSet.permission_classes
|
|
||||||
permission_classes = our_base_permission_classes + (permissions.IsCollectionAdmin,)
|
|
||||||
queryset = CollectionMember.objects.all()
|
|
||||||
serializer_class = CollectionMemberSerializer
|
|
||||||
lookup_field = f"user__{User.USERNAME_FIELD}__iexact"
|
|
||||||
lookup_url_kwarg = "username"
|
|
||||||
stoken_id_fields = ["stoken__id"]
|
|
||||||
|
|
||||||
# FIXME: need to make sure that there's always an admin, and maybe also don't let an owner remove adm access
|
|
||||||
# (if we want to transfer, we need to do that specifically)
|
|
||||||
|
|
||||||
def get_queryset(self, queryset=None):
|
|
||||||
collection_uid = self.kwargs["collection_uid"]
|
|
||||||
try:
|
|
||||||
collection = self.get_collection_queryset(Collection.objects).get(main_item__uid=collection_uid)
|
|
||||||
except Collection.DoesNotExist:
|
|
||||||
raise Http404("Collection does not exist")
|
|
||||||
|
|
||||||
if queryset is None:
|
|
||||||
queryset = type(self).queryset
|
|
||||||
|
|
||||||
return queryset.filter(collection=collection)
|
|
||||||
|
|
||||||
# We override this method because we expect the stoken to be called iterator
|
|
||||||
def get_stoken_obj_id(self, request):
|
|
||||||
return request.GET.get("iterator", None)
|
|
||||||
|
|
||||||
def list(self, request, collection_uid=None, *args, **kwargs):
|
|
||||||
queryset = self.get_queryset().order_by("id")
|
|
||||||
result, new_stoken_obj, done = self.filter_by_stoken_and_limit(request, queryset)
|
|
||||||
new_stoken = new_stoken_obj and new_stoken_obj.uid
|
|
||||||
serializer = self.get_serializer(result, many=True)
|
|
||||||
|
|
||||||
ret = {
|
|
||||||
"data": serializer.data,
|
|
||||||
"iterator": new_stoken, # Here we call it an iterator, it's only stoken for collection/items
|
|
||||||
"done": done,
|
|
||||||
}
|
|
||||||
|
|
||||||
return Response(ret)
|
|
||||||
|
|
||||||
def create(self, request, *args, **kwargs):
|
|
||||||
return Response(status=status.HTTP_405_METHOD_NOT_ALLOWED)
|
|
||||||
|
|
||||||
# FIXME: block leaving if we are the last admins - should be deleted / assigned in this case depending if there
|
|
||||||
# are other memebers.
|
|
||||||
def perform_destroy(self, instance):
|
|
||||||
instance.revoke()
|
|
||||||
|
|
||||||
@action_decorator(detail=False, methods=["POST"], permission_classes=our_base_permission_classes)
|
|
||||||
def leave(self, request, collection_uid=None, *args, **kwargs):
|
|
||||||
collection_uid = self.kwargs["collection_uid"]
|
|
||||||
col = get_object_or_404(self.get_collection_queryset(Collection.objects), main_item__uid=collection_uid)
|
|
||||||
|
|
||||||
member = col.members.get(user=request.user)
|
|
||||||
self.perform_destroy(member)
|
|
||||||
|
|
||||||
return Response({})
|
|
||||||
|
|
||||||
|
|
||||||
class InvitationBaseViewSet(BaseViewSet):
|
|
||||||
queryset = CollectionInvitation.objects.all()
|
|
||||||
serializer_class = CollectionInvitationSerializer
|
|
||||||
lookup_field = "uid"
|
|
||||||
lookup_url_kwarg = "invitation_uid"
|
|
||||||
|
|
||||||
def list(self, request, collection_uid=None, *args, **kwargs):
|
|
||||||
limit = int(request.GET.get("limit", 50))
|
|
||||||
iterator = request.GET.get("iterator", None)
|
|
||||||
|
|
||||||
queryset = self.get_queryset().order_by("id")
|
|
||||||
|
|
||||||
if iterator is not None:
|
|
||||||
iterator = get_object_or_404(queryset, uid=iterator)
|
|
||||||
queryset = queryset.filter(id__gt=iterator.id)
|
|
||||||
|
|
||||||
result = list(queryset[: limit + 1])
|
|
||||||
if len(result) < limit + 1:
|
|
||||||
done = True
|
|
||||||
else:
|
|
||||||
done = False
|
|
||||||
result = result[:-1]
|
|
||||||
|
|
||||||
serializer = self.get_serializer(result, many=True)
|
|
||||||
|
|
||||||
iterator = serializer.data[-1]["uid"] if len(result) > 0 else None
|
|
||||||
|
|
||||||
ret = {
|
|
||||||
"data": serializer.data,
|
|
||||||
"iterator": iterator,
|
|
||||||
"done": done,
|
|
||||||
}
|
|
||||||
|
|
||||||
return Response(ret)
|
|
||||||
|
|
||||||
|
|
||||||
class InvitationOutgoingViewSet(InvitationBaseViewSet):
|
|
||||||
allowed_methods = ["GET", "POST", "PUT", "DELETE"]
|
|
||||||
|
|
||||||
def get_queryset(self, queryset=None):
|
|
||||||
if queryset is None:
|
|
||||||
queryset = type(self).queryset
|
|
||||||
|
|
||||||
return queryset.filter(fromMember__user=self.request.user)
|
|
||||||
|
|
||||||
def create(self, request, *args, **kwargs):
|
|
||||||
serializer = self.get_serializer(data=request.data)
|
|
||||||
serializer.is_valid(raise_exception=True)
|
|
||||||
collection_uid = serializer.validated_data.get("collection", {}).get("uid")
|
|
||||||
|
|
||||||
try:
|
|
||||||
collection = self.get_collection_queryset(Collection.objects).get(main_item__uid=collection_uid)
|
|
||||||
except Collection.DoesNotExist:
|
|
||||||
raise Http404("Collection does not exist")
|
|
||||||
|
|
||||||
if request.user == serializer.validated_data.get("user"):
|
|
||||||
content = {"code": "self_invite", "detail": "Inviting yourself is invalid"}
|
|
||||||
return Response(content, status=status.HTTP_400_BAD_REQUEST)
|
|
||||||
|
|
||||||
if not permissions.is_collection_admin(collection, request.user):
|
|
||||||
raise PermissionDenied(
|
|
||||||
{"code": "admin_access_required", "detail": "User is not an admin of this collection"}
|
|
||||||
)
|
|
||||||
|
|
||||||
serializer.save(collection=collection)
|
|
||||||
|
|
||||||
return Response({}, status=status.HTTP_201_CREATED)
|
|
||||||
|
|
||||||
@action_decorator(detail=False, allowed_methods=["GET"], methods=["GET"])
|
|
||||||
def fetch_user_profile(self, request, *args, **kwargs):
|
|
||||||
username = request.GET.get("username")
|
|
||||||
kwargs = {User.USERNAME_FIELD: username.lower()}
|
|
||||||
user = get_object_or_404(get_user_queryset(User.objects.all(), self), **kwargs)
|
|
||||||
user_info = get_object_or_404(UserInfo.objects.all(), owner=user)
|
|
||||||
serializer = UserInfoPubkeySerializer(user_info)
|
|
||||||
return Response(serializer.data)
|
|
||||||
|
|
||||||
|
|
||||||
class InvitationIncomingViewSet(InvitationBaseViewSet):
|
|
||||||
allowed_methods = ["GET", "DELETE"]
|
|
||||||
|
|
||||||
def get_queryset(self, queryset=None):
|
|
||||||
if queryset is None:
|
|
||||||
queryset = type(self).queryset
|
|
||||||
|
|
||||||
return queryset.filter(user=self.request.user)
|
|
||||||
|
|
||||||
@action_decorator(detail=True, allowed_methods=["POST"], methods=["POST"])
|
|
||||||
def accept(self, request, invitation_uid=None, *args, **kwargs):
|
|
||||||
invitation = get_object_or_404(self.get_queryset(), uid=invitation_uid)
|
|
||||||
context = self.get_serializer_context()
|
|
||||||
context.update({"invitation": invitation})
|
|
||||||
|
|
||||||
serializer = InvitationAcceptSerializer(data=request.data, context=context)
|
|
||||||
serializer.is_valid(raise_exception=True)
|
|
||||||
serializer.save()
|
|
||||||
return Response(status=status.HTTP_201_CREATED)
|
|
||||||
|
|
||||||
|
|
||||||
class AuthenticationViewSet(viewsets.ViewSet):
|
|
||||||
allowed_methods = ["POST"]
|
|
||||||
authentication_classes = BaseViewSet.authentication_classes
|
|
||||||
renderer_classes = BaseViewSet.renderer_classes
|
|
||||||
parser_classes = BaseViewSet.parser_classes
|
|
||||||
|
|
||||||
def get_encryption_key(self, salt):
|
|
||||||
key = nacl.hash.blake2b(settings.SECRET_KEY.encode(), encoder=nacl.encoding.RawEncoder)
|
|
||||||
return nacl.hash.blake2b(
|
|
||||||
b"",
|
|
||||||
key=key,
|
|
||||||
salt=salt[: nacl.hash.BLAKE2B_SALTBYTES],
|
|
||||||
person=b"etebase-auth",
|
|
||||||
encoder=nacl.encoding.RawEncoder,
|
|
||||||
)
|
|
||||||
|
|
||||||
def get_queryset(self):
|
|
||||||
return get_user_queryset(User.objects.all(), self)
|
|
||||||
|
|
||||||
def get_serializer_context(self):
|
|
||||||
return {"request": self.request, "format": self.format_kwarg, "view": self}
|
|
||||||
|
|
||||||
def login_response_data(self, user):
|
|
||||||
return {
|
|
||||||
"token": AuthToken.objects.create(user=user).key,
|
|
||||||
"user": UserSerializer(user).data,
|
|
||||||
}
|
|
||||||
|
|
||||||
def list(self, request, *args, **kwargs):
|
|
||||||
return Response(status=status.HTTP_405_METHOD_NOT_ALLOWED)
|
|
||||||
|
|
||||||
@action_decorator(detail=False, methods=["POST"])
|
|
||||||
def signup(self, request, *args, **kwargs):
|
|
||||||
serializer = AuthenticationSignupSerializer(data=request.data, context=self.get_serializer_context())
|
|
||||||
serializer.is_valid(raise_exception=True)
|
|
||||||
user = serializer.save()
|
|
||||||
|
|
||||||
user_signed_up.send(sender=user.__class__, request=request, user=user)
|
|
||||||
|
|
||||||
data = self.login_response_data(user)
|
|
||||||
return Response(data, status=status.HTTP_201_CREATED)
|
|
||||||
|
|
||||||
def get_login_user(self, username):
|
|
||||||
kwargs = {User.USERNAME_FIELD + "__iexact": username.lower()}
|
|
||||||
try:
|
|
||||||
user = self.get_queryset().get(**kwargs)
|
|
||||||
if not hasattr(user, "userinfo"):
|
|
||||||
raise AuthenticationFailed({"code": "user_not_init", "detail": "User not properly init"})
|
|
||||||
return user
|
|
||||||
except User.DoesNotExist:
|
|
||||||
raise AuthenticationFailed({"code": "user_not_found", "detail": "User not found"})
|
|
||||||
|
|
||||||
def validate_login_request(self, request, validated_data, response_raw, signature, expected_action):
|
|
||||||
from datetime import datetime
|
|
||||||
|
|
||||||
username = validated_data.get("username")
|
|
||||||
user = self.get_login_user(username)
|
|
||||||
host = validated_data["host"]
|
|
||||||
challenge = validated_data["challenge"]
|
|
||||||
action = validated_data["action"]
|
|
||||||
|
|
||||||
salt = bytes(user.userinfo.salt)
|
|
||||||
enc_key = self.get_encryption_key(salt)
|
|
||||||
box = nacl.secret.SecretBox(enc_key)
|
|
||||||
|
|
||||||
challenge_data = msgpack_decode(box.decrypt(challenge))
|
|
||||||
now = int(datetime.now().timestamp())
|
|
||||||
if action != expected_action:
|
|
||||||
content = {"code": "wrong_action", "detail": 'Expected "{}" but got something else'.format(expected_action)}
|
|
||||||
return Response(content, status=status.HTTP_400_BAD_REQUEST)
|
|
||||||
elif now - challenge_data["timestamp"] > app_settings.CHALLENGE_VALID_SECONDS:
|
|
||||||
content = {"code": "challenge_expired", "detail": "Login challange has expired"}
|
|
||||||
return Response(content, status=status.HTTP_400_BAD_REQUEST)
|
|
||||||
elif challenge_data["userId"] != user.id:
|
|
||||||
content = {"code": "wrong_user", "detail": "This challenge is for the wrong user"}
|
|
||||||
return Response(content, status=status.HTTP_400_BAD_REQUEST)
|
|
||||||
elif not settings.DEBUG and host.split(":", 1)[0] != request.get_host():
|
|
||||||
detail = 'Found wrong host name. Got: "{}" expected: "{}"'.format(host, request.get_host())
|
|
||||||
content = {"code": "wrong_host", "detail": detail}
|
|
||||||
return Response(content, status=status.HTTP_400_BAD_REQUEST)
|
|
||||||
|
|
||||||
verify_key = nacl.signing.VerifyKey(bytes(user.userinfo.loginPubkey), encoder=nacl.encoding.RawEncoder)
|
|
||||||
|
|
||||||
try:
|
|
||||||
verify_key.verify(response_raw, signature)
|
|
||||||
except nacl.exceptions.BadSignatureError:
|
|
||||||
return Response(
|
|
||||||
{"code": "login_bad_signature", "detail": "Wrong password for user."},
|
|
||||||
status=status.HTTP_401_UNAUTHORIZED,
|
|
||||||
)
|
|
||||||
|
|
||||||
return None
|
|
||||||
|
|
||||||
@action_decorator(detail=False, methods=["GET"])
|
|
||||||
def is_etebase(self, request, *args, **kwargs):
|
|
||||||
return Response({}, status=status.HTTP_200_OK)
|
|
||||||
|
|
||||||
@action_decorator(detail=False, methods=["POST"])
|
|
||||||
def login_challenge(self, request, *args, **kwargs):
|
|
||||||
from datetime import datetime
|
|
||||||
|
|
||||||
serializer = AuthenticationLoginChallengeSerializer(data=request.data)
|
|
||||||
serializer.is_valid(raise_exception=True)
|
|
||||||
username = serializer.validated_data.get("username")
|
|
||||||
user = self.get_login_user(username)
|
|
||||||
|
|
||||||
salt = bytes(user.userinfo.salt)
|
|
||||||
enc_key = self.get_encryption_key(salt)
|
|
||||||
box = nacl.secret.SecretBox(enc_key)
|
|
||||||
|
|
||||||
challenge_data = {
|
|
||||||
"timestamp": int(datetime.now().timestamp()),
|
|
||||||
"userId": user.id,
|
|
||||||
}
|
|
||||||
challenge = box.encrypt(msgpack_encode(challenge_data), encoder=nacl.encoding.RawEncoder)
|
|
||||||
|
|
||||||
ret = {
|
|
||||||
"salt": salt,
|
|
||||||
"challenge": challenge,
|
|
||||||
"version": user.userinfo.version,
|
|
||||||
}
|
|
||||||
return Response(ret, status=status.HTTP_200_OK)
|
|
||||||
|
|
||||||
@action_decorator(detail=False, methods=["POST"])
|
|
||||||
def login(self, request, *args, **kwargs):
|
|
||||||
outer_serializer = AuthenticationLoginSerializer(data=request.data)
|
|
||||||
outer_serializer.is_valid(raise_exception=True)
|
|
||||||
|
|
||||||
response_raw = outer_serializer.validated_data["response"]
|
|
||||||
response = msgpack_decode(response_raw)
|
|
||||||
signature = outer_serializer.validated_data["signature"]
|
|
||||||
|
|
||||||
context = {"host": request.get_host()}
|
|
||||||
serializer = AuthenticationLoginInnerSerializer(data=response, context=context)
|
|
||||||
serializer.is_valid(raise_exception=True)
|
|
||||||
|
|
||||||
bad_login_response = self.validate_login_request(
|
|
||||||
request, serializer.validated_data, response_raw, signature, "login"
|
|
||||||
)
|
|
||||||
if bad_login_response is not None:
|
|
||||||
return bad_login_response
|
|
||||||
|
|
||||||
username = serializer.validated_data.get("username")
|
|
||||||
user = self.get_login_user(username)
|
|
||||||
|
|
||||||
data = self.login_response_data(user)
|
|
||||||
|
|
||||||
user_logged_in.send(sender=user.__class__, request=request, user=user)
|
|
||||||
|
|
||||||
return Response(data, status=status.HTTP_200_OK)
|
|
||||||
|
|
||||||
@action_decorator(detail=False, methods=["POST"], permission_classes=[IsAuthenticated])
|
|
||||||
def logout(self, request, *args, **kwargs):
|
|
||||||
request.auth.delete()
|
|
||||||
user_logged_out.send(sender=request.user.__class__, request=request, user=request.user)
|
|
||||||
return Response(status=status.HTTP_204_NO_CONTENT)
|
|
||||||
|
|
||||||
@action_decorator(detail=False, methods=["POST"], permission_classes=BaseViewSet.permission_classes)
|
|
||||||
def change_password(self, request, *args, **kwargs):
|
|
||||||
outer_serializer = AuthenticationLoginSerializer(data=request.data)
|
|
||||||
outer_serializer.is_valid(raise_exception=True)
|
|
||||||
|
|
||||||
response_raw = outer_serializer.validated_data["response"]
|
|
||||||
response = msgpack_decode(response_raw)
|
|
||||||
signature = outer_serializer.validated_data["signature"]
|
|
||||||
|
|
||||||
context = {"host": request.get_host()}
|
|
||||||
serializer = AuthenticationChangePasswordInnerSerializer(request.user.userinfo, data=response, context=context)
|
|
||||||
serializer.is_valid(raise_exception=True)
|
|
||||||
|
|
||||||
bad_login_response = self.validate_login_request(
|
|
||||||
request, serializer.validated_data, response_raw, signature, "changePassword"
|
|
||||||
)
|
|
||||||
if bad_login_response is not None:
|
|
||||||
return bad_login_response
|
|
||||||
|
|
||||||
serializer.save()
|
|
||||||
|
|
||||||
return Response({}, status=status.HTTP_200_OK)
|
|
||||||
|
|
||||||
@action_decorator(detail=False, methods=["POST"], permission_classes=[IsAuthenticated])
|
|
||||||
def dashboard_url(self, request, *args, **kwargs):
|
|
||||||
get_dashboard_url = app_settings.DASHBOARD_URL_FUNC
|
|
||||||
if get_dashboard_url is None:
|
|
||||||
raise EtebaseValidationError(
|
|
||||||
"not_supported", "This server doesn't have a user dashboard.", status_code=status.HTTP_400_BAD_REQUEST
|
|
||||||
)
|
|
||||||
|
|
||||||
ret = {
|
|
||||||
"url": get_dashboard_url(request, *args, **kwargs),
|
|
||||||
}
|
|
||||||
return Response(ret)
|
|
||||||
|
|
||||||
|
|
||||||
class TestAuthenticationViewSet(viewsets.ViewSet):
|
|
||||||
allowed_methods = ["POST"]
|
|
||||||
renderer_classes = BaseViewSet.renderer_classes
|
|
||||||
parser_classes = BaseViewSet.parser_classes
|
|
||||||
|
|
||||||
def get_serializer_context(self):
|
|
||||||
return {"request": self.request, "format": self.format_kwarg, "view": self}
|
|
||||||
|
|
||||||
def list(self, request, *args, **kwargs):
|
|
||||||
return Response(status=status.HTTP_405_METHOD_NOT_ALLOWED)
|
|
||||||
|
|
||||||
@action_decorator(detail=False, methods=["POST"])
|
|
||||||
def reset(self, request, *args, **kwargs):
|
|
||||||
# Only run when in DEBUG mode! It's only used for tests
|
|
||||||
if not settings.DEBUG:
|
|
||||||
return HttpResponseBadRequest("Only allowed in debug mode.")
|
|
||||||
|
|
||||||
with transaction.atomic():
|
|
||||||
user_queryset = get_user_queryset(User.objects.all(), self)
|
|
||||||
user = get_object_or_404(user_queryset, username=request.data.get("user").get("username"))
|
|
||||||
|
|
||||||
# Only allow test users for extra safety
|
|
||||||
if not getattr(user, User.USERNAME_FIELD).startswith("test_user"):
|
|
||||||
return HttpResponseBadRequest("Endpoint not allowed for user.")
|
|
||||||
|
|
||||||
if hasattr(user, "userinfo"):
|
|
||||||
user.userinfo.delete()
|
|
||||||
|
|
||||||
serializer = AuthenticationSignupSerializer(data=request.data, context=self.get_serializer_context())
|
|
||||||
serializer.is_valid(raise_exception=True)
|
|
||||||
serializer.save()
|
|
||||||
|
|
||||||
# Delete all of the journal data for this user for a clear test env
|
|
||||||
user.collection_set.all().delete()
|
|
||||||
user.collectionmember_set.all().delete()
|
|
||||||
user.incoming_invitations.all().delete()
|
|
||||||
|
|
||||||
# FIXME: also delete chunk files!!!
|
|
||||||
|
|
||||||
return HttpResponse()
|
|
@ -8,6 +8,7 @@ debug = false
|
|||||||
;media_url = /user-media/
|
;media_url = /user-media/
|
||||||
;language_code = en-us
|
;language_code = en-us
|
||||||
;time_zone = UTC
|
;time_zone = UTC
|
||||||
|
;redis_uri = redis://localhost:6379
|
||||||
|
|
||||||
[allowed_hosts]
|
[allowed_hosts]
|
||||||
allowed_host1 = example.com
|
allowed_host1 = example.com
|
||||||
|
27
etebase_fastapi/db_hack.py
Normal file
27
etebase_fastapi/db_hack.py
Normal file
@ -0,0 +1,27 @@
|
|||||||
|
"""
|
||||||
|
FIXME: this whole function is a hack around the django db limitations due to how db connections are cached and cleaned.
|
||||||
|
Essentially django assumes there's the django request dispatcher to automatically clean up after the ORM.
|
||||||
|
"""
|
||||||
|
import typing as t
|
||||||
|
from functools import wraps
|
||||||
|
|
||||||
|
from django.db import close_old_connections, reset_queries
|
||||||
|
|
||||||
|
|
||||||
|
def django_db_cleanup():
|
||||||
|
reset_queries()
|
||||||
|
close_old_connections()
|
||||||
|
|
||||||
|
|
||||||
|
def django_db_cleanup_decorator(func: t.Callable[..., t.Any]):
|
||||||
|
from inspect import iscoroutinefunction
|
||||||
|
|
||||||
|
if iscoroutinefunction(func):
|
||||||
|
return func
|
||||||
|
|
||||||
|
@wraps(func)
|
||||||
|
def wrapper(*args, **kwargs):
|
||||||
|
django_db_cleanup()
|
||||||
|
return func(*args, **kwargs)
|
||||||
|
|
||||||
|
return wrapper
|
88
etebase_fastapi/dependencies.py
Normal file
88
etebase_fastapi/dependencies.py
Normal file
@ -0,0 +1,88 @@
|
|||||||
|
import dataclasses
|
||||||
|
|
||||||
|
from fastapi import Depends
|
||||||
|
from fastapi.security import APIKeyHeader
|
||||||
|
|
||||||
|
from django.utils import timezone
|
||||||
|
from django.db.models import QuerySet
|
||||||
|
|
||||||
|
from django_etebase import models
|
||||||
|
from django_etebase.token_auth.models import AuthToken, get_default_expiry
|
||||||
|
from myauth.models import UserType, get_typed_user_model
|
||||||
|
from .exceptions import AuthenticationFailed
|
||||||
|
from .utils import get_object_or_404
|
||||||
|
from .db_hack import django_db_cleanup_decorator
|
||||||
|
|
||||||
|
|
||||||
|
User = get_typed_user_model()
|
||||||
|
token_scheme = APIKeyHeader(name="Authorization")
|
||||||
|
AUTO_REFRESH = True
|
||||||
|
MIN_REFRESH_INTERVAL = 60
|
||||||
|
|
||||||
|
|
||||||
|
@dataclasses.dataclass(frozen=True)
|
||||||
|
class AuthData:
|
||||||
|
user: UserType
|
||||||
|
token: AuthToken
|
||||||
|
|
||||||
|
|
||||||
|
def __renew_token(auth_token: AuthToken):
|
||||||
|
current_expiry = auth_token.expiry
|
||||||
|
new_expiry = get_default_expiry()
|
||||||
|
# Throttle refreshing of token to avoid db writes
|
||||||
|
delta = (new_expiry - current_expiry).total_seconds()
|
||||||
|
if delta > MIN_REFRESH_INTERVAL:
|
||||||
|
auth_token.expiry = new_expiry
|
||||||
|
auth_token.save(update_fields=("expiry",))
|
||||||
|
|
||||||
|
|
||||||
|
def __get_authenticated_user(api_token: str):
|
||||||
|
api_token = api_token.split()[1]
|
||||||
|
try:
|
||||||
|
token: AuthToken = AuthToken.objects.select_related("user").get(key=api_token)
|
||||||
|
except AuthToken.DoesNotExist:
|
||||||
|
raise AuthenticationFailed(detail="Invalid token.")
|
||||||
|
if not token.user.is_active:
|
||||||
|
raise AuthenticationFailed(detail="User inactive or deleted.")
|
||||||
|
|
||||||
|
if token.expiry is not None:
|
||||||
|
if token.expiry < timezone.now():
|
||||||
|
token.delete()
|
||||||
|
raise AuthenticationFailed(detail="Invalid token.")
|
||||||
|
|
||||||
|
if AUTO_REFRESH:
|
||||||
|
__renew_token(token)
|
||||||
|
|
||||||
|
return token.user, token
|
||||||
|
|
||||||
|
|
||||||
|
@django_db_cleanup_decorator
|
||||||
|
def get_auth_data(api_token: str = Depends(token_scheme)) -> AuthData:
|
||||||
|
user, token = __get_authenticated_user(api_token)
|
||||||
|
return AuthData(user, token)
|
||||||
|
|
||||||
|
|
||||||
|
@django_db_cleanup_decorator
|
||||||
|
def get_authenticated_user(api_token: str = Depends(token_scheme)) -> UserType:
|
||||||
|
user, _ = __get_authenticated_user(api_token)
|
||||||
|
return user
|
||||||
|
|
||||||
|
|
||||||
|
@django_db_cleanup_decorator
|
||||||
|
def get_collection_queryset(user: UserType = Depends(get_authenticated_user)) -> QuerySet:
|
||||||
|
default_queryset: QuerySet = models.Collection.objects.all()
|
||||||
|
return default_queryset.filter(members__user=user)
|
||||||
|
|
||||||
|
|
||||||
|
@django_db_cleanup_decorator
|
||||||
|
def get_collection(collection_uid: str, queryset: QuerySet = Depends(get_collection_queryset)) -> models.Collection:
|
||||||
|
return get_object_or_404(queryset, uid=collection_uid)
|
||||||
|
|
||||||
|
|
||||||
|
@django_db_cleanup_decorator
|
||||||
|
def get_item_queryset(collection: models.Collection = Depends(get_collection)) -> QuerySet:
|
||||||
|
default_item_queryset: QuerySet = models.CollectionItem.objects.all()
|
||||||
|
# XXX Potentially add this for performance: .prefetch_related('revisions__chunks')
|
||||||
|
queryset = default_item_queryset.filter(collection__pk=collection.pk, revisions__current=True)
|
||||||
|
|
||||||
|
return queryset
|
128
etebase_fastapi/exceptions.py
Normal file
128
etebase_fastapi/exceptions.py
Normal file
@ -0,0 +1,128 @@
|
|||||||
|
from fastapi import status, HTTPException
|
||||||
|
import typing as t
|
||||||
|
|
||||||
|
from pydantic import BaseModel
|
||||||
|
from django.core.exceptions import ValidationError as DjangoValidationError
|
||||||
|
|
||||||
|
|
||||||
|
class HttpErrorField(BaseModel):
|
||||||
|
field: str
|
||||||
|
code: str
|
||||||
|
detail: str
|
||||||
|
|
||||||
|
class Config:
|
||||||
|
orm_mode = True
|
||||||
|
|
||||||
|
|
||||||
|
class HttpErrorOut(BaseModel):
|
||||||
|
code: str
|
||||||
|
detail: str
|
||||||
|
errors: t.Optional[t.List[HttpErrorField]]
|
||||||
|
|
||||||
|
class Config:
|
||||||
|
orm_mode = True
|
||||||
|
|
||||||
|
|
||||||
|
class CustomHttpException(HTTPException):
|
||||||
|
def __init__(self, code: str, detail: str, status_code: int = status.HTTP_400_BAD_REQUEST):
|
||||||
|
self.code = code
|
||||||
|
super().__init__(status_code, detail)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def as_dict(self) -> dict:
|
||||||
|
return {"code": self.code, "detail": self.detail}
|
||||||
|
|
||||||
|
|
||||||
|
class AuthenticationFailed(CustomHttpException):
|
||||||
|
def __init__(
|
||||||
|
self,
|
||||||
|
code="authentication_failed",
|
||||||
|
detail: str = "Incorrect authentication credentials.",
|
||||||
|
status_code: int = status.HTTP_401_UNAUTHORIZED,
|
||||||
|
):
|
||||||
|
super().__init__(code=code, detail=detail, status_code=status_code)
|
||||||
|
|
||||||
|
|
||||||
|
class NotAuthenticated(CustomHttpException):
|
||||||
|
def __init__(
|
||||||
|
self,
|
||||||
|
code="not_authenticated",
|
||||||
|
detail: str = "Authentication credentials were not provided.",
|
||||||
|
status_code: int = status.HTTP_401_UNAUTHORIZED,
|
||||||
|
):
|
||||||
|
super().__init__(code=code, detail=detail, status_code=status_code)
|
||||||
|
|
||||||
|
|
||||||
|
class PermissionDenied(CustomHttpException):
|
||||||
|
def __init__(
|
||||||
|
self,
|
||||||
|
code="permission_denied",
|
||||||
|
detail: str = "You do not have permission to perform this action.",
|
||||||
|
status_code: int = status.HTTP_403_FORBIDDEN,
|
||||||
|
):
|
||||||
|
super().__init__(code=code, detail=detail, status_code=status_code)
|
||||||
|
|
||||||
|
|
||||||
|
class NotSupported(CustomHttpException):
|
||||||
|
def __init__(
|
||||||
|
self,
|
||||||
|
code="not_implemented",
|
||||||
|
detail: str = "This server's configuration does not support this request.",
|
||||||
|
status_code: int = status.HTTP_501_NOT_IMPLEMENTED,
|
||||||
|
):
|
||||||
|
super().__init__(code=code, detail=detail, status_code=status_code)
|
||||||
|
|
||||||
|
|
||||||
|
class HttpError(CustomHttpException):
|
||||||
|
def __init__(
|
||||||
|
self,
|
||||||
|
code: str,
|
||||||
|
detail: str,
|
||||||
|
status_code: int = status.HTTP_400_BAD_REQUEST,
|
||||||
|
errors: t.Optional[t.List["HttpError"]] = None,
|
||||||
|
):
|
||||||
|
self.errors = errors
|
||||||
|
super().__init__(code=code or "generic_error", detail=detail, status_code=status_code)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def as_dict(self) -> dict:
|
||||||
|
return HttpErrorOut(code=self.code, errors=self.errors, detail=self.detail).dict()
|
||||||
|
|
||||||
|
|
||||||
|
class ValidationError(HttpError):
|
||||||
|
def __init__(
|
||||||
|
self,
|
||||||
|
code: str,
|
||||||
|
detail: str,
|
||||||
|
status_code: int = status.HTTP_400_BAD_REQUEST,
|
||||||
|
errors: t.Optional[t.List["HttpError"]] = None,
|
||||||
|
field: t.Optional[str] = None,
|
||||||
|
):
|
||||||
|
self.field = field
|
||||||
|
super().__init__(code=code, detail=detail, errors=errors, status_code=status_code)
|
||||||
|
|
||||||
|
|
||||||
|
def flatten_errors(field_name: str, errors) -> t.List[HttpError]:
|
||||||
|
ret: t.List[HttpError] = []
|
||||||
|
if isinstance(errors, dict):
|
||||||
|
for error_key in errors:
|
||||||
|
error = errors[error_key]
|
||||||
|
ret.extend(flatten_errors("{}.{}".format(field_name, error_key), error))
|
||||||
|
else:
|
||||||
|
for error in errors:
|
||||||
|
if error.messages:
|
||||||
|
message = error.messages[0]
|
||||||
|
else:
|
||||||
|
message = str(error)
|
||||||
|
ret.append(ValidationError(code=error.code or "validation_error", detail=message, field=field_name))
|
||||||
|
return ret
|
||||||
|
|
||||||
|
|
||||||
|
def transform_validation_error(prefix: str, err: DjangoValidationError):
|
||||||
|
if hasattr(err, "error_dict"):
|
||||||
|
errors = flatten_errors(prefix, err.error_dict)
|
||||||
|
elif not hasattr(err, "message"):
|
||||||
|
errors = flatten_errors(prefix, err.error_list)
|
||||||
|
else:
|
||||||
|
raise HttpError(err.code or "validation_error", err.message)
|
||||||
|
raise HttpError(code="field_errors", detail="Field validations failed.", errors=errors)
|
77
etebase_fastapi/main.py
Normal file
77
etebase_fastapi/main.py
Normal file
@ -0,0 +1,77 @@
|
|||||||
|
from django.conf import settings
|
||||||
|
|
||||||
|
# Not at the top of the file because we first need to setup django
|
||||||
|
from fastapi import FastAPI, Request
|
||||||
|
from fastapi.middleware.cors import CORSMiddleware
|
||||||
|
from fastapi.middleware.trustedhost import TrustedHostMiddleware
|
||||||
|
|
||||||
|
from django_etebase import app_settings
|
||||||
|
|
||||||
|
from .exceptions import CustomHttpException
|
||||||
|
from .msgpack import MsgpackResponse
|
||||||
|
from .routers.authentication import authentication_router
|
||||||
|
from .routers.collection import collection_router, item_router
|
||||||
|
from .routers.member import member_router
|
||||||
|
from .routers.invitation import invitation_incoming_router, invitation_outgoing_router
|
||||||
|
from .routers.websocket import websocket_router
|
||||||
|
|
||||||
|
|
||||||
|
def create_application(prefix="", middlewares=[]):
|
||||||
|
app = FastAPI(
|
||||||
|
title="Etebase",
|
||||||
|
description="The Etebase server API documentation",
|
||||||
|
externalDocs={
|
||||||
|
"url": "https://docs.etebase.com",
|
||||||
|
"description": "Docs about the API specifications and clients.",
|
||||||
|
}
|
||||||
|
# FIXME: version="2.5.0",
|
||||||
|
)
|
||||||
|
VERSION = "v1"
|
||||||
|
BASE_PATH = f"{prefix}/api/{VERSION}"
|
||||||
|
COLLECTION_UID_MARKER = "{collection_uid}"
|
||||||
|
app.include_router(authentication_router, prefix=f"{BASE_PATH}/authentication", tags=["authentication"])
|
||||||
|
app.include_router(collection_router, prefix=f"{BASE_PATH}/collection", tags=["collection"])
|
||||||
|
app.include_router(item_router, prefix=f"{BASE_PATH}/collection/{COLLECTION_UID_MARKER}", tags=["item"])
|
||||||
|
app.include_router(member_router, prefix=f"{BASE_PATH}/collection/{COLLECTION_UID_MARKER}", tags=["member"])
|
||||||
|
app.include_router(
|
||||||
|
invitation_incoming_router, prefix=f"{BASE_PATH}/invitation/incoming", tags=["incoming invitation"]
|
||||||
|
)
|
||||||
|
app.include_router(
|
||||||
|
invitation_outgoing_router, prefix=f"{BASE_PATH}/invitation/outgoing", tags=["outgoing invitation"]
|
||||||
|
)
|
||||||
|
app.include_router(websocket_router, prefix=f"{BASE_PATH}/ws", tags=["websocket"])
|
||||||
|
|
||||||
|
if settings.DEBUG:
|
||||||
|
from etebase_fastapi.routers.test_reset_view import test_reset_view_router
|
||||||
|
|
||||||
|
app.include_router(test_reset_view_router, prefix=f"{BASE_PATH}/test/authentication")
|
||||||
|
|
||||||
|
app.add_middleware(
|
||||||
|
CORSMiddleware,
|
||||||
|
allow_origin_regex="https?://.*",
|
||||||
|
allow_credentials=True,
|
||||||
|
allow_methods=["*"],
|
||||||
|
allow_headers=["*"],
|
||||||
|
)
|
||||||
|
app.add_middleware(TrustedHostMiddleware, allowed_hosts=settings.ALLOWED_HOSTS)
|
||||||
|
|
||||||
|
for middleware in middlewares:
|
||||||
|
app.add_middleware(middleware)
|
||||||
|
|
||||||
|
@app.on_event("startup")
|
||||||
|
async def on_startup() -> None:
|
||||||
|
from .redis import redisw
|
||||||
|
|
||||||
|
await redisw.setup()
|
||||||
|
|
||||||
|
@app.on_event("shutdown")
|
||||||
|
async def on_shutdown():
|
||||||
|
from .redis import redisw
|
||||||
|
|
||||||
|
await redisw.close()
|
||||||
|
|
||||||
|
@app.exception_handler(CustomHttpException)
|
||||||
|
async def custom_exception_handler(request: Request, exc: CustomHttpException):
|
||||||
|
return MsgpackResponse(status_code=exc.status_code, content=exc.as_dict)
|
||||||
|
|
||||||
|
return app
|
76
etebase_fastapi/msgpack.py
Normal file
76
etebase_fastapi/msgpack.py
Normal file
@ -0,0 +1,76 @@
|
|||||||
|
import typing as t
|
||||||
|
|
||||||
|
from fastapi.routing import APIRoute, get_request_handler
|
||||||
|
from pydantic import BaseModel
|
||||||
|
from starlette.requests import Request
|
||||||
|
from starlette.responses import Response
|
||||||
|
|
||||||
|
from .utils import msgpack_encode, msgpack_decode
|
||||||
|
from .db_hack import django_db_cleanup_decorator
|
||||||
|
|
||||||
|
|
||||||
|
class MsgpackRequest(Request):
|
||||||
|
media_type = "application/msgpack"
|
||||||
|
|
||||||
|
async def json(self) -> bytes:
|
||||||
|
if not hasattr(self, "_json"):
|
||||||
|
body = await super().body()
|
||||||
|
self._json = msgpack_decode(body)
|
||||||
|
return self._json
|
||||||
|
|
||||||
|
|
||||||
|
class MsgpackResponse(Response):
|
||||||
|
media_type = "application/msgpack"
|
||||||
|
|
||||||
|
def render(self, content: t.Optional[t.Any]) -> bytes:
|
||||||
|
if content is None:
|
||||||
|
return b""
|
||||||
|
|
||||||
|
if isinstance(content, BaseModel):
|
||||||
|
content = content.dict()
|
||||||
|
return msgpack_encode(content)
|
||||||
|
|
||||||
|
|
||||||
|
class MsgpackRoute(APIRoute):
|
||||||
|
# keep track of content-type -> request classes
|
||||||
|
REQUESTS_CLASSES = {MsgpackRequest.media_type: MsgpackRequest}
|
||||||
|
# keep track of content-type -> response classes
|
||||||
|
ROUTES_HANDLERS_CLASSES = {MsgpackResponse.media_type: MsgpackResponse}
|
||||||
|
|
||||||
|
def __init__(self, path: str, endpoint: t.Callable[..., t.Any], *args, **kwargs):
|
||||||
|
endpoint = django_db_cleanup_decorator(endpoint)
|
||||||
|
super().__init__(path, endpoint, *args, **kwargs)
|
||||||
|
|
||||||
|
def _get_media_type_route_handler(self, media_type):
|
||||||
|
return get_request_handler(
|
||||||
|
dependant=self.dependant,
|
||||||
|
body_field=self.body_field,
|
||||||
|
status_code=self.status_code,
|
||||||
|
# use custom response class or fallback on default self.response_class
|
||||||
|
response_class=self.ROUTES_HANDLERS_CLASSES.get(media_type, self.response_class),
|
||||||
|
response_field=self.secure_cloned_response_field,
|
||||||
|
response_model_include=self.response_model_include,
|
||||||
|
response_model_exclude=self.response_model_exclude,
|
||||||
|
response_model_by_alias=self.response_model_by_alias,
|
||||||
|
response_model_exclude_unset=self.response_model_exclude_unset,
|
||||||
|
response_model_exclude_defaults=self.response_model_exclude_defaults,
|
||||||
|
response_model_exclude_none=self.response_model_exclude_none,
|
||||||
|
dependency_overrides_provider=self.dependency_overrides_provider,
|
||||||
|
)
|
||||||
|
|
||||||
|
def get_route_handler(self) -> t.Callable:
|
||||||
|
async def custom_route_handler(request: Request) -> Response:
|
||||||
|
|
||||||
|
content_type = request.headers.get("Content-Type")
|
||||||
|
try:
|
||||||
|
request_cls = self.REQUESTS_CLASSES[content_type]
|
||||||
|
request = request_cls(request.scope, request.receive)
|
||||||
|
except KeyError:
|
||||||
|
# nothing registered to handle content_type, process given requests as-is
|
||||||
|
pass
|
||||||
|
|
||||||
|
accept = request.headers.get("Accept")
|
||||||
|
route_handler = self._get_media_type_route_handler(accept)
|
||||||
|
return await route_handler(request)
|
||||||
|
|
||||||
|
return custom_route_handler
|
27
etebase_fastapi/redis.py
Normal file
27
etebase_fastapi/redis.py
Normal file
@ -0,0 +1,27 @@
|
|||||||
|
import typing as t
|
||||||
|
import aioredis
|
||||||
|
|
||||||
|
from django_etebase import app_settings
|
||||||
|
|
||||||
|
|
||||||
|
class RedisWrapper:
|
||||||
|
redis: aioredis.Redis
|
||||||
|
|
||||||
|
def __init__(self, redis_uri: t.Optional[str]):
|
||||||
|
self.redis_uri = redis_uri
|
||||||
|
|
||||||
|
async def setup(self):
|
||||||
|
if self.redis_uri is not None:
|
||||||
|
self.redis = await aioredis.create_redis_pool(self.redis_uri)
|
||||||
|
|
||||||
|
async def close(self):
|
||||||
|
if self.redis is not None:
|
||||||
|
self.redis.close()
|
||||||
|
await self.redis.wait_closed()
|
||||||
|
|
||||||
|
@property
|
||||||
|
def is_active(self):
|
||||||
|
return self.redis_uri is not None
|
||||||
|
|
||||||
|
|
||||||
|
redisw = RedisWrapper(app_settings.REDIS_URI)
|
263
etebase_fastapi/routers/authentication.py
Normal file
263
etebase_fastapi/routers/authentication.py
Normal file
@ -0,0 +1,263 @@
|
|||||||
|
import typing as t
|
||||||
|
from typing_extensions import Literal
|
||||||
|
from datetime import datetime
|
||||||
|
|
||||||
|
import nacl
|
||||||
|
import nacl.encoding
|
||||||
|
import nacl.hash
|
||||||
|
import nacl.secret
|
||||||
|
import nacl.signing
|
||||||
|
from django.conf import settings
|
||||||
|
from django.contrib.auth import user_logged_out, user_logged_in
|
||||||
|
from django.core import exceptions as django_exceptions
|
||||||
|
from django.db import transaction
|
||||||
|
from django.utils.functional import cached_property
|
||||||
|
from fastapi import APIRouter, Depends, status, Request
|
||||||
|
|
||||||
|
from django_etebase import app_settings, models
|
||||||
|
from django_etebase.token_auth.models import AuthToken
|
||||||
|
from django_etebase.models import UserInfo
|
||||||
|
from django_etebase.signals import user_signed_up
|
||||||
|
from django_etebase.utils import create_user, get_user_queryset, CallbackContext
|
||||||
|
from myauth.models import UserType, get_typed_user_model
|
||||||
|
from ..exceptions import AuthenticationFailed, transform_validation_error, HttpError
|
||||||
|
from ..msgpack import MsgpackRoute
|
||||||
|
from ..utils import BaseModel, permission_responses, msgpack_encode, msgpack_decode, get_user_username_email_kwargs
|
||||||
|
from ..dependencies import AuthData, get_auth_data, get_authenticated_user
|
||||||
|
|
||||||
|
User = get_typed_user_model()
|
||||||
|
authentication_router = APIRouter(route_class=MsgpackRoute)
|
||||||
|
|
||||||
|
|
||||||
|
class LoginChallengeIn(BaseModel):
|
||||||
|
username: str
|
||||||
|
|
||||||
|
|
||||||
|
class LoginChallengeOut(BaseModel):
|
||||||
|
salt: bytes
|
||||||
|
challenge: bytes
|
||||||
|
version: int
|
||||||
|
|
||||||
|
|
||||||
|
class LoginResponse(BaseModel):
|
||||||
|
username: str
|
||||||
|
challenge: bytes
|
||||||
|
host: str
|
||||||
|
action: Literal["login", "changePassword"]
|
||||||
|
|
||||||
|
|
||||||
|
class UserOut(BaseModel):
|
||||||
|
username: str
|
||||||
|
email: str
|
||||||
|
pubkey: bytes
|
||||||
|
encryptedContent: bytes
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def from_orm(cls: t.Type["UserOut"], obj: UserType) -> "UserOut":
|
||||||
|
return cls(
|
||||||
|
username=obj.username,
|
||||||
|
email=obj.email,
|
||||||
|
pubkey=bytes(obj.userinfo.pubkey),
|
||||||
|
encryptedContent=bytes(obj.userinfo.encryptedContent),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class LoginOut(BaseModel):
|
||||||
|
token: str
|
||||||
|
user: UserOut
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def from_orm(cls: t.Type["LoginOut"], obj: UserType) -> "LoginOut":
|
||||||
|
token = AuthToken.objects.create(user=obj).key
|
||||||
|
user = UserOut.from_orm(obj)
|
||||||
|
return cls(token=token, user=user)
|
||||||
|
|
||||||
|
|
||||||
|
class Authentication(BaseModel):
|
||||||
|
class Config:
|
||||||
|
keep_untouched = (cached_property,)
|
||||||
|
|
||||||
|
response: bytes
|
||||||
|
signature: bytes
|
||||||
|
|
||||||
|
|
||||||
|
class Login(Authentication):
|
||||||
|
@cached_property
|
||||||
|
def response_data(self) -> LoginResponse:
|
||||||
|
return LoginResponse(**msgpack_decode(self.response))
|
||||||
|
|
||||||
|
|
||||||
|
class ChangePasswordResponse(LoginResponse):
|
||||||
|
loginPubkey: bytes
|
||||||
|
encryptedContent: bytes
|
||||||
|
|
||||||
|
|
||||||
|
class ChangePassword(Authentication):
|
||||||
|
@cached_property
|
||||||
|
def response_data(self) -> ChangePasswordResponse:
|
||||||
|
return ChangePasswordResponse(**msgpack_decode(self.response))
|
||||||
|
|
||||||
|
|
||||||
|
class UserSignup(BaseModel):
|
||||||
|
username: str
|
||||||
|
email: str
|
||||||
|
|
||||||
|
|
||||||
|
class SignupIn(BaseModel):
|
||||||
|
user: UserSignup
|
||||||
|
salt: bytes
|
||||||
|
loginPubkey: bytes
|
||||||
|
pubkey: bytes
|
||||||
|
encryptedContent: bytes
|
||||||
|
|
||||||
|
|
||||||
|
def get_login_user(request: Request, challenge: LoginChallengeIn) -> UserType:
|
||||||
|
username = challenge.username
|
||||||
|
|
||||||
|
kwargs = get_user_username_email_kwargs(username)
|
||||||
|
try:
|
||||||
|
user_queryset = get_user_queryset(User.objects.all(), CallbackContext(request.path_params))
|
||||||
|
user = user_queryset.get(**kwargs)
|
||||||
|
if not hasattr(user, "userinfo"):
|
||||||
|
raise AuthenticationFailed(code="user_not_init", detail="User not properly init")
|
||||||
|
return user
|
||||||
|
except User.DoesNotExist:
|
||||||
|
raise AuthenticationFailed(code="user_not_found", detail="User not found")
|
||||||
|
|
||||||
|
|
||||||
|
def get_encryption_key(salt: bytes):
|
||||||
|
key = nacl.hash.blake2b(settings.SECRET_KEY.encode(), encoder=nacl.encoding.RawEncoder)
|
||||||
|
return nacl.hash.blake2b(
|
||||||
|
b"",
|
||||||
|
key=key,
|
||||||
|
salt=salt[: nacl.hash.BLAKE2B_SALTBYTES],
|
||||||
|
person=b"etebase-auth",
|
||||||
|
encoder=nacl.encoding.RawEncoder,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def save_changed_password(data: ChangePassword, user: UserType):
|
||||||
|
response_data = data.response_data
|
||||||
|
user_info: UserInfo = user.userinfo
|
||||||
|
user_info.loginPubkey = response_data.loginPubkey
|
||||||
|
user_info.encryptedContent = response_data.encryptedContent
|
||||||
|
user_info.save()
|
||||||
|
|
||||||
|
|
||||||
|
def validate_login_request(
|
||||||
|
validated_data: LoginResponse,
|
||||||
|
challenge_sent_to_user: Authentication,
|
||||||
|
user: UserType,
|
||||||
|
expected_action: str,
|
||||||
|
host_from_request: str,
|
||||||
|
):
|
||||||
|
enc_key = get_encryption_key(bytes(user.userinfo.salt))
|
||||||
|
box = nacl.secret.SecretBox(enc_key)
|
||||||
|
challenge_data = msgpack_decode(box.decrypt(validated_data.challenge))
|
||||||
|
now = int(datetime.now().timestamp())
|
||||||
|
if validated_data.action != expected_action:
|
||||||
|
raise HttpError("wrong_action", f'Expected "{expected_action}" but got something else')
|
||||||
|
elif now - challenge_data["timestamp"] > app_settings.CHALLENGE_VALID_SECONDS:
|
||||||
|
raise HttpError("challenge_expired", "Login challenge has expired")
|
||||||
|
elif challenge_data["userId"] != user.id:
|
||||||
|
raise HttpError("wrong_user", "This challenge is for the wrong user")
|
||||||
|
elif not settings.DEBUG and validated_data.host.split(":", 1)[0] != host_from_request:
|
||||||
|
raise HttpError(
|
||||||
|
"wrong_host", f'Found wrong host name. Got: "{validated_data.host}" expected: "{host_from_request}"'
|
||||||
|
)
|
||||||
|
verify_key = nacl.signing.VerifyKey(bytes(user.userinfo.loginPubkey), encoder=nacl.encoding.RawEncoder)
|
||||||
|
try:
|
||||||
|
verify_key.verify(challenge_sent_to_user.response, challenge_sent_to_user.signature)
|
||||||
|
except nacl.exceptions.BadSignatureError:
|
||||||
|
raise HttpError("login_bad_signature", "Wrong password for user.", status.HTTP_401_UNAUTHORIZED)
|
||||||
|
|
||||||
|
|
||||||
|
@authentication_router.get("/is_etebase/")
|
||||||
|
async def is_etebase():
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
@authentication_router.post("/login_challenge/", response_model=LoginChallengeOut)
|
||||||
|
def login_challenge(user: UserType = Depends(get_login_user)):
|
||||||
|
salt = bytes(user.userinfo.salt)
|
||||||
|
enc_key = get_encryption_key(salt)
|
||||||
|
box = nacl.secret.SecretBox(enc_key)
|
||||||
|
challenge_data = {
|
||||||
|
"timestamp": int(datetime.now().timestamp()),
|
||||||
|
"userId": user.id,
|
||||||
|
}
|
||||||
|
challenge = bytes(box.encrypt(msgpack_encode(challenge_data), encoder=nacl.encoding.RawEncoder))
|
||||||
|
return LoginChallengeOut(salt=salt, challenge=challenge, version=user.userinfo.version)
|
||||||
|
|
||||||
|
|
||||||
|
@authentication_router.post("/login/", response_model=LoginOut)
|
||||||
|
def login(data: Login, request: Request):
|
||||||
|
user = get_login_user(request, LoginChallengeIn(username=data.response_data.username))
|
||||||
|
host = request.headers.get("Host")
|
||||||
|
validate_login_request(data.response_data, data, user, "login", host)
|
||||||
|
ret = LoginOut.from_orm(user)
|
||||||
|
user_logged_in.send(sender=user.__class__, request=None, user=user)
|
||||||
|
return ret
|
||||||
|
|
||||||
|
|
||||||
|
@authentication_router.post("/logout/", status_code=status.HTTP_204_NO_CONTENT, responses=permission_responses)
|
||||||
|
def logout(auth_data: AuthData = Depends(get_auth_data)):
|
||||||
|
auth_data.token.delete()
|
||||||
|
user_logged_out.send(sender=auth_data.user.__class__, request=None, user=auth_data.user)
|
||||||
|
|
||||||
|
|
||||||
|
@authentication_router.post("/change_password/", status_code=status.HTTP_204_NO_CONTENT, responses=permission_responses)
|
||||||
|
def change_password(data: ChangePassword, request: Request, user: UserType = Depends(get_authenticated_user)):
|
||||||
|
host = request.headers.get("Host")
|
||||||
|
validate_login_request(data.response_data, data, user, "changePassword", host)
|
||||||
|
save_changed_password(data, user)
|
||||||
|
|
||||||
|
|
||||||
|
@authentication_router.post("/dashboard_url/", responses=permission_responses)
|
||||||
|
def dashboard_url(request: Request, user: UserType = Depends(get_authenticated_user)):
|
||||||
|
get_dashboard_url = app_settings.DASHBOARD_URL_FUNC
|
||||||
|
if get_dashboard_url is None:
|
||||||
|
raise HttpError("not_supported", "This server doesn't have a user dashboard.")
|
||||||
|
|
||||||
|
ret = {
|
||||||
|
"url": get_dashboard_url(CallbackContext(request.path_params, user=user)),
|
||||||
|
}
|
||||||
|
return ret
|
||||||
|
|
||||||
|
|
||||||
|
def signup_save(data: SignupIn, request: Request) -> UserType:
|
||||||
|
user_data = data.user
|
||||||
|
with transaction.atomic():
|
||||||
|
try:
|
||||||
|
user_queryset = get_user_queryset(User.objects.all(), CallbackContext(request.path_params))
|
||||||
|
instance = user_queryset.get(**{User.USERNAME_FIELD: user_data.username.lower()})
|
||||||
|
except User.DoesNotExist:
|
||||||
|
# Create the user and save the casing the user chose as the first name
|
||||||
|
try:
|
||||||
|
instance = create_user(
|
||||||
|
CallbackContext(request.path_params),
|
||||||
|
**user_data.dict(),
|
||||||
|
password=None,
|
||||||
|
first_name=user_data.username,
|
||||||
|
)
|
||||||
|
instance.full_clean()
|
||||||
|
except HttpError as e:
|
||||||
|
raise e
|
||||||
|
except django_exceptions.ValidationError as e:
|
||||||
|
transform_validation_error("user", e)
|
||||||
|
except Exception as e:
|
||||||
|
raise HttpError("generic", str(e))
|
||||||
|
|
||||||
|
if hasattr(instance, "userinfo"):
|
||||||
|
raise HttpError("user_exists", "User already exists", status_code=status.HTTP_409_CONFLICT)
|
||||||
|
|
||||||
|
models.UserInfo.objects.create(**data.dict(exclude={"user"}), owner=instance)
|
||||||
|
return instance
|
||||||
|
|
||||||
|
|
||||||
|
@authentication_router.post("/signup/", response_model=LoginOut, status_code=status.HTTP_201_CREATED)
|
||||||
|
def signup(data: SignupIn, request: Request):
|
||||||
|
user = signup_save(data, request)
|
||||||
|
ret = LoginOut.from_orm(user)
|
||||||
|
user_signed_up.send(sender=user.__class__, request=None, user=user)
|
||||||
|
return ret
|
631
etebase_fastapi/routers/collection.py
Normal file
631
etebase_fastapi/routers/collection.py
Normal file
@ -0,0 +1,631 @@
|
|||||||
|
import typing as t
|
||||||
|
|
||||||
|
from asgiref.sync import sync_to_async
|
||||||
|
from django.core import exceptions as django_exceptions
|
||||||
|
from django.core.files.base import ContentFile
|
||||||
|
from django.db import transaction, IntegrityError
|
||||||
|
from django.db.models import Q, QuerySet
|
||||||
|
from fastapi import APIRouter, Depends, status, Request, BackgroundTasks
|
||||||
|
|
||||||
|
from django_etebase import models
|
||||||
|
from myauth.models import UserType
|
||||||
|
from .authentication import get_authenticated_user
|
||||||
|
from .websocket import get_ticket, TicketRequest, TicketOut
|
||||||
|
from ..exceptions import HttpError, transform_validation_error, PermissionDenied, ValidationError
|
||||||
|
from ..msgpack import MsgpackRoute
|
||||||
|
from ..stoken_handler import filter_by_stoken_and_limit, filter_by_stoken, get_stoken_obj, get_queryset_stoken
|
||||||
|
from ..utils import (
|
||||||
|
get_object_or_404,
|
||||||
|
Context,
|
||||||
|
Prefetch,
|
||||||
|
PrefetchQuery,
|
||||||
|
is_collection_admin,
|
||||||
|
msgpack_encode,
|
||||||
|
BaseModel,
|
||||||
|
permission_responses,
|
||||||
|
PERMISSIONS_READ,
|
||||||
|
PERMISSIONS_READWRITE,
|
||||||
|
)
|
||||||
|
from ..dependencies import get_collection_queryset, get_item_queryset, get_collection
|
||||||
|
from ..sendfile import sendfile
|
||||||
|
from ..redis import redisw
|
||||||
|
from ..db_hack import django_db_cleanup_decorator
|
||||||
|
|
||||||
|
collection_router = APIRouter(route_class=MsgpackRoute, responses=permission_responses)
|
||||||
|
item_router = APIRouter(route_class=MsgpackRoute, responses=permission_responses)
|
||||||
|
CollectionQuerySet = QuerySet[models.Collection]
|
||||||
|
CollectionItemQuerySet = QuerySet[models.CollectionItem]
|
||||||
|
|
||||||
|
|
||||||
|
class ListMulti(BaseModel):
|
||||||
|
collectionTypes: t.List[bytes]
|
||||||
|
|
||||||
|
|
||||||
|
ChunkType = t.Tuple[str, t.Optional[bytes]]
|
||||||
|
|
||||||
|
|
||||||
|
class CollectionItemRevisionInOut(BaseModel):
|
||||||
|
uid: str
|
||||||
|
meta: bytes
|
||||||
|
deleted: bool
|
||||||
|
chunks: t.List[ChunkType]
|
||||||
|
|
||||||
|
class Config:
|
||||||
|
orm_mode = True
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def from_orm_context(
|
||||||
|
cls: t.Type["CollectionItemRevisionInOut"], obj: models.CollectionItemRevision, context: Context
|
||||||
|
) -> "CollectionItemRevisionInOut":
|
||||||
|
chunks: t.List[ChunkType] = []
|
||||||
|
for chunk_relation in obj.chunks_relation.all():
|
||||||
|
chunk_obj = chunk_relation.chunk
|
||||||
|
if context.prefetch == "auto":
|
||||||
|
with open(chunk_obj.chunkFile.path, "rb") as f:
|
||||||
|
chunks.append((chunk_obj.uid, f.read()))
|
||||||
|
else:
|
||||||
|
chunks.append((chunk_obj.uid, None))
|
||||||
|
return cls(uid=obj.uid, meta=bytes(obj.meta), deleted=obj.deleted, chunks=chunks)
|
||||||
|
|
||||||
|
|
||||||
|
class CollectionItemCommon(BaseModel):
|
||||||
|
uid: str
|
||||||
|
version: int
|
||||||
|
encryptionKey: t.Optional[bytes]
|
||||||
|
content: CollectionItemRevisionInOut
|
||||||
|
|
||||||
|
|
||||||
|
class CollectionItemOut(CollectionItemCommon):
|
||||||
|
class Config:
|
||||||
|
orm_mode = True
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def from_orm_context(
|
||||||
|
cls: t.Type["CollectionItemOut"], obj: models.CollectionItem, context: Context
|
||||||
|
) -> "CollectionItemOut":
|
||||||
|
return cls(
|
||||||
|
uid=obj.uid,
|
||||||
|
version=obj.version,
|
||||||
|
encryptionKey=obj.encryptionKey,
|
||||||
|
content=CollectionItemRevisionInOut.from_orm_context(obj.content, context),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class CollectionItemIn(CollectionItemCommon):
|
||||||
|
etag: t.Optional[str]
|
||||||
|
|
||||||
|
|
||||||
|
class CollectionCommon(BaseModel):
|
||||||
|
# FIXME: remove optional once we finish collection-type-migration
|
||||||
|
collectionType: t.Optional[bytes]
|
||||||
|
collectionKey: bytes
|
||||||
|
|
||||||
|
|
||||||
|
class CollectionOut(CollectionCommon):
|
||||||
|
accessLevel: models.AccessLevels
|
||||||
|
stoken: str
|
||||||
|
item: CollectionItemOut
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def from_orm_context(cls: t.Type["CollectionOut"], obj: models.Collection, context: Context) -> "CollectionOut":
|
||||||
|
member: models.CollectionMember = obj.members.get(user=context.user)
|
||||||
|
collection_type = member.collectionType
|
||||||
|
assert obj.main_item is not None
|
||||||
|
ret = cls(
|
||||||
|
collectionType=collection_type and bytes(collection_type.uid),
|
||||||
|
collectionKey=bytes(member.encryptionKey),
|
||||||
|
accessLevel=member.accessLevel,
|
||||||
|
stoken=obj.stoken,
|
||||||
|
item=CollectionItemOut.from_orm_context(obj.main_item, context),
|
||||||
|
)
|
||||||
|
return ret
|
||||||
|
|
||||||
|
|
||||||
|
class CollectionIn(CollectionCommon):
|
||||||
|
item: CollectionItemIn
|
||||||
|
|
||||||
|
|
||||||
|
class RemovedMembershipOut(BaseModel):
|
||||||
|
uid: str
|
||||||
|
|
||||||
|
|
||||||
|
class CollectionListResponse(BaseModel):
|
||||||
|
data: t.List[CollectionOut]
|
||||||
|
stoken: t.Optional[str]
|
||||||
|
done: bool
|
||||||
|
|
||||||
|
removedMemberships: t.Optional[t.List[RemovedMembershipOut]]
|
||||||
|
|
||||||
|
|
||||||
|
class CollectionItemListResponse(BaseModel):
|
||||||
|
data: t.List[CollectionItemOut]
|
||||||
|
stoken: t.Optional[str]
|
||||||
|
done: bool
|
||||||
|
|
||||||
|
|
||||||
|
class CollectionItemRevisionListResponse(BaseModel):
|
||||||
|
data: t.List[CollectionItemRevisionInOut]
|
||||||
|
iterator: t.Optional[str]
|
||||||
|
done: bool
|
||||||
|
|
||||||
|
|
||||||
|
class CollectionItemBulkGetIn(BaseModel):
|
||||||
|
uid: str
|
||||||
|
etag: t.Optional[str]
|
||||||
|
|
||||||
|
|
||||||
|
class ItemDepIn(BaseModel):
|
||||||
|
uid: str
|
||||||
|
etag: str
|
||||||
|
|
||||||
|
def validate_db(self):
|
||||||
|
item = models.CollectionItem.objects.get(uid=self.uid)
|
||||||
|
etag = self.etag
|
||||||
|
if item.etag != etag:
|
||||||
|
raise ValidationError(
|
||||||
|
"wrong_etag",
|
||||||
|
"Wrong etag. Expected {} got {}".format(item.etag, etag),
|
||||||
|
status_code=status.HTTP_409_CONFLICT,
|
||||||
|
field=self.uid,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class ItemBatchIn(BaseModel):
|
||||||
|
items: t.List[CollectionItemIn]
|
||||||
|
deps: t.Optional[t.List[ItemDepIn]]
|
||||||
|
|
||||||
|
def validate_db(self):
|
||||||
|
if self.deps is not None:
|
||||||
|
errors: t.List[HttpError] = []
|
||||||
|
for dep in self.deps:
|
||||||
|
try:
|
||||||
|
dep.validate_db()
|
||||||
|
except ValidationError as e:
|
||||||
|
errors.append(e)
|
||||||
|
if len(errors) > 0:
|
||||||
|
raise ValidationError(
|
||||||
|
code="dep_failed",
|
||||||
|
detail="Dependencies failed to validate",
|
||||||
|
errors=errors,
|
||||||
|
status_code=status.HTTP_409_CONFLICT,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
async def report_items_changed(col_uid: str, stoken: str, items: t.List[CollectionItemIn]):
|
||||||
|
if not redisw.is_active:
|
||||||
|
return
|
||||||
|
|
||||||
|
redis = redisw.redis
|
||||||
|
content = msgpack_encode(CollectionItemListResponse(data=items, stoken=stoken, done=True).dict())
|
||||||
|
await redis.publish(f"col.{col_uid}", content)
|
||||||
|
|
||||||
|
|
||||||
|
def collection_list_common(
|
||||||
|
queryset: CollectionQuerySet,
|
||||||
|
user: UserType,
|
||||||
|
stoken: t.Optional[str],
|
||||||
|
limit: int,
|
||||||
|
prefetch: Prefetch,
|
||||||
|
) -> CollectionListResponse:
|
||||||
|
result, new_stoken_obj, done = filter_by_stoken_and_limit(
|
||||||
|
stoken, limit, queryset, models.Collection.stoken_annotation
|
||||||
|
)
|
||||||
|
new_stoken = new_stoken_obj and new_stoken_obj.uid
|
||||||
|
context = Context(user, prefetch)
|
||||||
|
data: t.List[CollectionOut] = [CollectionOut.from_orm_context(item, context) for item in result]
|
||||||
|
|
||||||
|
ret = CollectionListResponse(data=data, stoken=new_stoken, done=done)
|
||||||
|
|
||||||
|
stoken_obj = get_stoken_obj(stoken)
|
||||||
|
if stoken_obj is not None:
|
||||||
|
# FIXME: honour limit? (the limit should be combined for data and this because of stoken)
|
||||||
|
remed_qs = models.CollectionMemberRemoved.objects.filter(user=user, stoken__id__gt=stoken_obj.id)
|
||||||
|
if not done and new_stoken_obj is not None:
|
||||||
|
# We only filter by the new_stoken if we are not done. This is because if we are done, the new stoken
|
||||||
|
# can point to the most recent collection change rather than most recent removed membership.
|
||||||
|
remed_qs = remed_qs.filter(stoken__id__lte=new_stoken_obj.id)
|
||||||
|
|
||||||
|
remed = remed_qs.values_list("collection__uid", flat=True)
|
||||||
|
if len(remed) > 0:
|
||||||
|
ret.removedMemberships = [RemovedMembershipOut(uid=x) for x in remed]
|
||||||
|
|
||||||
|
return ret
|
||||||
|
|
||||||
|
|
||||||
|
# permissions
|
||||||
|
|
||||||
|
|
||||||
|
@django_db_cleanup_decorator
|
||||||
|
def verify_collection_admin(
|
||||||
|
collection: models.Collection = Depends(get_collection), user: UserType = Depends(get_authenticated_user)
|
||||||
|
):
|
||||||
|
if not is_collection_admin(collection, user):
|
||||||
|
raise PermissionDenied("admin_access_required", "Only collection admins can perform this operation.")
|
||||||
|
|
||||||
|
|
||||||
|
@django_db_cleanup_decorator
|
||||||
|
def has_write_access(
|
||||||
|
collection: models.Collection = Depends(get_collection), user: UserType = Depends(get_authenticated_user)
|
||||||
|
):
|
||||||
|
member = collection.members.get(user=user)
|
||||||
|
if member.accessLevel == models.AccessLevels.READ_ONLY:
|
||||||
|
raise PermissionDenied("no_write_access", "You need write access to write to this collection")
|
||||||
|
|
||||||
|
|
||||||
|
# paths
|
||||||
|
|
||||||
|
|
||||||
|
@collection_router.post(
|
||||||
|
"/list_multi/",
|
||||||
|
response_model=CollectionListResponse,
|
||||||
|
response_model_exclude_unset=True,
|
||||||
|
dependencies=PERMISSIONS_READ,
|
||||||
|
)
|
||||||
|
def list_multi(
|
||||||
|
data: ListMulti,
|
||||||
|
stoken: t.Optional[str] = None,
|
||||||
|
limit: int = 50,
|
||||||
|
queryset: CollectionQuerySet = Depends(get_collection_queryset),
|
||||||
|
user: UserType = Depends(get_authenticated_user),
|
||||||
|
prefetch: Prefetch = PrefetchQuery,
|
||||||
|
):
|
||||||
|
# FIXME: Remove the isnull part once we attach collection types to all objects ("collection-type-migration")
|
||||||
|
queryset = queryset.filter(
|
||||||
|
Q(members__collectionType__uid__in=data.collectionTypes) | Q(members__collectionType__isnull=True)
|
||||||
|
)
|
||||||
|
|
||||||
|
return collection_list_common(queryset, user, stoken, limit, prefetch)
|
||||||
|
|
||||||
|
|
||||||
|
@collection_router.get("/", response_model=CollectionListResponse, dependencies=PERMISSIONS_READ)
|
||||||
|
def collection_list(
|
||||||
|
stoken: t.Optional[str] = None,
|
||||||
|
limit: int = 50,
|
||||||
|
prefetch: Prefetch = PrefetchQuery,
|
||||||
|
user: UserType = Depends(get_authenticated_user),
|
||||||
|
queryset: CollectionQuerySet = Depends(get_collection_queryset),
|
||||||
|
):
|
||||||
|
return collection_list_common(queryset, user, stoken, limit, prefetch)
|
||||||
|
|
||||||
|
|
||||||
|
def process_revisions_for_item(item: models.CollectionItem, revision_data: CollectionItemRevisionInOut):
|
||||||
|
chunks_objs = []
|
||||||
|
|
||||||
|
revision = models.CollectionItemRevision(**revision_data.dict(exclude={"chunks"}), item=item)
|
||||||
|
revision.validate_unique() # Verify there aren't any validation issues
|
||||||
|
|
||||||
|
for chunk in revision_data.chunks:
|
||||||
|
uid = chunk[0]
|
||||||
|
chunk_obj = models.CollectionItemChunk.objects.filter(uid=uid).first()
|
||||||
|
content = chunk[1] if len(chunk) > 1 else None
|
||||||
|
# If the chunk already exists we assume it's fine. Otherwise, we upload it.
|
||||||
|
if chunk_obj is None:
|
||||||
|
if content is not None:
|
||||||
|
chunk_obj = models.CollectionItemChunk(uid=uid, collection=item.collection)
|
||||||
|
chunk_obj.chunkFile.save("IGNORED", ContentFile(content))
|
||||||
|
chunk_obj.save()
|
||||||
|
else:
|
||||||
|
raise ValidationError("chunk_no_content", "Tried to create a new chunk without content")
|
||||||
|
|
||||||
|
chunks_objs.append(chunk_obj)
|
||||||
|
|
||||||
|
stoken = models.Stoken.objects.create()
|
||||||
|
revision.stoken = stoken
|
||||||
|
revision.save()
|
||||||
|
|
||||||
|
for chunk2 in chunks_objs:
|
||||||
|
models.RevisionChunkRelation.objects.create(chunk=chunk2, revision=revision)
|
||||||
|
return revision
|
||||||
|
|
||||||
|
|
||||||
|
def _create(data: CollectionIn, user: UserType):
|
||||||
|
with transaction.atomic():
|
||||||
|
if data.item.etag is not None:
|
||||||
|
raise ValidationError("bad_etag", "etag is not null")
|
||||||
|
instance = models.Collection(uid=data.item.uid, owner=user)
|
||||||
|
try:
|
||||||
|
instance.validate_unique()
|
||||||
|
except django_exceptions.ValidationError:
|
||||||
|
raise ValidationError(
|
||||||
|
"unique_uid", "Collection with this uid already exists", status_code=status.HTTP_409_CONFLICT
|
||||||
|
)
|
||||||
|
instance.save()
|
||||||
|
|
||||||
|
main_item = models.CollectionItem.objects.create(
|
||||||
|
uid=data.item.uid, version=data.item.version, collection=instance
|
||||||
|
)
|
||||||
|
|
||||||
|
instance.main_item = main_item
|
||||||
|
instance.save()
|
||||||
|
|
||||||
|
# TODO
|
||||||
|
process_revisions_for_item(main_item, data.item.content)
|
||||||
|
|
||||||
|
collection_type_obj, _ = models.CollectionType.objects.get_or_create(uid=data.collectionType, owner=user)
|
||||||
|
|
||||||
|
models.CollectionMember(
|
||||||
|
collection=instance,
|
||||||
|
stoken=models.Stoken.objects.create(),
|
||||||
|
user=user,
|
||||||
|
accessLevel=models.AccessLevels.ADMIN,
|
||||||
|
encryptionKey=data.collectionKey,
|
||||||
|
collectionType=collection_type_obj,
|
||||||
|
).save()
|
||||||
|
|
||||||
|
|
||||||
|
@collection_router.post("/", status_code=status.HTTP_201_CREATED, dependencies=PERMISSIONS_READWRITE)
|
||||||
|
def create(data: CollectionIn, user: UserType = Depends(get_authenticated_user)):
|
||||||
|
_create(data, user)
|
||||||
|
|
||||||
|
|
||||||
|
@collection_router.get("/{collection_uid}/", response_model=CollectionOut, dependencies=PERMISSIONS_READ)
|
||||||
|
def collection_get(
|
||||||
|
obj: models.Collection = Depends(get_collection),
|
||||||
|
user: UserType = Depends(get_authenticated_user),
|
||||||
|
prefetch: Prefetch = PrefetchQuery,
|
||||||
|
):
|
||||||
|
return CollectionOut.from_orm_context(obj, Context(user, prefetch))
|
||||||
|
|
||||||
|
|
||||||
|
def item_create(item_model: CollectionItemIn, collection: models.Collection, validate_etag: bool):
|
||||||
|
"""Function that's called when this serializer creates an item"""
|
||||||
|
etag = item_model.etag
|
||||||
|
revision_data = item_model.content
|
||||||
|
uid = item_model.uid
|
||||||
|
|
||||||
|
Model = models.CollectionItem
|
||||||
|
|
||||||
|
with transaction.atomic():
|
||||||
|
instance, created = Model.objects.get_or_create(
|
||||||
|
uid=uid, collection=collection, defaults=item_model.dict(exclude={"uid", "etag", "content"})
|
||||||
|
)
|
||||||
|
cur_etag = instance.etag if not created else None
|
||||||
|
|
||||||
|
# If we are trying to update an up to date item, abort early and consider it a success
|
||||||
|
if cur_etag == revision_data.uid:
|
||||||
|
return instance
|
||||||
|
|
||||||
|
if validate_etag and cur_etag != etag:
|
||||||
|
raise ValidationError(
|
||||||
|
"wrong_etag",
|
||||||
|
"Wrong etag. Expected {} got {}".format(cur_etag, etag),
|
||||||
|
status_code=status.HTTP_409_CONFLICT,
|
||||||
|
field=uid,
|
||||||
|
)
|
||||||
|
|
||||||
|
if not created:
|
||||||
|
# We don't have to use select_for_update here because the unique constraint on current guards against
|
||||||
|
# the race condition. But it's a good idea because it'll lock and wait rather than fail.
|
||||||
|
current_revision = instance.revisions.filter(current=True).select_for_update().first()
|
||||||
|
assert current_revision is not None
|
||||||
|
current_revision.current = None
|
||||||
|
current_revision.save()
|
||||||
|
|
||||||
|
try:
|
||||||
|
process_revisions_for_item(instance, revision_data)
|
||||||
|
except django_exceptions.ValidationError as e:
|
||||||
|
transform_validation_error("content", e)
|
||||||
|
|
||||||
|
return instance
|
||||||
|
|
||||||
|
|
||||||
|
@item_router.get("/item/{item_uid}/", response_model=CollectionItemOut, dependencies=PERMISSIONS_READ)
|
||||||
|
def item_get(
|
||||||
|
item_uid: str,
|
||||||
|
queryset: CollectionItemQuerySet = Depends(get_item_queryset),
|
||||||
|
user: UserType = Depends(get_authenticated_user),
|
||||||
|
prefetch: Prefetch = PrefetchQuery,
|
||||||
|
):
|
||||||
|
obj = queryset.get(uid=item_uid)
|
||||||
|
return CollectionItemOut.from_orm_context(obj, Context(user, prefetch))
|
||||||
|
|
||||||
|
|
||||||
|
def item_list_common(
|
||||||
|
queryset: CollectionItemQuerySet,
|
||||||
|
user: UserType,
|
||||||
|
stoken: t.Optional[str],
|
||||||
|
limit: int,
|
||||||
|
prefetch: Prefetch,
|
||||||
|
) -> CollectionItemListResponse:
|
||||||
|
result, new_stoken_obj, done = filter_by_stoken_and_limit(
|
||||||
|
stoken, limit, queryset, models.CollectionItem.stoken_annotation
|
||||||
|
)
|
||||||
|
new_stoken = new_stoken_obj and new_stoken_obj.uid
|
||||||
|
context = Context(user, prefetch)
|
||||||
|
data: t.List[CollectionItemOut] = [CollectionItemOut.from_orm_context(item, context) for item in result]
|
||||||
|
return CollectionItemListResponse(data=data, stoken=new_stoken, done=done)
|
||||||
|
|
||||||
|
|
||||||
|
@item_router.get("/item/", response_model=CollectionItemListResponse, dependencies=PERMISSIONS_READ)
|
||||||
|
def item_list(
|
||||||
|
queryset: CollectionItemQuerySet = Depends(get_item_queryset),
|
||||||
|
stoken: t.Optional[str] = None,
|
||||||
|
limit: int = 50,
|
||||||
|
prefetch: Prefetch = PrefetchQuery,
|
||||||
|
withCollection: bool = False,
|
||||||
|
user: UserType = Depends(get_authenticated_user),
|
||||||
|
):
|
||||||
|
if not withCollection:
|
||||||
|
queryset = queryset.filter(parent__isnull=True)
|
||||||
|
|
||||||
|
response = item_list_common(queryset, user, stoken, limit, prefetch)
|
||||||
|
return response
|
||||||
|
|
||||||
|
|
||||||
|
@item_router.post("/item/subscription-ticket/", response_model=TicketOut, dependencies=PERMISSIONS_READ)
|
||||||
|
async def item_list_subscription_ticket(
|
||||||
|
collection: models.Collection = Depends(get_collection),
|
||||||
|
user: UserType = Depends(get_authenticated_user),
|
||||||
|
):
|
||||||
|
"""Get an authentication ticket that can be used with the websocket endpoint"""
|
||||||
|
return await get_ticket(TicketRequest(collection=collection.uid), user)
|
||||||
|
|
||||||
|
|
||||||
|
def item_bulk_common(
|
||||||
|
data: ItemBatchIn,
|
||||||
|
user: UserType,
|
||||||
|
stoken: t.Optional[str],
|
||||||
|
uid: str,
|
||||||
|
validate_etag: bool,
|
||||||
|
background_tasks: BackgroundTasks,
|
||||||
|
):
|
||||||
|
queryset = get_collection_queryset(user)
|
||||||
|
with transaction.atomic(): # We need this for locking the collection object
|
||||||
|
collection_object = queryset.select_for_update().get(uid=uid)
|
||||||
|
|
||||||
|
if stoken and stoken != collection_object.stoken:
|
||||||
|
raise HttpError("stale_stoken", "Stoken is too old", status_code=status.HTTP_409_CONFLICT)
|
||||||
|
|
||||||
|
data.validate_db()
|
||||||
|
|
||||||
|
errors: t.List[HttpError] = []
|
||||||
|
for item in data.items:
|
||||||
|
try:
|
||||||
|
item_create(item, collection_object, validate_etag)
|
||||||
|
except ValidationError as e:
|
||||||
|
errors.append(e)
|
||||||
|
|
||||||
|
if len(errors) > 0:
|
||||||
|
raise ValidationError(
|
||||||
|
code="item_failed",
|
||||||
|
detail="Items failed to validate",
|
||||||
|
errors=errors,
|
||||||
|
status_code=status.HTTP_409_CONFLICT,
|
||||||
|
)
|
||||||
|
|
||||||
|
background_tasks.add_task(report_items_changed, collection_object.uid, collection_object.stoken, data.items)
|
||||||
|
|
||||||
|
|
||||||
|
@item_router.get(
|
||||||
|
"/item/{item_uid}/revision/", response_model=CollectionItemRevisionListResponse, dependencies=PERMISSIONS_READ
|
||||||
|
)
|
||||||
|
def item_revisions(
|
||||||
|
item_uid: str,
|
||||||
|
limit: int = 50,
|
||||||
|
iterator: t.Optional[str] = None,
|
||||||
|
prefetch: Prefetch = PrefetchQuery,
|
||||||
|
user: UserType = Depends(get_authenticated_user),
|
||||||
|
items: CollectionItemQuerySet = Depends(get_item_queryset),
|
||||||
|
):
|
||||||
|
item = get_object_or_404(items, uid=item_uid)
|
||||||
|
|
||||||
|
queryset = item.revisions.order_by("-id")
|
||||||
|
|
||||||
|
if iterator is not None:
|
||||||
|
iterator_obj = get_object_or_404(queryset, uid=iterator)
|
||||||
|
queryset = queryset.filter(id__lt=iterator_obj.id)
|
||||||
|
|
||||||
|
result = list(queryset[: limit + 1])
|
||||||
|
if len(result) < limit + 1:
|
||||||
|
done = True
|
||||||
|
else:
|
||||||
|
done = False
|
||||||
|
result = result[:-1]
|
||||||
|
|
||||||
|
context = Context(user, prefetch)
|
||||||
|
ret_data = [CollectionItemRevisionInOut.from_orm_context(revision, context) for revision in result]
|
||||||
|
iterator = ret_data[-1].uid if len(result) > 0 else None
|
||||||
|
|
||||||
|
return CollectionItemRevisionListResponse(
|
||||||
|
data=ret_data,
|
||||||
|
iterator=iterator,
|
||||||
|
done=done,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@item_router.post("/item/fetch_updates/", response_model=CollectionItemListResponse, dependencies=PERMISSIONS_READ)
|
||||||
|
def fetch_updates(
|
||||||
|
data: t.List[CollectionItemBulkGetIn],
|
||||||
|
stoken: t.Optional[str] = None,
|
||||||
|
prefetch: Prefetch = PrefetchQuery,
|
||||||
|
user: UserType = Depends(get_authenticated_user),
|
||||||
|
queryset: CollectionItemQuerySet = Depends(get_item_queryset),
|
||||||
|
):
|
||||||
|
# FIXME: make configurable?
|
||||||
|
item_limit = 200
|
||||||
|
|
||||||
|
if len(data) > item_limit:
|
||||||
|
raise HttpError("too_many_items", "Request has too many items.", status_code=status.HTTP_400_BAD_REQUEST)
|
||||||
|
|
||||||
|
queryset, stoken_rev = filter_by_stoken(stoken, queryset, models.CollectionItem.stoken_annotation)
|
||||||
|
|
||||||
|
uids, etags = zip(*[(item.uid, item.etag) for item in data])
|
||||||
|
revs = models.CollectionItemRevision.objects.filter(uid__in=etags, current=True)
|
||||||
|
queryset = queryset.filter(uid__in=uids).exclude(revisions__in=revs)
|
||||||
|
|
||||||
|
new_stoken_obj = get_queryset_stoken(queryset)
|
||||||
|
new_stoken = new_stoken_obj and new_stoken_obj.uid
|
||||||
|
stoken_rev_uid = stoken_rev and getattr(stoken_rev, "uid", None)
|
||||||
|
new_stoken = new_stoken or stoken_rev_uid
|
||||||
|
|
||||||
|
context = Context(user, prefetch)
|
||||||
|
return CollectionItemListResponse(
|
||||||
|
data=[CollectionItemOut.from_orm_context(item, context) for item in queryset],
|
||||||
|
stoken=new_stoken,
|
||||||
|
done=True, # we always return all the items, so it's always done
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@item_router.post("/item/transaction/", dependencies=[Depends(has_write_access), *PERMISSIONS_READWRITE])
|
||||||
|
def item_transaction(
|
||||||
|
collection_uid: str,
|
||||||
|
data: ItemBatchIn,
|
||||||
|
background_tasks: BackgroundTasks,
|
||||||
|
stoken: t.Optional[str] = None,
|
||||||
|
user: UserType = Depends(get_authenticated_user),
|
||||||
|
):
|
||||||
|
return item_bulk_common(data, user, stoken, collection_uid, validate_etag=True, background_tasks=background_tasks)
|
||||||
|
|
||||||
|
|
||||||
|
@item_router.post("/item/batch/", dependencies=[Depends(has_write_access), *PERMISSIONS_READWRITE])
|
||||||
|
def item_batch(
|
||||||
|
collection_uid: str,
|
||||||
|
data: ItemBatchIn,
|
||||||
|
background_tasks: BackgroundTasks,
|
||||||
|
stoken: t.Optional[str] = None,
|
||||||
|
user: UserType = Depends(get_authenticated_user),
|
||||||
|
):
|
||||||
|
return item_bulk_common(data, user, stoken, collection_uid, validate_etag=False, background_tasks=background_tasks)
|
||||||
|
|
||||||
|
|
||||||
|
# Chunks
|
||||||
|
|
||||||
|
|
||||||
|
@sync_to_async
|
||||||
|
def chunk_save(chunk_uid: str, collection: models.Collection, content_file: ContentFile):
|
||||||
|
chunk_obj = models.CollectionItemChunk(uid=chunk_uid, collection=collection)
|
||||||
|
chunk_obj.chunkFile.save("IGNORED", content_file)
|
||||||
|
chunk_obj.save()
|
||||||
|
return chunk_obj
|
||||||
|
|
||||||
|
|
||||||
|
@item_router.put(
|
||||||
|
"/item/{item_uid}/chunk/{chunk_uid}/",
|
||||||
|
dependencies=[Depends(has_write_access), *PERMISSIONS_READWRITE],
|
||||||
|
status_code=status.HTTP_201_CREATED,
|
||||||
|
)
|
||||||
|
async def chunk_update(
|
||||||
|
request: Request,
|
||||||
|
chunk_uid: str,
|
||||||
|
collection: models.Collection = Depends(get_collection),
|
||||||
|
):
|
||||||
|
# IGNORED FOR NOW: col_it = get_object_or_404(col.items, uid=collection_item_uid)
|
||||||
|
content_file = ContentFile(await request.body())
|
||||||
|
try:
|
||||||
|
await chunk_save(chunk_uid, collection, content_file)
|
||||||
|
except IntegrityError:
|
||||||
|
raise HttpError("chunk_exists", "Chunk already exists.", status_code=status.HTTP_409_CONFLICT)
|
||||||
|
|
||||||
|
|
||||||
|
@item_router.get(
|
||||||
|
"/item/{item_uid}/chunk/{chunk_uid}/download/",
|
||||||
|
dependencies=PERMISSIONS_READ,
|
||||||
|
)
|
||||||
|
def chunk_download(
|
||||||
|
chunk_uid: str,
|
||||||
|
collection: models.Collection = Depends(get_collection),
|
||||||
|
):
|
||||||
|
chunk = get_object_or_404(collection.chunks, uid=chunk_uid)
|
||||||
|
|
||||||
|
filename = chunk.chunkFile.path
|
||||||
|
return sendfile(filename)
|
244
etebase_fastapi/routers/invitation.py
Normal file
244
etebase_fastapi/routers/invitation.py
Normal file
@ -0,0 +1,244 @@
|
|||||||
|
import typing as t
|
||||||
|
|
||||||
|
from django.db import transaction, IntegrityError
|
||||||
|
from django.db.models import QuerySet
|
||||||
|
from fastapi import APIRouter, Depends, status, Request
|
||||||
|
|
||||||
|
from django_etebase import models
|
||||||
|
from django_etebase.utils import get_user_queryset, CallbackContext
|
||||||
|
from myauth.models import UserType, get_typed_user_model
|
||||||
|
from .authentication import get_authenticated_user
|
||||||
|
from ..exceptions import HttpError, PermissionDenied
|
||||||
|
from ..msgpack import MsgpackRoute
|
||||||
|
from ..utils import (
|
||||||
|
get_object_or_404,
|
||||||
|
get_user_username_email_kwargs,
|
||||||
|
Context,
|
||||||
|
is_collection_admin,
|
||||||
|
BaseModel,
|
||||||
|
permission_responses,
|
||||||
|
PERMISSIONS_READ,
|
||||||
|
PERMISSIONS_READWRITE,
|
||||||
|
)
|
||||||
|
from ..db_hack import django_db_cleanup_decorator
|
||||||
|
|
||||||
|
User = get_typed_user_model()
|
||||||
|
invitation_incoming_router = APIRouter(route_class=MsgpackRoute, responses=permission_responses)
|
||||||
|
invitation_outgoing_router = APIRouter(route_class=MsgpackRoute, responses=permission_responses)
|
||||||
|
InvitationQuerySet = QuerySet[models.CollectionInvitation]
|
||||||
|
default_queryset: InvitationQuerySet = models.CollectionInvitation.objects.all()
|
||||||
|
|
||||||
|
|
||||||
|
class UserInfoOut(BaseModel):
|
||||||
|
pubkey: bytes
|
||||||
|
|
||||||
|
class Config:
|
||||||
|
orm_mode = True
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def from_orm(cls: t.Type["UserInfoOut"], obj: models.UserInfo) -> "UserInfoOut":
|
||||||
|
return cls(pubkey=bytes(obj.pubkey))
|
||||||
|
|
||||||
|
|
||||||
|
class CollectionInvitationAcceptIn(BaseModel):
|
||||||
|
collectionType: bytes
|
||||||
|
encryptionKey: bytes
|
||||||
|
|
||||||
|
|
||||||
|
class CollectionInvitationCommon(BaseModel):
|
||||||
|
uid: str
|
||||||
|
version: int
|
||||||
|
accessLevel: models.AccessLevels
|
||||||
|
username: str
|
||||||
|
collection: str
|
||||||
|
signedEncryptionKey: bytes
|
||||||
|
|
||||||
|
|
||||||
|
class CollectionInvitationIn(CollectionInvitationCommon):
|
||||||
|
def validate_db(self, context: Context):
|
||||||
|
user = context.user
|
||||||
|
if user is not None and (user.username == self.username.lower()):
|
||||||
|
raise HttpError("no_self_invite", "Inviting yourself is not allowed")
|
||||||
|
|
||||||
|
|
||||||
|
class CollectionInvitationOut(CollectionInvitationCommon):
|
||||||
|
fromUsername: str
|
||||||
|
fromPubkey: bytes
|
||||||
|
|
||||||
|
class Config:
|
||||||
|
orm_mode = True
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def from_orm(cls: t.Type["CollectionInvitationOut"], obj: models.CollectionInvitation) -> "CollectionInvitationOut":
|
||||||
|
return cls(
|
||||||
|
uid=obj.uid,
|
||||||
|
version=obj.version,
|
||||||
|
accessLevel=obj.accessLevel,
|
||||||
|
username=obj.user.username,
|
||||||
|
collection=obj.collection.uid,
|
||||||
|
fromUsername=obj.fromMember.user.username,
|
||||||
|
fromPubkey=bytes(obj.fromMember.user.userinfo.pubkey),
|
||||||
|
signedEncryptionKey=bytes(obj.signedEncryptionKey),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class InvitationListResponse(BaseModel):
|
||||||
|
data: t.List[CollectionInvitationOut]
|
||||||
|
iterator: t.Optional[str]
|
||||||
|
done: bool
|
||||||
|
|
||||||
|
|
||||||
|
@django_db_cleanup_decorator
|
||||||
|
def get_incoming_queryset(user: UserType = Depends(get_authenticated_user)):
|
||||||
|
return default_queryset.filter(user=user)
|
||||||
|
|
||||||
|
|
||||||
|
@django_db_cleanup_decorator
|
||||||
|
def get_outgoing_queryset(user: UserType = Depends(get_authenticated_user)):
|
||||||
|
return default_queryset.filter(fromMember__user=user)
|
||||||
|
|
||||||
|
|
||||||
|
def list_common(
|
||||||
|
queryset: InvitationQuerySet,
|
||||||
|
iterator: t.Optional[str],
|
||||||
|
limit: int,
|
||||||
|
) -> InvitationListResponse:
|
||||||
|
queryset = queryset.order_by("id")
|
||||||
|
|
||||||
|
if iterator is not None:
|
||||||
|
iterator_obj = get_object_or_404(queryset, uid=iterator)
|
||||||
|
queryset = queryset.filter(id__gt=iterator_obj.id)
|
||||||
|
|
||||||
|
result = list(queryset[: limit + 1])
|
||||||
|
if len(result) < limit + 1:
|
||||||
|
done = True
|
||||||
|
else:
|
||||||
|
done = False
|
||||||
|
result = result[:-1]
|
||||||
|
|
||||||
|
ret_data = result
|
||||||
|
iterator = ret_data[-1].uid if len(result) > 0 else None
|
||||||
|
|
||||||
|
return InvitationListResponse(
|
||||||
|
data=ret_data,
|
||||||
|
iterator=iterator,
|
||||||
|
done=done,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@invitation_incoming_router.get("/", response_model=InvitationListResponse, dependencies=PERMISSIONS_READ)
|
||||||
|
def incoming_list(
|
||||||
|
iterator: t.Optional[str] = None,
|
||||||
|
limit: int = 50,
|
||||||
|
queryset: InvitationQuerySet = Depends(get_incoming_queryset),
|
||||||
|
):
|
||||||
|
return list_common(queryset, iterator, limit)
|
||||||
|
|
||||||
|
|
||||||
|
@invitation_incoming_router.get(
|
||||||
|
"/{invitation_uid}/", response_model=CollectionInvitationOut, dependencies=PERMISSIONS_READ
|
||||||
|
)
|
||||||
|
def incoming_get(
|
||||||
|
invitation_uid: str,
|
||||||
|
queryset: InvitationQuerySet = Depends(get_incoming_queryset),
|
||||||
|
):
|
||||||
|
obj = get_object_or_404(queryset, uid=invitation_uid)
|
||||||
|
return CollectionInvitationOut.from_orm(obj)
|
||||||
|
|
||||||
|
|
||||||
|
@invitation_incoming_router.delete(
|
||||||
|
"/{invitation_uid}/", status_code=status.HTTP_204_NO_CONTENT, dependencies=PERMISSIONS_READWRITE
|
||||||
|
)
|
||||||
|
def incoming_delete(
|
||||||
|
invitation_uid: str,
|
||||||
|
queryset: InvitationQuerySet = Depends(get_incoming_queryset),
|
||||||
|
):
|
||||||
|
obj = get_object_or_404(queryset, uid=invitation_uid)
|
||||||
|
obj.delete()
|
||||||
|
|
||||||
|
|
||||||
|
@invitation_incoming_router.post(
|
||||||
|
"/{invitation_uid}/accept/", status_code=status.HTTP_201_CREATED, dependencies=PERMISSIONS_READWRITE
|
||||||
|
)
|
||||||
|
def incoming_accept(
|
||||||
|
invitation_uid: str,
|
||||||
|
data: CollectionInvitationAcceptIn,
|
||||||
|
queryset: InvitationQuerySet = Depends(get_incoming_queryset),
|
||||||
|
):
|
||||||
|
invitation = get_object_or_404(queryset, uid=invitation_uid)
|
||||||
|
|
||||||
|
with transaction.atomic():
|
||||||
|
user = invitation.user
|
||||||
|
collection_type_obj, _ = models.CollectionType.objects.get_or_create(uid=data.collectionType, owner=user)
|
||||||
|
|
||||||
|
models.CollectionMember.objects.create(
|
||||||
|
collection=invitation.collection,
|
||||||
|
stoken=models.Stoken.objects.create(),
|
||||||
|
user=user,
|
||||||
|
accessLevel=invitation.accessLevel,
|
||||||
|
encryptionKey=data.encryptionKey,
|
||||||
|
collectionType=collection_type_obj,
|
||||||
|
)
|
||||||
|
|
||||||
|
models.CollectionMemberRemoved.objects.filter(user=invitation.user, collection=invitation.collection).delete()
|
||||||
|
|
||||||
|
invitation.delete()
|
||||||
|
|
||||||
|
|
||||||
|
@invitation_outgoing_router.post("/", status_code=status.HTTP_201_CREATED, dependencies=PERMISSIONS_READWRITE)
|
||||||
|
def outgoing_create(
|
||||||
|
data: CollectionInvitationIn,
|
||||||
|
request: Request,
|
||||||
|
user: UserType = Depends(get_authenticated_user),
|
||||||
|
):
|
||||||
|
collection = get_object_or_404(models.Collection.objects, uid=data.collection)
|
||||||
|
kwargs = get_user_username_email_kwargs(data.username)
|
||||||
|
to_user = get_object_or_404(get_user_queryset(User.objects.all(), CallbackContext(request.path_params)), **kwargs)
|
||||||
|
|
||||||
|
context = Context(user, None)
|
||||||
|
data.validate_db(context)
|
||||||
|
|
||||||
|
if not is_collection_admin(collection, user):
|
||||||
|
raise PermissionDenied("admin_access_required", "User is not an admin of this collection")
|
||||||
|
|
||||||
|
member = collection.members.get(user=user)
|
||||||
|
|
||||||
|
with transaction.atomic():
|
||||||
|
try:
|
||||||
|
models.CollectionInvitation.objects.create(
|
||||||
|
**data.dict(exclude={"collection", "username"}), user=to_user, fromMember=member
|
||||||
|
)
|
||||||
|
except IntegrityError:
|
||||||
|
raise HttpError("invitation_exists", "Invitation already exists")
|
||||||
|
|
||||||
|
|
||||||
|
@invitation_outgoing_router.get("/", response_model=InvitationListResponse, dependencies=PERMISSIONS_READ)
|
||||||
|
def outgoing_list(
|
||||||
|
iterator: t.Optional[str] = None,
|
||||||
|
limit: int = 50,
|
||||||
|
queryset: InvitationQuerySet = Depends(get_outgoing_queryset),
|
||||||
|
):
|
||||||
|
return list_common(queryset, iterator, limit)
|
||||||
|
|
||||||
|
|
||||||
|
@invitation_outgoing_router.delete(
|
||||||
|
"/{invitation_uid}/", status_code=status.HTTP_204_NO_CONTENT, dependencies=PERMISSIONS_READWRITE
|
||||||
|
)
|
||||||
|
def outgoing_delete(
|
||||||
|
invitation_uid: str,
|
||||||
|
queryset: InvitationQuerySet = Depends(get_outgoing_queryset),
|
||||||
|
):
|
||||||
|
obj = get_object_or_404(queryset, uid=invitation_uid)
|
||||||
|
obj.delete()
|
||||||
|
|
||||||
|
|
||||||
|
@invitation_outgoing_router.get("/fetch_user_profile/", response_model=UserInfoOut, dependencies=PERMISSIONS_READ)
|
||||||
|
def outgoing_fetch_user_profile(
|
||||||
|
username: str,
|
||||||
|
request: Request,
|
||||||
|
user: UserType = Depends(get_authenticated_user),
|
||||||
|
):
|
||||||
|
kwargs = get_user_username_email_kwargs(username)
|
||||||
|
user = get_object_or_404(get_user_queryset(User.objects.all(), CallbackContext(request.path_params)), **kwargs)
|
||||||
|
user_info = get_object_or_404(models.UserInfo.objects.all(), owner=user)
|
||||||
|
return UserInfoOut.from_orm(user_info)
|
109
etebase_fastapi/routers/member.py
Normal file
109
etebase_fastapi/routers/member.py
Normal file
@ -0,0 +1,109 @@
|
|||||||
|
import typing as t
|
||||||
|
|
||||||
|
from django.db import transaction
|
||||||
|
from django.db.models import QuerySet
|
||||||
|
from fastapi import APIRouter, Depends, status
|
||||||
|
|
||||||
|
from django_etebase import models
|
||||||
|
from myauth.models import UserType, get_typed_user_model
|
||||||
|
from .authentication import get_authenticated_user
|
||||||
|
from ..msgpack import MsgpackRoute
|
||||||
|
from ..utils import get_object_or_404, BaseModel, permission_responses, PERMISSIONS_READ, PERMISSIONS_READWRITE
|
||||||
|
from ..stoken_handler import filter_by_stoken_and_limit
|
||||||
|
from ..db_hack import django_db_cleanup_decorator
|
||||||
|
|
||||||
|
from .collection import get_collection, verify_collection_admin
|
||||||
|
|
||||||
|
User = get_typed_user_model()
|
||||||
|
member_router = APIRouter(route_class=MsgpackRoute, responses=permission_responses)
|
||||||
|
MemberQuerySet = QuerySet[models.CollectionMember]
|
||||||
|
default_queryset: MemberQuerySet = models.CollectionMember.objects.all()
|
||||||
|
|
||||||
|
|
||||||
|
@django_db_cleanup_decorator
|
||||||
|
def get_queryset(collection: models.Collection = Depends(get_collection)) -> MemberQuerySet:
|
||||||
|
return default_queryset.filter(collection=collection)
|
||||||
|
|
||||||
|
|
||||||
|
@django_db_cleanup_decorator
|
||||||
|
def get_member(username: str, queryset: MemberQuerySet = Depends(get_queryset)) -> models.CollectionMember:
|
||||||
|
return get_object_or_404(queryset, user__username__iexact=username)
|
||||||
|
|
||||||
|
|
||||||
|
class CollectionMemberModifyAccessLevelIn(BaseModel):
|
||||||
|
accessLevel: models.AccessLevels
|
||||||
|
|
||||||
|
|
||||||
|
class CollectionMemberOut(BaseModel):
|
||||||
|
username: str
|
||||||
|
accessLevel: models.AccessLevels
|
||||||
|
|
||||||
|
class Config:
|
||||||
|
orm_mode = True
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def from_orm(cls: t.Type["CollectionMemberOut"], obj: models.CollectionMember) -> "CollectionMemberOut":
|
||||||
|
return cls(username=obj.user.username, accessLevel=obj.accessLevel)
|
||||||
|
|
||||||
|
|
||||||
|
class MemberListResponse(BaseModel):
|
||||||
|
data: t.List[CollectionMemberOut]
|
||||||
|
iterator: t.Optional[str]
|
||||||
|
done: bool
|
||||||
|
|
||||||
|
|
||||||
|
@member_router.get(
|
||||||
|
"/member/", response_model=MemberListResponse, dependencies=[Depends(verify_collection_admin), *PERMISSIONS_READ]
|
||||||
|
)
|
||||||
|
def member_list(
|
||||||
|
iterator: t.Optional[str] = None,
|
||||||
|
limit: int = 50,
|
||||||
|
queryset: MemberQuerySet = Depends(get_queryset),
|
||||||
|
):
|
||||||
|
queryset = queryset.order_by("id")
|
||||||
|
result, new_stoken_obj, done = filter_by_stoken_and_limit(
|
||||||
|
iterator, limit, queryset, models.CollectionMember.stoken_annotation
|
||||||
|
)
|
||||||
|
new_stoken = new_stoken_obj and new_stoken_obj.uid
|
||||||
|
|
||||||
|
return MemberListResponse(
|
||||||
|
data=[CollectionMemberOut.from_orm(item) for item in result],
|
||||||
|
iterator=new_stoken,
|
||||||
|
done=done,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@member_router.delete(
|
||||||
|
"/member/{username}/",
|
||||||
|
status_code=status.HTTP_204_NO_CONTENT,
|
||||||
|
dependencies=[Depends(verify_collection_admin), *PERMISSIONS_READWRITE],
|
||||||
|
)
|
||||||
|
def member_delete(
|
||||||
|
obj: models.CollectionMember = Depends(get_member),
|
||||||
|
):
|
||||||
|
obj.revoke()
|
||||||
|
|
||||||
|
|
||||||
|
@member_router.patch(
|
||||||
|
"/member/{username}/",
|
||||||
|
status_code=status.HTTP_204_NO_CONTENT,
|
||||||
|
dependencies=[Depends(verify_collection_admin), *PERMISSIONS_READWRITE],
|
||||||
|
)
|
||||||
|
def member_patch(
|
||||||
|
data: CollectionMemberModifyAccessLevelIn,
|
||||||
|
instance: models.CollectionMember = Depends(get_member),
|
||||||
|
):
|
||||||
|
with transaction.atomic():
|
||||||
|
# We only allow updating accessLevel
|
||||||
|
if instance.accessLevel != data.accessLevel:
|
||||||
|
instance.stoken = models.Stoken.objects.create()
|
||||||
|
instance.accessLevel = data.accessLevel
|
||||||
|
instance.save()
|
||||||
|
|
||||||
|
|
||||||
|
@member_router.post("/member/leave/", status_code=status.HTTP_204_NO_CONTENT, dependencies=PERMISSIONS_READ)
|
||||||
|
def member_leave(
|
||||||
|
user: UserType = Depends(get_authenticated_user), collection: models.Collection = Depends(get_collection)
|
||||||
|
):
|
||||||
|
obj = get_object_or_404(collection.members, user=user)
|
||||||
|
obj.revoke()
|
38
etebase_fastapi/routers/test_reset_view.py
Normal file
38
etebase_fastapi/routers/test_reset_view.py
Normal file
@ -0,0 +1,38 @@
|
|||||||
|
from django.conf import settings
|
||||||
|
from django.db import transaction
|
||||||
|
from django.shortcuts import get_object_or_404
|
||||||
|
from fastapi import APIRouter, Request, status
|
||||||
|
|
||||||
|
from django_etebase.utils import get_user_queryset, CallbackContext
|
||||||
|
from .authentication import SignupIn, signup_save
|
||||||
|
from ..msgpack import MsgpackRoute
|
||||||
|
from ..exceptions import HttpError
|
||||||
|
from myauth.models import get_typed_user_model
|
||||||
|
|
||||||
|
test_reset_view_router = APIRouter(route_class=MsgpackRoute, tags=["test helpers"])
|
||||||
|
User = get_typed_user_model()
|
||||||
|
|
||||||
|
|
||||||
|
@test_reset_view_router.post("/reset/", status_code=status.HTTP_204_NO_CONTENT)
|
||||||
|
def reset(data: SignupIn, request: Request):
|
||||||
|
# Only run when in DEBUG mode! It's only used for tests
|
||||||
|
if not settings.DEBUG:
|
||||||
|
raise HttpError(code="generic", detail="Only allowed in debug mode.")
|
||||||
|
|
||||||
|
with transaction.atomic():
|
||||||
|
user_queryset = get_user_queryset(User.objects.all(), CallbackContext(request.path_params))
|
||||||
|
user = get_object_or_404(user_queryset, username=data.user.username)
|
||||||
|
# Only allow test users for extra safety
|
||||||
|
if not getattr(user, User.USERNAME_FIELD).startswith("test_user"):
|
||||||
|
raise HttpError(code="generic", detail="Endpoint not allowed for user.")
|
||||||
|
|
||||||
|
if hasattr(user, "userinfo"):
|
||||||
|
user.userinfo.delete()
|
||||||
|
|
||||||
|
signup_save(data, request)
|
||||||
|
# Delete all of the journal data for this user for a clear test env
|
||||||
|
user.collection_set.all().delete()
|
||||||
|
user.collectionmember_set.all().delete()
|
||||||
|
user.incoming_invitations.all().delete()
|
||||||
|
|
||||||
|
# FIXME: also delete chunk files!!!
|
145
etebase_fastapi/routers/websocket.py
Normal file
145
etebase_fastapi/routers/websocket.py
Normal file
@ -0,0 +1,145 @@
|
|||||||
|
import asyncio
|
||||||
|
import typing as t
|
||||||
|
|
||||||
|
import aioredis
|
||||||
|
from asgiref.sync import sync_to_async
|
||||||
|
from django.db.models import QuerySet
|
||||||
|
from fastapi import APIRouter, Depends, WebSocket, WebSocketDisconnect, status
|
||||||
|
import nacl.encoding
|
||||||
|
import nacl.utils
|
||||||
|
|
||||||
|
from django_etebase import models
|
||||||
|
from django_etebase.utils import CallbackContext, get_user_queryset
|
||||||
|
from myauth.models import UserType, get_typed_user_model
|
||||||
|
|
||||||
|
from ..dependencies import get_collection_queryset, get_item_queryset
|
||||||
|
from ..exceptions import NotSupported
|
||||||
|
from ..msgpack import MsgpackRoute, msgpack_decode, msgpack_encode
|
||||||
|
from ..redis import redisw
|
||||||
|
from ..utils import BaseModel, permission_responses
|
||||||
|
|
||||||
|
|
||||||
|
User = get_typed_user_model()
|
||||||
|
websocket_router = APIRouter(route_class=MsgpackRoute, responses=permission_responses)
|
||||||
|
CollectionQuerySet = QuerySet[models.Collection]
|
||||||
|
|
||||||
|
|
||||||
|
TICKET_VALIDITY_SECONDS = 10
|
||||||
|
|
||||||
|
|
||||||
|
class TicketRequest(BaseModel):
|
||||||
|
collection: str
|
||||||
|
|
||||||
|
|
||||||
|
class TicketOut(BaseModel):
|
||||||
|
ticket: str
|
||||||
|
|
||||||
|
|
||||||
|
class TicketInner(BaseModel):
|
||||||
|
user: int
|
||||||
|
req: TicketRequest
|
||||||
|
|
||||||
|
|
||||||
|
async def get_ticket(
|
||||||
|
ticket_request: TicketRequest,
|
||||||
|
user: UserType,
|
||||||
|
):
|
||||||
|
"""Get an authentication ticket that can be used with the websocket endpoint for authentication"""
|
||||||
|
if not redisw.is_active:
|
||||||
|
raise NotSupported(detail="This end-point requires Redis to be configured")
|
||||||
|
|
||||||
|
uid = nacl.encoding.URLSafeBase64Encoder.encode(nacl.utils.random(32))
|
||||||
|
ticket_model = TicketInner(user=user.id, req=ticket_request)
|
||||||
|
ticket_raw = msgpack_encode(ticket_model.dict())
|
||||||
|
await redisw.redis.set(uid, ticket_raw, expire=TICKET_VALIDITY_SECONDS * 1000)
|
||||||
|
return TicketOut(ticket=uid)
|
||||||
|
|
||||||
|
|
||||||
|
async def load_websocket_ticket(websocket: WebSocket, ticket: str) -> t.Optional[TicketInner]:
|
||||||
|
content = await redisw.redis.get(ticket)
|
||||||
|
if content is None:
|
||||||
|
await websocket.close(code=status.WS_1008_POLICY_VIOLATION)
|
||||||
|
return None
|
||||||
|
await redisw.redis.delete(ticket)
|
||||||
|
return TicketInner(**msgpack_decode(content))
|
||||||
|
|
||||||
|
|
||||||
|
def get_websocket_user(websocket: WebSocket, ticket_model: t.Optional[TicketInner] = Depends(load_websocket_ticket)):
|
||||||
|
if ticket_model is None:
|
||||||
|
return None
|
||||||
|
user_queryset = get_user_queryset(User.objects.all(), CallbackContext(websocket.path_params))
|
||||||
|
return user_queryset.get(id=ticket_model.user)
|
||||||
|
|
||||||
|
|
||||||
|
@websocket_router.websocket("/{ticket}/")
|
||||||
|
async def websocket_endpoint(
|
||||||
|
websocket: WebSocket,
|
||||||
|
stoken: t.Optional[str] = None,
|
||||||
|
user: t.Optional[UserType] = Depends(get_websocket_user),
|
||||||
|
ticket_model: TicketInner = Depends(load_websocket_ticket),
|
||||||
|
):
|
||||||
|
if user is None:
|
||||||
|
return
|
||||||
|
await websocket.accept()
|
||||||
|
await redis_connector(websocket, ticket_model, user, stoken)
|
||||||
|
|
||||||
|
|
||||||
|
async def send_item_updates(
|
||||||
|
websocket: WebSocket,
|
||||||
|
collection: models.Collection,
|
||||||
|
user: UserType,
|
||||||
|
stoken: t.Optional[str],
|
||||||
|
):
|
||||||
|
from .collection import item_list_common
|
||||||
|
|
||||||
|
done = False
|
||||||
|
while not done:
|
||||||
|
queryset = await sync_to_async(get_item_queryset)(collection)
|
||||||
|
response = await sync_to_async(item_list_common)(queryset, user, stoken, limit=50, prefetch="auto")
|
||||||
|
done = response.done
|
||||||
|
if len(response.data) > 0:
|
||||||
|
await websocket.send_bytes(msgpack_encode(response.dict()))
|
||||||
|
|
||||||
|
|
||||||
|
async def redis_connector(websocket: WebSocket, ticket_model: TicketInner, user: UserType, stoken: t.Optional[str]):
|
||||||
|
async def producer_handler(r: aioredis.Redis, ws: WebSocket):
|
||||||
|
channel_name = f"col.{ticket_model.req.collection}"
|
||||||
|
(channel,) = await r.psubscribe(channel_name)
|
||||||
|
assert isinstance(channel, aioredis.Channel)
|
||||||
|
|
||||||
|
# Send missing items if we are not up to date
|
||||||
|
queryset: QuerySet[models.Collection] = get_collection_queryset(user)
|
||||||
|
collection: t.Optional[models.Collection] = await sync_to_async(
|
||||||
|
queryset.filter(uid=ticket_model.req.collection).first
|
||||||
|
)()
|
||||||
|
if collection is None:
|
||||||
|
await websocket.close(code=status.WS_1008_POLICY_VIOLATION)
|
||||||
|
return
|
||||||
|
await send_item_updates(websocket, collection, user, stoken)
|
||||||
|
|
||||||
|
try:
|
||||||
|
while True:
|
||||||
|
# We wait on the websocket so we fail if web sockets fail or get data
|
||||||
|
receive = asyncio.create_task(websocket.receive())
|
||||||
|
done, pending = await asyncio.wait(
|
||||||
|
{receive, channel.wait_message()}, return_when=asyncio.FIRST_COMPLETED
|
||||||
|
)
|
||||||
|
for task in pending:
|
||||||
|
task.cancel()
|
||||||
|
if receive in done:
|
||||||
|
# Web socket should never receieve any data
|
||||||
|
await websocket.close(code=status.WS_1008_POLICY_VIOLATION)
|
||||||
|
return
|
||||||
|
|
||||||
|
message_raw = t.cast(t.Optional[t.Tuple[str, bytes]], await channel.get())
|
||||||
|
if message_raw:
|
||||||
|
_, message = message_raw
|
||||||
|
await ws.send_bytes(message)
|
||||||
|
|
||||||
|
except aioredis.errors.ConnectionClosedError:
|
||||||
|
await websocket.close(code=status.WS_1012_SERVICE_RESTART)
|
||||||
|
except WebSocketDisconnect:
|
||||||
|
pass
|
||||||
|
|
||||||
|
redis = redisw.redis
|
||||||
|
await producer_handler(redis, websocket)
|
28
etebase_fastapi/sendfile/LICENSE
Normal file
28
etebase_fastapi/sendfile/LICENSE
Normal file
@ -0,0 +1,28 @@
|
|||||||
|
Copyright (c) 2011, Sensible Development.
|
||||||
|
Copyright (c) 2019, Matt Molyneaux
|
||||||
|
All rights reserved.
|
||||||
|
|
||||||
|
Redistribution and use in source and binary forms, with or without modification,
|
||||||
|
are permitted provided that the following conditions are met:
|
||||||
|
|
||||||
|
1. Redistributions of source code must retain the above copyright notice,
|
||||||
|
this list of conditions and the following disclaimer.
|
||||||
|
|
||||||
|
2. Redistributions in binary form must reproduce the above copyright
|
||||||
|
notice, this list of conditions and the following disclaimer in the
|
||||||
|
documentation and/or other materials provided with the distribution.
|
||||||
|
|
||||||
|
3. Neither the name of Django Send File nor the names of its
|
||||||
|
contributors may be used to endorse or promote products derived from
|
||||||
|
this software without specific prior written permission.
|
||||||
|
|
||||||
|
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
|
||||||
|
ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
|
||||||
|
WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
|
||||||
|
DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR
|
||||||
|
ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
|
||||||
|
(INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
|
||||||
|
LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON
|
||||||
|
ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
|
||||||
|
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
|
||||||
|
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
3
etebase_fastapi/sendfile/README.md
Normal file
3
etebase_fastapi/sendfile/README.md
Normal file
@ -0,0 +1,3 @@
|
|||||||
|
Heavily inspired + code borrowed from: https://github.com/moggers87/django-sendfile2/
|
||||||
|
|
||||||
|
We just simplified and inlined it because we don't want another external dependency for distribution packagers to package, as well as need a much simpler version.
|
1
etebase_fastapi/sendfile/__init__.py
Normal file
1
etebase_fastapi/sendfile/__init__.py
Normal file
@ -0,0 +1 @@
|
|||||||
|
from .utils import sendfile # noqa
|
9
etebase_fastapi/sendfile/backends/mod_wsgi.py
Normal file
9
etebase_fastapi/sendfile/backends/mod_wsgi.py
Normal file
@ -0,0 +1,9 @@
|
|||||||
|
from __future__ import absolute_import
|
||||||
|
|
||||||
|
from fastapi import Response
|
||||||
|
|
||||||
|
from ..utils import _convert_file_to_url
|
||||||
|
|
||||||
|
|
||||||
|
def sendfile(filename, **kwargs):
|
||||||
|
return Response(headers={"Location": _convert_file_to_url(filename)})
|
9
etebase_fastapi/sendfile/backends/nginx.py
Normal file
9
etebase_fastapi/sendfile/backends/nginx.py
Normal file
@ -0,0 +1,9 @@
|
|||||||
|
from __future__ import absolute_import
|
||||||
|
|
||||||
|
from fastapi import Response
|
||||||
|
|
||||||
|
from ..utils import _convert_file_to_url
|
||||||
|
|
||||||
|
|
||||||
|
def sendfile(filename, **kwargs):
|
||||||
|
return Response(headers={"X-Accel-Redirect": _convert_file_to_url(filename)})
|
12
etebase_fastapi/sendfile/backends/simple.py
Normal file
12
etebase_fastapi/sendfile/backends/simple.py
Normal file
@ -0,0 +1,12 @@
|
|||||||
|
from fastapi.responses import FileResponse
|
||||||
|
|
||||||
|
|
||||||
|
def sendfile(filename, mimetype, **kwargs):
|
||||||
|
"""Use the SENDFILE_ROOT value composed with the path arrived as argument
|
||||||
|
to build an absolute path with which resolve and return the file contents.
|
||||||
|
|
||||||
|
If the path points to a file out of the root directory (should cover both
|
||||||
|
situations with '..' and symlinks) then a 404 is raised.
|
||||||
|
"""
|
||||||
|
|
||||||
|
return FileResponse(filename, media_type=mimetype)
|
6
etebase_fastapi/sendfile/backends/xsendfile.py
Normal file
6
etebase_fastapi/sendfile/backends/xsendfile.py
Normal file
@ -0,0 +1,6 @@
|
|||||||
|
from fastapi import Response
|
||||||
|
|
||||||
|
|
||||||
|
def sendfile(filename, **kwargs):
|
||||||
|
filename = str(filename)
|
||||||
|
return Response(headers={"X-Sendfile": filename})
|
88
etebase_fastapi/sendfile/utils.py
Normal file
88
etebase_fastapi/sendfile/utils.py
Normal file
@ -0,0 +1,88 @@
|
|||||||
|
from functools import lru_cache
|
||||||
|
from importlib import import_module
|
||||||
|
from pathlib import Path, PurePath
|
||||||
|
from urllib.parse import quote
|
||||||
|
import logging
|
||||||
|
|
||||||
|
from fastapi import status
|
||||||
|
from ..exceptions import HttpError
|
||||||
|
|
||||||
|
from django.conf import settings
|
||||||
|
from django.core.exceptions import ImproperlyConfigured
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
@lru_cache(maxsize=None)
|
||||||
|
def _get_sendfile():
|
||||||
|
backend = getattr(settings, "SENDFILE_BACKEND", None)
|
||||||
|
if not backend:
|
||||||
|
raise ImproperlyConfigured("You must specify a value for SENDFILE_BACKEND")
|
||||||
|
module = import_module(backend)
|
||||||
|
return module.sendfile
|
||||||
|
|
||||||
|
|
||||||
|
def _convert_file_to_url(path):
|
||||||
|
try:
|
||||||
|
url_root = PurePath(getattr(settings, "SENDFILE_URL", None))
|
||||||
|
except TypeError:
|
||||||
|
return path
|
||||||
|
|
||||||
|
path_root = PurePath(settings.SENDFILE_ROOT)
|
||||||
|
path_obj = PurePath(path)
|
||||||
|
|
||||||
|
relpath = path_obj.relative_to(path_root)
|
||||||
|
# Python 3.5: Path.resolve() has no `strict` kwarg, so use pathmod from an
|
||||||
|
# already instantiated Path object
|
||||||
|
url = relpath._flavour.pathmod.normpath(str(url_root / relpath))
|
||||||
|
|
||||||
|
return quote(str(url))
|
||||||
|
|
||||||
|
|
||||||
|
def _sanitize_path(filepath):
|
||||||
|
try:
|
||||||
|
path_root = Path(getattr(settings, "SENDFILE_ROOT", None))
|
||||||
|
except TypeError:
|
||||||
|
raise ImproperlyConfigured("You must specify a value for SENDFILE_ROOT")
|
||||||
|
|
||||||
|
filepath_obj = Path(filepath)
|
||||||
|
|
||||||
|
# get absolute path
|
||||||
|
# Python 3.5: Path.resolve() has no `strict` kwarg, so use pathmod from an
|
||||||
|
# already instantiated Path object
|
||||||
|
filepath_abs = Path(filepath_obj._flavour.pathmod.normpath(str(path_root / filepath_obj)))
|
||||||
|
|
||||||
|
# if filepath_abs is not relative to path_root, relative_to throws an error
|
||||||
|
try:
|
||||||
|
filepath_abs.relative_to(path_root)
|
||||||
|
except ValueError:
|
||||||
|
raise HttpError(
|
||||||
|
"generic", "{} wrt {} is impossible".format(filepath_abs, path_root), status_code=status.HTTP_404_NOT_FOUND
|
||||||
|
)
|
||||||
|
|
||||||
|
return filepath_abs
|
||||||
|
|
||||||
|
|
||||||
|
def sendfile(filename, mimetype="application/octet-stream", encoding=None):
|
||||||
|
"""
|
||||||
|
Create a response to send file using backend configured in ``SENDFILE_BACKEND``
|
||||||
|
|
||||||
|
``filename`` is the absolute path to the file to send.
|
||||||
|
"""
|
||||||
|
filepath_obj = _sanitize_path(filename)
|
||||||
|
logger.debug(
|
||||||
|
"filename '%s' requested \"\
|
||||||
|
\"-> filepath '%s' obtained",
|
||||||
|
filename,
|
||||||
|
filepath_obj,
|
||||||
|
)
|
||||||
|
_sendfile = _get_sendfile()
|
||||||
|
|
||||||
|
if not filepath_obj.exists():
|
||||||
|
raise HttpError("does_not_exist", '"%s" does not exist' % filepath_obj, status_code=status.HTTP_404_NOT_FOUND)
|
||||||
|
|
||||||
|
response = _sendfile(filepath_obj, mimetype=mimetype)
|
||||||
|
|
||||||
|
response.headers["Content-Type"] = mimetype
|
||||||
|
|
||||||
|
return response
|
62
etebase_fastapi/stoken_handler.py
Normal file
62
etebase_fastapi/stoken_handler.py
Normal file
@ -0,0 +1,62 @@
|
|||||||
|
import typing as t
|
||||||
|
|
||||||
|
from django.db.models import QuerySet
|
||||||
|
from fastapi import status
|
||||||
|
|
||||||
|
from django_etebase.models import Stoken
|
||||||
|
|
||||||
|
from .exceptions import HttpError
|
||||||
|
|
||||||
|
# TODO missing stoken_annotation type
|
||||||
|
StokenAnnotation = t.Any
|
||||||
|
|
||||||
|
|
||||||
|
def get_stoken_obj(stoken: t.Optional[str]) -> t.Optional[Stoken]:
|
||||||
|
if stoken:
|
||||||
|
try:
|
||||||
|
return Stoken.objects.get(uid=stoken)
|
||||||
|
except Stoken.DoesNotExist:
|
||||||
|
raise HttpError("bad_stoken", "Invalid stoken.", status_code=status.HTTP_400_BAD_REQUEST)
|
||||||
|
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def filter_by_stoken(
|
||||||
|
stoken: t.Optional[str], queryset: QuerySet, stoken_annotation: StokenAnnotation
|
||||||
|
) -> t.Tuple[QuerySet, t.Optional[Stoken]]:
|
||||||
|
stoken_rev = get_stoken_obj(stoken)
|
||||||
|
|
||||||
|
queryset = queryset.annotate(max_stoken=stoken_annotation).order_by("max_stoken")
|
||||||
|
|
||||||
|
if stoken_rev is not None:
|
||||||
|
queryset = queryset.filter(max_stoken__gt=stoken_rev.id)
|
||||||
|
|
||||||
|
return queryset, stoken_rev
|
||||||
|
|
||||||
|
|
||||||
|
def get_queryset_stoken(queryset: t.Iterable[t.Any]) -> t.Optional[Stoken]:
|
||||||
|
maxid = -1
|
||||||
|
for row in queryset:
|
||||||
|
rowmaxid = getattr(row, "max_stoken") or -1
|
||||||
|
maxid = max(maxid, rowmaxid)
|
||||||
|
new_stoken = Stoken.objects.get(id=maxid) if (maxid >= 0) else None
|
||||||
|
|
||||||
|
return new_stoken or None
|
||||||
|
|
||||||
|
|
||||||
|
def filter_by_stoken_and_limit(
|
||||||
|
stoken: t.Optional[str], limit: int, queryset: QuerySet, stoken_annotation: StokenAnnotation
|
||||||
|
) -> t.Tuple[list, t.Optional[Stoken], bool]:
|
||||||
|
|
||||||
|
queryset, stoken_rev = filter_by_stoken(stoken=stoken, queryset=queryset, stoken_annotation=stoken_annotation)
|
||||||
|
|
||||||
|
result = list(queryset[: limit + 1])
|
||||||
|
if len(result) < limit + 1:
|
||||||
|
done = True
|
||||||
|
else:
|
||||||
|
done = False
|
||||||
|
result = result[:-1]
|
||||||
|
|
||||||
|
new_stoken_obj = get_queryset_stoken(result) or stoken_rev
|
||||||
|
|
||||||
|
return result, new_stoken_obj, done
|
85
etebase_fastapi/utils.py
Normal file
85
etebase_fastapi/utils.py
Normal file
@ -0,0 +1,85 @@
|
|||||||
|
import dataclasses
|
||||||
|
import typing as t
|
||||||
|
from typing_extensions import Literal
|
||||||
|
import msgpack
|
||||||
|
import base64
|
||||||
|
|
||||||
|
from fastapi import status, Query, Depends
|
||||||
|
from pydantic import BaseModel as PyBaseModel
|
||||||
|
|
||||||
|
from django.db.models import Model, QuerySet
|
||||||
|
from django.core.exceptions import ObjectDoesNotExist
|
||||||
|
|
||||||
|
from django_etebase import app_settings
|
||||||
|
from django_etebase.models import AccessLevels
|
||||||
|
from myauth.models import UserType, get_typed_user_model
|
||||||
|
|
||||||
|
from .exceptions import HttpError, HttpErrorOut
|
||||||
|
|
||||||
|
User = get_typed_user_model()
|
||||||
|
|
||||||
|
Prefetch = Literal["auto", "medium"]
|
||||||
|
PrefetchQuery = Query(default="auto")
|
||||||
|
|
||||||
|
|
||||||
|
T = t.TypeVar("T", bound=Model, covariant=True)
|
||||||
|
|
||||||
|
|
||||||
|
class BaseModel(PyBaseModel):
|
||||||
|
class Config:
|
||||||
|
json_encoders = {
|
||||||
|
bytes: lambda x: x,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@dataclasses.dataclass
|
||||||
|
class Context:
|
||||||
|
user: t.Optional[UserType]
|
||||||
|
prefetch: t.Optional[Prefetch]
|
||||||
|
|
||||||
|
|
||||||
|
def get_object_or_404(queryset: QuerySet[T], **kwargs) -> T:
|
||||||
|
try:
|
||||||
|
return queryset.get(**kwargs)
|
||||||
|
except ObjectDoesNotExist as e:
|
||||||
|
raise HttpError("does_not_exist", str(e), status_code=status.HTTP_404_NOT_FOUND)
|
||||||
|
|
||||||
|
|
||||||
|
def is_collection_admin(collection, user):
|
||||||
|
member = collection.members.filter(user=user).first()
|
||||||
|
return (member is not None) and (member.accessLevel == AccessLevels.ADMIN)
|
||||||
|
|
||||||
|
|
||||||
|
def msgpack_encode(content) -> bytes:
|
||||||
|
ret = msgpack.packb(content, use_bin_type=True)
|
||||||
|
assert ret is not None
|
||||||
|
return ret
|
||||||
|
|
||||||
|
|
||||||
|
def msgpack_decode(content: bytes):
|
||||||
|
return msgpack.unpackb(content, raw=False)
|
||||||
|
|
||||||
|
|
||||||
|
def b64encode(value: bytes):
|
||||||
|
return base64.urlsafe_b64encode(value).decode("ascii").strip("=")
|
||||||
|
|
||||||
|
|
||||||
|
def b64decode(data: str):
|
||||||
|
data += "=" * ((4 - len(data) % 4) % 4)
|
||||||
|
return base64.urlsafe_b64decode(data)
|
||||||
|
|
||||||
|
|
||||||
|
def get_user_username_email_kwargs(username: str):
|
||||||
|
field_name = User.EMAIL_FIELD if "@" in username else User.USERNAME_FIELD
|
||||||
|
return {field_name + "__iexact": username.lower()}
|
||||||
|
|
||||||
|
|
||||||
|
PERMISSIONS_READ = [Depends(x) for x in app_settings.API_PERMISSIONS_READ]
|
||||||
|
PERMISSIONS_READWRITE = PERMISSIONS_READ + [Depends(x) for x in app_settings.API_PERMISSIONS_WRITE]
|
||||||
|
|
||||||
|
|
||||||
|
response_model_dict = {"model": HttpErrorOut}
|
||||||
|
permission_responses: t.Dict[t.Union[int, str], t.Dict[str, t.Any]] = {
|
||||||
|
401: response_model_dict,
|
||||||
|
403: response_model_dict,
|
||||||
|
}
|
@ -1,16 +1,19 @@
|
|||||||
"""
|
|
||||||
ASGI config for etebase_server project.
|
|
||||||
|
|
||||||
It exposes the ASGI callable as a module-level variable named ``application``.
|
|
||||||
|
|
||||||
For more information on this file, see
|
|
||||||
https://docs.djangoproject.com/en/3.0/howto/deployment/asgi/
|
|
||||||
"""
|
|
||||||
|
|
||||||
import os
|
import os
|
||||||
|
|
||||||
from django.core.asgi import get_asgi_application
|
from django.core.asgi import get_asgi_application
|
||||||
|
|
||||||
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "etebase_server.settings")
|
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "etebase_server.settings")
|
||||||
|
django_application = get_asgi_application()
|
||||||
|
|
||||||
application = get_asgi_application()
|
|
||||||
|
def create_application():
|
||||||
|
from etebase_fastapi.main import create_application
|
||||||
|
|
||||||
|
app = create_application()
|
||||||
|
|
||||||
|
app.mount("/", django_application)
|
||||||
|
|
||||||
|
return app
|
||||||
|
|
||||||
|
|
||||||
|
application = create_application()
|
||||||
|
@ -53,8 +53,6 @@ INSTALLED_APPS = [
|
|||||||
"django.contrib.sessions",
|
"django.contrib.sessions",
|
||||||
"django.contrib.messages",
|
"django.contrib.messages",
|
||||||
"django.contrib.staticfiles",
|
"django.contrib.staticfiles",
|
||||||
"corsheaders",
|
|
||||||
"rest_framework",
|
|
||||||
"myauth.apps.MyauthConfig",
|
"myauth.apps.MyauthConfig",
|
||||||
"django_etebase.apps.DjangoEtebaseConfig",
|
"django_etebase.apps.DjangoEtebaseConfig",
|
||||||
"django_etebase.token_auth.apps.TokenAuthConfig",
|
"django_etebase.token_auth.apps.TokenAuthConfig",
|
||||||
@ -63,7 +61,6 @@ INSTALLED_APPS = [
|
|||||||
MIDDLEWARE = [
|
MIDDLEWARE = [
|
||||||
"django.middleware.security.SecurityMiddleware",
|
"django.middleware.security.SecurityMiddleware",
|
||||||
"django.contrib.sessions.middleware.SessionMiddleware",
|
"django.contrib.sessions.middleware.SessionMiddleware",
|
||||||
"corsheaders.middleware.CorsMiddleware",
|
|
||||||
"django.middleware.common.CommonMiddleware",
|
"django.middleware.common.CommonMiddleware",
|
||||||
"django.middleware.csrf.CsrfViewMiddleware",
|
"django.middleware.csrf.CsrfViewMiddleware",
|
||||||
"django.contrib.auth.middleware.AuthenticationMiddleware",
|
"django.contrib.auth.middleware.AuthenticationMiddleware",
|
||||||
@ -124,9 +121,6 @@ USE_L10N = True
|
|||||||
|
|
||||||
USE_TZ = True
|
USE_TZ = True
|
||||||
|
|
||||||
# Cors
|
|
||||||
CORS_ORIGIN_ALLOW_ALL = True
|
|
||||||
|
|
||||||
# Static files (CSS, JavaScript, Images)
|
# Static files (CSS, JavaScript, Images)
|
||||||
# https://docs.djangoproject.com/en/3.0/howto/static-files/
|
# https://docs.djangoproject.com/en/3.0/howto/static-files/
|
||||||
|
|
||||||
@ -141,7 +135,6 @@ ETEBASE_API_AUTHENTICATORS = (
|
|||||||
"django_etebase.token_auth.authentication.TokenAuthentication",
|
"django_etebase.token_auth.authentication.TokenAuthentication",
|
||||||
"rest_framework.authentication.SessionAuthentication",
|
"rest_framework.authentication.SessionAuthentication",
|
||||||
)
|
)
|
||||||
ETEBASE_CREATE_USER_FUNC = "django_etebase.utils.create_user_blocked"
|
|
||||||
|
|
||||||
# Define where to find configuration files
|
# Define where to find configuration files
|
||||||
config_locations = [
|
config_locations = [
|
||||||
@ -166,6 +159,9 @@ if any(os.path.isfile(x) for x in config_locations):
|
|||||||
TIME_ZONE = section.get("time_zone", TIME_ZONE)
|
TIME_ZONE = section.get("time_zone", TIME_ZONE)
|
||||||
DEBUG = section.getboolean("debug", DEBUG)
|
DEBUG = section.getboolean("debug", DEBUG)
|
||||||
|
|
||||||
|
if "redis_uri" in section:
|
||||||
|
ETEBASE_REDIS_URI = section.get("redis_uri")
|
||||||
|
|
||||||
if "allowed_hosts" in config:
|
if "allowed_hosts" in config:
|
||||||
ALLOWED_HOSTS = [y for x, y in config.items("allowed_hosts")]
|
ALLOWED_HOSTS = [y for x, y in config.items("allowed_hosts")]
|
||||||
|
|
||||||
@ -184,6 +180,12 @@ if any(os.path.isfile(x) for x in config_locations):
|
|||||||
ETEBASE_CREATE_USER_FUNC = "myauth.ldap.create_user"
|
ETEBASE_CREATE_USER_FUNC = "myauth.ldap.create_user"
|
||||||
ETEBASE_API_PERMISSIONS.append("myauth.ldap.LDAPUserExists")
|
ETEBASE_API_PERMISSIONS.append("myauth.ldap.LDAPUserExists")
|
||||||
|
|
||||||
|
ETEBASE_CREATE_USER_FUNC = "django_etebase.utils.create_user_blocked"
|
||||||
|
|
||||||
|
# Efficient file streaming (for large files)
|
||||||
|
SENDFILE_BACKEND = "django_etebase.sendfile.backends.simple"
|
||||||
|
SENDFILE_ROOT = MEDIA_URL
|
||||||
|
|
||||||
# Make an `etebase_server_settings` module available to override settings.
|
# Make an `etebase_server_settings` module available to override settings.
|
||||||
try:
|
try:
|
||||||
from etebase_server_settings import *
|
from etebase_server_settings import *
|
||||||
|
@ -1,16 +1,25 @@
|
|||||||
|
import os
|
||||||
|
|
||||||
from django.conf import settings
|
from django.conf import settings
|
||||||
from django.conf.urls import include, url
|
from django.conf.urls import url
|
||||||
from django.contrib import admin
|
from django.contrib import admin
|
||||||
from django.urls import path
|
from django.urls import path, re_path
|
||||||
from django.views.generic import TemplateView
|
from django.views.generic import TemplateView
|
||||||
|
from django.views.static import serve
|
||||||
|
from django.contrib.staticfiles import finders
|
||||||
|
|
||||||
urlpatterns = [
|
urlpatterns = [
|
||||||
url(r"^api/", include("django_etebase.urls")),
|
|
||||||
url(r"^admin/", admin.site.urls),
|
url(r"^admin/", admin.site.urls),
|
||||||
path("", TemplateView.as_view(template_name="success.html")),
|
path("", TemplateView.as_view(template_name="success.html")),
|
||||||
]
|
]
|
||||||
|
|
||||||
if settings.DEBUG:
|
if settings.DEBUG:
|
||||||
urlpatterns += [
|
|
||||||
url(r"^api-auth/", include("rest_framework.urls", namespace="rest_framework")),
|
def serve_static(request, path):
|
||||||
]
|
filename = finders.find(path)
|
||||||
|
dirname = os.path.dirname(filename)
|
||||||
|
basename = os.path.basename(filename)
|
||||||
|
|
||||||
|
return serve(request, basename, dirname)
|
||||||
|
|
||||||
|
urlpatterns += [re_path(r"^static/(?P<path>.*)$", serve_static)]
|
||||||
|
@ -1,16 +0,0 @@
|
|||||||
"""
|
|
||||||
WSGI config for etebase_server project.
|
|
||||||
|
|
||||||
It exposes the WSGI callable as a module-level variable named ``application``.
|
|
||||||
|
|
||||||
For more information on this file, see
|
|
||||||
https://docs.djangoproject.com/en/3.0/howto/deployment/wsgi/
|
|
||||||
"""
|
|
||||||
|
|
||||||
import os
|
|
||||||
|
|
||||||
from django.core.wsgi import get_wsgi_application
|
|
||||||
|
|
||||||
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "etebase_server.settings")
|
|
||||||
|
|
||||||
application = get_wsgi_application()
|
|
@ -1,8 +1,8 @@
|
|||||||
from django import forms
|
from django import forms
|
||||||
from django.contrib.auth import get_user_model
|
|
||||||
from django.contrib.auth.forms import UsernameField
|
from django.contrib.auth.forms import UsernameField
|
||||||
|
from myauth.models import get_typed_user_model
|
||||||
|
|
||||||
User = get_user_model()
|
User = get_typed_user_model()
|
||||||
|
|
||||||
|
|
||||||
class AdminUserCreationForm(forms.ModelForm):
|
class AdminUserCreationForm(forms.ModelForm):
|
||||||
|
37
myauth/migrations/0003_auto_20201119_0810.py
Normal file
37
myauth/migrations/0003_auto_20201119_0810.py
Normal file
@ -0,0 +1,37 @@
|
|||||||
|
# Generated by Django 3.1.1 on 2020-11-19 08:10
|
||||||
|
|
||||||
|
from django.db import migrations, models
|
||||||
|
import myauth.models
|
||||||
|
|
||||||
|
|
||||||
|
class Migration(migrations.Migration):
|
||||||
|
|
||||||
|
dependencies = [
|
||||||
|
("myauth", "0002_auto_20200515_0801"),
|
||||||
|
]
|
||||||
|
|
||||||
|
operations = [
|
||||||
|
migrations.AlterModelManagers(
|
||||||
|
name="user",
|
||||||
|
managers=[
|
||||||
|
("objects", myauth.models.UserManager()),
|
||||||
|
],
|
||||||
|
),
|
||||||
|
migrations.AlterField(
|
||||||
|
model_name="user",
|
||||||
|
name="first_name",
|
||||||
|
field=models.CharField(blank=True, max_length=150, verbose_name="first name"),
|
||||||
|
),
|
||||||
|
migrations.AlterField(
|
||||||
|
model_name="user",
|
||||||
|
name="username",
|
||||||
|
field=models.CharField(
|
||||||
|
error_messages={"unique": "A user with that username already exists."},
|
||||||
|
help_text="Required. 150 characters or fewer. Letters, digits and ./-/_ only.",
|
||||||
|
max_length=150,
|
||||||
|
unique=True,
|
||||||
|
validators=[myauth.models.UnicodeUsernameValidator()],
|
||||||
|
verbose_name="username",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
]
|
@ -1,3 +1,5 @@
|
|||||||
|
import typing as t
|
||||||
|
|
||||||
from django.contrib.auth.models import AbstractUser, UserManager as DjangoUserManager
|
from django.contrib.auth.models import AbstractUser, UserManager as DjangoUserManager
|
||||||
from django.core import validators
|
from django.core import validators
|
||||||
from django.db import models
|
from django.db import models
|
||||||
@ -13,14 +15,15 @@ class UnicodeUsernameValidator(validators.RegexValidator):
|
|||||||
|
|
||||||
|
|
||||||
class UserManager(DjangoUserManager):
|
class UserManager(DjangoUserManager):
|
||||||
def get_by_natural_key(self, username):
|
def get_by_natural_key(self, username: str):
|
||||||
return self.get(**{self.model.USERNAME_FIELD + "__iexact": username})
|
return self.get(**{self.model.USERNAME_FIELD + "__iexact": username})
|
||||||
|
|
||||||
|
|
||||||
class User(AbstractUser):
|
class User(AbstractUser):
|
||||||
|
id: int
|
||||||
username_validator = UnicodeUsernameValidator()
|
username_validator = UnicodeUsernameValidator()
|
||||||
|
|
||||||
objects = UserManager()
|
objects: UserManager = UserManager()
|
||||||
|
|
||||||
username = models.CharField(
|
username = models.CharField(
|
||||||
_("username"),
|
_("username"),
|
||||||
@ -28,9 +31,21 @@ class User(AbstractUser):
|
|||||||
unique=True,
|
unique=True,
|
||||||
help_text=_("Required. 150 characters or fewer. Letters, digits and ./-/_ only."),
|
help_text=_("Required. 150 characters or fewer. Letters, digits and ./-/_ only."),
|
||||||
validators=[username_validator],
|
validators=[username_validator],
|
||||||
error_messages={"unique": _("A user with that username already exists."),},
|
error_messages={
|
||||||
|
"unique": _("A user with that username already exists."),
|
||||||
|
},
|
||||||
)
|
)
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def normalize_username(cls, username):
|
def normalize_username(cls, username: str):
|
||||||
return super().normalize_username(username).lower()
|
return super().normalize_username(username).lower()
|
||||||
|
|
||||||
|
|
||||||
|
UserType = User
|
||||||
|
|
||||||
|
|
||||||
|
def get_typed_user_model() -> UserType:
|
||||||
|
from django.contrib.auth import get_user_model
|
||||||
|
|
||||||
|
ret: t.Any = get_user_model()
|
||||||
|
return ret
|
||||||
|
5
mypy.ini
Normal file
5
mypy.ini
Normal file
@ -0,0 +1,5 @@
|
|||||||
|
[mypy]
|
||||||
|
plugins = mypy_django_plugin.main
|
||||||
|
|
||||||
|
[mypy.plugins.django-stubs]
|
||||||
|
django_settings_module = "etebase_server.settings"
|
28
requirements-dev.txt
Normal file
28
requirements-dev.txt
Normal file
@ -0,0 +1,28 @@
|
|||||||
|
#
|
||||||
|
# This file is autogenerated by pip-compile
|
||||||
|
# To update, run:
|
||||||
|
#
|
||||||
|
# pip-compile --output-file=requirements-dev.txt requirements.in/development.txt
|
||||||
|
#
|
||||||
|
appdirs==1.4.4 # via black
|
||||||
|
asgiref==3.3.1 # via django
|
||||||
|
black==20.8b1 # via -r requirements.in/development.txt
|
||||||
|
click==7.1.2 # via black, pip-tools
|
||||||
|
coverage==5.3.1 # via -r requirements.in/development.txt
|
||||||
|
django-stubs==1.7.0 # via -r requirements.in/development.txt
|
||||||
|
django==3.1.4 # via django-stubs
|
||||||
|
mypy-extensions==0.4.3 # via black, mypy
|
||||||
|
mypy==0.790 # via django-stubs
|
||||||
|
pathspec==0.8.1 # via black
|
||||||
|
pip-tools==5.4.0 # via -r requirements.in/development.txt
|
||||||
|
pytz==2020.5 # via django
|
||||||
|
pywatchman==1.4.1 # via -r requirements.in/development.txt
|
||||||
|
regex==2020.11.13 # via black
|
||||||
|
six==1.15.0 # via pip-tools
|
||||||
|
sqlparse==0.4.1 # via django
|
||||||
|
toml==0.10.2 # via black
|
||||||
|
typed-ast==1.4.1 # via black, mypy
|
||||||
|
typing-extensions==3.7.4.3 # via black, django-stubs, mypy
|
||||||
|
|
||||||
|
# The following packages are considered to be unsafe in a requirements file:
|
||||||
|
# pip
|
@ -1,7 +1,8 @@
|
|||||||
django
|
django
|
||||||
django-cors-headers
|
|
||||||
djangorestframework
|
|
||||||
drf-nested-routers
|
|
||||||
msgpack
|
msgpack
|
||||||
psycopg2-binary
|
|
||||||
pynacl
|
pynacl
|
||||||
|
fastapi
|
||||||
|
typing_extensions
|
||||||
|
uvicorn[standard]
|
||||||
|
aiofiles
|
||||||
|
aioredis
|
||||||
|
@ -1,4 +1,5 @@
|
|||||||
coverage
|
coverage
|
||||||
pip-tools
|
pip-tools
|
||||||
pywatchman
|
pywatchman
|
||||||
black
|
black
|
||||||
|
django-stubs
|
||||||
|
@ -4,16 +4,29 @@
|
|||||||
#
|
#
|
||||||
# pip-compile --output-file=requirements.txt requirements.in/base.txt
|
# pip-compile --output-file=requirements.txt requirements.in/base.txt
|
||||||
#
|
#
|
||||||
asgiref==3.2.10 # via django
|
aiofiles==0.6.0 # via -r requirements.in/base.txt
|
||||||
cffi==1.14.0 # via pynacl
|
aioredis==1.3.1 # via -r requirements.in/base.txt
|
||||||
django-cors-headers==3.2.1 # via -r requirements.in/base.txt
|
asgiref==3.3.1 # via django
|
||||||
django==3.1.1 # via -r requirements.in/base.txt, django-cors-headers, djangorestframework, drf-nested-routers
|
async-timeout==3.0.1 # via aioredis
|
||||||
djangorestframework==3.11.0 # via -r requirements.in/base.txt, drf-nested-routers
|
cffi==1.14.4 # via pynacl
|
||||||
drf-nested-routers==0.91 # via -r requirements.in/base.txt
|
click==7.1.2 # via uvicorn
|
||||||
msgpack==1.0.0 # via -r requirements.in/base.txt
|
django==3.1.4 # via -r requirements.in/base.txt
|
||||||
psycopg2-binary==2.8.4 # via -r requirements.in/base.txt
|
fastapi==0.63.0 # via -r requirements.in/base.txt
|
||||||
|
h11==0.11.0 # via uvicorn
|
||||||
|
hiredis==1.1.0 # via aioredis
|
||||||
|
httptools==0.1.1 # via uvicorn
|
||||||
|
msgpack==1.0.2 # via -r requirements.in/base.txt
|
||||||
pycparser==2.20 # via cffi
|
pycparser==2.20 # via cffi
|
||||||
pynacl==1.3.0 # via -r requirements.in/base.txt
|
pydantic==1.7.3 # via fastapi
|
||||||
pytz==2019.3 # via django
|
pynacl==1.4.0 # via -r requirements.in/base.txt
|
||||||
six==1.14.0 # via pynacl
|
python-dotenv==0.15.0 # via uvicorn
|
||||||
sqlparse==0.3.0 # via django
|
pytz==2020.4 # via django
|
||||||
|
pyyaml==5.3.1 # via uvicorn
|
||||||
|
six==1.15.0 # via pynacl
|
||||||
|
sqlparse==0.4.1 # via django
|
||||||
|
starlette==0.13.6 # via fastapi
|
||||||
|
typing-extensions==3.7.4.3 # via -r requirements.in/base.txt
|
||||||
|
uvicorn[standard]==0.13.2 # via -r requirements.in/base.txt
|
||||||
|
uvloop==0.14.0 # via uvicorn
|
||||||
|
watchgod==0.6 # via uvicorn
|
||||||
|
websockets==8.1 # via uvicorn
|
||||||
|
Loading…
Reference in New Issue
Block a user