Merged revisions 401-446 via svnmerge from
svn+ssh://torsion.org/home/luminotes/repos/luminotes/branches/postgres ................ r402 | witten | 2007-10-04 00:48:49 -0700 (Thu, 04 Oct 2007) | 3 lines Initialized merge tracking via "svnmerge" with revisions "1-401" from svn+ssh://torsion.org/home/luminotes/repos/luminotes/trunk ................ r404 | witten | 2007-10-04 01:17:07 -0700 (Thu, 04 Oct 2007) | 2 lines Beginning a conversion from bsddb to postgres. ................ r405 | witten | 2007-10-04 01:18:58 -0700 (Thu, 04 Oct 2007) | 9 lines Merged revisions 402-404 via svnmerge from svn+ssh://torsion.org/home/luminotes/repos/luminotes/trunk ........ r403 | witten | 2007-10-04 01:14:45 -0700 (Thu, 04 Oct 2007) | 2 lines Yay, no more stupid deprecation warnings from simplejson about the sre module. ........ ................ r406 | witten | 2007-10-04 15:34:39 -0700 (Thu, 04 Oct 2007) | 4 lines * Switched back to Python 2.4 because many Python modules in Debian are not packaged to work with Python 2.5 * Began removal of all references to Scheduler, @async, yield, and so on. * Converted Database.py to support PostgreSQL and updated its unit tests accordingly. ................ r407 | witten | 2007-10-04 16:34:01 -0700 (Thu, 04 Oct 2007) | 2 lines All unit tests for the new model classes now pass. ................ r409 | witten | 2007-10-05 00:53:56 -0700 (Fri, 05 Oct 2007) | 2 lines Reordering some columns and adding some indices. ................ r410 | witten | 2007-10-05 16:08:37 -0700 (Fri, 05 Oct 2007) | 4 lines Now adding trash notebooks to user_notebook table. Also switching db conversion/verification tools back to require Python 2.5, since they still use the old Scheduler, which requires 2.5 generator features. ................ r411 | witten | 2007-10-06 16:26:56 -0700 (Sat, 06 Oct 2007) | 2 lines Lots more unit tests passing. Most of the recent work was on controller.Users and related stuff. ................ r412 | witten | 2007-10-07 01:52:12 -0700 (Sun, 07 Oct 2007) | 2 lines controller.Users unit tests now finally pass! ................ r413 | witten | 2007-10-07 02:14:10 -0700 (Sun, 07 Oct 2007) | 3 lines Got controller.Root unit tests passing. Moved fake sql_* function shenanigans from Test_users.py to Test_controller.py, for use by other controller unit tests. ................ r414 | witten | 2007-10-08 23:11:11 -0700 (Mon, 08 Oct 2007) | 2 lines All unit tests pass! Fuck yeah! ................ r415 | witten | 2007-10-08 23:13:07 -0700 (Mon, 08 Oct 2007) | 2 lines Removing all references to Scheduler from luminotes.py ................ r416 | witten | 2007-10-08 23:54:51 -0700 (Mon, 08 Oct 2007) | 3 lines Converted deleted_from to deleted_from_id in a few more places. Fixed bug in Users.contents(). ................ r417 | witten | 2007-10-09 00:11:59 -0700 (Tue, 09 Oct 2007) | 3 lines Typo fix in Note sql method. Adding autocommit flag to Database.next_id() method. ................ r418 | witten | 2007-10-09 00:13:19 -0700 (Tue, 09 Oct 2007) | 2 lines Updating unit test for new auto commit flag. ................ r419 | witten | 2007-10-09 00:14:09 -0700 (Tue, 09 Oct 2007) | 2 lines Removing debugging print. ................ r420 | witten | 2007-10-09 00:20:55 -0700 (Tue, 09 Oct 2007) | 2 lines More sql fixes. I really need some funtional tests that hit the database and exercise the SQL. ................ r421 | witten | 2007-10-09 00:51:34 -0700 (Tue, 09 Oct 2007) | 3 lines Fixed controller.Database handling of tuple as an Object_type. Made SQL for user storage calculation better at handling null values and also more succinct. ................ r422 | witten | 2007-10-09 13:32:16 -0700 (Tue, 09 Oct 2007) | 2 lines Converting Wiki.js to trash_id notebook member instead of trash object. ................ r423 | witten | 2007-10-09 13:42:10 -0700 (Tue, 09 Oct 2007) | 2 lines No longer displaying "download as html" on the front page, as people see "download" and think they're downloading the software. ................ r424 | witten | 2007-10-09 14:24:40 -0700 (Tue, 09 Oct 2007) | 2 lines Notebooks.contents() now returns notebooks with correct read-write status. ................ r425 | witten | 2007-10-09 14:32:25 -0700 (Tue, 09 Oct 2007) | 2 lines Fixed reporting of validation errors to the user. Now says "The blah is missing." instead of just "is missing" ................ r426 | witten | 2007-10-09 17:05:22 -0700 (Tue, 09 Oct 2007) | 2 lines No longer redirecting to trash notebook upon login. ................ r427 | witten | 2007-10-09 17:20:33 -0700 (Tue, 09 Oct 2007) | 2 lines Made controller.Database use a connection pool. ................ r429 | witten | 2007-10-09 20:13:30 -0700 (Tue, 09 Oct 2007) | 2 lines Converted initdb.py and updatedb.py to Postgres from bsddb. ................ r430 | witten | 2007-10-09 20:37:14 -0700 (Tue, 09 Oct 2007) | 2 lines Changing error message to remove underscores from variable names. ................ r431 | witten | 2007-10-10 13:23:30 -0700 (Wed, 10 Oct 2007) | 2 lines Removing unused note_title parameter from Wiki.create_editor(). ................ r432 | witten | 2007-10-10 13:25:16 -0700 (Wed, 10 Oct 2007) | 2 lines Revision regular expression now supports timezone notation. ................ r433 | witten | 2007-10-10 14:43:47 -0700 (Wed, 10 Oct 2007) | 2 lines Finished implementing ranked ordering for startup notes. (However, there's no way to change the rank from the client yet.) ................ r434 | witten | 2007-10-10 16:25:19 -0700 (Wed, 10 Oct 2007) | 4 lines More strict access checking. Fixed oversight in Postgres DB conversion where, in certain controller.Notebook methods, access was only checked at the notebook level, not at the note level as well. ................ r435 | witten | 2007-10-10 17:45:18 -0700 (Wed, 10 Oct 2007) | 3 lines Now loading revisions on demand from client when the "changes" button is clicked. Also caching loading revisions so subsequent clicks don't have to reload. ................ r436 | witten | 2007-10-10 21:31:20 -0700 (Wed, 10 Oct 2007) | 2 lines Tweaking some of the error handling in Expose and Root so that unhandled errors give a generic error message to the client. ................ r437 | witten | 2007-10-10 21:33:49 -0700 (Wed, 10 Oct 2007) | 2 lines The release script no longer runs initdb.py, because the default database is no longer a single file included in the tarball. ................ r438 | witten | 2007-10-10 21:40:11 -0700 (Wed, 10 Oct 2007) | 2 lines Updated install instructuctions to include use of initdb.py. ................ r439 | witten | 2007-10-10 21:56:42 -0700 (Wed, 10 Oct 2007) | 3 lines Made initdb.py only nuke (drop tables/views) when given a command-line flag. Also made install directions more correct. ................ r440 | witten | 2007-10-10 21:58:48 -0700 (Wed, 10 Oct 2007) | 2 lines IE 6 doesn't like commas. ................ r441 | witten | 2007-10-10 22:08:50 -0700 (Wed, 10 Oct 2007) | 4 lines load your notebook. without clicking on "changes", edit a note that has previous revisions. click on "changes". it'll only show the most recent revision. fixed by not appending to changes as a result of a save unless the client-side revisions list cache has something in it ................ r442 | witten | 2007-10-10 23:30:41 -0700 (Wed, 10 Oct 2007) | 2 lines Forgot to actually save off the new revision as editor.revision. ................ r443 | witten | 2007-10-11 01:35:54 -0700 (Thu, 11 Oct 2007) | 13 lines More intelligent datetime handling: * convertdb.py assumes old bsddb database timestamps are Pacific, and then converts them to UTC before inserting them into the new PostgreSQL database. * No longer using naked timezoneless datetime objects in model/controller code, except in unit tests that need compatability with pysqlite. Now using UTC everwhere. * Asking PostgreSQL to give us all timestamps back in UTC. * New dependency on python-tz (pytz) package, noted in INSTALL doc. * Client now responsible for converting UTC timestamps to local time for display. ................ r444 | witten | 2007-10-11 01:46:09 -0700 (Thu, 11 Oct 2007) | 2 lines Tweak to prevent potential race in IE. ................ r445 | witten | 2007-10-11 01:49:58 -0700 (Thu, 11 Oct 2007) | 2 lines Got JavaScript "unit" tests passing again. ................ r446 | witten | 2007-10-11 01:53:58 -0700 (Thu, 11 Oct 2007) | 2 lines Noting that js tests require the Luminotes server on localhost. ................
This commit is contained in:
parent
03bffe4676
commit
43c6f54e9f
61
INSTALL
61
INSTALL
|
@ -4,19 +4,17 @@ you shouldn't need if you only want to make a wiki.
|
||||||
|
|
||||||
First, install the prerequisites:
|
First, install the prerequisites:
|
||||||
|
|
||||||
* Python 2.5
|
* Python 2.4
|
||||||
* CherryPy 2.2
|
* CherryPy 2.2
|
||||||
* PostgreSQL 8.1
|
* PostgreSQL 8.1
|
||||||
* psycopg 2.0
|
* psycopg 2.0
|
||||||
* simplejson 1.3
|
* simplejson 1.3
|
||||||
|
* pytz 2006p
|
||||||
|
|
||||||
In Debian GNU/Linux, you can issue the following command to install these
|
In Debian GNU/Linux, you can issue the following command to install these
|
||||||
packages:
|
packages:
|
||||||
|
|
||||||
apt-get install python2.5 python-cherrypy postgresql-8.1 python-psycopg2 python-simplejson
|
apt-get install python2.4 python-cherrypy postgresql-8.1 python-psycopg2 python-simplejson python-tz
|
||||||
|
|
||||||
If you're using Debian Etch, see the note below about "psycopg in Debian
|
|
||||||
Etch".
|
|
||||||
|
|
||||||
|
|
||||||
development mode
|
development mode
|
||||||
|
@ -40,14 +38,15 @@ and set the password to "dev".
|
||||||
|
|
||||||
createuser -S -d -R -P -E luminotes
|
createuser -S -d -R -P -E luminotes
|
||||||
|
|
||||||
Initialize the database with the starting schema and basic data:
|
Initialize the database with the starting schema and default data:
|
||||||
|
|
||||||
psql -U luminotes postgres -f model/schema.sql
|
createdb -W -U luminotes luminotes
|
||||||
psql -U luminotes postgres -f model/data.sql
|
export PYTHONPATH=.
|
||||||
|
python2.4 tools/initdb.py
|
||||||
|
|
||||||
To start the server in development mode, run:
|
To start the server in development mode, run:
|
||||||
|
|
||||||
python2.5 luminotes.py -d
|
python2.4 luminotes.py -d
|
||||||
|
|
||||||
Connect to the following URL in a web browser running on the same machine:
|
Connect to the following URL in a web browser running on the same machine:
|
||||||
|
|
||||||
|
@ -114,18 +113,20 @@ Restart postgresql so these changes take effect:
|
||||||
/etc/init.d/postgresql restart
|
/etc/init.d/postgresql restart
|
||||||
|
|
||||||
As the PostgreSQL superuser (usually "postgres"), create a new database user
|
As the PostgreSQL superuser (usually "postgres"), create a new database user
|
||||||
and set the password to "dev".
|
and set a new password, for instance, "mypassword".
|
||||||
|
|
||||||
createuser -S -d -R -P -E luminotes
|
createuser -S -d -R -P -E luminotes
|
||||||
|
|
||||||
Initialize the database with the starting schema and basic data:
|
Initialize the database with the starting schema and default data:
|
||||||
|
|
||||||
psql -U luminotes postgres -f model/schema.sql
|
createdb -W -U luminotes luminotes
|
||||||
psql -U luminotes postgres -f model/data.sql
|
export PYTHONPATH=.
|
||||||
|
export PGPASSWORD=mypassword
|
||||||
|
python2.4 tools/initdb.py
|
||||||
|
|
||||||
Then to actually start the production mode server, run:
|
Then to actually start the production mode server, run:
|
||||||
|
|
||||||
python2.5 luminotes.py
|
python2.4 luminotes.py
|
||||||
|
|
||||||
You should be able to connect to the site at whatever domain you've configured
|
You should be able to connect to the site at whatever domain you've configured
|
||||||
Apache to serve.
|
Apache to serve.
|
||||||
|
@ -137,44 +138,28 @@ Python unit tests
|
||||||
If you're interested in running unit tests of the server, install:
|
If you're interested in running unit tests of the server, install:
|
||||||
|
|
||||||
* nose 0.9.0
|
* nose 0.9.0
|
||||||
|
* pysqlite 2.3
|
||||||
|
|
||||||
In Debian GNU/Linux, you can issue the following command to install this
|
In Debian GNU/Linux, you can issue the following command to install this
|
||||||
package:
|
package:
|
||||||
|
|
||||||
apt-get install python-nose
|
apt-get install python-nose python-pysqlite2
|
||||||
|
|
||||||
Then you can run unit tests by running:
|
Then you can run unit tests by running:
|
||||||
|
|
||||||
nosetests
|
nosetests
|
||||||
|
|
||||||
|
|
||||||
JavaScript unit tests
|
JavaScript "unit" tests
|
||||||
---------------------
|
-----------------------
|
||||||
|
|
||||||
JsUnit is included with Luminotes, so to kick off tests of the client-side
|
JsUnit is included with Luminotes, so to kick off tests of the client-side
|
||||||
JavaScript code, simply run:
|
JavaScript code, simply run:
|
||||||
|
|
||||||
python2.5 static/js/test/run_tests.py
|
python2.4 static/js/test/run_tests.py
|
||||||
|
|
||||||
The run_tests.py script runs the tests inside browser windows and presumes you
|
The run_tests.py script runs the tests inside browser windows and presumes
|
||||||
have both Firefox and Internet Explorer 6 installed. Edit run_tests.py if you
|
that you have both Firefox and Internet Explorer 6 installed, and also that
|
||||||
|
the Luminotes server is running on the local machine. Edit run_tests.py if you
|
||||||
need to specify different paths to the browser binaries or want to test with
|
need to specify different paths to the browser binaries or want to test with
|
||||||
additional browsers.
|
additional browsers.
|
||||||
|
|
||||||
|
|
||||||
psycopg in Debian Etch
|
|
||||||
----------------------
|
|
||||||
|
|
||||||
As of this writing, Debian Etch does not contain a version of psycopg with
|
|
||||||
support for Python 2.5. However, the version of psycopg in Debian testing does
|
|
||||||
support Python 2.5. So you can grab the source for python-psycopg2 from Debian
|
|
||||||
testing, install the build dependencies (including python2.5-dev), and build
|
|
||||||
the package yourself on an Etch machine.
|
|
||||||
|
|
||||||
Then, edit /usr/share/python/debian_defaults and move "python2.5" from
|
|
||||||
"unsupported-versions" to "supported-versions". Finally, install the
|
|
||||||
python-psycopg2 package you've just built, and it should fully support Python
|
|
||||||
2.5.
|
|
||||||
|
|
||||||
See Debian bug #404355 for more information. Note that it was fixed in
|
|
||||||
unstable, but not in Etch.
|
|
||||||
|
|
|
@ -1,219 +1,158 @@
|
||||||
import re
|
import re
|
||||||
import bsddb
|
import os
|
||||||
|
import psycopg2 as psycopg
|
||||||
|
from psycopg2.pool import PersistentConnectionPool
|
||||||
import random
|
import random
|
||||||
import cPickle
|
|
||||||
from cStringIO import StringIO
|
|
||||||
from copy import copy
|
|
||||||
from model.Persistent import Persistent
|
|
||||||
from Async import async
|
|
||||||
|
|
||||||
|
|
||||||
class Database( object ):
|
class Database( object ):
|
||||||
ID_BITS = 128 # number of bits within an id
|
ID_BITS = 128 # number of bits within an id
|
||||||
ID_DIGITS = "0123456789abcdefghijklmnopqrstuvwxyz"
|
ID_DIGITS = "0123456789abcdefghijklmnopqrstuvwxyz"
|
||||||
|
|
||||||
def __init__( self, scheduler, database_path = None ):
|
def __init__( self, connection = None ):
|
||||||
"""
|
"""
|
||||||
Create a new database and return it.
|
Create a new database and return it.
|
||||||
|
|
||||||
@type scheduler: Scheduler
|
@type connection: existing connection object with cursor()/close()/commit() methods, or NoneType
|
||||||
@param scheduler: scheduler to use
|
@param connection: database connection to use (optional, defaults to making a connection pool)
|
||||||
@type database_path: unicode
|
|
||||||
@param database_path: path to the database file
|
|
||||||
@rtype: Database
|
@rtype: Database
|
||||||
@return: database at the given path
|
@return: newly constructed Database
|
||||||
"""
|
"""
|
||||||
self.__scheduler = scheduler
|
# This tells PostgreSQL to give us timestamps in UTC. I'd use "set timezone" instead, but that
|
||||||
self.__env = bsddb.db.DBEnv()
|
# makes SQLite angry.
|
||||||
self.__env.open( None, bsddb.db.DB_CREATE | bsddb.db.DB_PRIVATE | bsddb.db.DB_INIT_MPOOL )
|
os.putenv( "PGTZ", "UTC" )
|
||||||
self.__db = bsddb.db.DB( self.__env )
|
|
||||||
self.__db.open( database_path, "database", bsddb.db.DB_HASH, bsddb.db.DB_CREATE )
|
|
||||||
self.__cache = {}
|
|
||||||
|
|
||||||
def __persistent_id( self, obj, skip = None ):
|
if connection:
|
||||||
# save the object and return its persistent id
|
self.__connection = connection
|
||||||
if obj != skip and isinstance( obj, Persistent ):
|
self.__pool = None
|
||||||
self.__save( obj )
|
else:
|
||||||
return obj.object_id
|
self.__connection = None
|
||||||
|
self.__pool = PersistentConnectionPool(
|
||||||
|
1, # minimum connections
|
||||||
|
50, # maximum connections
|
||||||
|
"dbname=luminotes user=luminotes password=%s" % os.getenv( "PGPASSWORD", "dev" ),
|
||||||
|
)
|
||||||
|
|
||||||
# returning None indicates that the object should be pickled normally without using a persistent id
|
def __get_connection( self ):
|
||||||
return None
|
if self.__connection:
|
||||||
|
return self.__connection
|
||||||
|
else:
|
||||||
|
return self.__pool.getconn()
|
||||||
|
|
||||||
@async
|
def save( self, obj, commit = True ):
|
||||||
def save( self, obj, callback = None ):
|
|
||||||
"""
|
"""
|
||||||
Save the given object to the database, including any objects that it references.
|
Save the given object to the database.
|
||||||
|
|
||||||
@type obj: Persistent
|
@type obj: Persistent
|
||||||
@param obj: object to save
|
@param obj: object to save
|
||||||
@type callback: generator or NoneType
|
@type commit: bool
|
||||||
@param callback: generator to wakeup when the save is complete (optional)
|
@param commit: True to automatically commit after the save
|
||||||
"""
|
"""
|
||||||
self.__save( obj )
|
connection = self.__get_connection()
|
||||||
yield callback
|
cursor = connection.cursor()
|
||||||
|
|
||||||
def __save( self, obj ):
|
cursor.execute( obj.sql_exists() )
|
||||||
# if this object's current revision is already saved, bail
|
if cursor.fetchone():
|
||||||
revision_id = obj.revision_id()
|
cursor.execute( obj.sql_update() )
|
||||||
if revision_id in self.__cache:
|
else:
|
||||||
return
|
cursor.execute( obj.sql_create() )
|
||||||
|
|
||||||
object_id = unicode( obj.object_id ).encode( "utf8" )
|
if commit:
|
||||||
revision_id = unicode( obj.revision_id() ).encode( "utf8" )
|
connection.commit()
|
||||||
secondary_id = obj.secondary_id and unicode( obj.full_secondary_id() ).encode( "utf8" ) or None
|
|
||||||
|
|
||||||
# update the cache with this saved object
|
def commit( self ):
|
||||||
self.__cache[ object_id ] = obj
|
self.__get_connection().commit()
|
||||||
self.__cache[ revision_id ] = copy( obj )
|
|
||||||
if secondary_id:
|
|
||||||
self.__cache[ secondary_id ] = obj
|
|
||||||
|
|
||||||
# set the pickler up to save persistent ids for every object except for the obj passed in, which
|
def load( self, Object_type, object_id, revision = None ):
|
||||||
# will be pickled normally
|
|
||||||
buffer = StringIO()
|
|
||||||
pickler = cPickle.Pickler( buffer, protocol = -1 )
|
|
||||||
pickler.persistent_id = lambda o: self.__persistent_id( o, skip = obj )
|
|
||||||
|
|
||||||
# pickle the object and write it to the database under both its id key and its revision id key
|
|
||||||
pickler.dump( obj )
|
|
||||||
pickled = buffer.getvalue()
|
|
||||||
self.__db.put( object_id, pickled )
|
|
||||||
self.__db.put( revision_id, pickled )
|
|
||||||
|
|
||||||
# write the pickled object id (only) to the database under its secondary id
|
|
||||||
if secondary_id:
|
|
||||||
buffer = StringIO()
|
|
||||||
pickler = cPickle.Pickler( buffer, protocol = -1 )
|
|
||||||
pickler.persistent_id = lambda o: self.__persistent_id( o )
|
|
||||||
pickler.dump( obj )
|
|
||||||
self.__db.put( secondary_id, buffer.getvalue() )
|
|
||||||
|
|
||||||
self.__db.sync()
|
|
||||||
|
|
||||||
@async
|
|
||||||
def load( self, object_id, callback, revision = None ):
|
|
||||||
"""
|
"""
|
||||||
Load the object corresponding to the given object id from the database, and yield the provided
|
Load the object corresponding to the given object id from the database and return it, or None if
|
||||||
callback generator with the loaded object as its argument, or None if the object_id is unknown.
|
the object_id is unknown. If a revision is provided, a specific revision of the object will be
|
||||||
If a revision is provided, a specific revision of the object will be loaded.
|
|
||||||
|
|
||||||
@type object_id: unicode
|
|
||||||
@param object_id: id of the object to load
|
|
||||||
@type callback: generator
|
|
||||||
@param callback: generator to send the loaded object to
|
|
||||||
@type revision: int or NoneType
|
|
||||||
@param revision: revision of the object to load (optional)
|
|
||||||
"""
|
|
||||||
obj = self.__load( object_id, revision )
|
|
||||||
yield callback, obj
|
|
||||||
|
|
||||||
def __load( self, object_id, revision = None ):
|
|
||||||
if revision is not None:
|
|
||||||
object_id = Persistent.make_revision_id( object_id, revision )
|
|
||||||
|
|
||||||
object_id = unicode( object_id ).encode( "utf8" )
|
|
||||||
|
|
||||||
# if the object corresponding to the given id has already been loaded, simply return it without
|
|
||||||
# loading it again
|
|
||||||
obj = self.__cache.get( object_id )
|
|
||||||
if obj is not None:
|
|
||||||
return obj
|
|
||||||
|
|
||||||
# grab the object for the given id from the database
|
|
||||||
buffer = StringIO()
|
|
||||||
unpickler = cPickle.Unpickler( buffer )
|
|
||||||
unpickler.persistent_load = self.__load
|
|
||||||
|
|
||||||
pickled = self.__db.get( object_id )
|
|
||||||
if pickled is None or pickled == "":
|
|
||||||
return None
|
|
||||||
|
|
||||||
buffer.write( pickled )
|
|
||||||
buffer.flush()
|
|
||||||
buffer.seek( 0 )
|
|
||||||
|
|
||||||
# unpickle the object and update the cache with this saved object
|
|
||||||
obj = unpickler.load()
|
|
||||||
if obj is None:
|
|
||||||
print "error unpickling %s: %s" % ( object_id, pickled )
|
|
||||||
return None
|
|
||||||
self.__cache[ unicode( obj.object_id ).encode( "utf8" ) ] = obj
|
|
||||||
self.__cache[ unicode( obj.revision_id() ).encode( "utf8" ) ] = copy( obj )
|
|
||||||
|
|
||||||
return obj
|
|
||||||
|
|
||||||
@async
|
|
||||||
def reload( self, object_id, callback = None ):
|
|
||||||
"""
|
|
||||||
Load and immediately save the object corresponding to the given object id or database key. This
|
|
||||||
is useful when the object has a __setstate__() method that performs some sort of schema
|
|
||||||
evolution operation.
|
|
||||||
|
|
||||||
@type object_id: unicode
|
|
||||||
@param object_id: id or key of the object to reload
|
|
||||||
@type callback: generator or NoneType
|
|
||||||
@param callback: generator to wakeup when the save is complete (optional)
|
|
||||||
"""
|
|
||||||
self.__reload( object_id )
|
|
||||||
yield callback
|
|
||||||
|
|
||||||
def __reload( self, object_id, revision = None ):
|
|
||||||
object_id = unicode( object_id ).encode( "utf8" )
|
|
||||||
|
|
||||||
# grab the object for the given id from the database
|
|
||||||
buffer = StringIO()
|
|
||||||
unpickler = cPickle.Unpickler( buffer )
|
|
||||||
unpickler.persistent_load = self.__load
|
|
||||||
|
|
||||||
pickled = self.__db.get( object_id )
|
|
||||||
if pickled is None or pickled == "":
|
|
||||||
return
|
|
||||||
|
|
||||||
buffer.write( pickled )
|
|
||||||
buffer.flush()
|
|
||||||
buffer.seek( 0 )
|
|
||||||
|
|
||||||
# unpickle the object. this should trigger __setstate__() if the object has such a method
|
|
||||||
obj = unpickler.load()
|
|
||||||
if obj is None:
|
|
||||||
print "error unpickling %s: %s" % ( object_id, pickled )
|
|
||||||
return
|
|
||||||
self.__cache[ object_id ] = obj
|
|
||||||
|
|
||||||
# set the pickler up to save persistent ids for every object except for the obj passed in, which
|
|
||||||
# will be pickled normally
|
|
||||||
buffer = StringIO()
|
|
||||||
pickler = cPickle.Pickler( buffer, protocol = -1 )
|
|
||||||
pickler.persistent_id = lambda o: self.__persistent_id( o, skip = obj )
|
|
||||||
|
|
||||||
# pickle the object and write it to the database under its id key
|
|
||||||
pickler.dump( obj )
|
|
||||||
pickled = buffer.getvalue()
|
|
||||||
self.__db.put( object_id, pickled )
|
|
||||||
|
|
||||||
self.__db.sync()
|
|
||||||
|
|
||||||
def size( self, object_id, revision = None ):
|
|
||||||
"""
|
|
||||||
Load the object corresponding to the given object id from the database, and return the size of
|
|
||||||
its pickled data in bytes. If a revision is provided, a specific revision of the object will be
|
|
||||||
loaded.
|
loaded.
|
||||||
|
|
||||||
|
@type Object_type: type
|
||||||
|
@param Object_type: class of the object to load
|
||||||
@type object_id: unicode
|
@type object_id: unicode
|
||||||
@param object_id: id of the object whose size should be returned
|
@param object_id: id of the object to load
|
||||||
@type revision: int or NoneType
|
@type revision: int or NoneType
|
||||||
@param revision: revision of the object to load (optional)
|
@param revision: revision of the object to load (optional)
|
||||||
|
@rtype: Object_type or NoneType
|
||||||
|
@return: loaded object, or None if no match
|
||||||
"""
|
"""
|
||||||
if revision is not None:
|
return self.select_one( Object_type, Object_type.sql_load( object_id, revision ) )
|
||||||
object_id = Persistent.make_revision_id( object_id, revision )
|
|
||||||
|
|
||||||
object_id = unicode( object_id ).encode( "utf8" )
|
def select_one( self, Object_type, sql_command ):
|
||||||
|
"""
|
||||||
|
Execute the given sql_command and return its results in the form of an object of Object_type,
|
||||||
|
or None if there was no match.
|
||||||
|
|
||||||
pickled = self.__db.get( object_id )
|
@type Object_type: type
|
||||||
if pickled is None or pickled == "":
|
@param Object_type: class of the object to load
|
||||||
|
@type sql_command: unicode
|
||||||
|
@param sql_command: SQL command to execute
|
||||||
|
@rtype: Object_type or NoneType
|
||||||
|
@return: loaded object, or None if no match
|
||||||
|
"""
|
||||||
|
connection = self.__get_connection()
|
||||||
|
cursor = connection.cursor()
|
||||||
|
|
||||||
|
cursor.execute( sql_command )
|
||||||
|
|
||||||
|
row = cursor.fetchone()
|
||||||
|
if not row:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
return len( pickled )
|
if Object_type in ( tuple, list ):
|
||||||
|
return Object_type( row )
|
||||||
|
else:
|
||||||
|
return Object_type( *row )
|
||||||
|
|
||||||
|
def select_many( self, Object_type, sql_command ):
|
||||||
|
"""
|
||||||
|
Execute the given sql_command and return its results in the form of a list of objects of
|
||||||
|
Object_type.
|
||||||
|
|
||||||
|
@type Object_type: type
|
||||||
|
@param Object_type: class of the object to load
|
||||||
|
@type sql_command: unicode
|
||||||
|
@param sql_command: SQL command to execute
|
||||||
|
@rtype: list of Object_type
|
||||||
|
@return: loaded objects
|
||||||
|
"""
|
||||||
|
connection = self.__get_connection()
|
||||||
|
cursor = connection.cursor()
|
||||||
|
|
||||||
|
cursor.execute( sql_command )
|
||||||
|
|
||||||
|
objects = []
|
||||||
|
row = cursor.fetchone()
|
||||||
|
|
||||||
|
while row:
|
||||||
|
if Object_type in ( tuple, list ):
|
||||||
|
obj = Object_type( row )
|
||||||
|
else:
|
||||||
|
obj = Object_type( *row )
|
||||||
|
|
||||||
|
objects.append( obj )
|
||||||
|
row = cursor.fetchone()
|
||||||
|
|
||||||
|
return objects
|
||||||
|
|
||||||
|
def execute( self, sql_command, commit = True ):
|
||||||
|
"""
|
||||||
|
Execute the given sql_command.
|
||||||
|
|
||||||
|
@type sql_command: unicode
|
||||||
|
@param sql_command: SQL command to execute
|
||||||
|
@type commit: bool
|
||||||
|
@param commit: True to automatically commit after the command
|
||||||
|
"""
|
||||||
|
connection = self.__get_connection()
|
||||||
|
cursor = connection.cursor()
|
||||||
|
|
||||||
|
cursor.execute( sql_command )
|
||||||
|
|
||||||
|
if commit:
|
||||||
|
connection.commit()
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
def generate_id():
|
def generate_id():
|
||||||
|
@ -231,44 +170,45 @@ class Database( object ):
|
||||||
|
|
||||||
return "".join( digits )
|
return "".join( digits )
|
||||||
|
|
||||||
@async
|
def next_id( self, Object_type, commit = True ):
|
||||||
def next_id( self, callback ):
|
|
||||||
"""
|
"""
|
||||||
Generate the next available object id, and yield the provided callback generator with the
|
Generate the next available object id and return it.
|
||||||
object id as its argument.
|
|
||||||
|
|
||||||
@type callback: generator
|
@type Object_type: type
|
||||||
@param callback: generator to send the next available object id to
|
@param Object_type: class of the object that the id is for
|
||||||
|
@type commit: bool
|
||||||
|
@param commit: True to automatically commit after storing the next id
|
||||||
"""
|
"""
|
||||||
|
connection = self.__get_connection()
|
||||||
|
cursor = connection.cursor()
|
||||||
|
|
||||||
# generate a random id, but on the off-chance that it collides with something else already in
|
# generate a random id, but on the off-chance that it collides with something else already in
|
||||||
# the database, try again
|
# the database, try again
|
||||||
next_id = Database.generate_id()
|
next_id = Database.generate_id()
|
||||||
while self.__db.get( next_id, default = None ) is not None:
|
cursor.execute( Object_type.sql_id_exists( next_id ) )
|
||||||
|
|
||||||
|
while cursor.fetchone() is not None:
|
||||||
next_id = Database.generate_id()
|
next_id = Database.generate_id()
|
||||||
|
cursor.execute( Object_type.sql_id_exists( next_id ) )
|
||||||
|
|
||||||
# save the next_id as a key in the database so that it's not handed out again to another client
|
# save a new object with the next_id to the database
|
||||||
self.__db[ next_id ] = ""
|
obj = Object_type( next_id )
|
||||||
|
cursor.execute( obj.sql_create() )
|
||||||
|
|
||||||
yield callback, next_id
|
if commit:
|
||||||
|
connection.commit()
|
||||||
|
|
||||||
|
return next_id
|
||||||
|
|
||||||
@async
|
|
||||||
def close( self ):
|
def close( self ):
|
||||||
"""
|
"""
|
||||||
Shutdown the database.
|
Shutdown the database.
|
||||||
"""
|
"""
|
||||||
self.__db.close()
|
if self.__connection:
|
||||||
self.__env.close()
|
self.__connection.close()
|
||||||
yield None
|
|
||||||
|
|
||||||
@async
|
if self.__pool:
|
||||||
def clear_cache( self ):
|
self.__pool.closeall()
|
||||||
"""
|
|
||||||
Clear the memory object cache.
|
|
||||||
"""
|
|
||||||
self.__cache.clear()
|
|
||||||
yield None
|
|
||||||
|
|
||||||
scheduler = property( lambda self: self.__scheduler )
|
|
||||||
|
|
||||||
|
|
||||||
class Valid_id( object ):
|
class Valid_id( object ):
|
||||||
|
@ -289,9 +229,9 @@ class Valid_id( object ):
|
||||||
|
|
||||||
class Valid_revision( object ):
|
class Valid_revision( object ):
|
||||||
"""
|
"""
|
||||||
Validator for an object id.
|
Validator for an object revision timestamp.
|
||||||
"""
|
"""
|
||||||
REVISION_PATTERN = re.compile( "^\d\d\d\d-\d\d-\d\d \d\d:\d\d:\d\d\.\d+$" )
|
REVISION_PATTERN = re.compile( "^\d\d\d\d-\d\d-\d\d \d\d:\d\d:\d\d\.\d+[+-]\d\d(:)?\d\d$" )
|
||||||
|
|
||||||
def __init__( self, none_okay = False ):
|
def __init__( self, none_okay = False ):
|
||||||
self.__none_okay = none_okay
|
self.__none_okay = none_okay
|
||||||
|
|
|
@ -1,18 +1,10 @@
|
||||||
import cherrypy
|
import cherrypy
|
||||||
|
|
||||||
from Validate import Validation_error
|
|
||||||
|
|
||||||
# module-level variable that, when set to a view, overrides the view for all exposed methods. used
|
# module-level variable that, when set to a view, overrides the view for all exposed methods. used
|
||||||
# by unit tests
|
# by unit tests
|
||||||
view_override = None
|
view_override = None
|
||||||
|
|
||||||
|
|
||||||
class Expose_error( Exception ):
|
|
||||||
def __init__( self, message ):
|
|
||||||
Exception.__init__( self, message )
|
|
||||||
self.__message = message
|
|
||||||
|
|
||||||
|
|
||||||
def expose( view = None, rss = None ):
|
def expose( view = None, rss = None ):
|
||||||
"""
|
"""
|
||||||
expose() can be used to tag a method as available for publishing to the web via CherryPy. In
|
expose() can be used to tag a method as available for publishing to the web via CherryPy. In
|
||||||
|
@ -57,8 +49,16 @@ def expose( view = None, rss = None ):
|
||||||
# try executing the exposed function
|
# try executing the exposed function
|
||||||
try:
|
try:
|
||||||
result = function( *args, **kwargs )
|
result = function( *args, **kwargs )
|
||||||
except Validation_error, error:
|
except cherrypy.NotFound:
|
||||||
result = dict( name = error.name, value = error.value, error = error.message )
|
raise
|
||||||
|
except Exception, error:
|
||||||
|
if hasattr( error, "to_dict" ):
|
||||||
|
result = error.to_dict()
|
||||||
|
else:
|
||||||
|
# TODO: it'd be nice to send an email to myself with the traceback
|
||||||
|
import traceback
|
||||||
|
traceback.print_exc()
|
||||||
|
result = dict( error = u"An error occurred when processing your request. Please try again or contact support." )
|
||||||
|
|
||||||
redirect = result.get( u"redirect", None )
|
redirect = result.get( u"redirect", None )
|
||||||
|
|
||||||
|
@ -74,7 +74,7 @@ def expose( view = None, rss = None ):
|
||||||
return unicode( view_override( **result ) )
|
return unicode( view_override( **result ) )
|
||||||
except:
|
except:
|
||||||
if redirect is None:
|
if redirect is None:
|
||||||
raise Expose_error( result.get( u"error" ) or result )
|
raise
|
||||||
|
|
||||||
# if that doesn't work, and there's a redirect, then redirect
|
# if that doesn't work, and there's a redirect, then redirect
|
||||||
del( result[ u"redirect" ] )
|
del( result[ u"redirect" ] )
|
||||||
|
|
|
@ -1,15 +1,13 @@
|
||||||
import cherrypy
|
import cherrypy
|
||||||
from Scheduler import Scheduler
|
from datetime import datetime
|
||||||
from Expose import expose
|
from Expose import expose
|
||||||
from Validate import validate, Valid_string, Validation_error, Valid_bool
|
from Validate import validate, Valid_string, Validation_error, Valid_bool
|
||||||
from Database import Valid_id, Valid_revision
|
from Database import Valid_id, Valid_revision
|
||||||
from Users import grab_user_id
|
from Users import grab_user_id
|
||||||
from Updater import wait_for_update, update_client
|
|
||||||
from Expire import strongly_expire
|
from Expire import strongly_expire
|
||||||
from Html_nuker import Html_nuker
|
from Html_nuker import Html_nuker
|
||||||
from Async import async
|
from new_model.Notebook import Notebook
|
||||||
from model.Notebook import Notebook
|
from new_model.Note import Note
|
||||||
from model.Note import Note
|
|
||||||
from view.Main_page import Main_page
|
from view.Main_page import Main_page
|
||||||
from view.Json import Json
|
from view.Json import Json
|
||||||
from view.Html_file import Html_file
|
from view.Html_file import Html_file
|
||||||
|
@ -18,7 +16,7 @@ from view.Html_file import Html_file
|
||||||
class Access_error( Exception ):
|
class Access_error( Exception ):
|
||||||
def __init__( self, message = None ):
|
def __init__( self, message = None ):
|
||||||
if message is None:
|
if message is None:
|
||||||
message = u"You don't have access to this notebook."
|
message = u"Sorry, you don't have access to do that."
|
||||||
|
|
||||||
Exception.__init__( self, message )
|
Exception.__init__( self, message )
|
||||||
self.__message = message
|
self.__message = message
|
||||||
|
@ -33,12 +31,10 @@ class Notebooks( object ):
|
||||||
"""
|
"""
|
||||||
Controller for dealing with notebooks and their notes, corresponding to the "/notebooks" URL.
|
Controller for dealing with notebooks and their notes, corresponding to the "/notebooks" URL.
|
||||||
"""
|
"""
|
||||||
def __init__( self, scheduler, database, users ):
|
def __init__( self, database, users ):
|
||||||
"""
|
"""
|
||||||
Create a new Notebooks object.
|
Create a new Notebooks object.
|
||||||
|
|
||||||
@type scheduler: controller.Scheduler
|
|
||||||
@param scheduler: scheduler to use for asynchronous calls
|
|
||||||
@type database: controller.Database
|
@type database: controller.Database
|
||||||
@param database: database that notebooks are stored in
|
@param database: database that notebooks are stored in
|
||||||
@type users: controller.Users
|
@type users: controller.Users
|
||||||
|
@ -46,7 +42,6 @@ class Notebooks( object ):
|
||||||
@rtype: Notebooks
|
@rtype: Notebooks
|
||||||
@return: newly constructed Notebooks
|
@return: newly constructed Notebooks
|
||||||
"""
|
"""
|
||||||
self.__scheduler = scheduler
|
|
||||||
self.__database = database
|
self.__database = database
|
||||||
self.__users = users
|
self.__users = users
|
||||||
|
|
||||||
|
@ -83,14 +78,11 @@ class Notebooks( object ):
|
||||||
|
|
||||||
@expose( view = Json )
|
@expose( view = Json )
|
||||||
@strongly_expire
|
@strongly_expire
|
||||||
@wait_for_update
|
|
||||||
@grab_user_id
|
@grab_user_id
|
||||||
@async
|
|
||||||
@update_client
|
|
||||||
@validate(
|
@validate(
|
||||||
notebook_id = Valid_id(),
|
notebook_id = Valid_id(),
|
||||||
note_id = Valid_id( none_okay = True ),
|
note_id = Valid_id( none_okay = True ),
|
||||||
revision = Valid_string( min = 0, max = 30 ),
|
revision = Valid_revision( none_okay = True ),
|
||||||
user_id = Valid_id( none_okay = True ),
|
user_id = Valid_id( none_okay = True ),
|
||||||
)
|
)
|
||||||
def contents( self, notebook_id, note_id = None, revision = None, user_id = None ):
|
def contents( self, notebook_id, note_id = None, revision = None, user_id = None ):
|
||||||
|
@ -108,39 +100,37 @@ class Notebooks( object ):
|
||||||
@param user_id: id of current logged-in user (if any), determined by @grab_user_id
|
@param user_id: id of current logged-in user (if any), determined by @grab_user_id
|
||||||
@rtype: json dict
|
@rtype: json dict
|
||||||
@return: { 'notebook': notebookdict, 'note': notedict or None }
|
@return: { 'notebook': notebookdict, 'note': notedict or None }
|
||||||
@raise Access_error: the current user doesn't have access to the given notebook
|
@raise Access_error: the current user doesn't have access to the given notebook or note
|
||||||
@raise Validation_error: one of the arguments is invalid
|
@raise Validation_error: one of the arguments is invalid
|
||||||
"""
|
"""
|
||||||
self.check_access( notebook_id, user_id, self.__scheduler.thread )
|
if not self.__users.check_access( user_id, notebook_id ):
|
||||||
if not ( yield Scheduler.SLEEP ):
|
|
||||||
raise Access_error()
|
raise Access_error()
|
||||||
|
|
||||||
self.__database.load( notebook_id, self.__scheduler.thread )
|
notebook = self.__database.load( Notebook, notebook_id )
|
||||||
notebook = ( yield Scheduler.SLEEP )
|
|
||||||
|
if not self.__users.check_access( user_id, notebook_id, read_write = True ):
|
||||||
|
notebook.read_write = False
|
||||||
|
|
||||||
if notebook is None:
|
if notebook is None:
|
||||||
note = None
|
note = None
|
||||||
elif note_id == u"blank":
|
elif note_id == u"blank":
|
||||||
note = Note( note_id )
|
note = Note.create( note_id )
|
||||||
else:
|
else:
|
||||||
note = notebook.lookup_note( note_id )
|
note = self.__database.load( Note, note_id, revision )
|
||||||
|
if note and note.notebook_id != notebook_id:
|
||||||
|
raise Access_error()
|
||||||
|
|
||||||
if revision:
|
startup_notes = self.__database.select_many( Note, notebook.sql_load_startup_notes() )
|
||||||
self.__database.load( note_id, self.__scheduler.thread, revision )
|
|
||||||
note = ( yield Scheduler.SLEEP )
|
|
||||||
|
|
||||||
yield dict(
|
return dict(
|
||||||
notebook = notebook,
|
notebook = notebook,
|
||||||
startup_notes = notebook.startup_notes,
|
startup_notes = startup_notes,
|
||||||
note = note,
|
note = note,
|
||||||
)
|
)
|
||||||
|
|
||||||
@expose( view = Json )
|
@expose( view = Json )
|
||||||
@strongly_expire
|
@strongly_expire
|
||||||
@wait_for_update
|
|
||||||
@grab_user_id
|
@grab_user_id
|
||||||
@async
|
|
||||||
@update_client
|
|
||||||
@validate(
|
@validate(
|
||||||
notebook_id = Valid_id(),
|
notebook_id = Valid_id(),
|
||||||
note_id = Valid_id(),
|
note_id = Valid_id(),
|
||||||
|
@ -161,35 +151,24 @@ class Notebooks( object ):
|
||||||
@param user_id: id of current logged-in user (if any), determined by @grab_user_id
|
@param user_id: id of current logged-in user (if any), determined by @grab_user_id
|
||||||
@rtype: json dict
|
@rtype: json dict
|
||||||
@return: { 'note': notedict or None }
|
@return: { 'note': notedict or None }
|
||||||
@raise Access_error: the current user doesn't have access to the given notebook
|
@raise Access_error: the current user doesn't have access to the given notebook or note
|
||||||
@raise Validation_error: one of the arguments is invalid
|
@raise Validation_error: one of the arguments is invalid
|
||||||
"""
|
"""
|
||||||
self.check_access( notebook_id, user_id, self.__scheduler.thread )
|
if not self.__users.check_access( user_id, notebook_id ):
|
||||||
if not ( yield Scheduler.SLEEP ):
|
|
||||||
raise Access_error()
|
raise Access_error()
|
||||||
|
|
||||||
self.__database.load( notebook_id, self.__scheduler.thread )
|
note = self.__database.load( Note, note_id, revision )
|
||||||
notebook = ( yield Scheduler.SLEEP )
|
|
||||||
|
|
||||||
if notebook is None:
|
if note and note.notebook_id != notebook_id:
|
||||||
note = None
|
raise Access_error()
|
||||||
else:
|
|
||||||
note = notebook.lookup_note( note_id )
|
|
||||||
|
|
||||||
if revision:
|
return dict(
|
||||||
self.__database.load( note_id, self.__scheduler.thread, revision )
|
|
||||||
note = ( yield Scheduler.SLEEP )
|
|
||||||
|
|
||||||
yield dict(
|
|
||||||
note = note,
|
note = note,
|
||||||
)
|
)
|
||||||
|
|
||||||
@expose( view = Json )
|
@expose( view = Json )
|
||||||
@strongly_expire
|
@strongly_expire
|
||||||
@wait_for_update
|
|
||||||
@grab_user_id
|
@grab_user_id
|
||||||
@async
|
|
||||||
@update_client
|
|
||||||
@validate(
|
@validate(
|
||||||
notebook_id = Valid_id(),
|
notebook_id = Valid_id(),
|
||||||
note_title = Valid_string( min = 1, max = 500 ),
|
note_title = Valid_string( min = 1, max = 500 ),
|
||||||
|
@ -210,28 +189,23 @@ class Notebooks( object ):
|
||||||
@raise Access_error: the current user doesn't have access to the given notebook
|
@raise Access_error: the current user doesn't have access to the given notebook
|
||||||
@raise Validation_error: one of the arguments is invalid
|
@raise Validation_error: one of the arguments is invalid
|
||||||
"""
|
"""
|
||||||
self.check_access( notebook_id, user_id, self.__scheduler.thread )
|
if not self.__users.check_access( user_id, notebook_id ):
|
||||||
if not ( yield Scheduler.SLEEP ):
|
|
||||||
raise Access_error()
|
raise Access_error()
|
||||||
|
|
||||||
self.__database.load( notebook_id, self.__scheduler.thread )
|
notebook = self.__database.load( Notebook, notebook_id )
|
||||||
notebook = ( yield Scheduler.SLEEP )
|
|
||||||
|
|
||||||
if notebook is None:
|
if notebook is None:
|
||||||
note = None
|
note = None
|
||||||
else:
|
else:
|
||||||
note = notebook.lookup_note_by_title( note_title )
|
note = self.__database.select_one( Notebook, notebook.sql_load_note_by_title( note_title ) )
|
||||||
|
|
||||||
yield dict(
|
return dict(
|
||||||
note = note,
|
note = note,
|
||||||
)
|
)
|
||||||
|
|
||||||
@expose( view = Json )
|
@expose( view = Json )
|
||||||
@strongly_expire
|
@strongly_expire
|
||||||
@wait_for_update
|
|
||||||
@grab_user_id
|
@grab_user_id
|
||||||
@async
|
|
||||||
@update_client
|
|
||||||
@validate(
|
@validate(
|
||||||
notebook_id = Valid_id(),
|
notebook_id = Valid_id(),
|
||||||
note_title = Valid_string( min = 1, max = 500 ),
|
note_title = Valid_string( min = 1, max = 500 ),
|
||||||
|
@ -252,27 +226,61 @@ class Notebooks( object ):
|
||||||
@raise Access_error: the current user doesn't have access to the given notebook
|
@raise Access_error: the current user doesn't have access to the given notebook
|
||||||
@raise Validation_error: one of the arguments is invalid
|
@raise Validation_error: one of the arguments is invalid
|
||||||
"""
|
"""
|
||||||
self.check_access( notebook_id, user_id, self.__scheduler.thread )
|
if not self.__users.check_access( user_id, notebook_id ):
|
||||||
if not ( yield Scheduler.SLEEP ):
|
|
||||||
raise Access_error()
|
raise Access_error()
|
||||||
|
|
||||||
self.__database.load( notebook_id, self.__scheduler.thread )
|
notebook = self.__database.load( Notebook, notebook_id )
|
||||||
notebook = ( yield Scheduler.SLEEP )
|
|
||||||
|
|
||||||
if notebook is None:
|
if notebook is None:
|
||||||
note = None
|
note = None
|
||||||
else:
|
else:
|
||||||
note = notebook.lookup_note_by_title( note_title )
|
note = self.__database.select_one( Notebook, notebook.sql_load_note_by_title( note_title ) )
|
||||||
|
|
||||||
yield dict(
|
return dict(
|
||||||
note_id = note and note.object_id or None,
|
note_id = note and note.object_id or None,
|
||||||
)
|
)
|
||||||
|
|
||||||
@expose( view = Json )
|
@expose( view = Json )
|
||||||
@wait_for_update
|
@strongly_expire
|
||||||
|
@grab_user_id
|
||||||
|
@validate(
|
||||||
|
notebook_id = Valid_id(),
|
||||||
|
note_id = Valid_id(),
|
||||||
|
user_id = Valid_id( none_okay = True ),
|
||||||
|
)
|
||||||
|
def load_note_revisions( self, notebook_id, note_id, user_id = None ):
|
||||||
|
"""
|
||||||
|
Return the full list of revision timestamps for this note in chronological order.
|
||||||
|
|
||||||
|
@type notebook_id: unicode
|
||||||
|
@param notebook_id: id of notebook the note is in
|
||||||
|
@type note_id: unicode
|
||||||
|
@param note_id: id of note in question
|
||||||
|
@type user_id: unicode or NoneType
|
||||||
|
@param user_id: id of current logged-in user (if any), determined by @grab_user_id
|
||||||
|
@rtype: json dict
|
||||||
|
@return: { 'revisions': revisionslist or None }
|
||||||
|
@raise Access_error: the current user doesn't have access to the given notebook or note
|
||||||
|
@raise Validation_error: one of the arguments is invalid
|
||||||
|
"""
|
||||||
|
if not self.__users.check_access( user_id, notebook_id ):
|
||||||
|
raise Access_error()
|
||||||
|
|
||||||
|
note = self.__database.load( Note, note_id )
|
||||||
|
|
||||||
|
if note:
|
||||||
|
if note.notebook_id != notebook_id:
|
||||||
|
raise Access_error()
|
||||||
|
revisions = self.__database.select_many( unicode, note.sql_load_revisions() )
|
||||||
|
else:
|
||||||
|
revisions = None
|
||||||
|
|
||||||
|
return dict(
|
||||||
|
revisions = revisions,
|
||||||
|
)
|
||||||
|
|
||||||
|
@expose( view = Json )
|
||||||
@grab_user_id
|
@grab_user_id
|
||||||
@async
|
|
||||||
@update_client
|
|
||||||
@validate(
|
@validate(
|
||||||
notebook_id = Valid_id(),
|
notebook_id = Valid_id(),
|
||||||
note_id = Valid_id(),
|
note_id = Valid_id(),
|
||||||
|
@ -310,186 +318,78 @@ class Notebooks( object ):
|
||||||
@raise Access_error: the current user doesn't have access to the given notebook
|
@raise Access_error: the current user doesn't have access to the given notebook
|
||||||
@raise Validation_error: one of the arguments is invalid
|
@raise Validation_error: one of the arguments is invalid
|
||||||
"""
|
"""
|
||||||
self.check_access( notebook_id, user_id, self.__scheduler.thread )
|
if not self.__users.check_access( user_id, notebook_id, read_write = True ):
|
||||||
if not ( yield Scheduler.SLEEP ):
|
|
||||||
raise Access_error()
|
raise Access_error()
|
||||||
|
|
||||||
self.__database.load( notebook_id, self.__scheduler.thread )
|
notebook = self.__database.load( Notebook, notebook_id )
|
||||||
notebook = ( yield Scheduler.SLEEP )
|
|
||||||
|
|
||||||
if not notebook:
|
if not notebook:
|
||||||
raise Access_error()
|
raise Access_error()
|
||||||
|
|
||||||
self.__database.load( note_id, self.__scheduler.thread )
|
note = self.__database.load( Note, note_id )
|
||||||
note = ( yield Scheduler.SLEEP )
|
|
||||||
|
|
||||||
# check whether the provided note contents have been changed since the previous revision
|
# check whether the provided note contents have been changed since the previous revision
|
||||||
def update_note( current_notebook, old_note ):
|
def update_note( current_notebook, old_note, startup ):
|
||||||
# the note hasn't been changed, so bail without updating it
|
# the note hasn't been changed, so bail without updating it
|
||||||
if contents == old_note.contents:
|
if contents == old_note.contents and startup == old_note.startup:
|
||||||
new_revision = None
|
new_revision = None
|
||||||
# the note has changed, so update it
|
# the note has changed, so update it
|
||||||
else:
|
else:
|
||||||
notebook.update_note( note, contents )
|
note.contents = contents
|
||||||
|
note.startup = startup
|
||||||
|
if startup:
|
||||||
|
if note.rank is None:
|
||||||
|
note.rank = self.__database.select_one( float, notebook.sql_highest_rank() ) + 1
|
||||||
|
else:
|
||||||
|
note.rank = None
|
||||||
|
|
||||||
new_revision = note.revision
|
new_revision = note.revision
|
||||||
|
|
||||||
return new_revision
|
return new_revision
|
||||||
|
|
||||||
# if the note is already in the given notebook, load it and update it
|
# if the note is already in the given notebook, load it and update it
|
||||||
if note and note in notebook.notes:
|
if note and note.notebook_id == notebook.object_id:
|
||||||
self.__database.load( note_id, self.__scheduler.thread, previous_revision )
|
old_note = self.__database.load( Note, note_id, previous_revision )
|
||||||
old_note = ( yield Scheduler.SLEEP )
|
|
||||||
|
|
||||||
previous_revision = note.revision
|
previous_revision = note.revision
|
||||||
new_revision = update_note( notebook, old_note )
|
new_revision = update_note( notebook, old_note, startup )
|
||||||
|
|
||||||
# the note is not already in the given notebook, so look for it in the trash
|
# the note is not already in the given notebook, so look for it in the trash
|
||||||
elif note and notebook.trash and note in notebook.trash.notes:
|
elif note and notebook.trash_id and note.notebook_id == notebook.trash_id:
|
||||||
self.__database.load( note_id, self.__scheduler.thread, previous_revision )
|
old_note = self.__database.load( Note, note_id, previous_revision )
|
||||||
old_note = ( yield Scheduler.SLEEP )
|
|
||||||
|
|
||||||
# undelete the note, putting it back in the given notebook
|
# undelete the note, putting it back in the given notebook
|
||||||
previous_revision = note.revision
|
previous_revision = note.revision
|
||||||
notebook.trash.remove_note( note )
|
note.notebook_id = notebook.object_id
|
||||||
note.deleted_from = None
|
note.deleted_from_id = None
|
||||||
notebook.add_note( note )
|
|
||||||
|
|
||||||
new_revision = update_note( notebook, old_note )
|
|
||||||
|
|
||||||
|
new_revision = update_note( notebook, old_note, startup )
|
||||||
# otherwise, create a new note
|
# otherwise, create a new note
|
||||||
else:
|
else:
|
||||||
|
if startup:
|
||||||
|
rank = self.__database.select_one( float, notebook.sql_highest_rank() ) + 1
|
||||||
|
else:
|
||||||
|
rank = None
|
||||||
|
|
||||||
previous_revision = None
|
previous_revision = None
|
||||||
note = Note( note_id, contents )
|
note = Note.create( note_id, contents, notebook_id = notebook.object_id, startup = startup, rank = rank )
|
||||||
notebook.add_note( note )
|
|
||||||
new_revision = note.revision
|
new_revision = note.revision
|
||||||
|
|
||||||
if startup:
|
if new_revision:
|
||||||
startup_changed = notebook.add_startup_note( note )
|
self.__database.save( note, commit = False )
|
||||||
else:
|
user = self.__users.update_storage( user_id, commit = False )
|
||||||
startup_changed = notebook.remove_startup_note( note )
|
self.__database.commit()
|
||||||
|
|
||||||
if new_revision or startup_changed:
|
|
||||||
self.__database.save( notebook, self.__scheduler.thread )
|
|
||||||
yield Scheduler.SLEEP
|
|
||||||
self.__users.update_storage( user_id, self.__scheduler.thread )
|
|
||||||
user = ( yield Scheduler.SLEEP )
|
|
||||||
self.__database.save( user )
|
|
||||||
else:
|
else:
|
||||||
user = None
|
user = None
|
||||||
|
|
||||||
yield dict(
|
return dict(
|
||||||
new_revision = new_revision,
|
new_revision = new_revision,
|
||||||
previous_revision = previous_revision,
|
previous_revision = previous_revision,
|
||||||
storage_bytes = user and user.storage_bytes or 0,
|
storage_bytes = user and user.storage_bytes or 0,
|
||||||
)
|
)
|
||||||
|
|
||||||
@expose( view = Json )
|
@expose( view = Json )
|
||||||
@wait_for_update
|
|
||||||
@grab_user_id
|
@grab_user_id
|
||||||
@async
|
|
||||||
@update_client
|
|
||||||
@validate(
|
|
||||||
notebook_id = Valid_id(),
|
|
||||||
note_id = Valid_id(),
|
|
||||||
user_id = Valid_id( none_okay = True ),
|
|
||||||
)
|
|
||||||
def add_startup_note( self, notebook_id, note_id, user_id ):
|
|
||||||
"""
|
|
||||||
Designate a particular note to be shown upon startup, e.g. whenever its notebook is displayed.
|
|
||||||
The given note must already be within this notebook.
|
|
||||||
|
|
||||||
@type notebook_id: unicode
|
|
||||||
@param notebook_id: id of notebook the note is in
|
|
||||||
@type note_id: unicode
|
|
||||||
@param note_id: id of note to show on startup
|
|
||||||
@type user_id: unicode or NoneType
|
|
||||||
@param user_id: id of current logged-in user (if any), determined by @grab_user_id
|
|
||||||
@rtype: json dict
|
|
||||||
@return: { 'storage_bytes': current storage usage by user }
|
|
||||||
@raise Access_error: the current user doesn't have access to the given notebook
|
|
||||||
@raise Validation_error: one of the arguments is invalid
|
|
||||||
"""
|
|
||||||
self.check_access( notebook_id, user_id, self.__scheduler.thread )
|
|
||||||
if not ( yield Scheduler.SLEEP ):
|
|
||||||
raise Access_error()
|
|
||||||
|
|
||||||
self.__database.load( notebook_id, self.__scheduler.thread )
|
|
||||||
notebook = ( yield Scheduler.SLEEP )
|
|
||||||
|
|
||||||
if not notebook:
|
|
||||||
raise Access_error()
|
|
||||||
|
|
||||||
self.__database.load( note_id, self.__scheduler.thread )
|
|
||||||
note = ( yield Scheduler.SLEEP )
|
|
||||||
|
|
||||||
if note:
|
|
||||||
notebook.add_startup_note( note )
|
|
||||||
self.__database.save( notebook, self.__scheduler.thread )
|
|
||||||
yield Scheduler.SLEEP
|
|
||||||
self.__users.update_storage( user_id, self.__scheduler.thread )
|
|
||||||
user = ( yield Scheduler.SLEEP )
|
|
||||||
self.__database.save( user )
|
|
||||||
|
|
||||||
yield dict( storage_bytes = user.storage_bytes )
|
|
||||||
else:
|
|
||||||
yield dict( storage_bytes = 0 )
|
|
||||||
|
|
||||||
@expose( view = Json )
|
|
||||||
@wait_for_update
|
|
||||||
@grab_user_id
|
|
||||||
@async
|
|
||||||
@update_client
|
|
||||||
@validate(
|
|
||||||
notebook_id = Valid_id(),
|
|
||||||
note_id = Valid_id(),
|
|
||||||
user_id = Valid_id( none_okay = True ),
|
|
||||||
)
|
|
||||||
def remove_startup_note( self, notebook_id, note_id, user_id ):
|
|
||||||
"""
|
|
||||||
Prevent a particular note from being shown on startup, e.g. whenever its notebook is displayed.
|
|
||||||
The given note must already be within this notebook.
|
|
||||||
|
|
||||||
@type notebook_id: unicode
|
|
||||||
@param notebook_id: id of notebook the note is in
|
|
||||||
@type note_id: unicode
|
|
||||||
@param note_id: id of note to no longer show on startup
|
|
||||||
@type user_id: unicode or NoneType
|
|
||||||
@param user_id: id of current logged-in user (if any), determined by @grab_user_id
|
|
||||||
@rtype: json dict
|
|
||||||
@return: { 'storage_bytes': current storage usage by user }
|
|
||||||
@raise Access_error: the current user doesn't have access to the given notebook
|
|
||||||
@raise Validation_error: one of the arguments is invalid
|
|
||||||
"""
|
|
||||||
self.check_access( notebook_id, user_id, self.__scheduler.thread )
|
|
||||||
if not ( yield Scheduler.SLEEP ):
|
|
||||||
raise Access_error()
|
|
||||||
|
|
||||||
self.__database.load( notebook_id, self.__scheduler.thread )
|
|
||||||
notebook = ( yield Scheduler.SLEEP )
|
|
||||||
|
|
||||||
if not notebook:
|
|
||||||
raise Access_error()
|
|
||||||
|
|
||||||
self.__database.load( note_id, self.__scheduler.thread )
|
|
||||||
note = ( yield Scheduler.SLEEP )
|
|
||||||
|
|
||||||
if note:
|
|
||||||
notebook.remove_startup_note( note )
|
|
||||||
self.__database.save( notebook, self.__scheduler.thread )
|
|
||||||
yield Scheduler.SLEEP
|
|
||||||
self.__users.update_storage( user_id, self.__scheduler.thread )
|
|
||||||
user = ( yield Scheduler.SLEEP )
|
|
||||||
self.__database.save( user )
|
|
||||||
|
|
||||||
yield dict( storage_bytes = user.storage_bytes )
|
|
||||||
else:
|
|
||||||
yield dict( storage_bytes = 0 )
|
|
||||||
|
|
||||||
@expose( view = Json )
|
|
||||||
@wait_for_update
|
|
||||||
@grab_user_id
|
|
||||||
@async
|
|
||||||
@update_client
|
|
||||||
@validate(
|
@validate(
|
||||||
notebook_id = Valid_id(),
|
notebook_id = Valid_id(),
|
||||||
note_id = Valid_id(),
|
note_id = Valid_id(),
|
||||||
|
@ -512,42 +412,34 @@ class Notebooks( object ):
|
||||||
@raise Access_error: the current user doesn't have access to the given notebook
|
@raise Access_error: the current user doesn't have access to the given notebook
|
||||||
@raise Validation_error: one of the arguments is invalid
|
@raise Validation_error: one of the arguments is invalid
|
||||||
"""
|
"""
|
||||||
self.check_access( notebook_id, user_id, self.__scheduler.thread )
|
if not self.__users.check_access( user_id, notebook_id, read_write = True ):
|
||||||
if not ( yield Scheduler.SLEEP ):
|
|
||||||
raise Access_error()
|
raise Access_error()
|
||||||
|
|
||||||
self.__database.load( notebook_id, self.__scheduler.thread )
|
notebook = self.__database.load( Notebook, notebook_id )
|
||||||
notebook = ( yield Scheduler.SLEEP )
|
|
||||||
|
|
||||||
if not notebook:
|
if not notebook:
|
||||||
raise Access_error()
|
raise Access_error()
|
||||||
|
|
||||||
self.__database.load( note_id, self.__scheduler.thread )
|
note = self.__database.load( Note, note_id )
|
||||||
note = ( yield Scheduler.SLEEP )
|
|
||||||
|
|
||||||
if note:
|
if note and note.notebook_id == notebook_id:
|
||||||
notebook.remove_note( note )
|
if notebook.trash_id:
|
||||||
|
note.deleted_from_id = notebook_id
|
||||||
|
note.notebook_id = notebook.trash_id
|
||||||
|
note.startup = True
|
||||||
|
else:
|
||||||
|
note.notebook_id = None
|
||||||
|
|
||||||
if notebook.trash:
|
self.__database.save( note, commit = False )
|
||||||
note.deleted_from = notebook.object_id
|
user = self.__users.update_storage( user_id, commit = False )
|
||||||
notebook.trash.add_note( note )
|
self.__database.commit()
|
||||||
notebook.trash.add_startup_note( note )
|
|
||||||
|
|
||||||
self.__database.save( notebook, self.__scheduler.thread )
|
return dict( storage_bytes = user.storage_bytes )
|
||||||
yield Scheduler.SLEEP
|
|
||||||
self.__users.update_storage( user_id, self.__scheduler.thread )
|
|
||||||
user = ( yield Scheduler.SLEEP )
|
|
||||||
self.__database.save( user )
|
|
||||||
|
|
||||||
yield dict( storage_bytes = user.storage_bytes )
|
|
||||||
else:
|
else:
|
||||||
yield dict( storage_bytes = 0 )
|
return dict( storage_bytes = 0 )
|
||||||
|
|
||||||
@expose( view = Json )
|
@expose( view = Json )
|
||||||
@wait_for_update
|
|
||||||
@grab_user_id
|
@grab_user_id
|
||||||
@async
|
|
||||||
@update_client
|
|
||||||
@validate(
|
@validate(
|
||||||
notebook_id = Valid_id(),
|
notebook_id = Valid_id(),
|
||||||
note_id = Valid_id(),
|
note_id = Valid_id(),
|
||||||
|
@ -569,50 +461,39 @@ class Notebooks( object ):
|
||||||
@raise Access_error: the current user doesn't have access to the given notebook
|
@raise Access_error: the current user doesn't have access to the given notebook
|
||||||
@raise Validation_error: one of the arguments is invalid
|
@raise Validation_error: one of the arguments is invalid
|
||||||
"""
|
"""
|
||||||
self.check_access( notebook_id, user_id, self.__scheduler.thread )
|
if not self.__users.check_access( user_id, notebook_id, read_write = True ):
|
||||||
if not ( yield Scheduler.SLEEP ):
|
|
||||||
raise Access_error()
|
raise Access_error()
|
||||||
|
|
||||||
self.__database.load( notebook_id, self.__scheduler.thread )
|
notebook = self.__database.load( Notebook, notebook_id )
|
||||||
notebook = ( yield Scheduler.SLEEP )
|
|
||||||
|
|
||||||
if not notebook:
|
if not notebook:
|
||||||
raise Access_error()
|
raise Access_error()
|
||||||
|
|
||||||
self.__database.load( note_id, self.__scheduler.thread )
|
note = self.__database.load( Note, note_id )
|
||||||
note = ( yield Scheduler.SLEEP )
|
|
||||||
|
|
||||||
if note and notebook.trash:
|
if note and notebook.trash_id:
|
||||||
# if the note isn't deleted, and it's already in this notebook, just return
|
# if the note isn't deleted, and it's already in this notebook, just return
|
||||||
if note.deleted_from is None and notebook.lookup_note( note.object_id ):
|
if note.deleted_from_id is None and note.notebook_id == notebook_id:
|
||||||
yield dict( storage_bytes = 0 )
|
return dict( storage_bytes = 0 )
|
||||||
return
|
|
||||||
|
|
||||||
# if the note was deleted from a different notebook than the notebook given, raise
|
# if the note was deleted from a different notebook than the notebook given, raise
|
||||||
if note.deleted_from != notebook_id:
|
if note.deleted_from_id != notebook_id:
|
||||||
raise Access_error()
|
raise Access_error()
|
||||||
|
|
||||||
notebook.trash.remove_note( note )
|
note.notebook_id = note.deleted_from_id
|
||||||
|
note.deleted_from_id = None
|
||||||
|
note.startup = True
|
||||||
|
|
||||||
note.deleted_from = None
|
self.__database.save( note, commit = False )
|
||||||
notebook.add_note( note )
|
user = self.__users.update_storage( user_id, commit = False )
|
||||||
notebook.add_startup_note( note )
|
self.__database.commit()
|
||||||
|
|
||||||
self.__database.save( notebook, self.__scheduler.thread )
|
return dict( storage_bytes = user.storage_bytes )
|
||||||
yield Scheduler.SLEEP
|
|
||||||
self.__users.update_storage( user_id, self.__scheduler.thread )
|
|
||||||
user = ( yield Scheduler.SLEEP )
|
|
||||||
self.__database.save( user )
|
|
||||||
|
|
||||||
yield dict( storage_bytes = user.storage_bytes )
|
|
||||||
else:
|
else:
|
||||||
yield dict( storage_bytes = 0 )
|
return dict( storage_bytes = 0 )
|
||||||
|
|
||||||
@expose( view = Json )
|
@expose( view = Json )
|
||||||
@wait_for_update
|
|
||||||
@grab_user_id
|
@grab_user_id
|
||||||
@async
|
|
||||||
@update_client
|
|
||||||
@validate(
|
@validate(
|
||||||
notebook_id = Valid_id(),
|
notebook_id = Valid_id(),
|
||||||
user_id = Valid_id( none_okay = True ),
|
user_id = Valid_id( none_okay = True ),
|
||||||
|
@ -632,40 +513,35 @@ class Notebooks( object ):
|
||||||
@raise Access_error: the current user doesn't have access to the given notebook
|
@raise Access_error: the current user doesn't have access to the given notebook
|
||||||
@raise Validation_error: one of the arguments is invalid
|
@raise Validation_error: one of the arguments is invalid
|
||||||
"""
|
"""
|
||||||
self.check_access( notebook_id, user_id, self.__scheduler.thread )
|
if not self.__users.check_access( user_id, notebook_id, read_write = True ):
|
||||||
if not ( yield Scheduler.SLEEP ):
|
|
||||||
raise Access_error()
|
raise Access_error()
|
||||||
|
|
||||||
self.__database.load( notebook_id, self.__scheduler.thread )
|
notebook = self.__database.load( Notebook, notebook_id )
|
||||||
notebook = ( yield Scheduler.SLEEP )
|
|
||||||
|
|
||||||
if not notebook:
|
if not notebook:
|
||||||
raise Access_error()
|
raise Access_error()
|
||||||
|
|
||||||
for note in notebook.notes:
|
notes = self.__database.select_many( Note, notebook.sql_load_notes() )
|
||||||
notebook.remove_note( note )
|
|
||||||
|
|
||||||
if notebook.trash:
|
for note in notes:
|
||||||
note.deleted_from = notebook.object_id
|
if notebook.trash_id:
|
||||||
notebook.trash.add_note( note )
|
note.deleted_from_id = notebook_id
|
||||||
notebook.trash.add_startup_note( note )
|
note.notebook_id = notebook.trash_id
|
||||||
|
note.startup = True
|
||||||
|
else:
|
||||||
|
note.notebook_id = None
|
||||||
|
self.__database.save( note, commit = False )
|
||||||
|
|
||||||
self.__database.save( notebook, self.__scheduler.thread )
|
user = self.__users.update_storage( user_id, commit = False )
|
||||||
yield Scheduler.SLEEP
|
self.__database.commit()
|
||||||
self.__users.update_storage( user_id, self.__scheduler.thread )
|
|
||||||
user = ( yield Scheduler.SLEEP )
|
|
||||||
self.__database.save( user )
|
|
||||||
|
|
||||||
yield dict(
|
return dict(
|
||||||
storage_bytes = user.storage_bytes,
|
storage_bytes = user.storage_bytes,
|
||||||
)
|
)
|
||||||
|
|
||||||
@expose( view = Json )
|
@expose( view = Json )
|
||||||
@strongly_expire
|
@strongly_expire
|
||||||
@wait_for_update
|
|
||||||
@grab_user_id
|
@grab_user_id
|
||||||
@async
|
|
||||||
@update_client
|
|
||||||
@validate(
|
@validate(
|
||||||
notebook_id = Valid_id(),
|
notebook_id = Valid_id(),
|
||||||
search_text = Valid_string( min = 0, max = 100 ),
|
search_text = Valid_string( min = 0, max = 100 ),
|
||||||
|
@ -688,41 +564,39 @@ class Notebooks( object ):
|
||||||
@raise Access_error: the current user doesn't have access to the given notebook
|
@raise Access_error: the current user doesn't have access to the given notebook
|
||||||
@raise Validation_error: one of the arguments is invalid
|
@raise Validation_error: one of the arguments is invalid
|
||||||
"""
|
"""
|
||||||
self.check_access( notebook_id, user_id, self.__scheduler.thread )
|
if not self.__users.check_access( user_id, notebook_id ):
|
||||||
if not ( yield Scheduler.SLEEP ):
|
|
||||||
raise Access_error()
|
raise Access_error()
|
||||||
|
|
||||||
self.__database.load( notebook_id, self.__scheduler.thread )
|
notebook = self.__database.load( Notebook, notebook_id )
|
||||||
notebook = ( yield Scheduler.SLEEP )
|
|
||||||
|
|
||||||
if not notebook:
|
if not notebook:
|
||||||
raise Access_error()
|
raise Access_error()
|
||||||
|
|
||||||
search_text = search_text.lower()
|
search_text = search_text.lower()
|
||||||
|
if len( search_text ) == 0:
|
||||||
|
return dict( notes = [] )
|
||||||
|
|
||||||
title_matches = []
|
title_matches = []
|
||||||
content_matches = []
|
content_matches = []
|
||||||
nuker = Html_nuker()
|
nuker = Html_nuker()
|
||||||
|
|
||||||
if len( search_text ) > 0:
|
notes = self.__database.select_many( Note, notebook.sql_search_notes( search_text ) )
|
||||||
for note in notebook.notes:
|
|
||||||
if note is None: continue
|
|
||||||
if search_text in nuker.nuke( note.title ).lower():
|
|
||||||
title_matches.append( note )
|
|
||||||
elif search_text in nuker.nuke( note.contents ).lower():
|
|
||||||
content_matches.append( note )
|
|
||||||
|
|
||||||
notes = title_matches + content_matches
|
# further narrow the search results by making sure notes still match after all HTML tags are
|
||||||
|
# stripped out
|
||||||
|
for note in notes:
|
||||||
|
if search_text in nuker.nuke( note.title ).lower():
|
||||||
|
title_matches.append( note )
|
||||||
|
elif search_text in nuker.nuke( note.contents ).lower():
|
||||||
|
content_matches.append( note )
|
||||||
|
|
||||||
yield dict(
|
return dict(
|
||||||
notes = notes,
|
notes = title_matches + content_matches,
|
||||||
)
|
)
|
||||||
|
|
||||||
@expose( view = Json )
|
@expose( view = Json )
|
||||||
@strongly_expire
|
@strongly_expire
|
||||||
@wait_for_update
|
|
||||||
@grab_user_id
|
@grab_user_id
|
||||||
@async
|
|
||||||
@update_client
|
|
||||||
@validate(
|
@validate(
|
||||||
notebook_id = Valid_id(),
|
notebook_id = Valid_id(),
|
||||||
user_id = Valid_id( none_okay = True ),
|
user_id = Valid_id( none_okay = True ),
|
||||||
|
@ -740,29 +614,23 @@ class Notebooks( object ):
|
||||||
@raise Access_error: the current user doesn't have access to the given notebook
|
@raise Access_error: the current user doesn't have access to the given notebook
|
||||||
@raise Validation_error: one of the arguments is invalid
|
@raise Validation_error: one of the arguments is invalid
|
||||||
"""
|
"""
|
||||||
self.check_access( notebook_id, user_id, self.__scheduler.thread )
|
if not self.__users.check_access( user_id, notebook_id ):
|
||||||
if not ( yield Scheduler.SLEEP ):
|
|
||||||
raise Access_error()
|
raise Access_error()
|
||||||
|
|
||||||
self.__database.load( notebook_id, self.__scheduler.thread )
|
notebook = self.__database.load( Notebook, notebook_id )
|
||||||
notebook = ( yield Scheduler.SLEEP )
|
|
||||||
|
|
||||||
if not notebook:
|
if not notebook:
|
||||||
raise Access_error()
|
raise Access_error()
|
||||||
|
|
||||||
notes = [ note for note in notebook.notes if note is not None and note.title is not None ]
|
notes = self.__database.select_many( Note, notebook.sql_load_notes() )
|
||||||
notes.sort( lambda a, b: cmp( b.revision, a.revision ) )
|
|
||||||
|
|
||||||
yield dict(
|
return dict(
|
||||||
notes = [ ( note.object_id, note.title ) for note in notes ]
|
notes = [ ( note.object_id, note.title ) for note in notes ]
|
||||||
)
|
)
|
||||||
|
|
||||||
@expose( view = Html_file )
|
@expose( view = Html_file )
|
||||||
@strongly_expire
|
@strongly_expire
|
||||||
@wait_for_update
|
|
||||||
@grab_user_id
|
@grab_user_id
|
||||||
@async
|
|
||||||
@update_client
|
|
||||||
@validate(
|
@validate(
|
||||||
notebook_id = Valid_id(),
|
notebook_id = Valid_id(),
|
||||||
user_id = Valid_id( none_okay = True ),
|
user_id = Valid_id( none_okay = True ),
|
||||||
|
@ -780,42 +648,18 @@ class Notebooks( object ):
|
||||||
@raise Access_error: the current user doesn't have access to the given notebook
|
@raise Access_error: the current user doesn't have access to the given notebook
|
||||||
@raise Validation_error: one of the arguments is invalid
|
@raise Validation_error: one of the arguments is invalid
|
||||||
"""
|
"""
|
||||||
self.check_access( notebook_id, user_id, self.__scheduler.thread )
|
if not self.__users.check_access( user_id, notebook_id ):
|
||||||
if not ( yield Scheduler.SLEEP ):
|
|
||||||
raise Access_error()
|
raise Access_error()
|
||||||
|
|
||||||
self.__database.load( notebook_id, self.__scheduler.thread )
|
notebook = self.__database.load( Notebook, notebook_id )
|
||||||
notebook = ( yield Scheduler.SLEEP )
|
|
||||||
|
|
||||||
if not notebook:
|
if not notebook:
|
||||||
raise Access_error()
|
raise Access_error()
|
||||||
|
|
||||||
normal_notes = list( set( notebook.notes ) - set( notebook.startup_notes ) )
|
startup_notes = self.__database.select_many( Note, notebook.sql_load_startup_notes() )
|
||||||
normal_notes.sort( lambda a, b: -cmp( a.revision, b.revision ) )
|
other_notes = self.__database.select_many( Note, notebook.sql_load_non_startup_notes() )
|
||||||
|
|
||||||
yield dict(
|
return dict(
|
||||||
notebook_name = notebook.name,
|
notebook_name = notebook.name,
|
||||||
notes = [ note for note in notebook.startup_notes + normal_notes if note is not None ],
|
notes = startup_notes + other_notes,
|
||||||
)
|
)
|
||||||
|
|
||||||
@async
|
|
||||||
def check_access( self, notebook_id, user_id, callback ):
|
|
||||||
# check if the anonymous user has access to this notebook
|
|
||||||
self.__database.load( u"User anonymous", self.__scheduler.thread )
|
|
||||||
anonymous = ( yield Scheduler.SLEEP )
|
|
||||||
|
|
||||||
access = False
|
|
||||||
if anonymous.has_access( notebook_id ):
|
|
||||||
access = True
|
|
||||||
|
|
||||||
if user_id:
|
|
||||||
# check if the currently logged in user has access to this notebook
|
|
||||||
self.__database.load( user_id, self.__scheduler.thread )
|
|
||||||
user = ( yield Scheduler.SLEEP )
|
|
||||||
|
|
||||||
if user and user.has_access( notebook_id ):
|
|
||||||
access = True
|
|
||||||
|
|
||||||
yield callback, access
|
|
||||||
|
|
||||||
scheduler = property( lambda self: self.__scheduler )
|
|
||||||
|
|
|
@ -0,0 +1,303 @@
|
||||||
|
import re
|
||||||
|
import bsddb
|
||||||
|
import random
|
||||||
|
import cPickle
|
||||||
|
from cStringIO import StringIO
|
||||||
|
from copy import copy
|
||||||
|
from model.Persistent import Persistent
|
||||||
|
from Async import async
|
||||||
|
|
||||||
|
|
||||||
|
class Old_database( object ):
|
||||||
|
ID_BITS = 128 # number of bits within an id
|
||||||
|
ID_DIGITS = "0123456789abcdefghijklmnopqrstuvwxyz"
|
||||||
|
|
||||||
|
def __init__( self, scheduler, database_path = None ):
|
||||||
|
"""
|
||||||
|
Create a new database and return it.
|
||||||
|
|
||||||
|
@type scheduler: Scheduler
|
||||||
|
@param scheduler: scheduler to use
|
||||||
|
@type database_path: unicode
|
||||||
|
@param database_path: path to the database file
|
||||||
|
@rtype: Old_database
|
||||||
|
@return: database at the given path
|
||||||
|
"""
|
||||||
|
self.__scheduler = scheduler
|
||||||
|
self.__env = bsddb.db.DBEnv()
|
||||||
|
self.__env.open( None, bsddb.db.DB_CREATE | bsddb.db.DB_PRIVATE | bsddb.db.DB_INIT_MPOOL )
|
||||||
|
self.__db = bsddb.db.DB( self.__env )
|
||||||
|
self.__db.open( database_path, "database", bsddb.db.DB_HASH, bsddb.db.DB_CREATE )
|
||||||
|
self.__cache = {}
|
||||||
|
|
||||||
|
def __persistent_id( self, obj, skip = None ):
|
||||||
|
# save the object and return its persistent id
|
||||||
|
if obj != skip and isinstance( obj, Persistent ):
|
||||||
|
self.__save( obj )
|
||||||
|
return obj.object_id
|
||||||
|
|
||||||
|
# returning None indicates that the object should be pickled normally without using a persistent id
|
||||||
|
return None
|
||||||
|
|
||||||
|
@async
|
||||||
|
def save( self, obj, callback = None ):
|
||||||
|
"""
|
||||||
|
Save the given object to the database, including any objects that it references.
|
||||||
|
|
||||||
|
@type obj: Persistent
|
||||||
|
@param obj: object to save
|
||||||
|
@type callback: generator or NoneType
|
||||||
|
@param callback: generator to wakeup when the save is complete (optional)
|
||||||
|
"""
|
||||||
|
self.__save( obj )
|
||||||
|
yield callback
|
||||||
|
|
||||||
|
def __save( self, obj ):
|
||||||
|
# if this object's current revision is already saved, bail
|
||||||
|
revision_id = obj.revision_id()
|
||||||
|
if revision_id in self.__cache:
|
||||||
|
return
|
||||||
|
|
||||||
|
object_id = unicode( obj.object_id ).encode( "utf8" )
|
||||||
|
revision_id = unicode( obj.revision_id() ).encode( "utf8" )
|
||||||
|
secondary_id = obj.secondary_id and unicode( obj.full_secondary_id() ).encode( "utf8" ) or None
|
||||||
|
|
||||||
|
# update the cache with this saved object
|
||||||
|
self.__cache[ object_id ] = obj
|
||||||
|
self.__cache[ revision_id ] = copy( obj )
|
||||||
|
if secondary_id:
|
||||||
|
self.__cache[ secondary_id ] = obj
|
||||||
|
|
||||||
|
# set the pickler up to save persistent ids for every object except for the obj passed in, which
|
||||||
|
# will be pickled normally
|
||||||
|
buffer = StringIO()
|
||||||
|
pickler = cPickle.Pickler( buffer, protocol = -1 )
|
||||||
|
pickler.persistent_id = lambda o: self.__persistent_id( o, skip = obj )
|
||||||
|
|
||||||
|
# pickle the object and write it to the database under both its id key and its revision id key
|
||||||
|
pickler.dump( obj )
|
||||||
|
pickled = buffer.getvalue()
|
||||||
|
self.__db.put( object_id, pickled )
|
||||||
|
self.__db.put( revision_id, pickled )
|
||||||
|
|
||||||
|
# write the pickled object id (only) to the database under its secondary id
|
||||||
|
if secondary_id:
|
||||||
|
buffer = StringIO()
|
||||||
|
pickler = cPickle.Pickler( buffer, protocol = -1 )
|
||||||
|
pickler.persistent_id = lambda o: self.__persistent_id( o )
|
||||||
|
pickler.dump( obj )
|
||||||
|
self.__db.put( secondary_id, buffer.getvalue() )
|
||||||
|
|
||||||
|
self.__db.sync()
|
||||||
|
|
||||||
|
@async
|
||||||
|
def load( self, object_id, callback, revision = None ):
|
||||||
|
"""
|
||||||
|
Load the object corresponding to the given object id from the database, and yield the provided
|
||||||
|
callback generator with the loaded object as its argument, or None if the object_id is unknown.
|
||||||
|
If a revision is provided, a specific revision of the object will be loaded.
|
||||||
|
|
||||||
|
@type object_id: unicode
|
||||||
|
@param object_id: id of the object to load
|
||||||
|
@type callback: generator
|
||||||
|
@param callback: generator to send the loaded object to
|
||||||
|
@type revision: int or NoneType
|
||||||
|
@param revision: revision of the object to load (optional)
|
||||||
|
"""
|
||||||
|
obj = self.__load( object_id, revision )
|
||||||
|
yield callback, obj
|
||||||
|
|
||||||
|
def __load( self, object_id, revision = None ):
|
||||||
|
if revision is not None:
|
||||||
|
object_id = Persistent.make_revision_id( object_id, revision )
|
||||||
|
|
||||||
|
object_id = unicode( object_id ).encode( "utf8" )
|
||||||
|
|
||||||
|
# if the object corresponding to the given id has already been loaded, simply return it without
|
||||||
|
# loading it again
|
||||||
|
obj = self.__cache.get( object_id )
|
||||||
|
if obj is not None:
|
||||||
|
return obj
|
||||||
|
|
||||||
|
# grab the object for the given id from the database
|
||||||
|
buffer = StringIO()
|
||||||
|
unpickler = cPickle.Unpickler( buffer )
|
||||||
|
unpickler.persistent_load = self.__load
|
||||||
|
|
||||||
|
pickled = self.__db.get( object_id )
|
||||||
|
if pickled is None or pickled == "":
|
||||||
|
return None
|
||||||
|
|
||||||
|
buffer.write( pickled )
|
||||||
|
buffer.flush()
|
||||||
|
buffer.seek( 0 )
|
||||||
|
|
||||||
|
# unpickle the object and update the cache with this saved object
|
||||||
|
obj = unpickler.load()
|
||||||
|
if obj is None:
|
||||||
|
print "error unpickling %s: %s" % ( object_id, pickled )
|
||||||
|
return None
|
||||||
|
self.__cache[ unicode( obj.object_id ).encode( "utf8" ) ] = obj
|
||||||
|
self.__cache[ unicode( obj.revision_id() ).encode( "utf8" ) ] = copy( obj )
|
||||||
|
|
||||||
|
return obj
|
||||||
|
|
||||||
|
@async
|
||||||
|
def reload( self, object_id, callback = None ):
|
||||||
|
"""
|
||||||
|
Load and immediately save the object corresponding to the given object id or database key. This
|
||||||
|
is useful when the object has a __setstate__() method that performs some sort of schema
|
||||||
|
evolution operation.
|
||||||
|
|
||||||
|
@type object_id: unicode
|
||||||
|
@param object_id: id or key of the object to reload
|
||||||
|
@type callback: generator or NoneType
|
||||||
|
@param callback: generator to wakeup when the save is complete (optional)
|
||||||
|
"""
|
||||||
|
self.__reload( object_id )
|
||||||
|
yield callback
|
||||||
|
|
||||||
|
def __reload( self, object_id, revision = None ):
|
||||||
|
object_id = unicode( object_id ).encode( "utf8" )
|
||||||
|
|
||||||
|
# grab the object for the given id from the database
|
||||||
|
buffer = StringIO()
|
||||||
|
unpickler = cPickle.Unpickler( buffer )
|
||||||
|
unpickler.persistent_load = self.__load
|
||||||
|
|
||||||
|
pickled = self.__db.get( object_id )
|
||||||
|
if pickled is None or pickled == "":
|
||||||
|
return
|
||||||
|
|
||||||
|
buffer.write( pickled )
|
||||||
|
buffer.flush()
|
||||||
|
buffer.seek( 0 )
|
||||||
|
|
||||||
|
# unpickle the object. this should trigger __setstate__() if the object has such a method
|
||||||
|
obj = unpickler.load()
|
||||||
|
if obj is None:
|
||||||
|
print "error unpickling %s: %s" % ( object_id, pickled )
|
||||||
|
return
|
||||||
|
self.__cache[ object_id ] = obj
|
||||||
|
|
||||||
|
# set the pickler up to save persistent ids for every object except for the obj passed in, which
|
||||||
|
# will be pickled normally
|
||||||
|
buffer = StringIO()
|
||||||
|
pickler = cPickle.Pickler( buffer, protocol = -1 )
|
||||||
|
pickler.persistent_id = lambda o: self.__persistent_id( o, skip = obj )
|
||||||
|
|
||||||
|
# pickle the object and write it to the database under its id key
|
||||||
|
pickler.dump( obj )
|
||||||
|
pickled = buffer.getvalue()
|
||||||
|
self.__db.put( object_id, pickled )
|
||||||
|
|
||||||
|
self.__db.sync()
|
||||||
|
|
||||||
|
def size( self, object_id, revision = None ):
|
||||||
|
"""
|
||||||
|
Load the object corresponding to the given object id from the database, and return the size of
|
||||||
|
its pickled data in bytes. If a revision is provided, a specific revision of the object will be
|
||||||
|
loaded.
|
||||||
|
|
||||||
|
@type object_id: unicode
|
||||||
|
@param object_id: id of the object whose size should be returned
|
||||||
|
@type revision: int or NoneType
|
||||||
|
@param revision: revision of the object to load (optional)
|
||||||
|
"""
|
||||||
|
if revision is not None:
|
||||||
|
object_id = Persistent.make_revision_id( object_id, revision )
|
||||||
|
|
||||||
|
object_id = unicode( object_id ).encode( "utf8" )
|
||||||
|
|
||||||
|
pickled = self.__db.get( object_id )
|
||||||
|
if pickled is None or pickled == "":
|
||||||
|
return None
|
||||||
|
|
||||||
|
return len( pickled )
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def generate_id():
|
||||||
|
int_id = random.getrandbits( Old_database.ID_BITS )
|
||||||
|
|
||||||
|
base = len( Old_database.ID_DIGITS )
|
||||||
|
digits = []
|
||||||
|
|
||||||
|
while True:
|
||||||
|
index = int_id % base
|
||||||
|
digits.insert( 0, Old_database.ID_DIGITS[ index ] )
|
||||||
|
int_id = int_id / base
|
||||||
|
if int_id == 0:
|
||||||
|
break
|
||||||
|
|
||||||
|
return "".join( digits )
|
||||||
|
|
||||||
|
@async
|
||||||
|
def next_id( self, callback ):
|
||||||
|
"""
|
||||||
|
Generate the next available object id, and yield the provided callback generator with the
|
||||||
|
object id as its argument.
|
||||||
|
|
||||||
|
@type callback: generator
|
||||||
|
@param callback: generator to send the next available object id to
|
||||||
|
"""
|
||||||
|
# generate a random id, but on the off-chance that it collides with something else already in
|
||||||
|
# the database, try again
|
||||||
|
next_id = Old_database.generate_id()
|
||||||
|
while self.__db.get( next_id, default = None ) is not None:
|
||||||
|
next_id = Old_database.generate_id()
|
||||||
|
|
||||||
|
# save the next_id as a key in the database so that it's not handed out again to another client
|
||||||
|
self.__db[ next_id ] = ""
|
||||||
|
|
||||||
|
yield callback, next_id
|
||||||
|
|
||||||
|
@async
|
||||||
|
def close( self ):
|
||||||
|
"""
|
||||||
|
Shutdown the database.
|
||||||
|
"""
|
||||||
|
self.__db.close()
|
||||||
|
self.__env.close()
|
||||||
|
yield None
|
||||||
|
|
||||||
|
@async
|
||||||
|
def clear_cache( self ):
|
||||||
|
"""
|
||||||
|
Clear the memory object cache.
|
||||||
|
"""
|
||||||
|
self.__cache.clear()
|
||||||
|
yield None
|
||||||
|
|
||||||
|
scheduler = property( lambda self: self.__scheduler )
|
||||||
|
|
||||||
|
|
||||||
|
class Valid_id( object ):
|
||||||
|
"""
|
||||||
|
Validator for an object id.
|
||||||
|
"""
|
||||||
|
ID_PATTERN = re.compile( "^[%s]+$" % Old_database.ID_DIGITS )
|
||||||
|
|
||||||
|
def __init__( self, none_okay = False ):
|
||||||
|
self.__none_okay = none_okay
|
||||||
|
|
||||||
|
def __call__( self, value ):
|
||||||
|
if self.__none_okay and value in ( None, "None", "" ): return None
|
||||||
|
if self.ID_PATTERN.search( value ): return str( value )
|
||||||
|
|
||||||
|
raise ValueError()
|
||||||
|
|
||||||
|
|
||||||
|
class Valid_revision( object ):
|
||||||
|
"""
|
||||||
|
Validator for an object id.
|
||||||
|
"""
|
||||||
|
REVISION_PATTERN = re.compile( "^\d\d\d\d-\d\d-\d\d \d\d:\d\d:\d\d\.\d+$" )
|
||||||
|
|
||||||
|
def __init__( self, none_okay = False ):
|
||||||
|
self.__none_okay = none_okay
|
||||||
|
|
||||||
|
def __call__( self, value ):
|
||||||
|
if self.__none_okay and value in ( None, "None", "" ): return None
|
||||||
|
if self.REVISION_PATTERN.search( value ): return str( value )
|
||||||
|
|
||||||
|
raise ValueError()
|
|
@ -1,13 +1,11 @@
|
||||||
import cherrypy
|
import cherrypy
|
||||||
|
|
||||||
from Scheduler import Scheduler
|
|
||||||
from Expose import expose
|
from Expose import expose
|
||||||
from Validate import validate
|
from Validate import validate
|
||||||
from Async import async
|
|
||||||
from Notebooks import Notebooks
|
from Notebooks import Notebooks
|
||||||
from Users import Users
|
from Users import Users
|
||||||
from Updater import update_client, wait_for_update
|
|
||||||
from Database import Valid_id
|
from Database import Valid_id
|
||||||
|
from new_model.Note import Note
|
||||||
from view.Main_page import Main_page
|
from view.Main_page import Main_page
|
||||||
from view.Json import Json
|
from view.Json import Json
|
||||||
from view.Error_page import Error_page
|
from view.Error_page import Error_page
|
||||||
|
@ -18,12 +16,10 @@ class Root( object ):
|
||||||
"""
|
"""
|
||||||
The root of the controller hierarchy, corresponding to the "/" URL.
|
The root of the controller hierarchy, corresponding to the "/" URL.
|
||||||
"""
|
"""
|
||||||
def __init__( self, scheduler, database, settings ):
|
def __init__( self, database, settings ):
|
||||||
"""
|
"""
|
||||||
Create a new Root object with the given settings.
|
Create a new Root object with the given settings.
|
||||||
|
|
||||||
@type scheduler: controller.Scheduler
|
|
||||||
@param scheduler: scheduler to use for asynchronous calls
|
|
||||||
@type database: controller.Database
|
@type database: controller.Database
|
||||||
@param database: database to use for all controllers
|
@param database: database to use for all controllers
|
||||||
@type settings: dict
|
@type settings: dict
|
||||||
|
@ -31,18 +27,16 @@ class Root( object ):
|
||||||
@rtype: Root
|
@rtype: Root
|
||||||
@return: newly constructed Root
|
@return: newly constructed Root
|
||||||
"""
|
"""
|
||||||
self.__scheduler = scheduler
|
|
||||||
self.__database = database
|
self.__database = database
|
||||||
self.__settings = settings
|
self.__settings = settings
|
||||||
self.__users = Users(
|
self.__users = Users(
|
||||||
scheduler,
|
|
||||||
database,
|
database,
|
||||||
settings[ u"global" ].get( u"luminotes.http_url", u"" ),
|
settings[ u"global" ].get( u"luminotes.http_url", u"" ),
|
||||||
settings[ u"global" ].get( u"luminotes.https_url", u"" ),
|
settings[ u"global" ].get( u"luminotes.https_url", u"" ),
|
||||||
settings[ u"global" ].get( u"luminotes.support_email", u"" ),
|
settings[ u"global" ].get( u"luminotes.support_email", u"" ),
|
||||||
settings[ u"global" ].get( u"luminotes.rate_plans", [] ),
|
settings[ u"global" ].get( u"luminotes.rate_plans", [] ),
|
||||||
)
|
)
|
||||||
self.__notebooks = Notebooks( scheduler, database, self.__users )
|
self.__notebooks = Notebooks( database, self.__users )
|
||||||
|
|
||||||
@expose()
|
@expose()
|
||||||
def default( self, password_reset_id ):
|
def default( self, password_reset_id ):
|
||||||
|
@ -72,22 +66,19 @@ class Root( object ):
|
||||||
|
|
||||||
return dict()
|
return dict()
|
||||||
|
|
||||||
|
# TODO: move this method to controller.Notebooks, and maybe give it a more sensible name
|
||||||
@expose( view = Json )
|
@expose( view = Json )
|
||||||
@wait_for_update
|
|
||||||
@async
|
|
||||||
@update_client
|
|
||||||
def next_id( self ):
|
def next_id( self ):
|
||||||
"""
|
"""
|
||||||
Return the next available database object id. This id is guaranteed to be unique to the
|
Return the next available database object id for a new note. This id is guaranteed to be unique
|
||||||
database.
|
among all existing notes.
|
||||||
|
|
||||||
@rtype: json dict
|
@rtype: json dict
|
||||||
@return: { 'next_id': nextid }
|
@return: { 'next_id': nextid }
|
||||||
"""
|
"""
|
||||||
self.__database.next_id( self.__scheduler.thread )
|
next_id = self.__database.next_id( Note )
|
||||||
next_id = ( yield Scheduler.SLEEP )
|
|
||||||
|
|
||||||
yield dict(
|
return dict(
|
||||||
next_id = next_id,
|
next_id = next_id,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
@ -95,28 +86,20 @@ class Root( object ):
|
||||||
"""
|
"""
|
||||||
CherryPy HTTP error handler, used to display page not found and generic error pages.
|
CherryPy HTTP error handler, used to display page not found and generic error pages.
|
||||||
"""
|
"""
|
||||||
|
support_email = self.__settings[ u"global" ].get( u"luminotes.support_email" )
|
||||||
|
|
||||||
if status == 404:
|
if status == 404:
|
||||||
cherrypy.response.headerMap[ u"Status" ] = u"404 Not Found"
|
cherrypy.response.headerMap[ u"Status" ] = u"404 Not Found"
|
||||||
cherrypy.response.status = status
|
cherrypy.response.status = status
|
||||||
cherrypy.response.body = [ unicode( Not_found_page( self.__settings[ u"global" ].get( u"luminotes.support_email" ) ) ) ]
|
cherrypy.response.body = [ unicode( Not_found_page( support_email ) ) ]
|
||||||
return
|
return
|
||||||
|
|
||||||
import sys
|
# TODO: it'd be nice to send an email to myself with the traceback
|
||||||
import traceback
|
import traceback
|
||||||
traceback.print_exc()
|
traceback.print_exc()
|
||||||
|
|
||||||
exc_info = sys.exc_info()
|
cherrypy.response.body = [ unicode( Error_page( support_email ) ) ]
|
||||||
if exc_info:
|
|
||||||
message = exc_info[ 1 ].message
|
|
||||||
else:
|
|
||||||
message = None
|
|
||||||
|
|
||||||
cherrypy.response.body = [ unicode( Error_page(
|
|
||||||
self.__settings[ u"global" ].get( u"luminotes.support_email" ),
|
|
||||||
message,
|
|
||||||
) ) ]
|
|
||||||
|
|
||||||
scheduler = property( lambda self: self.__scheduler )
|
|
||||||
database = property( lambda self: self.__database )
|
database = property( lambda self: self.__database )
|
||||||
notebooks = property( lambda self: self.__notebooks )
|
notebooks = property( lambda self: self.__notebooks )
|
||||||
users = property( lambda self: self.__users )
|
users = property( lambda self: self.__users )
|
||||||
|
|
|
@ -1,72 +0,0 @@
|
||||||
from Queue import Queue, Empty
|
|
||||||
|
|
||||||
|
|
||||||
TIMEOUT_SECONDS = 10.0
|
|
||||||
|
|
||||||
|
|
||||||
def wait_for_update( function ):
|
|
||||||
"""
|
|
||||||
A decorator that passes a "queue" keyword arugment to its decorated function, calls the function,
|
|
||||||
and then blocks until an asynchronous response comes back via the Queue. When a response is
|
|
||||||
received, wait_for_update() returns it.
|
|
||||||
|
|
||||||
For this decorator to be useful, you should use it to decorate a function that fires off some
|
|
||||||
asynchronous action and then returns immediately. A typical way to accomplish this is by using
|
|
||||||
the @async decorator after the @wait_for_update decorator.
|
|
||||||
"""
|
|
||||||
def get_message( *args, **kwargs ):
|
|
||||||
queue = Queue()
|
|
||||||
|
|
||||||
kwargs[ "queue" ] = queue
|
|
||||||
function( *args, **kwargs )
|
|
||||||
|
|
||||||
# wait until a response is available in the queue, and then return that response
|
|
||||||
try:
|
|
||||||
return queue.get( block = True, timeout = TIMEOUT_SECONDS )
|
|
||||||
except Empty:
|
|
||||||
return { "error": u"A timeout occurred when processing your request. Please try again or contact support." }
|
|
||||||
|
|
||||||
return get_message
|
|
||||||
|
|
||||||
|
|
||||||
def update_client( function ):
|
|
||||||
"""
|
|
||||||
A decorator used to wrap a generator function so that its yielded values can be issued as
|
|
||||||
updates to the client. For this to work, the generator function must be invoked with a keyword
|
|
||||||
argument "queue" containing a Queue where the result can be put().
|
|
||||||
|
|
||||||
Also supports catching Validation_error exceptions and sending appropriate errors to the client.
|
|
||||||
|
|
||||||
Note that this decorator itself is a generator function and works by passing along next()/send()
|
|
||||||
calls to its decorated generator. Only yielded values that are dictionaries are sent to the
|
|
||||||
client via the provided queue. All other types of yielded values are in turn yielded by this
|
|
||||||
decorator itself.
|
|
||||||
"""
|
|
||||||
def put_message( *args, **kwargs ):
|
|
||||||
# look in the called function's kwargs for the queue where results should be sent
|
|
||||||
queue = kwargs.pop( "queue" )
|
|
||||||
|
|
||||||
try:
|
|
||||||
generator = function( *args, **kwargs )
|
|
||||||
message = None
|
|
||||||
|
|
||||||
while True:
|
|
||||||
result = generator.send( message )
|
|
||||||
|
|
||||||
if isinstance( result, dict ):
|
|
||||||
queue.put( result )
|
|
||||||
message = ( yield None )
|
|
||||||
else:
|
|
||||||
message = ( yield result )
|
|
||||||
except StopIteration:
|
|
||||||
return
|
|
||||||
except Exception, error:
|
|
||||||
# TODO: might be better to use view.Json instead of calling to_dict() manually
|
|
||||||
if hasattr( error, "to_dict" ):
|
|
||||||
result = error.to_dict()
|
|
||||||
queue.put( result )
|
|
||||||
else:
|
|
||||||
queue.put( { "error": u"An error occurred when processing your request. Please try again or contact support." } )
|
|
||||||
raise
|
|
||||||
|
|
||||||
return put_message
|
|
|
@ -1,17 +1,15 @@
|
||||||
import re
|
import re
|
||||||
import cherrypy
|
import cherrypy
|
||||||
|
from pytz import utc
|
||||||
from datetime import datetime, timedelta
|
from datetime import datetime, timedelta
|
||||||
from model.User import User
|
from new_model.User import User
|
||||||
from model.Notebook import Notebook
|
from new_model.Notebook import Notebook
|
||||||
from model.Note import Note
|
from new_model.Note import Note
|
||||||
from model.Password_reset import Password_reset
|
from new_model.Password_reset import Password_reset
|
||||||
from Scheduler import Scheduler
|
|
||||||
from Expose import expose
|
from Expose import expose
|
||||||
from Validate import validate, Valid_string, Valid_bool, Validation_error
|
from Validate import validate, Valid_string, Valid_bool, Validation_error
|
||||||
from Database import Valid_id
|
from Database import Valid_id
|
||||||
from Updater import update_client, wait_for_update
|
|
||||||
from Expire import strongly_expire
|
from Expire import strongly_expire
|
||||||
from Async import async
|
|
||||||
from view.Json import Json
|
from view.Json import Json
|
||||||
from view.Main_page import Main_page
|
from view.Main_page import Main_page
|
||||||
from view.Redeem_reset_note import Redeem_reset_note
|
from view.Redeem_reset_note import Redeem_reset_note
|
||||||
|
@ -123,12 +121,10 @@ class Users( object ):
|
||||||
"""
|
"""
|
||||||
Controller for dealing with users, corresponding to the "/users" URL.
|
Controller for dealing with users, corresponding to the "/users" URL.
|
||||||
"""
|
"""
|
||||||
def __init__( self, scheduler, database, http_url, https_url, support_email, rate_plans ):
|
def __init__( self, database, http_url, https_url, support_email, rate_plans ):
|
||||||
"""
|
"""
|
||||||
Create a new Users object.
|
Create a new Users object.
|
||||||
|
|
||||||
@type scheduler: controller.Scheduler
|
|
||||||
@param scheduler: scheduler to use for asynchronous calls
|
|
||||||
@type database: controller.Database
|
@type database: controller.Database
|
||||||
@param database: database that users are stored in
|
@param database: database that users are stored in
|
||||||
@type http_url: unicode
|
@type http_url: unicode
|
||||||
|
@ -142,7 +138,6 @@ class Users( object ):
|
||||||
@rtype: Users
|
@rtype: Users
|
||||||
@return: newly constructed Users
|
@return: newly constructed Users
|
||||||
"""
|
"""
|
||||||
self.__scheduler = scheduler
|
|
||||||
self.__database = database
|
self.__database = database
|
||||||
self.__http_url = http_url
|
self.__http_url = http_url
|
||||||
self.__https_url = https_url
|
self.__https_url = https_url
|
||||||
|
@ -151,9 +146,6 @@ class Users( object ):
|
||||||
|
|
||||||
@expose( view = Json )
|
@expose( view = Json )
|
||||||
@update_auth
|
@update_auth
|
||||||
@wait_for_update
|
|
||||||
@async
|
|
||||||
@update_client
|
|
||||||
@validate(
|
@validate(
|
||||||
username = ( Valid_string( min = 1, max = 30 ), valid_username ),
|
username = ( Valid_string( min = 1, max = 30 ), valid_username ),
|
||||||
password = Valid_string( min = 1, max = 30 ),
|
password = Valid_string( min = 1, max = 30 ),
|
||||||
|
@ -184,45 +176,39 @@ class Users( object ):
|
||||||
if password != password_repeat:
|
if password != password_repeat:
|
||||||
raise Signup_error( u"The passwords you entered do not match. Please try again." )
|
raise Signup_error( u"The passwords you entered do not match. Please try again." )
|
||||||
|
|
||||||
self.__database.load( "User %s" % username, self.__scheduler.thread )
|
user = self.__database.select_one( User, User.sql_load_by_username( username ) )
|
||||||
user = ( yield Scheduler.SLEEP )
|
|
||||||
|
|
||||||
if user is not None:
|
if user is not None:
|
||||||
raise Signup_error( u"Sorry, that username is not available. Please try something else." )
|
raise Signup_error( u"Sorry, that username is not available. Please try something else." )
|
||||||
|
|
||||||
# create a notebook for this user, along with a trash for that notebook
|
# create a notebook for this user, along with a trash for that notebook
|
||||||
self.__database.next_id( self.__scheduler.thread )
|
trash_id = self.__database.next_id( Notebook, commit = False )
|
||||||
trash_id = ( yield Scheduler.SLEEP )
|
trash = Notebook.create( trash_id, u"trash" )
|
||||||
trash = Notebook( trash_id, u"trash" )
|
self.__database.save( trash, commit = False )
|
||||||
|
|
||||||
self.__database.next_id( self.__scheduler.thread )
|
notebook_id = self.__database.next_id( Notebook, commit = False )
|
||||||
notebook_id = ( yield Scheduler.SLEEP )
|
notebook = Notebook.create( notebook_id, u"my notebook", trash_id )
|
||||||
notebook = Notebook( notebook_id, u"my notebook", trash )
|
self.__database.save( notebook, commit = False )
|
||||||
|
|
||||||
# create a startup note for this user's notebook
|
# create a startup note for this user's notebook
|
||||||
self.__database.next_id( self.__scheduler.thread )
|
note_id = self.__database.next_id( Note, commit = False )
|
||||||
note_id = ( yield Scheduler.SLEEP )
|
note_contents = file( u"static/html/welcome to your wiki.html" ).read()
|
||||||
note = Note( note_id, file( u"static/html/welcome to your wiki.html" ).read() )
|
note = Note.create( note_id, note_contents, notebook_id, startup = True, rank = 0 )
|
||||||
notebook.add_note( note )
|
self.__database.save( note, commit = False )
|
||||||
notebook.add_startup_note( note )
|
|
||||||
|
|
||||||
# actually create the new user
|
# actually create the new user
|
||||||
self.__database.next_id( self.__scheduler.thread )
|
user_id = self.__database.next_id( User, commit = False )
|
||||||
user_id = ( yield Scheduler.SLEEP )
|
user = User.create( user_id, username, password, email_address )
|
||||||
|
self.__database.save( user, commit = False )
|
||||||
|
|
||||||
user = User( user_id, username, password, email_address, notebooks = [ notebook ] )
|
# record the fact that the new user has access to their new notebook
|
||||||
self.__database.save( user )
|
self.__database.execute( user.sql_save_notebook( notebook_id, read_write = True ), commit = False )
|
||||||
|
self.__database.execute( user.sql_save_notebook( trash_id, read_write = True ), commit = False )
|
||||||
# add the new user to the user list
|
self.__database.commit()
|
||||||
self.__database.load( u"User_list all", self.scheduler.thread )
|
|
||||||
user_list = ( yield Scheduler.SLEEP )
|
|
||||||
if user_list:
|
|
||||||
user_list.add_user( user )
|
|
||||||
self.__database.save( user_list )
|
|
||||||
|
|
||||||
redirect = u"/notebooks/%s" % notebook.object_id
|
redirect = u"/notebooks/%s" % notebook.object_id
|
||||||
|
|
||||||
yield dict(
|
return dict(
|
||||||
redirect = redirect,
|
redirect = redirect,
|
||||||
authenticated = user,
|
authenticated = user,
|
||||||
)
|
)
|
||||||
|
@ -230,9 +216,6 @@ class Users( object ):
|
||||||
@expose()
|
@expose()
|
||||||
@grab_user_id
|
@grab_user_id
|
||||||
@update_auth
|
@update_auth
|
||||||
@wait_for_update
|
|
||||||
@async
|
|
||||||
@update_client
|
|
||||||
def demo( self, user_id = None ):
|
def demo( self, user_id = None ):
|
||||||
"""
|
"""
|
||||||
Create a new guest User for purposes of the demo. Start that user with their own Notebook and
|
Create a new guest User for purposes of the demo. Start that user with their own Notebook and
|
||||||
|
@ -250,54 +233,51 @@ class Users( object ):
|
||||||
# if the user is already logged in as a guest, then just redirect to their existing demo
|
# if the user is already logged in as a guest, then just redirect to their existing demo
|
||||||
# notebook
|
# notebook
|
||||||
if user_id:
|
if user_id:
|
||||||
self.__database.load( user_id, self.__scheduler.thread )
|
user = self.__database.load( User, user_id )
|
||||||
user = ( yield Scheduler.SLEEP )
|
first_notebook = self.__database.select_one( Notebook, user.sql_load_notebooks( parents_only = True ) )
|
||||||
if user.username is None and len( user.notebooks ) > 0:
|
if user.username is None and first_notebook:
|
||||||
redirect = u"/notebooks/%s" % user.notebooks[ 0 ].object_id
|
redirect = u"/notebooks/%s" % first_notebook.object_id
|
||||||
yield dict( redirect = redirect )
|
return dict( redirect = redirect )
|
||||||
return
|
|
||||||
|
|
||||||
# create a demo notebook for this user, along with a trash for that notebook
|
# create a demo notebook for this user, along with a trash for that notebook
|
||||||
self.__database.next_id( self.__scheduler.thread )
|
trash_id = self.__database.next_id( Notebook, commit = False )
|
||||||
trash_id = ( yield Scheduler.SLEEP )
|
trash = Notebook.create( trash_id, u"trash" )
|
||||||
trash = Notebook( trash_id, u"trash" )
|
self.__database.save( trash, commit = False )
|
||||||
|
|
||||||
self.__database.next_id( self.__scheduler.thread )
|
notebook_id = self.__database.next_id( Notebook, commit = False )
|
||||||
notebook_id = ( yield Scheduler.SLEEP )
|
notebook = Notebook.create( notebook_id, u"my notebook", trash_id )
|
||||||
notebook = Notebook( notebook_id, u"my notebook", trash )
|
self.__database.save( notebook, commit = False )
|
||||||
|
|
||||||
# create startup notes for this user's notebook
|
# create startup notes for this user's notebook
|
||||||
self.__database.next_id( self.__scheduler.thread )
|
note_id = self.__database.next_id( Note, commit = False )
|
||||||
note_id = ( yield Scheduler.SLEEP )
|
note_contents = file( u"static/html/this is a demo.html" ).read()
|
||||||
note = Note( note_id, file( u"static/html/this is a demo.html" ).read() )
|
note = Note.create( note_id, note_contents, notebook_id, startup = True, rank = 0 )
|
||||||
notebook.add_note( note )
|
self.__database.save( note, commit = False )
|
||||||
notebook.add_startup_note( note )
|
|
||||||
|
|
||||||
self.__database.next_id( self.__scheduler.thread )
|
note_id = self.__database.next_id( Note, commit = False )
|
||||||
note_id = ( yield Scheduler.SLEEP )
|
note_contents = file( u"static/html/welcome to your wiki.html" ).read()
|
||||||
note = Note( note_id, file( u"static/html/welcome to your wiki.html" ).read() )
|
note = Note.create( note_id, note_contents, notebook_id, startup = True, rank = 1 )
|
||||||
notebook.add_note( note )
|
self.__database.save( note, commit = False )
|
||||||
notebook.add_startup_note( note )
|
|
||||||
|
|
||||||
# actually create the new user. because this is just a demo user, we're not adding it to the User_list
|
# actually create the new user
|
||||||
self.__database.next_id( self.__scheduler.thread )
|
user_id = self.__database.next_id( User, commit = False )
|
||||||
user_id = ( yield Scheduler.SLEEP )
|
user = User.create( user_id, username = None, password = None, email_address = None )
|
||||||
|
self.__database.save( user, commit = False )
|
||||||
|
|
||||||
user = User( user_id, username = None, password = None, email_address = None, notebooks = [ notebook ] )
|
# record the fact that the new user has access to their new notebook
|
||||||
self.__database.save( user )
|
self.__database.execute( user.sql_save_notebook( notebook_id, read_write = True ), commit = False )
|
||||||
|
self.__database.execute( user.sql_save_notebook( trash_id, read_write = True ), commit = False )
|
||||||
|
self.__database.commit()
|
||||||
|
|
||||||
redirect = u"/notebooks/%s" % notebook.object_id
|
redirect = u"/notebooks/%s" % notebook.object_id
|
||||||
|
|
||||||
yield dict(
|
return dict(
|
||||||
redirect = redirect,
|
redirect = redirect,
|
||||||
authenticated = user,
|
authenticated = user,
|
||||||
)
|
)
|
||||||
|
|
||||||
@expose( view = Json )
|
@expose( view = Json )
|
||||||
@update_auth
|
@update_auth
|
||||||
@wait_for_update
|
|
||||||
@async
|
|
||||||
@update_client
|
|
||||||
@validate(
|
@validate(
|
||||||
username = ( Valid_string( min = 1, max = 30 ), valid_username ),
|
username = ( Valid_string( min = 1, max = 30 ), valid_username ),
|
||||||
password = Valid_string( min = 1, max = 30 ),
|
password = Valid_string( min = 1, max = 30 ),
|
||||||
|
@ -317,28 +297,26 @@ class Users( object ):
|
||||||
@raise Authentication_error: invalid username or password
|
@raise Authentication_error: invalid username or password
|
||||||
@raise Validation_error: one of the arguments is invalid
|
@raise Validation_error: one of the arguments is invalid
|
||||||
"""
|
"""
|
||||||
self.__database.load( "User %s" % username, self.__scheduler.thread )
|
user = self.__database.select_one( User, User.sql_load_by_username( username ) )
|
||||||
user = ( yield Scheduler.SLEEP )
|
|
||||||
|
|
||||||
if user is None or user.check_password( password ) is False:
|
if user is None or user.check_password( password ) is False:
|
||||||
raise Authentication_error( u"Invalid username or password." )
|
raise Authentication_error( u"Invalid username or password." )
|
||||||
|
|
||||||
|
first_notebook = self.__database.select_one( Notebook, user.sql_load_notebooks( parents_only = True ) )
|
||||||
|
|
||||||
# redirect to the user's first notebook (if any)
|
# redirect to the user's first notebook (if any)
|
||||||
if len( user.notebooks ) > 0:
|
if first_notebook:
|
||||||
redirect = u"/notebooks/%s" % user.notebooks[ 0 ].object_id
|
redirect = u"/notebooks/%s" % first_notebook.object_id
|
||||||
else:
|
else:
|
||||||
redirect = u"/"
|
redirect = u"/"
|
||||||
|
|
||||||
yield dict(
|
return dict(
|
||||||
redirect = redirect,
|
redirect = redirect,
|
||||||
authenticated = user,
|
authenticated = user,
|
||||||
)
|
)
|
||||||
|
|
||||||
@expose( view = Json )
|
@expose( view = Json )
|
||||||
@update_auth
|
@update_auth
|
||||||
@wait_for_update
|
|
||||||
@async
|
|
||||||
@update_client
|
|
||||||
def logout( self ):
|
def logout( self ):
|
||||||
"""
|
"""
|
||||||
Deauthenticate the user and log them out of their current session.
|
Deauthenticate the user and log them out of their current session.
|
||||||
|
@ -346,7 +324,7 @@ class Users( object ):
|
||||||
@rtype: json dict
|
@rtype: json dict
|
||||||
@return: { 'redirect': url, 'deauthenticated': True }
|
@return: { 'redirect': url, 'deauthenticated': True }
|
||||||
"""
|
"""
|
||||||
yield dict(
|
return dict(
|
||||||
redirect = self.__http_url + u"/",
|
redirect = self.__http_url + u"/",
|
||||||
deauthenticated = True,
|
deauthenticated = True,
|
||||||
)
|
)
|
||||||
|
@ -354,9 +332,6 @@ class Users( object ):
|
||||||
@expose( view = Json )
|
@expose( view = Json )
|
||||||
@strongly_expire
|
@strongly_expire
|
||||||
@grab_user_id
|
@grab_user_id
|
||||||
@wait_for_update
|
|
||||||
@async
|
|
||||||
@update_client
|
|
||||||
@validate(
|
@validate(
|
||||||
include_startup_notes = Valid_bool(),
|
include_startup_notes = Valid_bool(),
|
||||||
user_id = Valid_id( none_okay = True ),
|
user_id = Valid_id( none_okay = True ),
|
||||||
|
@ -382,38 +357,42 @@ class Users( object ):
|
||||||
@raise Validation_error: one of the arguments is invalid
|
@raise Validation_error: one of the arguments is invalid
|
||||||
"""
|
"""
|
||||||
# if there's no logged-in user, default to the anonymous user
|
# if there's no logged-in user, default to the anonymous user
|
||||||
self.__database.load( user_id or u"User anonymous", self.__scheduler.thread )
|
anonymous = self.__database.select_one( User, User.sql_load_by_username( u"anonymous" ) )
|
||||||
user = ( yield Scheduler.SLEEP )
|
if user_id:
|
||||||
|
user = self.__database.load( User, user_id )
|
||||||
|
else:
|
||||||
|
user = anonymous
|
||||||
|
|
||||||
if not user:
|
if not user or not anonymous:
|
||||||
yield dict(
|
return dict(
|
||||||
user = None,
|
user = None,
|
||||||
notebooks = None,
|
notebooks = None,
|
||||||
http_url = u"",
|
http_url = u"",
|
||||||
)
|
)
|
||||||
return
|
|
||||||
|
|
||||||
# in addition to this user's own notebooks, add to that list the anonymous user's notebooks
|
# in addition to this user's own notebooks, add to that list the anonymous user's notebooks
|
||||||
self.__database.load( u"User anonymous", self.__scheduler.thread )
|
|
||||||
anonymous = ( yield Scheduler.SLEEP )
|
|
||||||
login_url = None
|
login_url = None
|
||||||
|
notebooks = self.__database.select_many( Notebook, anonymous.sql_load_notebooks() )
|
||||||
|
|
||||||
if user_id:
|
if user_id:
|
||||||
notebooks = anonymous.notebooks
|
notebooks += self.__database.select_many( Notebook, user.sql_load_notebooks() )
|
||||||
|
# if the user is not logged in, return a login URL
|
||||||
else:
|
else:
|
||||||
notebooks = []
|
if len( notebooks ) > 0:
|
||||||
if len( anonymous.notebooks ) > 0:
|
main_notebook = notebooks[ 0 ]
|
||||||
anon_notebook = anonymous.notebooks[ 0 ]
|
login_note = self.__database.select_one( Note, main_notebook.sql_load_note_by_title( u"login" ) )
|
||||||
login_note = anon_notebook.lookup_note_by_title( u"login" )
|
|
||||||
if login_note:
|
if login_note:
|
||||||
login_url = "%s/notebooks/%s?note_id=%s" % ( self.__https_url, anon_notebook.object_id, login_note.object_id )
|
login_url = "%s/notebooks/%s?note_id=%s" % ( self.__https_url, main_notebook.object_id, login_note.object_id )
|
||||||
|
|
||||||
notebooks += user.notebooks
|
if include_startup_notes and len( notebooks ) > 0:
|
||||||
|
startup_notes = self.__database.select_many( Note, notebooks[ 0 ].sql_load_startup_notes() )
|
||||||
|
else:
|
||||||
|
startup_notes = []
|
||||||
|
|
||||||
yield dict(
|
return dict(
|
||||||
user = user,
|
user = user,
|
||||||
notebooks = notebooks,
|
notebooks = notebooks,
|
||||||
startup_notes = include_startup_notes and len( notebooks ) > 0 and notebooks[ 0 ].startup_notes or [],
|
startup_notes = startup_notes,
|
||||||
http_url = self.__http_url,
|
http_url = self.__http_url,
|
||||||
login_url = login_url,
|
login_url = login_url,
|
||||||
rate_plan = ( user.rate_plan < len( self.__rate_plans ) ) and self.__rate_plans[ user.rate_plan ] or {},
|
rate_plan = ( user.rate_plan < len( self.__rate_plans ) ) and self.__rate_plans[ user.rate_plan ] or {},
|
||||||
|
@ -421,54 +400,64 @@ class Users( object ):
|
||||||
|
|
||||||
def calculate_storage( self, user ):
|
def calculate_storage( self, user ):
|
||||||
"""
|
"""
|
||||||
Calculate total storage utilization for all notebooks and all notes of the given user,
|
Calculate total storage utilization for all notes of the given user, including storage for all
|
||||||
including storage for all past revisions.
|
past revisions.
|
||||||
|
|
||||||
@type user: User
|
@type user: User
|
||||||
@param user: user for which to calculate storage utilization
|
@param user: user for which to calculate storage utilization
|
||||||
@rtype: int
|
@rtype: int
|
||||||
@return: total bytes used for storage
|
@return: total bytes used for storage
|
||||||
"""
|
"""
|
||||||
total_bytes = 0
|
return sum( self.__database.select_one( tuple, user.sql_calculate_storage() ), 0 )
|
||||||
|
|
||||||
def sum_revisions( obj ):
|
def update_storage( self, user_id, commit = True ):
|
||||||
return \
|
|
||||||
self.__database.size( obj.object_id ) + \
|
|
||||||
sum( [ self.__database.size( obj.object_id, revision ) or 0 for revision in obj.revisions_list ], 0 )
|
|
||||||
|
|
||||||
def sum_notebook( notebook ):
|
|
||||||
return \
|
|
||||||
self.__database.size( notebook.object_id ) + \
|
|
||||||
sum( [ sum_revisions( note ) for note in notebook.notes ], 0 )
|
|
||||||
|
|
||||||
for notebook in user.notebooks:
|
|
||||||
total_bytes += sum_notebook( notebook )
|
|
||||||
|
|
||||||
if notebook.trash:
|
|
||||||
total_bytes += sum_notebook( notebook.trash )
|
|
||||||
|
|
||||||
return total_bytes
|
|
||||||
|
|
||||||
@async
|
|
||||||
def update_storage( self, user_id, callback = None ):
|
|
||||||
"""
|
"""
|
||||||
Calculate and record total storage utilization for the given user.
|
Calculate and record total storage utilization for the given user.
|
||||||
@type user_id: unicode or NoneType
|
|
||||||
|
@type user_id: unicode
|
||||||
@param user_id: id of user for which to calculate storage utilization
|
@param user_id: id of user for which to calculate storage utilization
|
||||||
@type callback: generator or NoneType
|
@type commit: bool
|
||||||
@param callback: generator to wakeup when the update is complete (optional)
|
@param commit: True to automatically commit after the update
|
||||||
|
@rtype: model.User
|
||||||
|
@return: object of the user corresponding to user_id
|
||||||
"""
|
"""
|
||||||
self.__database.load( user_id, self.__scheduler.thread )
|
user = self.__database.load( User, user_id )
|
||||||
user = ( yield Scheduler.SLEEP )
|
|
||||||
|
|
||||||
if user:
|
if user:
|
||||||
user.storage_bytes = self.calculate_storage( user )
|
user.storage_bytes = self.calculate_storage( user )
|
||||||
|
self.__database.save( user, commit )
|
||||||
|
|
||||||
yield callback, user
|
return user
|
||||||
|
|
||||||
|
def check_access( self, user_id, notebook_id, read_write = False ):
|
||||||
|
"""
|
||||||
|
Determine whether the given user has access to the given notebook.
|
||||||
|
|
||||||
|
@type user_id: unicode
|
||||||
|
@param user_id: id of user whose access to check
|
||||||
|
@type notebook_id: unicode
|
||||||
|
@param notebook_id: id of notebook to check access for
|
||||||
|
@type read_write: bool
|
||||||
|
@param read_write: True if read-write access is being checked, False if read-only access (defaults to False)
|
||||||
|
@rtype: bool
|
||||||
|
@return: True if the user has access
|
||||||
|
"""
|
||||||
|
# check if the anonymous user has access to this notebook
|
||||||
|
anonymous = self.__database.select_one( User, User.sql_load_by_username( u"anonymous" ) )
|
||||||
|
|
||||||
|
if self.__database.select_one( bool, anonymous.sql_has_access( notebook_id, read_write ) ):
|
||||||
|
return True
|
||||||
|
|
||||||
|
if user_id:
|
||||||
|
# check if the given user has access to this notebook
|
||||||
|
user = self.__database.load( User, user_id )
|
||||||
|
|
||||||
|
if user and self.__database.select_one( bool, user.sql_has_access( notebook_id ) ):
|
||||||
|
return True
|
||||||
|
|
||||||
|
return False
|
||||||
|
|
||||||
@expose( view = Json )
|
@expose( view = Json )
|
||||||
@wait_for_update
|
|
||||||
@async
|
|
||||||
@update_client
|
|
||||||
@validate(
|
@validate(
|
||||||
email_address = ( Valid_string( min = 1, max = 60 ), valid_email_address ),
|
email_address = ( Valid_string( min = 1, max = 60 ), valid_email_address ),
|
||||||
send_reset_button = unicode,
|
send_reset_button = unicode,
|
||||||
|
@ -491,19 +480,13 @@ class Users( object ):
|
||||||
from email import Message
|
from email import Message
|
||||||
|
|
||||||
# check whether there are actually any users with the given email address
|
# check whether there are actually any users with the given email address
|
||||||
self.__database.load( u"User_list all", self.scheduler.thread )
|
users = self.__database.select_many( User, User.sql_load_by_email_address( email_address ) )
|
||||||
user_list = ( yield Scheduler.SLEEP )
|
|
||||||
|
|
||||||
if not user_list:
|
|
||||||
raise Password_reset_error( "There was an error when sending your password reset email. Please contact %s." % self.__support_email )
|
|
||||||
|
|
||||||
users = [ user for user in user_list.users if user.email_address == email_address ]
|
|
||||||
if len( users ) == 0:
|
if len( users ) == 0:
|
||||||
raise Password_reset_error( u"There are no Luminotes users with the email address %s" % email_address )
|
raise Password_reset_error( u"There are no Luminotes users with the email address %s" % email_address )
|
||||||
|
|
||||||
# record the sending of this reset email
|
# record the sending of this reset email
|
||||||
self.__database.next_id( self.__scheduler.thread )
|
password_reset_id = self.__database.next_id( Password_reset, commit = False )
|
||||||
password_reset_id = ( yield Scheduler.SLEEP )
|
|
||||||
password_reset = Password_reset( password_reset_id, email_address )
|
password_reset = Password_reset( password_reset_id, email_address )
|
||||||
self.__database.save( password_reset )
|
self.__database.save( password_reset )
|
||||||
|
|
||||||
|
@ -527,15 +510,12 @@ class Users( object ):
|
||||||
server.sendmail( message[ u"from" ], [ email_address ], message.as_string() )
|
server.sendmail( message[ u"from" ], [ email_address ], message.as_string() )
|
||||||
server.quit()
|
server.quit()
|
||||||
|
|
||||||
yield dict(
|
return dict(
|
||||||
message = u"Please check your inbox. A password reset email has been sent to %s" % email_address,
|
message = u"Please check your inbox. A password reset email has been sent to %s" % email_address,
|
||||||
)
|
)
|
||||||
|
|
||||||
@expose( view = Main_page )
|
@expose( view = Main_page )
|
||||||
@strongly_expire
|
@strongly_expire
|
||||||
@wait_for_update
|
|
||||||
@async
|
|
||||||
@update_client
|
|
||||||
@validate(
|
@validate(
|
||||||
password_reset_id = Valid_id(),
|
password_reset_id = Valid_id(),
|
||||||
)
|
)
|
||||||
|
@ -550,43 +530,34 @@ class Users( object ):
|
||||||
@raise Password_reset_error: an error occured when redeeming the password reset, such as an expired link
|
@raise Password_reset_error: an error occured when redeeming the password reset, such as an expired link
|
||||||
@raise Validation_error: one of the arguments is invalid
|
@raise Validation_error: one of the arguments is invalid
|
||||||
"""
|
"""
|
||||||
self.__database.load( u"User anonymous", self.__scheduler.thread )
|
anonymous = self.__database.select_one( User, User.sql_load_by_username( u"anonymous" ) )
|
||||||
anonymous = ( yield Scheduler.SLEEP )
|
if anonymous:
|
||||||
|
main_notebook = self.__database.select_one( Notebook, anonymous.sql_load_notebooks() )
|
||||||
|
|
||||||
if not anonymous or len( anonymous.notebooks ) == 0:
|
if not anonymous or not main_notebook:
|
||||||
raise Password_reset_error( "There was an error when completing your password reset. Please contact %s." % self.__support_email )
|
raise Password_reset_error( "There was an error when completing your password reset. Please contact %s." % self.__support_email )
|
||||||
|
|
||||||
self.__database.load( password_reset_id, self.__scheduler.thread )
|
password_reset = self.__database.load( Password_reset, password_reset_id )
|
||||||
password_reset = ( yield Scheduler.SLEEP )
|
|
||||||
|
|
||||||
if not password_reset or datetime.now() - password_reset.revision > timedelta( hours = 25 ):
|
if not password_reset or datetime.now( tz = utc ) - password_reset.revision > timedelta( hours = 25 ):
|
||||||
raise Password_reset_error( "Your password reset link has expired. Please request a new password reset email." )
|
raise Password_reset_error( "Your password reset link has expired. Please request a new password reset email." )
|
||||||
|
|
||||||
if password_reset.redeemed:
|
if password_reset.redeemed:
|
||||||
raise Password_reset_error( "Your password has already been reset. Please request a new password reset email." )
|
raise Password_reset_error( "Your password has already been reset. Please request a new password reset email." )
|
||||||
|
|
||||||
self.__database.load( u"User_list all", self.__scheduler.thread )
|
|
||||||
user_list = ( yield Scheduler.SLEEP )
|
|
||||||
|
|
||||||
if not user_list:
|
|
||||||
raise Password_reset_error( u"There are no Luminotes users with the email address %s" % password_reset.email_address )
|
|
||||||
|
|
||||||
# find the user(s) with the email address from the password reset request
|
# find the user(s) with the email address from the password reset request
|
||||||
matching_users = [ user for user in user_list.users if user.email_address == password_reset.email_address ]
|
matching_users = self.__database.select_many( User, User.sql_load_by_email_address( password_reset.email_address ) )
|
||||||
|
|
||||||
if len( matching_users ) == 0:
|
if len( matching_users ) == 0:
|
||||||
raise Password_reset_error( u"There are no Luminotes users with the email address %s" % password_reset.email_address )
|
raise Password_reset_error( u"There are no Luminotes users with the email address %s" % password_reset.email_address )
|
||||||
|
|
||||||
yield dict(
|
return dict(
|
||||||
notebook_id = anonymous.notebooks[ 0 ].object_id,
|
notebook_id = main_notebook.object_id,
|
||||||
note_id = u"blank",
|
note_id = u"blank",
|
||||||
note_contents = unicode( Redeem_reset_note( password_reset_id, matching_users ) ),
|
note_contents = unicode( Redeem_reset_note( password_reset_id, matching_users ) ),
|
||||||
)
|
)
|
||||||
|
|
||||||
@expose( view = Json )
|
@expose( view = Json )
|
||||||
@wait_for_update
|
|
||||||
@async
|
|
||||||
@update_client
|
|
||||||
def reset_password( self, password_reset_id, reset_button, **new_passwords ):
|
def reset_password( self, password_reset_id, reset_button, **new_passwords ):
|
||||||
"""
|
"""
|
||||||
Reset all the users with the provided passwords.
|
Reset all the users with the provided passwords.
|
||||||
|
@ -606,27 +577,19 @@ class Users( object ):
|
||||||
except ValueError:
|
except ValueError:
|
||||||
raise Validation_error( "password_reset_id", password_reset_id, id_validator, "is not a valid id" )
|
raise Validation_error( "password_reset_id", password_reset_id, id_validator, "is not a valid id" )
|
||||||
|
|
||||||
self.__database.load( password_reset_id, self.__scheduler.thread )
|
password_reset = self.__database.load( Password_reset, password_reset_id )
|
||||||
password_reset = ( yield Scheduler.SLEEP )
|
|
||||||
|
|
||||||
if not password_reset or datetime.now() - password_reset.revision > timedelta( hours = 25 ):
|
if not password_reset or datetime.now( tz = utc ) - password_reset.revision > timedelta( hours = 25 ):
|
||||||
raise Password_reset_error( "Your password reset link has expired. Please request a new password reset email." )
|
raise Password_reset_error( "Your password reset link has expired. Please request a new password reset email." )
|
||||||
|
|
||||||
if password_reset.redeemed:
|
if password_reset.redeemed:
|
||||||
raise Password_reset_error( "Your password has already been reset. Please request a new password reset email." )
|
raise Password_reset_error( "Your password has already been reset. Please request a new password reset email." )
|
||||||
|
|
||||||
self.__database.load( u"User_list all", self.__scheduler.thread )
|
matching_users = self.__database.select_many( User, User.sql_load_by_email_address( password_reset.email_address ) )
|
||||||
user_list = ( yield Scheduler.SLEEP )
|
|
||||||
|
|
||||||
if not user_list:
|
|
||||||
raise Password_reset_error( "There was an error when resetting your password. Please contact %s." % self.__support_email )
|
|
||||||
|
|
||||||
# find the user(s) with the email address from the password reset request
|
|
||||||
matching_users = [ user for user in user_list.users if user.email_address == password_reset.email_address ]
|
|
||||||
allowed_user_ids = [ user.object_id for user in matching_users ]
|
allowed_user_ids = [ user.object_id for user in matching_users ]
|
||||||
|
|
||||||
# reset any passwords that are non-blank
|
# reset any passwords that are non-blank
|
||||||
users_to_reset = []
|
at_least_one_reset = False
|
||||||
for ( user_id, ( new_password, new_password_repeat ) ) in new_passwords.items():
|
for ( user_id, ( new_password, new_password_repeat ) ) in new_passwords.items():
|
||||||
if user_id not in allowed_user_ids:
|
if user_id not in allowed_user_ids:
|
||||||
raise Password_reset_error( "There was an error when resetting your password. Please contact %s." % self.__support_email )
|
raise Password_reset_error( "There was an error when resetting your password. Please contact %s." % self.__support_email )
|
||||||
|
@ -635,8 +598,7 @@ class Users( object ):
|
||||||
if new_password == u"" and new_password_repeat == u"":
|
if new_password == u"" and new_password_repeat == u"":
|
||||||
continue
|
continue
|
||||||
|
|
||||||
self.__database.load( user_id, self.__scheduler.thread )
|
user = self.__database.load( User, user_id )
|
||||||
user = ( yield Scheduler.SLEEP )
|
|
||||||
|
|
||||||
if not user:
|
if not user:
|
||||||
raise Password_reset_error( "There was an error when resetting your password. Please contact %s." % self.__support_email )
|
raise Password_reset_error( "There was an error when resetting your password. Please contact %s." % self.__support_email )
|
||||||
|
@ -649,19 +611,16 @@ class Users( object ):
|
||||||
if len( new_password ) > 30:
|
if len( new_password ) > 30:
|
||||||
raise Password_reset_error( u"Your password can be no longer than 30 characters." )
|
raise Password_reset_error( u"Your password can be no longer than 30 characters." )
|
||||||
|
|
||||||
users_to_reset.append( ( user, new_password ) )
|
at_least_one_reset = True
|
||||||
|
|
||||||
for ( user, new_password ) in users_to_reset:
|
|
||||||
user.password = new_password
|
user.password = new_password
|
||||||
self.__database.save( user )
|
self.__database.save( user, commit = False )
|
||||||
|
|
||||||
# if all the new passwords provided are blank, bail
|
# if all the new passwords provided are blank, bail
|
||||||
if not users_to_reset:
|
if not at_least_one_reset:
|
||||||
raise Password_reset_error( u"Please enter a new password. Or, if you already know your password, just click the login link above." )
|
raise Password_reset_error( u"Please enter a new password. Or, if you already know your password, just click the login link above." )
|
||||||
|
|
||||||
password_reset.redeemed = True
|
password_reset.redeemed = True
|
||||||
self.__database.save( password_reset )
|
self.__database.save( password_reset, commit = False )
|
||||||
|
self.__database.commit()
|
||||||
|
|
||||||
yield dict( redirect = u"/" )
|
return dict( redirect = u"/" )
|
||||||
|
|
||||||
scheduler = property( lambda self: self.__scheduler )
|
|
||||||
|
|
|
@ -32,7 +32,7 @@ class Validation_error( Exception ):
|
||||||
|
|
||||||
def to_dict( self ):
|
def to_dict( self ):
|
||||||
return dict(
|
return dict(
|
||||||
error = u"The %s %s." % ( self.__name, self.__message ),
|
error = u"The %s %s." % ( self.__name.replace( u"_", " " ), self.__message ),
|
||||||
name = self.__name,
|
name = self.__name,
|
||||||
value = self.__value,
|
value = self.__value,
|
||||||
)
|
)
|
||||||
|
|
|
@ -0,0 +1,72 @@
|
||||||
|
from copy import copy
|
||||||
|
|
||||||
|
|
||||||
|
class Stub_database( object ):
|
||||||
|
def __init__( self, connection = None ):
|
||||||
|
# map of object id to list of saved objects (presumably in increasing order of revisions)
|
||||||
|
self.objects = {}
|
||||||
|
self.user_notebook = {} # map of user_id to ( notebook_id, read_write )
|
||||||
|
self.__next_id = 0
|
||||||
|
|
||||||
|
def save( self, obj, commit = False ):
|
||||||
|
if obj.object_id in self.objects:
|
||||||
|
self.objects[ obj.object_id ].append( copy( obj ) )
|
||||||
|
else:
|
||||||
|
self.objects[ obj.object_id ] = [ copy( obj ) ]
|
||||||
|
|
||||||
|
def load( self, Object_type, object_id, revision = None ):
|
||||||
|
obj_list = self.objects.get( object_id )
|
||||||
|
|
||||||
|
if not obj_list:
|
||||||
|
return None
|
||||||
|
|
||||||
|
# if a particular revision wasn't requested, just return the most recently saved object
|
||||||
|
# matching the given object_id
|
||||||
|
if revision is None:
|
||||||
|
if not isinstance( obj_list[ -1 ], Object_type ):
|
||||||
|
return None
|
||||||
|
return copy( obj_list[ -1 ] )
|
||||||
|
|
||||||
|
# a particular revision was requested, so pick it out of the objects matching the given id
|
||||||
|
matching_objs = [ obj for obj in obj_list if str( obj.revision ) == str( revision ) ]
|
||||||
|
if len( matching_objs ) > 0:
|
||||||
|
if not isinstance( matching_objs[ -1 ], Object_type ):
|
||||||
|
return None
|
||||||
|
return copy( matching_objs[ -1 ] )
|
||||||
|
|
||||||
|
return None
|
||||||
|
|
||||||
|
def select_one( self, Object_type, sql_command ):
|
||||||
|
if callable( sql_command ):
|
||||||
|
result = sql_command( self )
|
||||||
|
if isinstance( result, list ):
|
||||||
|
if len( result ) == 0: return None
|
||||||
|
return result[ 0 ]
|
||||||
|
return result
|
||||||
|
|
||||||
|
raise NotImplementedError( sql_command )
|
||||||
|
|
||||||
|
def select_many( self, Object_type, sql_command ):
|
||||||
|
if callable( sql_command ):
|
||||||
|
result = sql_command( self )
|
||||||
|
if isinstance( result, list ):
|
||||||
|
return result
|
||||||
|
return [ result ]
|
||||||
|
|
||||||
|
raise NotImplementedError( sql_command )
|
||||||
|
|
||||||
|
def execute( self, sql_command, commit = False ):
|
||||||
|
if callable( sql_command ):
|
||||||
|
return sql_command( self )
|
||||||
|
|
||||||
|
raise NotImplementedError( sql_command )
|
||||||
|
|
||||||
|
def next_id( self, Object_type, commit = True ):
|
||||||
|
self.__next_id += 1
|
||||||
|
return unicode( self.__next_id )
|
||||||
|
|
||||||
|
def commit( self ):
|
||||||
|
pass
|
||||||
|
|
||||||
|
def close( self ):
|
||||||
|
pass
|
|
@ -0,0 +1,79 @@
|
||||||
|
from datetime import datetime
|
||||||
|
from new_model.Persistent import Persistent, quote
|
||||||
|
|
||||||
|
|
||||||
|
def notz_quote( value ):
|
||||||
|
"""
|
||||||
|
Apparently, pysqlite2 chokes on timestamps that have a timezone when reading them out of the
|
||||||
|
database, so for purposes of the unit tests, strip off the timezone on all datetime objects.
|
||||||
|
"""
|
||||||
|
if isinstance( value, datetime ):
|
||||||
|
value = value.replace( tzinfo = None )
|
||||||
|
|
||||||
|
return quote( value )
|
||||||
|
|
||||||
|
|
||||||
|
class Stub_object( Persistent ):
|
||||||
|
def __init__( self, object_id, revision = None, value = None, value2 = None ):
|
||||||
|
Persistent.__init__( self, object_id, revision )
|
||||||
|
self.__value = value
|
||||||
|
self.__value2 = value2
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def sql_load( object_id, revision = None ):
|
||||||
|
if revision:
|
||||||
|
return "select * from stub_object where id = %s and revision = %s;" % ( quote( object_id ), notz_quote( revision ) )
|
||||||
|
|
||||||
|
return "select * from stub_object where id = %s order by revision desc limit 1;" % quote( object_id )
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def sql_id_exists( object_id, revision = None ):
|
||||||
|
if revision:
|
||||||
|
return "select id from stub_object where id = %s and revision = %s;" % ( quote( object_id ), notz_quote( revision ) )
|
||||||
|
|
||||||
|
return "select id from stub_object where id = %s order by revision desc limit 1;" % quote( object_id )
|
||||||
|
|
||||||
|
def sql_exists( self ):
|
||||||
|
return Stub_object.sql_id_exists( self.object_id, self.revision )
|
||||||
|
|
||||||
|
def sql_create( self ):
|
||||||
|
return \
|
||||||
|
"insert into stub_object ( id, revision, value, value2 ) " + \
|
||||||
|
"values ( %s, %s, %s, %s );" % \
|
||||||
|
( quote( self.object_id ), notz_quote( self.revision ), quote( self.__value ),
|
||||||
|
quote( self.__value2 ) )
|
||||||
|
|
||||||
|
def sql_update( self ):
|
||||||
|
return self.sql_create()
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def sql_load_em_all():
|
||||||
|
return "select * from stub_object;"
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def sql_create_table():
|
||||||
|
return \
|
||||||
|
"""
|
||||||
|
create table stub_object (
|
||||||
|
id text not null,
|
||||||
|
revision timestamp with time zone not null,
|
||||||
|
value integer,
|
||||||
|
value2 integer
|
||||||
|
);
|
||||||
|
"""
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def sql_tuple():
|
||||||
|
return "select 1, 2;"
|
||||||
|
|
||||||
|
def __set_value( self, value ):
|
||||||
|
self.update_revision()
|
||||||
|
self.__value = value
|
||||||
|
|
||||||
|
def __set_value2( self, value2 ):
|
||||||
|
self.update_revision()
|
||||||
|
self.__value2 = value2
|
||||||
|
|
||||||
|
value = property( lambda self: self.__value, __set_value )
|
||||||
|
value2 = property( lambda self: self.__value2, __set_value2 )
|
||||||
|
|
|
@ -1,18 +1,178 @@
|
||||||
import cherrypy
|
import cherrypy
|
||||||
from controller.Scheduler import Scheduler
|
from Stub_database import Stub_database
|
||||||
from controller.Database import Database
|
from Stub_view import Stub_view
|
||||||
from controller.test.Stub_view import Stub_view
|
|
||||||
from config import Common
|
from config import Common
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
from StringIO import StringIO
|
from StringIO import StringIO
|
||||||
|
|
||||||
|
|
||||||
class Test_controller( object ):
|
class Test_controller( object ):
|
||||||
|
def __init__( self ):
|
||||||
|
from new_model.User import User
|
||||||
|
from new_model.Notebook import Notebook
|
||||||
|
from new_model.Note import Note
|
||||||
|
|
||||||
|
# Since Stub_database isn't a real database and doesn't know SQL, replace some of the
|
||||||
|
# SQL-returning methods in User, Note, and Notebook to return functions that manipulate data in
|
||||||
|
# Stub_database directly instead. This is all a little fragile, but it's better than relying on
|
||||||
|
# the presence of a real database for unit tests.
|
||||||
|
def sql_save_notebook( self, notebook_id, read_write, database ):
|
||||||
|
if self.object_id in database.user_notebook:
|
||||||
|
database.user_notebook[ self.object_id ].append( ( notebook_id, read_write ) )
|
||||||
|
else:
|
||||||
|
database.user_notebook[ self.object_id ] = [ ( notebook_id, read_write ) ]
|
||||||
|
|
||||||
|
User.sql_save_notebook = lambda self, notebook_id, read_write = False: \
|
||||||
|
lambda database: sql_save_notebook( self, notebook_id, read_write, database )
|
||||||
|
|
||||||
|
def sql_load_notebooks( self, parents_only, database ):
|
||||||
|
notebooks = []
|
||||||
|
notebook_tuples = database.user_notebook.get( self.object_id )
|
||||||
|
|
||||||
|
if not notebook_tuples: return None
|
||||||
|
|
||||||
|
for notebook_tuple in notebook_tuples:
|
||||||
|
( notebook_id, read_write ) = notebook_tuple
|
||||||
|
notebook = database.objects.get( notebook_id )[ -1 ]
|
||||||
|
notebook._Notebook__read_write = read_write
|
||||||
|
if parents_only and notebook.trash_id is None:
|
||||||
|
continue
|
||||||
|
notebooks.append( notebook )
|
||||||
|
|
||||||
|
return notebooks
|
||||||
|
|
||||||
|
User.sql_load_notebooks = lambda self, parents_only = False: \
|
||||||
|
lambda database: sql_load_notebooks( self, parents_only, database )
|
||||||
|
|
||||||
|
def sql_load_by_username( username, database ):
|
||||||
|
users = []
|
||||||
|
|
||||||
|
for ( object_id, obj_list ) in database.objects.items():
|
||||||
|
obj = obj_list[ -1 ]
|
||||||
|
if isinstance( obj, User ) and obj.username == username:
|
||||||
|
users.append( obj )
|
||||||
|
|
||||||
|
return users
|
||||||
|
|
||||||
|
User.sql_load_by_username = staticmethod( lambda username: \
|
||||||
|
lambda database: sql_load_by_username( username, database ) )
|
||||||
|
|
||||||
|
def sql_load_by_email_address( email_address, database ):
|
||||||
|
users = []
|
||||||
|
|
||||||
|
for ( object_id, obj_list ) in database.objects.items():
|
||||||
|
obj = obj_list[ -1 ]
|
||||||
|
if isinstance( obj, User ) and obj.email_address == email_address:
|
||||||
|
users.append( obj )
|
||||||
|
|
||||||
|
return users
|
||||||
|
|
||||||
|
User.sql_load_by_email_address = staticmethod( lambda email_address: \
|
||||||
|
lambda database: sql_load_by_email_address( email_address, database ) )
|
||||||
|
|
||||||
|
def sql_calculate_storage( self, database ):
|
||||||
|
return ( 17, 3, 4, 22 ) # rather than actually calculating anything, return arbitrary numbers
|
||||||
|
|
||||||
|
User.sql_calculate_storage = lambda self: \
|
||||||
|
lambda database: sql_calculate_storage( self, database )
|
||||||
|
|
||||||
|
def sql_has_access( self, notebook_id, read_write, database ):
|
||||||
|
for ( user_id, notebook_tuples ) in database.user_notebook.items():
|
||||||
|
for notebook_tuple in notebook_tuples:
|
||||||
|
( db_notebook_id, db_read_write ) = notebook_tuple
|
||||||
|
|
||||||
|
if self.object_id == user_id and notebook_id == db_notebook_id:
|
||||||
|
if read_write is True and db_read_write is False:
|
||||||
|
return False
|
||||||
|
return True
|
||||||
|
|
||||||
|
return False
|
||||||
|
|
||||||
|
User.sql_has_access = lambda self, notebook_id, read_write = False: \
|
||||||
|
lambda database: sql_has_access( self, notebook_id, read_write, database )
|
||||||
|
|
||||||
|
def sql_load_revisions( self, database ):
|
||||||
|
note_list = database.objects.get( self.object_id )
|
||||||
|
if not note_list: return None
|
||||||
|
|
||||||
|
revisions = [ note.revision for note in note_list ]
|
||||||
|
return revisions
|
||||||
|
|
||||||
|
Note.sql_load_revisions = lambda self: \
|
||||||
|
lambda database: sql_load_revisions( self, database )
|
||||||
|
|
||||||
|
def sql_load_notes( self, database ):
|
||||||
|
notes = []
|
||||||
|
|
||||||
|
for ( object_id, obj_list ) in database.objects.items():
|
||||||
|
obj = obj_list[ -1 ]
|
||||||
|
if isinstance( obj, Note ) and obj.notebook_id == self.object_id:
|
||||||
|
notes.append( obj )
|
||||||
|
|
||||||
|
notes.sort( lambda a, b: -cmp( a.revision, b.revision ) )
|
||||||
|
return notes
|
||||||
|
|
||||||
|
Notebook.sql_load_notes = lambda self: \
|
||||||
|
lambda database: sql_load_notes( self, database )
|
||||||
|
|
||||||
|
def sql_load_startup_notes( self, database ):
|
||||||
|
notes = []
|
||||||
|
|
||||||
|
for ( object_id, obj_list ) in database.objects.items():
|
||||||
|
obj = obj_list[ -1 ]
|
||||||
|
if isinstance( obj, Note ) and obj.notebook_id == self.object_id and obj.startup:
|
||||||
|
notes.append( obj )
|
||||||
|
|
||||||
|
return notes
|
||||||
|
|
||||||
|
Notebook.sql_load_startup_notes = lambda self: \
|
||||||
|
lambda database: sql_load_startup_notes( self, database )
|
||||||
|
|
||||||
|
def sql_load_note_by_title( self, title, database ):
|
||||||
|
notes = []
|
||||||
|
|
||||||
|
for ( object_id, obj_list ) in database.objects.items():
|
||||||
|
obj = obj_list[ -1 ]
|
||||||
|
if isinstance( obj, Note ) and obj.notebook_id == self.object_id and obj.title == title:
|
||||||
|
notes.append( obj )
|
||||||
|
|
||||||
|
return notes
|
||||||
|
|
||||||
|
Notebook.sql_load_note_by_title = lambda self, title: \
|
||||||
|
lambda database: sql_load_note_by_title( self, title, database )
|
||||||
|
|
||||||
|
def sql_search_notes( self, search_text, database ):
|
||||||
|
notes = []
|
||||||
|
search_text = search_text.lower()
|
||||||
|
|
||||||
|
for ( object_id, obj_list ) in database.objects.items():
|
||||||
|
obj = obj_list[ -1 ]
|
||||||
|
if isinstance( obj, Note ) and obj.notebook_id == self.object_id and \
|
||||||
|
search_text in obj.contents.lower():
|
||||||
|
notes.append( obj )
|
||||||
|
|
||||||
|
return notes
|
||||||
|
|
||||||
|
Notebook.sql_search_notes = lambda self, search_text: \
|
||||||
|
lambda database: sql_search_notes( self, search_text, database )
|
||||||
|
|
||||||
|
def sql_highest_rank( self, database ):
|
||||||
|
max_rank = -1
|
||||||
|
|
||||||
|
for ( object_id, obj_list ) in database.objects.items():
|
||||||
|
obj = obj_list[ -1 ]
|
||||||
|
if isinstance( obj, Note ) and obj.notebook_id == self.object_id and obj.rank > max_rank:
|
||||||
|
max_rank = obj.rank
|
||||||
|
|
||||||
|
return max_rank
|
||||||
|
|
||||||
|
Notebook.sql_highest_rank = lambda self: \
|
||||||
|
lambda database: sql_highest_rank( self, database )
|
||||||
|
|
||||||
def setUp( self ):
|
def setUp( self ):
|
||||||
from controller.Root import Root
|
from controller.Root import Root
|
||||||
cherrypy.lowercase_api = True
|
cherrypy.lowercase_api = True
|
||||||
self.scheduler = Scheduler()
|
self.database = Stub_database()
|
||||||
self.database = Database( self.scheduler, database_path = None )
|
|
||||||
self.settings = {
|
self.settings = {
|
||||||
u"global": {
|
u"global": {
|
||||||
u"luminotes.http_url" : u"http://luminotes.com",
|
u"luminotes.http_url" : u"http://luminotes.com",
|
||||||
|
@ -33,7 +193,7 @@ class Test_controller( object ):
|
||||||
},
|
},
|
||||||
}
|
}
|
||||||
|
|
||||||
cherrypy.root = Root( self.scheduler, self.database, self.settings )
|
cherrypy.root = Root( self.database, self.settings )
|
||||||
cherrypy.config.update( Common.settings )
|
cherrypy.config.update( Common.settings )
|
||||||
cherrypy.config.update( { u"server.log_to_screen": False } )
|
cherrypy.config.update( { u"server.log_to_screen": False } )
|
||||||
cherrypy.server.start( init_only = True, server_class = None )
|
cherrypy.server.start( init_only = True, server_class = None )
|
||||||
|
@ -45,7 +205,6 @@ class Test_controller( object ):
|
||||||
|
|
||||||
def tearDown( self ):
|
def tearDown( self ):
|
||||||
cherrypy.server.stop()
|
cherrypy.server.stop()
|
||||||
self.scheduler.shutdown()
|
|
||||||
|
|
||||||
def http_get( self, http_path, headers = None, session_id = None, pretend_https = False ):
|
def http_get( self, http_path, headers = None, session_id = None, pretend_https = False ):
|
||||||
"""
|
"""
|
||||||
|
@ -64,7 +223,7 @@ class Test_controller( object ):
|
||||||
proxy_ip = self.settings[ "global" ].get( u"luminotes.http_proxy_ip" )
|
proxy_ip = self.settings[ "global" ].get( u"luminotes.http_proxy_ip" )
|
||||||
|
|
||||||
request = cherrypy.server.request( ( proxy_ip, 1234 ), u"127.0.0.5" )
|
request = cherrypy.server.request( ( proxy_ip, 1234 ), u"127.0.0.5" )
|
||||||
response = request.run( "GET %s HTTP/1.0" % http_path, headers = headers, rfile = StringIO() )
|
response = request.run( "GET %s HTTP/1.0" % str( http_path ), headers = headers, rfile = StringIO() )
|
||||||
session_id = response.simple_cookie.get( u"session_id" )
|
session_id = response.simple_cookie.get( u"session_id" )
|
||||||
if session_id: session_id = session_id.value
|
if session_id: session_id = session_id.value
|
||||||
|
|
||||||
|
@ -103,7 +262,7 @@ class Test_controller( object ):
|
||||||
headers.append( ( u"Cookie", "session_id=%s" % session_id ) ) # will break if unicode is used for the value
|
headers.append( ( u"Cookie", "session_id=%s" % session_id ) ) # will break if unicode is used for the value
|
||||||
|
|
||||||
request = cherrypy.server.request( ( u"127.0.0.1", 1234 ), u"127.0.0.5" )
|
request = cherrypy.server.request( ( u"127.0.0.1", 1234 ), u"127.0.0.5" )
|
||||||
response = request.run( "POST %s HTTP/1.0" % http_path, headers = headers, rfile = StringIO( post_data ) )
|
response = request.run( "POST %s HTTP/1.0" % str( http_path ), headers = headers, rfile = StringIO( post_data ) )
|
||||||
session_id = response.simple_cookie.get( u"session_id" )
|
session_id = response.simple_cookie.get( u"session_id" )
|
||||||
if session_id: session_id = session_id.value
|
if session_id: session_id = session_id.value
|
||||||
|
|
||||||
|
|
|
@ -1,323 +1,188 @@
|
||||||
|
from pytz import utc
|
||||||
|
from pysqlite2 import dbapi2 as sqlite
|
||||||
|
from datetime import datetime
|
||||||
|
from Stub_object import Stub_object
|
||||||
from controller.Database import Database
|
from controller.Database import Database
|
||||||
from controller.Scheduler import Scheduler
|
|
||||||
from model.Persistent import Persistent
|
|
||||||
|
|
||||||
|
|
||||||
class Some_object( Persistent ):
|
|
||||||
def __init__( self, object_id, value, value2 = None, secondary_id = None ):
|
|
||||||
Persistent.__init__( self, object_id, secondary_id )
|
|
||||||
self.__value = value
|
|
||||||
self.__value2 = value2
|
|
||||||
|
|
||||||
def __set_value( self, value ):
|
|
||||||
self.update_revision()
|
|
||||||
self.__value = value
|
|
||||||
|
|
||||||
def __set_value2( self, value2 ):
|
|
||||||
self.update_revision()
|
|
||||||
self.__value2 = value2
|
|
||||||
|
|
||||||
value = property( lambda self: self.__value, __set_value )
|
|
||||||
value2 = property( lambda self: self.__value2, __set_value2 )
|
|
||||||
|
|
||||||
|
|
||||||
class Test_database( object ):
|
class Test_database( object ):
|
||||||
def __init__( self, clear_cache = True ):
|
|
||||||
self.clear_cache = clear_cache
|
|
||||||
|
|
||||||
def setUp( self ):
|
def setUp( self ):
|
||||||
self.scheduler = Scheduler()
|
# make an in-memory sqlite database to use in place of PostgreSQL during testing
|
||||||
self.database = Database( self.scheduler )
|
self.connection = sqlite.connect( ":memory:", detect_types = sqlite.PARSE_DECLTYPES | sqlite.PARSE_COLNAMES )
|
||||||
next_id = None
|
cursor = self.connection.cursor()
|
||||||
|
cursor.execute( Stub_object.sql_create_table() )
|
||||||
|
|
||||||
|
self.database = Database( self.connection )
|
||||||
|
|
||||||
def tearDown( self ):
|
def tearDown( self ):
|
||||||
self.database.close()
|
self.database.close()
|
||||||
self.scheduler.shutdown()
|
|
||||||
|
|
||||||
def test_save_and_load( self ):
|
def test_save_and_load( self ):
|
||||||
def gen():
|
basic_obj = Stub_object( object_id = "5", value = 1 )
|
||||||
basic_obj = Some_object( object_id = "5", value = 1 )
|
original_revision = basic_obj.revision
|
||||||
original_revision = basic_obj.revision
|
|
||||||
|
|
||||||
self.database.save( basic_obj, self.scheduler.thread )
|
self.database.save( basic_obj )
|
||||||
yield Scheduler.SLEEP
|
obj = self.database.load( Stub_object, basic_obj.object_id )
|
||||||
if self.clear_cache: self.database.clear_cache()
|
|
||||||
self.database.load( basic_obj.object_id, self.scheduler.thread )
|
|
||||||
obj = ( yield Scheduler.SLEEP )
|
|
||||||
|
|
||||||
assert obj.object_id == basic_obj.object_id
|
assert obj.object_id == basic_obj.object_id
|
||||||
assert obj.revision == original_revision
|
assert obj.revision.replace( tzinfo = utc ) == original_revision
|
||||||
assert obj.revisions_list == [ original_revision ]
|
assert obj.value == basic_obj.value
|
||||||
assert obj.value == basic_obj.value
|
|
||||||
|
|
||||||
g = gen()
|
def test_save_and_load_without_commit( self ):
|
||||||
self.scheduler.add( g )
|
basic_obj = Stub_object( object_id = "5", value = 1 )
|
||||||
self.scheduler.wait_for( g )
|
original_revision = basic_obj.revision
|
||||||
|
|
||||||
def test_complex_save_and_load( self ):
|
self.database.save( basic_obj, commit = False )
|
||||||
def gen():
|
self.connection.rollback() # if commit wasn't called, this should back out the save
|
||||||
basic_obj = Some_object( object_id = "7", value = 2 )
|
obj = self.database.load( Stub_object, basic_obj.object_id )
|
||||||
basic_original_revision = basic_obj.revision
|
|
||||||
complex_obj = Some_object( object_id = "6", value = basic_obj )
|
|
||||||
complex_original_revision = complex_obj.revision
|
|
||||||
|
|
||||||
self.database.save( complex_obj, self.scheduler.thread )
|
assert obj == None
|
||||||
yield Scheduler.SLEEP
|
|
||||||
if self.clear_cache: self.database.clear_cache()
|
|
||||||
self.database.load( complex_obj.object_id, self.scheduler.thread )
|
|
||||||
obj = ( yield Scheduler.SLEEP )
|
|
||||||
if self.clear_cache: self.database.clear_cache()
|
|
||||||
|
|
||||||
assert obj.object_id == complex_obj.object_id
|
def test_save_and_load_with_explicit_commit( self ):
|
||||||
assert obj.revision == complex_original_revision
|
basic_obj = Stub_object( object_id = "5", value = 1 )
|
||||||
assert obj.revisions_list == [ complex_original_revision ]
|
original_revision = basic_obj.revision
|
||||||
assert obj.value.object_id == basic_obj.object_id
|
|
||||||
assert obj.value.value == basic_obj.value
|
|
||||||
assert obj.value.revision == basic_original_revision
|
|
||||||
assert obj.value.revisions_list == [ basic_original_revision ]
|
|
||||||
|
|
||||||
self.database.load( basic_obj.object_id, self.scheduler.thread )
|
self.database.save( basic_obj, commit = False )
|
||||||
obj = ( yield Scheduler.SLEEP )
|
self.database.commit()
|
||||||
|
self.connection.rollback() # should have no effect because of the call to commit
|
||||||
|
obj = self.database.load( Stub_object, basic_obj.object_id )
|
||||||
|
|
||||||
assert obj.object_id == basic_obj.object_id
|
assert obj.object_id == basic_obj.object_id
|
||||||
assert obj.value == basic_obj.value
|
assert obj.revision.replace( tzinfo = utc ) == original_revision
|
||||||
assert obj.revision == basic_original_revision
|
assert obj.value == basic_obj.value
|
||||||
assert obj.revisions_list == [ basic_original_revision ]
|
|
||||||
|
|
||||||
g = gen()
|
def test_select_one( self ):
|
||||||
self.scheduler.add( g )
|
basic_obj = Stub_object( object_id = "5", value = 1 )
|
||||||
self.scheduler.wait_for( g )
|
original_revision = basic_obj.revision
|
||||||
|
|
||||||
def test_save_and_load_by_secondary( self ):
|
self.database.save( basic_obj )
|
||||||
def gen():
|
obj = self.database.select_one( Stub_object, Stub_object.sql_load( basic_obj.object_id ) )
|
||||||
basic_obj = Some_object( object_id = "5", value = 1, secondary_id = u"foo" )
|
|
||||||
original_revision = basic_obj.revision
|
|
||||||
|
|
||||||
self.database.save( basic_obj, self.scheduler.thread )
|
assert obj.object_id == basic_obj.object_id
|
||||||
yield Scheduler.SLEEP
|
assert obj.revision.replace( tzinfo = utc ) == original_revision
|
||||||
if self.clear_cache: self.database.clear_cache()
|
assert obj.value == basic_obj.value
|
||||||
self.database.load( u"Some_object foo", self.scheduler.thread )
|
|
||||||
obj = ( yield Scheduler.SLEEP )
|
|
||||||
|
|
||||||
assert obj.object_id == basic_obj.object_id
|
def test_select_one_tuple( self ):
|
||||||
assert obj.value == basic_obj.value
|
obj = self.database.select_one( tuple, Stub_object.sql_tuple() )
|
||||||
assert obj.revision == original_revision
|
|
||||||
assert obj.revisions_list == [ original_revision ]
|
|
||||||
|
|
||||||
g = gen()
|
assert len( obj ) == 2
|
||||||
self.scheduler.add( g )
|
assert obj[ 0 ] == 1
|
||||||
self.scheduler.wait_for( g )
|
assert obj[ 1 ] == 2
|
||||||
|
|
||||||
def test_duplicate_save_and_load( self ):
|
def test_select_many( self ):
|
||||||
def gen():
|
basic_obj = Stub_object( object_id = "5", value = 1 )
|
||||||
basic_obj = Some_object( object_id = "9", value = 3 )
|
original_revision = basic_obj.revision
|
||||||
basic_original_revision = basic_obj.revision
|
basic_obj2 = Stub_object( object_id = "6", value = 2 )
|
||||||
complex_obj = Some_object( object_id = "8", value = basic_obj, value2 = basic_obj )
|
original_revision2 = basic_obj2.revision
|
||||||
complex_original_revision = complex_obj.revision
|
|
||||||
|
|
||||||
self.database.save( complex_obj, self.scheduler.thread )
|
self.database.save( basic_obj )
|
||||||
yield Scheduler.SLEEP
|
self.database.save( basic_obj2 )
|
||||||
if self.clear_cache: self.database.clear_cache()
|
objs = self.database.select_many( Stub_object, Stub_object.sql_load_em_all() )
|
||||||
self.database.load( complex_obj.object_id, self.scheduler.thread )
|
|
||||||
obj = ( yield Scheduler.SLEEP )
|
|
||||||
if self.clear_cache: self.database.clear_cache()
|
|
||||||
|
|
||||||
assert obj.object_id == complex_obj.object_id
|
assert len( objs ) == 2
|
||||||
assert obj.revision == complex_original_revision
|
assert objs[ 0 ].object_id == basic_obj.object_id
|
||||||
assert obj.revisions_list == [ complex_original_revision ]
|
assert objs[ 0 ].revision.replace( tzinfo = utc ) == original_revision
|
||||||
|
assert objs[ 0 ].value == basic_obj.value
|
||||||
|
assert objs[ 1 ].object_id == basic_obj2.object_id
|
||||||
|
assert objs[ 1 ].revision.replace( tzinfo = utc ) == original_revision2
|
||||||
|
assert objs[ 1 ].value == basic_obj2.value
|
||||||
|
|
||||||
assert obj.value.object_id == basic_obj.object_id
|
def test_select_many_tuples( self ):
|
||||||
assert obj.value.value == basic_obj.value
|
objs = self.database.select_many( tuple, Stub_object.sql_tuple() )
|
||||||
assert obj.value.revision == basic_original_revision
|
|
||||||
assert obj.value.revisions_list == [ basic_original_revision ]
|
|
||||||
|
|
||||||
assert obj.value2.object_id == basic_obj.object_id
|
assert len( objs ) == 1
|
||||||
assert obj.value2.value == basic_obj.value
|
assert len( objs[ 0 ] ) == 2
|
||||||
assert obj.value2.revision == basic_original_revision
|
assert objs[ 0 ][ 0 ] == 1
|
||||||
assert obj.value2.revisions_list == [ basic_original_revision ]
|
assert objs[ 0 ][ 1 ] == 2
|
||||||
|
|
||||||
assert obj.value == obj.value2
|
def test_select_many_with_no_matches( self ):
|
||||||
|
objs = self.database.select_many( Stub_object, Stub_object.sql_load_em_all() )
|
||||||
|
|
||||||
self.database.load( basic_obj.object_id, self.scheduler.thread )
|
assert len( objs ) == 0
|
||||||
obj = ( yield Scheduler.SLEEP )
|
|
||||||
|
|
||||||
assert obj.object_id == basic_obj.object_id
|
|
||||||
assert obj.value == basic_obj.value
|
|
||||||
assert obj.revision == basic_original_revision
|
|
||||||
assert obj.revisions_list == [ basic_original_revision ]
|
|
||||||
|
|
||||||
g = gen()
|
|
||||||
self.scheduler.add( g )
|
|
||||||
self.scheduler.wait_for( g )
|
|
||||||
|
|
||||||
def test_save_and_load_revision( self ):
|
def test_save_and_load_revision( self ):
|
||||||
def gen():
|
basic_obj = Stub_object( object_id = "5", value = 1 )
|
||||||
basic_obj = Some_object( object_id = "5", value = 1 )
|
original_revision = basic_obj.revision
|
||||||
original_revision = basic_obj.revision
|
|
||||||
|
|
||||||
self.database.save( basic_obj, self.scheduler.thread )
|
self.database.save( basic_obj )
|
||||||
yield Scheduler.SLEEP
|
basic_obj.value = 2
|
||||||
if self.clear_cache: self.database.clear_cache()
|
|
||||||
|
|
||||||
basic_obj.value = 2
|
self.database.save( basic_obj )
|
||||||
|
obj = self.database.load( Stub_object, basic_obj.object_id )
|
||||||
|
|
||||||
self.database.save( basic_obj, self.scheduler.thread )
|
assert obj.object_id == basic_obj.object_id
|
||||||
yield Scheduler.SLEEP
|
assert obj.revision.replace( tzinfo = utc ) == basic_obj.revision
|
||||||
if self.clear_cache: self.database.clear_cache()
|
assert obj.value == basic_obj.value
|
||||||
self.database.load( basic_obj.object_id, self.scheduler.thread )
|
|
||||||
obj = ( yield Scheduler.SLEEP )
|
|
||||||
if self.clear_cache: self.database.clear_cache()
|
|
||||||
|
|
||||||
assert obj.object_id == basic_obj.object_id
|
revised = self.database.load( Stub_object, basic_obj.object_id, revision = original_revision )
|
||||||
assert obj.revision == basic_obj.revision
|
|
||||||
assert obj.revisions_list == [ original_revision, basic_obj.revision ]
|
|
||||||
assert obj.value == basic_obj.value
|
|
||||||
|
|
||||||
self.database.load( basic_obj.object_id, self.scheduler.thread, revision = original_revision )
|
assert revised.object_id == basic_obj.object_id
|
||||||
revised = ( yield Scheduler.SLEEP )
|
assert revised.value == 1
|
||||||
|
assert revised.revision.replace( tzinfo = utc ) == original_revision
|
||||||
|
|
||||||
assert revised.object_id == basic_obj.object_id
|
def test_execute( self ):
|
||||||
assert revised.value == 1
|
basic_obj = Stub_object( object_id = "5", value = 1 )
|
||||||
assert revised.revision == original_revision
|
original_revision = basic_obj.revision
|
||||||
assert id( obj.revisions_list ) != id( revised.revisions_list )
|
|
||||||
assert revised.revisions_list == [ original_revision ]
|
|
||||||
|
|
||||||
g = gen()
|
self.database.execute( basic_obj.sql_create() )
|
||||||
self.scheduler.add( g )
|
obj = self.database.load( Stub_object, basic_obj.object_id )
|
||||||
self.scheduler.wait_for( g )
|
|
||||||
|
assert obj.object_id == basic_obj.object_id
|
||||||
|
assert obj.revision.replace( tzinfo = utc ) == original_revision
|
||||||
|
assert obj.value == basic_obj.value
|
||||||
|
|
||||||
|
def test_execute_without_commit( self ):
|
||||||
|
basic_obj = Stub_object( object_id = "5", value = 1 )
|
||||||
|
original_revision = basic_obj.revision
|
||||||
|
|
||||||
|
self.database.execute( basic_obj.sql_create(), commit = False )
|
||||||
|
self.connection.rollback()
|
||||||
|
obj = self.database.load( Stub_object, basic_obj.object_id )
|
||||||
|
|
||||||
|
assert obj == None
|
||||||
|
|
||||||
|
def test_execute_with_explicit_commit( self ):
|
||||||
|
basic_obj = Stub_object( object_id = "5", value = 1 )
|
||||||
|
original_revision = basic_obj.revision
|
||||||
|
|
||||||
|
self.database.execute( basic_obj.sql_create(), commit = False )
|
||||||
|
self.database.commit()
|
||||||
|
obj = self.database.load( Stub_object, basic_obj.object_id )
|
||||||
|
|
||||||
|
assert obj.object_id == basic_obj.object_id
|
||||||
|
assert obj.revision.replace( tzinfo = utc ) == original_revision
|
||||||
|
assert obj.value == basic_obj.value
|
||||||
|
|
||||||
def test_load_unknown( self ):
|
def test_load_unknown( self ):
|
||||||
def gen():
|
basic_obj = Stub_object( object_id = "5", value = 1 )
|
||||||
basic_obj = Some_object( object_id = "5", value = 1 )
|
obj = self.database.load( Stub_object, basic_obj.object_id )
|
||||||
self.database.load( basic_obj.object_id, self.scheduler.thread )
|
|
||||||
obj = ( yield Scheduler.SLEEP )
|
|
||||||
|
|
||||||
assert obj == None
|
|
||||||
|
|
||||||
g = gen()
|
|
||||||
self.scheduler.add( g )
|
|
||||||
self.scheduler.wait_for( g )
|
|
||||||
|
|
||||||
def test_reload( self ):
|
|
||||||
def gen():
|
|
||||||
basic_obj = Some_object( object_id = "5", value = 1 )
|
|
||||||
original_revision = basic_obj.revision
|
|
||||||
|
|
||||||
self.database.save( basic_obj, self.scheduler.thread )
|
|
||||||
yield Scheduler.SLEEP
|
|
||||||
if self.clear_cache: self.database.clear_cache()
|
|
||||||
|
|
||||||
def setstate( self, state ):
|
|
||||||
state[ "_Some_object__value" ] = 55
|
|
||||||
self.__dict__.update( state )
|
|
||||||
|
|
||||||
Some_object.__setstate__ = setstate
|
|
||||||
|
|
||||||
self.database.reload( basic_obj.object_id, self.scheduler.thread )
|
|
||||||
yield Scheduler.SLEEP
|
|
||||||
delattr( Some_object, "__setstate__" )
|
|
||||||
if self.clear_cache: self.database.clear_cache()
|
|
||||||
|
|
||||||
self.database.load( basic_obj.object_id, self.scheduler.thread )
|
|
||||||
obj = ( yield Scheduler.SLEEP )
|
|
||||||
|
|
||||||
assert obj.object_id == basic_obj.object_id
|
|
||||||
assert obj.value == 55
|
|
||||||
assert obj.revision == original_revision
|
|
||||||
assert obj.revisions_list == [ original_revision ]
|
|
||||||
|
|
||||||
g = gen()
|
|
||||||
self.scheduler.add( g )
|
|
||||||
self.scheduler.wait_for( g )
|
|
||||||
|
|
||||||
def test_reload_revision( self ):
|
|
||||||
def gen():
|
|
||||||
basic_obj = Some_object( object_id = "5", value = 1 )
|
|
||||||
original_revision = basic_obj.revision
|
|
||||||
original_revision_id = basic_obj.revision_id()
|
|
||||||
|
|
||||||
self.database.save( basic_obj, self.scheduler.thread )
|
|
||||||
yield Scheduler.SLEEP
|
|
||||||
if self.clear_cache: self.database.clear_cache()
|
|
||||||
|
|
||||||
basic_obj.value = 2
|
|
||||||
|
|
||||||
self.database.save( basic_obj, self.scheduler.thread )
|
|
||||||
yield Scheduler.SLEEP
|
|
||||||
if self.clear_cache: self.database.clear_cache()
|
|
||||||
|
|
||||||
def setstate( self, state ):
|
|
||||||
state[ "_Some_object__value" ] = 55
|
|
||||||
self.__dict__.update( state )
|
|
||||||
|
|
||||||
Some_object.__setstate__ = setstate
|
|
||||||
|
|
||||||
self.database.reload( original_revision_id, self.scheduler.thread )
|
|
||||||
yield Scheduler.SLEEP
|
|
||||||
delattr( Some_object, "__setstate__" )
|
|
||||||
if self.clear_cache: self.database.clear_cache()
|
|
||||||
|
|
||||||
self.database.load( basic_obj.object_id, self.scheduler.thread, revision = original_revision )
|
|
||||||
obj = ( yield Scheduler.SLEEP )
|
|
||||||
|
|
||||||
assert obj.object_id == basic_obj.object_id
|
|
||||||
assert obj.revision == original_revision
|
|
||||||
assert obj.revisions_list == [ original_revision ]
|
|
||||||
assert obj.value == 55
|
|
||||||
|
|
||||||
g = gen()
|
|
||||||
self.scheduler.add( g )
|
|
||||||
self.scheduler.wait_for( g )
|
|
||||||
|
|
||||||
def test_size( self ):
|
|
||||||
def gen():
|
|
||||||
basic_obj = Some_object( object_id = "5", value = 1 )
|
|
||||||
original_revision = basic_obj.revision
|
|
||||||
|
|
||||||
self.database.save( basic_obj, self.scheduler.thread )
|
|
||||||
yield Scheduler.SLEEP
|
|
||||||
if self.clear_cache: self.database.clear_cache()
|
|
||||||
|
|
||||||
size = self.database.size( basic_obj.object_id )
|
|
||||||
|
|
||||||
from cPickle import Pickler
|
|
||||||
from StringIO import StringIO
|
|
||||||
buffer = StringIO()
|
|
||||||
pickler = Pickler( buffer, protocol = -1 )
|
|
||||||
pickler.dump( basic_obj )
|
|
||||||
expected_size = len( buffer.getvalue() )
|
|
||||||
|
|
||||||
# as long as the size is close to the expected size, that's fine
|
|
||||||
assert abs( size - expected_size ) < 10
|
|
||||||
|
|
||||||
g = gen()
|
|
||||||
self.scheduler.add( g )
|
|
||||||
self.scheduler.wait_for( g )
|
|
||||||
|
|
||||||
|
assert obj == None
|
||||||
|
|
||||||
def test_next_id( self ):
|
def test_next_id( self ):
|
||||||
def gen():
|
next_id = self.database.next_id( Stub_object )
|
||||||
self.database.next_id( self.scheduler.thread )
|
assert next_id
|
||||||
next_id = ( yield Scheduler.SLEEP )
|
assert self.database.load( Stub_object, next_id )
|
||||||
assert next_id
|
prev_ids = [ next_id ]
|
||||||
prev_ids = [ next_id ]
|
|
||||||
|
|
||||||
self.database.next_id( self.scheduler.thread )
|
next_id = self.database.next_id( Stub_object )
|
||||||
next_id = ( yield Scheduler.SLEEP )
|
assert next_id
|
||||||
assert next_id
|
assert next_id not in prev_ids
|
||||||
assert next_id not in prev_ids
|
assert self.database.load( Stub_object, next_id )
|
||||||
prev_ids.append( next_id )
|
prev_ids.append( next_id )
|
||||||
|
|
||||||
self.database.next_id( self.scheduler.thread )
|
next_id = self.database.next_id( Stub_object )
|
||||||
next_id = ( yield Scheduler.SLEEP )
|
assert next_id
|
||||||
assert next_id
|
assert next_id not in prev_ids
|
||||||
assert next_id not in prev_ids
|
assert self.database.load( Stub_object, next_id )
|
||||||
|
|
||||||
g = gen()
|
def test_next_id_without_commit( self ):
|
||||||
self.scheduler.add( g )
|
next_id = self.database.next_id( Stub_object, commit = False )
|
||||||
self.scheduler.wait_for( g )
|
self.connection.rollback()
|
||||||
|
assert self.database.load( Stub_object, next_id ) == None
|
||||||
|
|
||||||
|
def test_next_id_with_explit_commit( self ):
|
||||||
class Test_database_without_clearing_cache( Test_database ):
|
next_id = self.database.next_id( Stub_object, commit = False )
|
||||||
def __init__( self ):
|
self.database.commit()
|
||||||
Test_database.__init__( self, clear_cache = False )
|
assert next_id
|
||||||
|
assert self.database.load( Stub_object, next_id )
|
||||||
|
|
File diff suppressed because it is too large
Load Diff
|
@ -1,5 +1,5 @@
|
||||||
import cherrypy
|
import cherrypy
|
||||||
from model.User import User
|
from new_model.User import User
|
||||||
from controller.Scheduler import Scheduler
|
from controller.Scheduler import Scheduler
|
||||||
from Test_controller import Test_controller
|
from Test_controller import Test_controller
|
||||||
|
|
||||||
|
@ -14,13 +14,7 @@ class Test_root( Test_controller ):
|
||||||
self.user = None
|
self.user = None
|
||||||
self.session_id = None
|
self.session_id = None
|
||||||
|
|
||||||
thread = self.make_user()
|
self.user = User.create( self.database.next_id( User ), self.username, self.password, self.email_address )
|
||||||
self.scheduler.add( thread )
|
|
||||||
self.scheduler.wait_for( thread )
|
|
||||||
|
|
||||||
def make_user( self ):
|
|
||||||
self.database.next_id( self.scheduler.thread )
|
|
||||||
self.user = User( ( yield Scheduler.SLEEP ), self.username, self.password, self.email_address, [] )
|
|
||||||
self.database.save( self.user )
|
self.database.save( self.user )
|
||||||
|
|
||||||
def test_index( self ):
|
def test_index( self ):
|
||||||
|
|
|
@ -1,15 +1,15 @@
|
||||||
import re
|
import re
|
||||||
import cherrypy
|
import cherrypy
|
||||||
import smtplib
|
import smtplib
|
||||||
|
from pytz import utc
|
||||||
from datetime import datetime, timedelta
|
from datetime import datetime, timedelta
|
||||||
from nose.tools import raises
|
from nose.tools import raises
|
||||||
from Test_controller import Test_controller
|
from Test_controller import Test_controller
|
||||||
from Stub_smtp import Stub_smtp
|
from Stub_smtp import Stub_smtp
|
||||||
from controller.Scheduler import Scheduler
|
from new_model.User import User
|
||||||
from model.User import User
|
from new_model.Notebook import Notebook
|
||||||
from model.Notebook import Notebook
|
from new_model.Note import Note
|
||||||
from model.Note import Note
|
from new_model.Password_reset import Password_reset
|
||||||
from model.User_list import User_list
|
|
||||||
|
|
||||||
|
|
||||||
class Test_users( Test_controller ):
|
class Test_users( Test_controller ):
|
||||||
|
@ -32,46 +32,42 @@ class Test_users( Test_controller ):
|
||||||
self.anonymous = None
|
self.anonymous = None
|
||||||
self.notebooks = None
|
self.notebooks = None
|
||||||
|
|
||||||
thread = self.make_users()
|
self.make_users()
|
||||||
self.scheduler.add( thread )
|
|
||||||
self.scheduler.wait_for( thread )
|
|
||||||
|
|
||||||
def make_users( self ):
|
def make_users( self ):
|
||||||
self.database.next_id( self.scheduler.thread )
|
notebook_id1 = self.database.next_id( Notebook )
|
||||||
notebook_id1 = ( yield Scheduler.SLEEP )
|
notebook_id2 = self.database.next_id( Notebook )
|
||||||
self.database.next_id( self.scheduler.thread )
|
trash_id1 = self.database.next_id( Notebook )
|
||||||
notebook_id2 = ( yield Scheduler.SLEEP )
|
trash_id2 = self.database.next_id( Notebook )
|
||||||
|
|
||||||
self.notebooks = [
|
self.notebooks = [
|
||||||
Notebook( notebook_id1, u"my notebook" ),
|
Notebook.create( notebook_id1, u"my notebook", trash_id = trash_id1 ),
|
||||||
Notebook( notebook_id2, u"my other notebook" ),
|
Notebook.create( notebook_id2, u"my other notebook", trash_id = trash_id2 ),
|
||||||
]
|
]
|
||||||
|
self.database.save( self.notebooks[ 0 ] )
|
||||||
|
self.database.save( self.notebooks[ 1 ] )
|
||||||
|
|
||||||
self.database.next_id( self.scheduler.thread )
|
self.anon_notebook = Notebook.create( self.database.next_id( Notebook ), u"anon notebook" )
|
||||||
self.anon_notebook = Notebook( ( yield Scheduler.SLEEP ), u"anon notebook" )
|
self.database.save( self.anon_notebook )
|
||||||
self.database.next_id( self.scheduler.thread )
|
self.startup_note = Note.create(
|
||||||
self.startup_note = Note( ( yield Scheduler.SLEEP ), u"<h3>login</h3>" )
|
self.database.next_id( Note ), u"<h3>login</h3>",
|
||||||
self.anon_notebook.add_note( self.startup_note )
|
notebook_id = self.anon_notebook.object_id, startup = True,
|
||||||
self.anon_notebook.add_startup_note( self.startup_note )
|
)
|
||||||
|
self.database.save( self.startup_note )
|
||||||
|
|
||||||
self.database.next_id( self.scheduler.thread )
|
self.user = User.create( self.database.next_id( User ), self.username, self.password, self.email_address )
|
||||||
self.user = User( ( yield Scheduler.SLEEP ), self.username, self.password, self.email_address, self.notebooks )
|
self.database.save( self.user, commit = False )
|
||||||
self.database.next_id( self.scheduler.thread )
|
self.database.execute( self.user.sql_save_notebook( notebook_id1, read_write = True ), commit = False )
|
||||||
self.user2 = User( ( yield Scheduler.SLEEP ), self.username2, self.password2, self.email_address2 )
|
self.database.execute( self.user.sql_save_notebook( notebook_id2, read_write = True ), commit = False )
|
||||||
self.database.next_id( self.scheduler.thread )
|
|
||||||
self.anonymous = User( ( yield Scheduler.SLEEP ), u"anonymous", None, None, [ self.anon_notebook ] )
|
|
||||||
|
|
||||||
self.database.next_id( self.scheduler.thread )
|
self.user2 = User.create( self.database.next_id( User ), self.username2, self.password2, self.email_address2 )
|
||||||
user_list_id = ( yield Scheduler.SLEEP )
|
self.database.save( self.user2, commit = False )
|
||||||
user_list = User_list( user_list_id, u"all" )
|
|
||||||
user_list.add_user( self.user )
|
|
||||||
user_list.add_user( self.user2 )
|
|
||||||
user_list.add_user( self.anonymous )
|
|
||||||
|
|
||||||
self.database.save( self.user )
|
self.anonymous = User.create( self.database.next_id( User ), u"anonymous" )
|
||||||
self.database.save( self.user2 )
|
self.database.save( self.anonymous, commit = False )
|
||||||
self.database.save( self.anonymous )
|
self.database.execute( self.anonymous.sql_save_notebook( self.anon_notebook.object_id ), commit = False )
|
||||||
self.database.save( user_list )
|
|
||||||
|
self.database.commit()
|
||||||
|
|
||||||
def test_signup( self ):
|
def test_signup( self ):
|
||||||
result = self.http_post( "/users/signup", dict(
|
result = self.http_post( "/users/signup", dict(
|
||||||
|
@ -103,20 +99,33 @@ class Test_users( Test_controller ):
|
||||||
|
|
||||||
assert result[ u"user" ].username == self.new_username
|
assert result[ u"user" ].username == self.new_username
|
||||||
notebooks = result[ u"notebooks" ]
|
notebooks = result[ u"notebooks" ]
|
||||||
assert len( notebooks ) == 2
|
notebook = notebooks[ 0 ]
|
||||||
assert notebooks[ 0 ] == self.anon_notebook
|
assert notebook.object_id == self.anon_notebook.object_id
|
||||||
assert notebooks[ 0 ].trash == None
|
assert notebook.revision == self.anon_notebook.revision
|
||||||
|
assert notebook.name == self.anon_notebook.name
|
||||||
|
assert notebook.trash_id == None
|
||||||
|
assert notebook.read_write == False
|
||||||
|
|
||||||
notebook = notebooks[ 1 ]
|
notebook = notebooks[ 1 ]
|
||||||
assert notebook.object_id == new_notebook_id
|
assert notebook.object_id == new_notebook_id
|
||||||
assert notebook.trash
|
assert notebook.revision
|
||||||
assert len( notebook.notes ) == 1
|
assert notebook.name == u"my notebook"
|
||||||
assert len( notebook.startup_notes ) == 1
|
assert notebook.trash_id
|
||||||
|
assert notebook.read_write == True
|
||||||
|
|
||||||
|
notebook = notebooks[ 2 ]
|
||||||
|
assert notebook.object_id == notebooks[ 1 ].trash_id
|
||||||
|
assert notebook.revision
|
||||||
|
assert notebook.name == u"trash"
|
||||||
|
assert notebook.trash_id == None
|
||||||
|
assert notebook.read_write == True
|
||||||
|
|
||||||
startup_notes = result[ "startup_notes" ]
|
startup_notes = result[ "startup_notes" ]
|
||||||
if include_startup_notes:
|
if include_startup_notes:
|
||||||
assert len( startup_notes ) == 1
|
assert len( startup_notes ) == 1
|
||||||
assert startup_notes[ 0 ] == self.startup_note
|
assert startup_notes[ 0 ].object_id == self.startup_note.object_id
|
||||||
|
assert startup_notes[ 0 ].title == self.startup_note.title
|
||||||
|
assert startup_notes[ 0 ].contents == self.startup_note.contents
|
||||||
else:
|
else:
|
||||||
assert startup_notes == []
|
assert startup_notes == []
|
||||||
|
|
||||||
|
@ -156,20 +165,34 @@ class Test_users( Test_controller ):
|
||||||
|
|
||||||
assert result[ u"user" ].username == None
|
assert result[ u"user" ].username == None
|
||||||
notebooks = result[ u"notebooks" ]
|
notebooks = result[ u"notebooks" ]
|
||||||
assert len( notebooks ) == 2
|
assert len( notebooks ) == 3
|
||||||
assert notebooks[ 0 ] == self.anon_notebook
|
notebook = notebooks[ 0 ]
|
||||||
assert notebooks[ 0 ].trash == None
|
assert notebook.object_id == self.anon_notebook.object_id
|
||||||
|
assert notebook.revision == self.anon_notebook.revision
|
||||||
|
assert notebook.name == self.anon_notebook.name
|
||||||
|
assert notebook.trash_id == None
|
||||||
|
assert notebook.read_write == False
|
||||||
|
|
||||||
notebook = notebooks[ 1 ]
|
notebook = notebooks[ 1 ]
|
||||||
assert notebook.object_id == new_notebook_id
|
assert notebook.object_id == new_notebook_id
|
||||||
assert notebook.trash
|
assert notebook.revision
|
||||||
assert len( notebook.notes ) == 2
|
assert notebook.name == u"my notebook"
|
||||||
assert len( notebook.startup_notes ) == 2
|
assert notebook.trash_id
|
||||||
|
assert notebook.read_write == True
|
||||||
|
|
||||||
|
notebook = notebooks[ 2 ]
|
||||||
|
assert notebook.object_id == notebooks[ 1 ].trash_id
|
||||||
|
assert notebook.revision
|
||||||
|
assert notebook.name == u"trash"
|
||||||
|
assert notebook.trash_id == None
|
||||||
|
assert notebook.read_write == True
|
||||||
|
|
||||||
startup_notes = result[ "startup_notes" ]
|
startup_notes = result[ "startup_notes" ]
|
||||||
if include_startup_notes:
|
if include_startup_notes:
|
||||||
assert len( startup_notes ) == 1
|
assert len( startup_notes ) == 1
|
||||||
assert startup_notes[ 0 ] == self.startup_note
|
assert startup_notes[ 0 ].object_id == self.startup_note.object_id
|
||||||
|
assert startup_notes[ 0 ].title == self.startup_note.title
|
||||||
|
assert startup_notes[ 0 ].contents == self.startup_note.contents
|
||||||
else:
|
else:
|
||||||
assert startup_notes == []
|
assert startup_notes == []
|
||||||
|
|
||||||
|
@ -260,15 +283,25 @@ class Test_users( Test_controller ):
|
||||||
session_id = session_id,
|
session_id = session_id,
|
||||||
)
|
)
|
||||||
|
|
||||||
assert result[ u"user" ] == self.user
|
assert result[ u"user" ]
|
||||||
assert result[ u"notebooks" ] == [ self.anon_notebook ] + self.notebooks
|
assert result[ u"user" ].object_id == self.user.object_id
|
||||||
|
assert result[ u"user" ].username == self.user.username
|
||||||
|
assert len( result[ u"notebooks" ] ) == 3
|
||||||
|
assert result[ u"notebooks" ][ 0 ].object_id == self.anon_notebook.object_id
|
||||||
|
assert result[ u"notebooks" ][ 0 ].read_write == False
|
||||||
|
assert result[ u"notebooks" ][ 1 ].object_id == self.notebooks[ 0 ].object_id
|
||||||
|
assert result[ u"notebooks" ][ 1 ].read_write == True
|
||||||
|
assert result[ u"notebooks" ][ 2 ].object_id == self.notebooks[ 1 ].object_id
|
||||||
|
assert result[ u"notebooks" ][ 2 ].read_write == True
|
||||||
assert result[ u"http_url" ] == self.settings[ u"global" ].get( u"luminotes.http_url" )
|
assert result[ u"http_url" ] == self.settings[ u"global" ].get( u"luminotes.http_url" )
|
||||||
assert result[ u"login_url" ] == None
|
assert result[ u"login_url" ] == None
|
||||||
|
|
||||||
startup_notes = result[ "startup_notes" ]
|
startup_notes = result[ "startup_notes" ]
|
||||||
if include_startup_notes:
|
if include_startup_notes:
|
||||||
assert len( startup_notes ) == 1
|
assert len( startup_notes ) == 1
|
||||||
assert startup_notes[ 0 ] == self.startup_note
|
assert startup_notes[ 0 ].object_id == self.startup_note.object_id
|
||||||
|
assert startup_notes[ 0 ].title == self.startup_note.title
|
||||||
|
assert startup_notes[ 0 ].contents == self.startup_note.contents
|
||||||
else:
|
else:
|
||||||
assert startup_notes == []
|
assert startup_notes == []
|
||||||
|
|
||||||
|
@ -281,10 +314,13 @@ class Test_users( Test_controller ):
|
||||||
)
|
)
|
||||||
|
|
||||||
assert result[ u"user" ].username == "anonymous"
|
assert result[ u"user" ].username == "anonymous"
|
||||||
assert result[ u"notebooks" ] == [ self.anon_notebook ]
|
assert len( result[ u"notebooks" ] ) == 1
|
||||||
|
assert result[ u"notebooks" ][ 0 ].object_id == self.anon_notebook.object_id
|
||||||
|
assert result[ u"notebooks" ][ 0 ].name == self.anon_notebook.name
|
||||||
|
assert result[ u"notebooks" ][ 0 ].read_write == False
|
||||||
assert result[ u"http_url" ] == self.settings[ u"global" ].get( u"luminotes.http_url" )
|
assert result[ u"http_url" ] == self.settings[ u"global" ].get( u"luminotes.http_url" )
|
||||||
|
|
||||||
login_note = self.anon_notebook.lookup_note_by_title( u"login" )
|
login_note = self.database.select_one( Note, self.anon_notebook.sql_load_note_by_title( u"login" ) )
|
||||||
assert result[ u"login_url" ] == u"%s/notebooks/%s?note_id=%s" % (
|
assert result[ u"login_url" ] == u"%s/notebooks/%s?note_id=%s" % (
|
||||||
self.settings[ u"global" ][ u"luminotes.https_url" ],
|
self.settings[ u"global" ][ u"luminotes.https_url" ],
|
||||||
self.anon_notebook.object_id,
|
self.anon_notebook.object_id,
|
||||||
|
@ -294,72 +330,37 @@ class Test_users( Test_controller ):
|
||||||
startup_notes = result[ "startup_notes" ]
|
startup_notes = result[ "startup_notes" ]
|
||||||
if include_startup_notes:
|
if include_startup_notes:
|
||||||
assert len( startup_notes ) == 1
|
assert len( startup_notes ) == 1
|
||||||
assert startup_notes[ 0 ] == self.startup_note
|
assert startup_notes[ 0 ].object_id == self.startup_note.object_id
|
||||||
|
assert startup_notes[ 0 ].title == self.startup_note.title
|
||||||
|
assert startup_notes[ 0 ].contents == self.startup_note.contents
|
||||||
else:
|
else:
|
||||||
assert startup_notes == []
|
assert startup_notes == []
|
||||||
|
|
||||||
def test_current_with_startup_notes_without_login( self ):
|
def test_current_with_startup_notes_without_login( self ):
|
||||||
self.test_current_without_login( include_startup_notes = True )
|
self.test_current_without_login( include_startup_notes = True )
|
||||||
|
|
||||||
def test_calculate_user_storage( self ):
|
|
||||||
size = cherrypy.root.users.calculate_storage( self.user )
|
|
||||||
notebooks = self.user.notebooks
|
|
||||||
|
|
||||||
# expected a sum of the sizes of all of this user's notebooks, notes, and revisions
|
|
||||||
expected_size = \
|
|
||||||
self.database.size( notebooks[ 0 ].object_id ) + \
|
|
||||||
self.database.size( notebooks[ 1 ].object_id )
|
|
||||||
|
|
||||||
assert size == expected_size
|
|
||||||
|
|
||||||
def test_calculate_anon_storage( self ):
|
|
||||||
size = cherrypy.root.users.calculate_storage( self.anonymous )
|
|
||||||
|
|
||||||
expected_size = \
|
|
||||||
self.database.size( self.anon_notebook.object_id ) + \
|
|
||||||
self.database.size( self.anon_notebook.notes[ 0 ].object_id ) + \
|
|
||||||
self.database.size( self.anon_notebook.notes[ 0 ].object_id, self.anon_notebook.notes[ 0 ].revision )
|
|
||||||
|
|
||||||
assert size == expected_size
|
|
||||||
|
|
||||||
def test_update_storage( self ):
|
def test_update_storage( self ):
|
||||||
previous_revision = self.user.revision
|
previous_revision = self.user.revision
|
||||||
|
|
||||||
cherrypy.root.users.update_storage( self.user.object_id )
|
cherrypy.root.users.update_storage( self.user.object_id )
|
||||||
self.scheduler.wait_until_idle()
|
|
||||||
|
|
||||||
expected_size = cherrypy.root.users.calculate_storage( self.user )
|
expected_size = cherrypy.root.users.calculate_storage( self.user )
|
||||||
|
|
||||||
assert self.user.storage_bytes == expected_size
|
user = self.database.load( User, self.user.object_id )
|
||||||
assert self.user.revision > previous_revision
|
assert user.storage_bytes == expected_size
|
||||||
|
assert user.revision > previous_revision
|
||||||
|
|
||||||
def test_update_storage_with_unknown_user_id( self ):
|
def test_update_storage_with_unknown_user_id( self ):
|
||||||
original_revision = self.user.revision
|
original_revision = self.user.revision
|
||||||
|
|
||||||
cherrypy.root.users.update_storage( 77 )
|
cherrypy.root.users.update_storage( 77 )
|
||||||
self.scheduler.wait_until_idle()
|
|
||||||
|
|
||||||
expected_size = cherrypy.root.users.calculate_storage( self.user )
|
expected_size = cherrypy.root.users.calculate_storage( self.user )
|
||||||
|
|
||||||
|
user = self.database.load( User, self.user.object_id )
|
||||||
assert self.user.storage_bytes == 0
|
assert self.user.storage_bytes == 0
|
||||||
assert self.user.revision == original_revision
|
assert self.user.revision == original_revision
|
||||||
|
|
||||||
def test_update_storage_with_callback( self ):
|
|
||||||
def gen():
|
|
||||||
previous_revision = self.user.revision
|
|
||||||
|
|
||||||
cherrypy.root.users.update_storage( self.user.object_id, self.scheduler.thread )
|
|
||||||
user = ( yield Scheduler.SLEEP )
|
|
||||||
|
|
||||||
expected_size = cherrypy.root.users.calculate_storage( self.user )
|
|
||||||
assert user == self.user
|
|
||||||
assert self.user.storage_bytes == expected_size
|
|
||||||
assert self.user.revision > previous_revision
|
|
||||||
|
|
||||||
g = gen()
|
|
||||||
self.scheduler.add( g )
|
|
||||||
self.scheduler.wait_for( g )
|
|
||||||
|
|
||||||
def test_send_reset( self ):
|
def test_send_reset( self ):
|
||||||
# trick send_reset() into using a fake SMTP server
|
# trick send_reset() into using a fake SMTP server
|
||||||
Stub_smtp.reset()
|
Stub_smtp.reset()
|
||||||
|
@ -408,7 +409,7 @@ class Test_users( Test_controller ):
|
||||||
|
|
||||||
result = self.http_get( "/users/redeem_reset/%s" % password_reset_id )
|
result = self.http_get( "/users/redeem_reset/%s" % password_reset_id )
|
||||||
|
|
||||||
assert result[ u"notebook_id" ] == self.anonymous.notebooks[ 0 ].object_id
|
assert result[ u"notebook_id" ] == self.anon_notebook.object_id
|
||||||
assert result[ u"note_id" ]
|
assert result[ u"note_id" ]
|
||||||
assert u"password reset" in result[ u"note_contents" ]
|
assert u"password reset" in result[ u"note_contents" ]
|
||||||
assert self.user.username in result[ u"note_contents" ]
|
assert self.user.username in result[ u"note_contents" ]
|
||||||
|
@ -434,15 +435,9 @@ class Test_users( Test_controller ):
|
||||||
assert password_reset_id
|
assert password_reset_id
|
||||||
|
|
||||||
# to trigger expiration, pretend that the password reset was made 25 hours ago
|
# to trigger expiration, pretend that the password reset was made 25 hours ago
|
||||||
def gen():
|
password_reset = self.database.load( Password_reset, password_reset_id )
|
||||||
self.database.load( password_reset_id, self.scheduler.thread )
|
password_reset._Persistent__revision = datetime.now( tz = utc ) - timedelta( hours = 25 )
|
||||||
password_reset = ( yield Scheduler.SLEEP )
|
self.database.save( password_reset )
|
||||||
password_reset._Persistent__revision = datetime.now() - timedelta( hours = 25 )
|
|
||||||
self.database.save( password_reset )
|
|
||||||
|
|
||||||
g = gen()
|
|
||||||
self.scheduler.add( g )
|
|
||||||
self.scheduler.wait_for( g )
|
|
||||||
|
|
||||||
result = self.http_get( "/users/redeem_reset/%s" % password_reset_id )
|
result = self.http_get( "/users/redeem_reset/%s" % password_reset_id )
|
||||||
|
|
||||||
|
@ -461,15 +456,9 @@ class Test_users( Test_controller ):
|
||||||
password_reset_id = matches.group( 2 )
|
password_reset_id = matches.group( 2 )
|
||||||
assert password_reset_id
|
assert password_reset_id
|
||||||
|
|
||||||
def gen():
|
password_reset = self.database.load( Password_reset, password_reset_id )
|
||||||
self.database.load( password_reset_id, self.scheduler.thread )
|
password_reset.redeemed = True
|
||||||
password_reset = ( yield Scheduler.SLEEP )
|
self.database.save( password_reset )
|
||||||
password_reset.redeemed = True
|
|
||||||
self.database.save( password_reset )
|
|
||||||
|
|
||||||
g = gen()
|
|
||||||
self.scheduler.add( g )
|
|
||||||
self.scheduler.wait_for( g )
|
|
||||||
|
|
||||||
result = self.http_get( "/users/redeem_reset/%s" % password_reset_id )
|
result = self.http_get( "/users/redeem_reset/%s" % password_reset_id )
|
||||||
|
|
||||||
|
@ -488,15 +477,9 @@ class Test_users( Test_controller ):
|
||||||
password_reset_id = matches.group( 2 )
|
password_reset_id = matches.group( 2 )
|
||||||
assert password_reset_id
|
assert password_reset_id
|
||||||
|
|
||||||
def gen():
|
password_reset = self.database.load( Password_reset, password_reset_id )
|
||||||
self.database.load( password_reset_id, self.scheduler.thread )
|
password_reset._Password_reset__email_address = u"unknown@example.com"
|
||||||
password_reset = ( yield Scheduler.SLEEP )
|
self.database.save( password_reset )
|
||||||
password_reset._Password_reset__email_address = u"unknown@example.com"
|
|
||||||
self.database.save( password_reset )
|
|
||||||
|
|
||||||
g = gen()
|
|
||||||
self.scheduler.add( g )
|
|
||||||
self.scheduler.wait_for( g )
|
|
||||||
|
|
||||||
result = self.http_get( "/users/redeem_reset/%s" % password_reset_id )
|
result = self.http_get( "/users/redeem_reset/%s" % password_reset_id )
|
||||||
|
|
||||||
|
@ -525,20 +508,17 @@ class Test_users( Test_controller ):
|
||||||
( self.user2.object_id, u"" ),
|
( self.user2.object_id, u"" ),
|
||||||
) )
|
) )
|
||||||
|
|
||||||
# check that the password reset is now marked as redeemed
|
assert result[ u"redirect" ]
|
||||||
def gen():
|
|
||||||
self.database.load( password_reset_id, self.scheduler.thread )
|
|
||||||
password_reset = ( yield Scheduler.SLEEP )
|
|
||||||
assert password_reset.redeemed
|
|
||||||
|
|
||||||
g = gen()
|
# check that the password reset is now marked as redeemed
|
||||||
self.scheduler.add( g )
|
password_reset = self.database.load( Password_reset, password_reset_id )
|
||||||
self.scheduler.wait_for( g )
|
assert password_reset.redeemed
|
||||||
|
|
||||||
# check that the password was actually reset for one of the users, but not the other
|
# check that the password was actually reset for one of the users, but not the other
|
||||||
assert self.user.check_password( new_password )
|
user = self.database.load( User, self.user.object_id )
|
||||||
assert self.user2.check_password( self.password2 )
|
assert user.check_password( new_password )
|
||||||
assert result[ u"redirect" ]
|
user2 = self.database.load( User, self.user2.object_id )
|
||||||
|
assert user2.check_password( self.password2 )
|
||||||
|
|
||||||
def test_reset_password_unknown_reset_id( self ):
|
def test_reset_password_unknown_reset_id( self ):
|
||||||
new_password = u"newpass"
|
new_password = u"newpass"
|
||||||
|
@ -552,11 +532,14 @@ class Test_users( Test_controller ):
|
||||||
( self.user2.object_id, u"" ),
|
( self.user2.object_id, u"" ),
|
||||||
) )
|
) )
|
||||||
|
|
||||||
# check that neither user's password has changed
|
|
||||||
assert self.user.check_password( self.password )
|
|
||||||
assert self.user2.check_password( self.password2 )
|
|
||||||
assert u"expired" in result[ "error" ]
|
assert u"expired" in result[ "error" ]
|
||||||
|
|
||||||
|
# check that neither user's password has changed
|
||||||
|
user = self.database.load( User, self.user.object_id )
|
||||||
|
assert user.check_password( self.password )
|
||||||
|
user2 = self.database.load( User, self.user2.object_id )
|
||||||
|
assert user2.check_password( self.password2 )
|
||||||
|
|
||||||
def test_reset_password_invalid_reset_id( self ):
|
def test_reset_password_invalid_reset_id( self ):
|
||||||
new_password = u"newpass"
|
new_password = u"newpass"
|
||||||
password_reset_id = u"invalid reset id"
|
password_reset_id = u"invalid reset id"
|
||||||
|
@ -569,11 +552,14 @@ class Test_users( Test_controller ):
|
||||||
( self.user2.object_id, u"" ),
|
( self.user2.object_id, u"" ),
|
||||||
) )
|
) )
|
||||||
|
|
||||||
# check that neither user's password has changed
|
|
||||||
assert self.user.check_password( self.password )
|
|
||||||
assert self.user2.check_password( self.password2 )
|
|
||||||
assert u"valid" in result[ "error" ]
|
assert u"valid" in result[ "error" ]
|
||||||
|
|
||||||
|
# check that neither user's password has changed
|
||||||
|
user = self.database.load( User, self.user.object_id )
|
||||||
|
assert user.check_password( self.password )
|
||||||
|
user2 = self.database.load( User, self.user2.object_id )
|
||||||
|
assert user2.check_password( self.password2 )
|
||||||
|
|
||||||
def test_reset_password_expired( self ):
|
def test_reset_password_expired( self ):
|
||||||
Stub_smtp.reset()
|
Stub_smtp.reset()
|
||||||
smtplib.SMTP = Stub_smtp
|
smtplib.SMTP = Stub_smtp
|
||||||
|
@ -588,15 +574,9 @@ class Test_users( Test_controller ):
|
||||||
assert password_reset_id
|
assert password_reset_id
|
||||||
|
|
||||||
# to trigger expiration, pretend that the password reset was made 25 hours ago
|
# to trigger expiration, pretend that the password reset was made 25 hours ago
|
||||||
def gen():
|
password_reset = self.database.load( Password_reset, password_reset_id )
|
||||||
self.database.load( password_reset_id, self.scheduler.thread )
|
password_reset._Persistent__revision = datetime.now( tz = utc ) - timedelta( hours = 25 )
|
||||||
password_reset = ( yield Scheduler.SLEEP )
|
self.database.save( password_reset )
|
||||||
password_reset._Persistent__revision = datetime.now() - timedelta( hours = 25 )
|
|
||||||
self.database.save( password_reset )
|
|
||||||
|
|
||||||
g = gen()
|
|
||||||
self.scheduler.add( g )
|
|
||||||
self.scheduler.wait_for( g )
|
|
||||||
|
|
||||||
new_password = u"newpass"
|
new_password = u"newpass"
|
||||||
result = self.http_post( "/users/reset_password", (
|
result = self.http_post( "/users/reset_password", (
|
||||||
|
@ -609,86 +589,16 @@ class Test_users( Test_controller ):
|
||||||
) )
|
) )
|
||||||
|
|
||||||
# check that the password reset is not marked as redeemed
|
# check that the password reset is not marked as redeemed
|
||||||
def gen():
|
password_reset = self.database.load( Password_reset, password_reset_id )
|
||||||
self.database.load( password_reset_id, self.scheduler.thread )
|
assert password_reset.redeemed == False
|
||||||
password_reset = ( yield Scheduler.SLEEP )
|
|
||||||
assert password_reset.redeemed == False
|
|
||||||
|
|
||||||
g = gen()
|
|
||||||
self.scheduler.add( g )
|
|
||||||
self.scheduler.wait_for( g )
|
|
||||||
|
|
||||||
# check that neither user's password has changed
|
|
||||||
assert self.user.check_password( self.password )
|
|
||||||
assert self.user2.check_password( self.password2 )
|
|
||||||
assert u"expired" in result[ "error" ]
|
assert u"expired" in result[ "error" ]
|
||||||
|
|
||||||
def test_reset_password_expired( self ):
|
|
||||||
Stub_smtp.reset()
|
|
||||||
smtplib.SMTP = Stub_smtp
|
|
||||||
|
|
||||||
self.http_post( "/users/send_reset", dict(
|
|
||||||
email_address = self.user.email_address,
|
|
||||||
send_reset_button = u"email me",
|
|
||||||
) )
|
|
||||||
|
|
||||||
matches = self.RESET_LINK_PATTERN.search( smtplib.SMTP.message )
|
|
||||||
password_reset_id = matches.group( 2 )
|
|
||||||
assert password_reset_id
|
|
||||||
|
|
||||||
def gen():
|
|
||||||
self.database.load( password_reset_id, self.scheduler.thread )
|
|
||||||
password_reset = ( yield Scheduler.SLEEP )
|
|
||||||
password_reset.redeemed = True
|
|
||||||
|
|
||||||
g = gen()
|
|
||||||
self.scheduler.add( g )
|
|
||||||
self.scheduler.wait_for( g )
|
|
||||||
|
|
||||||
new_password = u"newpass"
|
|
||||||
result = self.http_post( "/users/reset_password", (
|
|
||||||
( u"password_reset_id", password_reset_id ),
|
|
||||||
( u"reset_button", u"reset passwords" ),
|
|
||||||
( self.user.object_id, new_password ),
|
|
||||||
( self.user.object_id, new_password ),
|
|
||||||
( self.user2.object_id, u"" ),
|
|
||||||
( self.user2.object_id, u"" ),
|
|
||||||
) )
|
|
||||||
|
|
||||||
# check that neither user's password has changed
|
# check that neither user's password has changed
|
||||||
assert self.user.check_password( self.password )
|
user = self.database.load( User, self.user.object_id )
|
||||||
assert self.user2.check_password( self.password2 )
|
assert user.check_password( self.password )
|
||||||
assert u"already" in result[ "error" ]
|
user2 = self.database.load( User, self.user2.object_id )
|
||||||
|
assert user2.check_password( self.password2 )
|
||||||
def test_reset_password_unknown_user_id( self ):
|
|
||||||
Stub_smtp.reset()
|
|
||||||
smtplib.SMTP = Stub_smtp
|
|
||||||
|
|
||||||
self.http_post( "/users/send_reset", dict(
|
|
||||||
email_address = self.user.email_address,
|
|
||||||
send_reset_button = u"email me",
|
|
||||||
) )
|
|
||||||
|
|
||||||
matches = self.RESET_LINK_PATTERN.search( smtplib.SMTP.message )
|
|
||||||
password_reset_id = matches.group( 2 )
|
|
||||||
assert password_reset_id
|
|
||||||
|
|
||||||
new_password = u"newpass"
|
|
||||||
result = self.http_post( "/users/reset_password", (
|
|
||||||
( u"password_reset_id", password_reset_id ),
|
|
||||||
( u"reset_button", u"reset passwords" ),
|
|
||||||
( self.user.object_id, new_password ),
|
|
||||||
( self.user.object_id, new_password ),
|
|
||||||
( u"unknown", u"foo" ),
|
|
||||||
( u"unknown", u"foo" ),
|
|
||||||
( self.user2.object_id, u"" ),
|
|
||||||
( self.user2.object_id, u"" ),
|
|
||||||
) )
|
|
||||||
|
|
||||||
# check that neither user's password has changed
|
|
||||||
assert self.user.check_password( self.password )
|
|
||||||
assert self.user2.check_password( self.password2 )
|
|
||||||
assert result[ "error" ]
|
|
||||||
|
|
||||||
def test_reset_password_non_matching( self ):
|
def test_reset_password_non_matching( self ):
|
||||||
Stub_smtp.reset()
|
Stub_smtp.reset()
|
||||||
|
@ -713,10 +623,13 @@ class Test_users( Test_controller ):
|
||||||
( self.user2.object_id, u"" ),
|
( self.user2.object_id, u"" ),
|
||||||
) )
|
) )
|
||||||
|
|
||||||
|
assert u"password" in result[ "error" ]
|
||||||
|
|
||||||
# check that neither user's password has changed
|
# check that neither user's password has changed
|
||||||
assert self.user.check_password( self.password )
|
user = self.database.load( User, self.user.object_id )
|
||||||
assert self.user2.check_password( self.password2 )
|
assert user.check_password( self.password )
|
||||||
assert result[ "error" ]
|
user2 = self.database.load( User, self.user2.object_id )
|
||||||
|
assert user2.check_password( self.password2 )
|
||||||
|
|
||||||
def test_reset_password_blank( self ):
|
def test_reset_password_blank( self ):
|
||||||
Stub_smtp.reset()
|
Stub_smtp.reset()
|
||||||
|
@ -740,10 +653,11 @@ class Test_users( Test_controller ):
|
||||||
( self.user2.object_id, u"" ),
|
( self.user2.object_id, u"" ),
|
||||||
) )
|
) )
|
||||||
|
|
||||||
|
assert result[ "error" ]
|
||||||
|
|
||||||
# check that neither user's password has changed
|
# check that neither user's password has changed
|
||||||
assert self.user.check_password( self.password )
|
assert self.user.check_password( self.password )
|
||||||
assert self.user2.check_password( self.password2 )
|
assert self.user2.check_password( self.password2 )
|
||||||
assert result[ "error" ]
|
|
||||||
|
|
||||||
def test_reset_password_multiple_users( self ):
|
def test_reset_password_multiple_users( self ):
|
||||||
Stub_smtp.reset()
|
Stub_smtp.reset()
|
||||||
|
@ -769,17 +683,14 @@ class Test_users( Test_controller ):
|
||||||
( self.user2.object_id, new_password2 ),
|
( self.user2.object_id, new_password2 ),
|
||||||
) )
|
) )
|
||||||
|
|
||||||
# check that the password reset is now marked as redeemed
|
assert result[ u"redirect" ]
|
||||||
def gen():
|
|
||||||
self.database.load( password_reset_id, self.scheduler.thread )
|
|
||||||
password_reset = ( yield Scheduler.SLEEP )
|
|
||||||
assert password_reset.redeemed
|
|
||||||
|
|
||||||
g = gen()
|
# check that the password reset is now marked as redeemed
|
||||||
self.scheduler.add( g )
|
password_reset = self.database.load( Password_reset, password_reset_id )
|
||||||
self.scheduler.wait_for( g )
|
assert password_reset.redeemed
|
||||||
|
|
||||||
# check that the password was actually reset for both users
|
# check that the password was actually reset for both users
|
||||||
assert self.user.check_password( new_password )
|
user = self.database.load( User, self.user.object_id )
|
||||||
assert self.user2.check_password( new_password2 )
|
assert user.check_password( new_password )
|
||||||
assert result[ u"redirect" ]
|
user2 = self.database.load( User, self.user2.object_id )
|
||||||
|
assert user2.check_password( new_password2 )
|
||||||
|
|
|
@ -1,13 +1,11 @@
|
||||||
import cherrypy
|
import cherrypy
|
||||||
from controller.Database import Database
|
from controller.Database import Database
|
||||||
from controller.Root import Root
|
from controller.Root import Root
|
||||||
from controller.Scheduler import Scheduler
|
|
||||||
from config import Common
|
from config import Common
|
||||||
|
|
||||||
|
|
||||||
def main( args ):
|
def main( args ):
|
||||||
scheduler = Scheduler()
|
database = Database()
|
||||||
database = Database( scheduler, "data.db" )
|
|
||||||
|
|
||||||
cherrypy.config.update( Common.settings )
|
cherrypy.config.update( Common.settings )
|
||||||
|
|
||||||
|
@ -21,12 +19,9 @@ def main( args ):
|
||||||
cherrypy.config.update( settings )
|
cherrypy.config.update( settings )
|
||||||
|
|
||||||
cherrypy.lowercase_api = True
|
cherrypy.lowercase_api = True
|
||||||
root = Root( scheduler, database, cherrypy.config.configMap )
|
root = Root( database, cherrypy.config.configMap )
|
||||||
cherrypy.root = root
|
cherrypy.root = root
|
||||||
|
|
||||||
if scheduler.shutdown not in cherrypy.server.on_stop_server_list:
|
|
||||||
cherrypy.server.on_stop_server_list.append( scheduler.shutdown )
|
|
||||||
|
|
||||||
cherrypy.server.start()
|
cherrypy.server.start()
|
||||||
|
|
||||||
|
|
||||||
|
|
|
@ -0,0 +1,150 @@
|
||||||
|
import re
|
||||||
|
from Persistent import Persistent, quote
|
||||||
|
from controller.Html_nuker import Html_nuker
|
||||||
|
|
||||||
|
|
||||||
|
class Note( Persistent ):
|
||||||
|
"""
|
||||||
|
An single textual wiki note.
|
||||||
|
"""
|
||||||
|
TITLE_PATTERN = re.compile( u"<h3>(.*?)</h3>", flags = re.IGNORECASE )
|
||||||
|
|
||||||
|
def __init__( self, object_id, revision = None, title = None, contents = None, notebook_id = None,
|
||||||
|
startup = None, deleted_from_id = None, rank = None ):
|
||||||
|
"""
|
||||||
|
Create a new note with the given id and contents.
|
||||||
|
|
||||||
|
@type object_id: unicode
|
||||||
|
@param object_id: id of the note
|
||||||
|
@type revision: datetime or NoneType
|
||||||
|
@param revision: revision timestamp of the object (optional, defaults to now)
|
||||||
|
@type title: unicode or NoneType
|
||||||
|
@param title: textual title of the note (optional, defaults to derived from contents)
|
||||||
|
@type contents: unicode or NoneType
|
||||||
|
@param contents: HTML contents of the note (optional)
|
||||||
|
@type notebook_id: unicode or NoneType
|
||||||
|
@param notebook_id: id of notebook containing this note (optional)
|
||||||
|
@type startup: bool or NoneType
|
||||||
|
@param startup: whether this note should be displayed upon startup (optional, defaults to False)
|
||||||
|
@type deleted_from_id: unicode or NoneType
|
||||||
|
@param deleted_from_id: id of the notebook that this note was deleted from (optional)
|
||||||
|
@type rank: float or NoneType
|
||||||
|
@param rank: indicates numeric ordering of this note in relation to other startup notes
|
||||||
|
@rtype: Note
|
||||||
|
@return: newly constructed note
|
||||||
|
"""
|
||||||
|
Persistent.__init__( self, object_id, revision )
|
||||||
|
self.__title = title
|
||||||
|
self.__contents = contents
|
||||||
|
self.__notebook_id = notebook_id
|
||||||
|
self.__startup = startup or False
|
||||||
|
self.__deleted_from_id = deleted_from_id
|
||||||
|
self.__rank = rank
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def create( object_id, contents = None, notebook_id = None, startup = None, rank = None ):
|
||||||
|
"""
|
||||||
|
Convenience constructor for creating a new note.
|
||||||
|
|
||||||
|
@type object_id: unicode
|
||||||
|
@param object_id: id of the note
|
||||||
|
@type contents: unicode or NoneType
|
||||||
|
@param contents: HTML contents of the note (optional)
|
||||||
|
@type notebook_id: unicode or NoneType
|
||||||
|
@param notebook_id: id of notebook containing this note (optional)
|
||||||
|
@type startup: bool or NoneType
|
||||||
|
@param startup: whether this note should be displayed upon startup (optional, defaults to False)
|
||||||
|
@type rank: float or NoneType
|
||||||
|
@param rank: indicates numeric ordering of this note in relation to other startup notes
|
||||||
|
@rtype: Note
|
||||||
|
@return: newly constructed note
|
||||||
|
"""
|
||||||
|
note = Note( object_id, notebook_id = notebook_id, startup = startup, rank = rank )
|
||||||
|
note.contents = contents
|
||||||
|
|
||||||
|
return note
|
||||||
|
|
||||||
|
def __set_contents( self, contents ):
|
||||||
|
self.update_revision()
|
||||||
|
self.__contents = contents
|
||||||
|
|
||||||
|
if contents is None:
|
||||||
|
self.__title = None
|
||||||
|
return
|
||||||
|
|
||||||
|
# parse title out of the beginning of the contents
|
||||||
|
result = Note.TITLE_PATTERN.search( contents )
|
||||||
|
|
||||||
|
if result:
|
||||||
|
self.__title = result.groups()[ 0 ]
|
||||||
|
self.__title = Html_nuker( allow_refs = True ).nuke( self.__title )
|
||||||
|
else:
|
||||||
|
self.__title = None
|
||||||
|
|
||||||
|
def __set_notebook_id( self, notebook_id ):
|
||||||
|
self.__notebook_id = notebook_id
|
||||||
|
self.update_revision()
|
||||||
|
|
||||||
|
def __set_startup( self, startup ):
|
||||||
|
self.__startup = startup
|
||||||
|
self.update_revision()
|
||||||
|
|
||||||
|
def __set_deleted_from_id( self, deleted_from_id ):
|
||||||
|
self.__deleted_from_id = deleted_from_id
|
||||||
|
self.update_revision()
|
||||||
|
|
||||||
|
def __set_rank( self, rank ):
|
||||||
|
self.__rank = rank
|
||||||
|
self.update_revision()
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def sql_load( object_id, revision = None ):
|
||||||
|
if revision:
|
||||||
|
return "select * from note where id = %s and revision = %s;" % ( quote( object_id ), quote( revision ) )
|
||||||
|
|
||||||
|
return "select * from note_current where id = %s;" % quote( object_id )
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def sql_id_exists( object_id, revision = None ):
|
||||||
|
if revision:
|
||||||
|
return "select id from note where id = %s and revision = %s;" % ( quote( object_id ), quote( revision ) )
|
||||||
|
|
||||||
|
return "select id from note_current where id = %s;" % quote( object_id )
|
||||||
|
|
||||||
|
def sql_exists( self ):
|
||||||
|
return Note.sql_id_exists( self.object_id, self.revision )
|
||||||
|
|
||||||
|
def sql_create( self ):
|
||||||
|
rank = self.__rank
|
||||||
|
if rank is None:
|
||||||
|
rank = quote( None )
|
||||||
|
|
||||||
|
return \
|
||||||
|
"insert into note ( id, revision, title, contents, notebook_id, startup, deleted_from_id, rank ) " + \
|
||||||
|
"values ( %s, %s, %s, %s, %s, %s, %s, %s );" % \
|
||||||
|
( quote( self.object_id ), quote( self.revision ), quote( self.__title ),
|
||||||
|
quote( self.__contents ), quote( self.__notebook_id ), quote( self.__startup and 't' or 'f' ),
|
||||||
|
quote( self.__deleted_from_id ), rank )
|
||||||
|
|
||||||
|
def sql_update( self ):
|
||||||
|
return self.sql_create()
|
||||||
|
|
||||||
|
def sql_load_revisions( self ):
|
||||||
|
return "select revision from note where id = %s order by revision;" % quote( self.object_id )
|
||||||
|
|
||||||
|
def to_dict( self ):
|
||||||
|
d = Persistent.to_dict( self )
|
||||||
|
d.update( dict(
|
||||||
|
contents = self.__contents,
|
||||||
|
title = self.__title,
|
||||||
|
deleted_from_id = self.__deleted_from_id,
|
||||||
|
) )
|
||||||
|
|
||||||
|
return d
|
||||||
|
|
||||||
|
title = property( lambda self: self.__title )
|
||||||
|
contents = property( lambda self: self.__contents, __set_contents )
|
||||||
|
notebook_id = property( lambda self: self.__notebook_id, __set_notebook_id )
|
||||||
|
startup = property( lambda self: self.__startup, __set_startup )
|
||||||
|
deleted_from_id = property( lambda self: self.__deleted_from_id, __set_deleted_from_id )
|
||||||
|
rank = property( lambda self: self.__rank, __set_rank )
|
|
@ -0,0 +1,148 @@
|
||||||
|
from copy import copy
|
||||||
|
from Note import Note
|
||||||
|
from Persistent import Persistent, quote
|
||||||
|
|
||||||
|
|
||||||
|
class Notebook( Persistent ):
|
||||||
|
"""
|
||||||
|
A collection of wiki notes.
|
||||||
|
"""
|
||||||
|
def __init__( self, object_id, revision = None, name = None, trash_id = None, read_write = True ):
|
||||||
|
"""
|
||||||
|
Create a new notebook with the given id and name.
|
||||||
|
|
||||||
|
@type object_id: unicode
|
||||||
|
@param object_id: id of the notebook
|
||||||
|
@type revision: datetime or NoneType
|
||||||
|
@param revision: revision timestamp of the object (optional, defaults to now)
|
||||||
|
@type name: unicode or NoneType
|
||||||
|
@param name: name of this notebook (optional)
|
||||||
|
@type trash_id: Notebook or NoneType
|
||||||
|
@param trash_id: id of the notebook where deleted notes from this notebook go to die (optional)
|
||||||
|
@type read_write: bool or NoneType
|
||||||
|
@param read_write: whether this view of the notebook is currently read-write (optional, defaults to True)
|
||||||
|
@rtype: Notebook
|
||||||
|
@return: newly constructed notebook
|
||||||
|
"""
|
||||||
|
Persistent.__init__( self, object_id, revision )
|
||||||
|
self.__name = name
|
||||||
|
self.__trash_id = trash_id
|
||||||
|
self.__read_write = read_write
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def create( object_id, name = None, trash_id = None, read_write = True ):
|
||||||
|
"""
|
||||||
|
Convenience constructor for creating a new notebook.
|
||||||
|
|
||||||
|
@type object_id: unicode
|
||||||
|
@param object_id: id of the notebook
|
||||||
|
@type name: unicode or NoneType
|
||||||
|
@param name: name of this notebook (optional)
|
||||||
|
@type trash_id: Notebook or NoneType
|
||||||
|
@param trash_id: id of the notebook where deleted notes from this notebook go to die (optional)
|
||||||
|
@type read_write: bool or NoneType
|
||||||
|
@param read_write: whether this view of the notebook is currently read-write (optional, defaults to True)
|
||||||
|
@rtype: Notebook
|
||||||
|
@return: newly constructed notebook
|
||||||
|
"""
|
||||||
|
return Notebook( object_id, name = name, trash_id = trash_id, read_write = read_write )
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def sql_load( object_id, revision = None ):
|
||||||
|
if revision:
|
||||||
|
return "select * from notebook where id = %s and revision = %s;" % ( quote( object_id ), quote( revision ) )
|
||||||
|
|
||||||
|
return "select * from notebook_current where id = %s;" % quote( object_id )
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def sql_id_exists( object_id, revision = None ):
|
||||||
|
if revision:
|
||||||
|
return "select id from notebook where id = %s and revision = %s;" % ( quote( object_id ), quote( revision ) )
|
||||||
|
|
||||||
|
return "select id from notebook_current where id = %s;" % quote( object_id )
|
||||||
|
|
||||||
|
def sql_exists( self ):
|
||||||
|
return Notebook.sql_id_exists( self.object_id, self.revision )
|
||||||
|
|
||||||
|
def sql_create( self ):
|
||||||
|
return \
|
||||||
|
"insert into notebook ( id, revision, name, trash_id ) " + \
|
||||||
|
"values ( %s, %s, %s, %s );" % \
|
||||||
|
( quote( self.object_id ), quote( self.revision ), quote( self.__name ),
|
||||||
|
quote( self.__trash_id ) )
|
||||||
|
|
||||||
|
def sql_update( self ):
|
||||||
|
return self.sql_create()
|
||||||
|
|
||||||
|
def sql_load_notes( self ):
|
||||||
|
"""
|
||||||
|
Return a SQL string to load a list of all the notes within this notebook.
|
||||||
|
"""
|
||||||
|
return "select * from note_current where notebook_id = %s order by revision desc;" % quote( self.object_id )
|
||||||
|
|
||||||
|
def sql_load_non_startup_notes( self ):
|
||||||
|
"""
|
||||||
|
Return a SQL string to load a list of the non-startup notes within this notebook.
|
||||||
|
"""
|
||||||
|
return "select * from note_current where notebook_id = %s and startup = 'f' order by revision desc;" % quote( self.object_id )
|
||||||
|
|
||||||
|
def sql_load_startup_notes( self ):
|
||||||
|
"""
|
||||||
|
Return a SQL string to load a list of the startup notes within this notebook.
|
||||||
|
"""
|
||||||
|
return "select * from note_current where notebook_id = %s and startup = 't' order by rank;" % quote( self.object_id )
|
||||||
|
|
||||||
|
def sql_load_note_by_id( self, note_id ):
|
||||||
|
"""
|
||||||
|
Return a SQL string to load a particular note within this notebook by the note's id.
|
||||||
|
|
||||||
|
@type note_id: unicode
|
||||||
|
@param note_id: id of note to load
|
||||||
|
"""
|
||||||
|
return "select * from note_current where notebook_id = %s and id = %s;" % ( quote( self.object_id ), quote( note_id ) )
|
||||||
|
|
||||||
|
def sql_load_note_by_title( self, title ):
|
||||||
|
"""
|
||||||
|
Return a SQL string to load a particular note within this notebook by the note's title.
|
||||||
|
|
||||||
|
@type note_id: unicode
|
||||||
|
@param note_id: title of note to load
|
||||||
|
"""
|
||||||
|
return "select * from note_current where notebook_id = %s and title = %s;" % ( quote( self.object_id ), quote( title ) )
|
||||||
|
|
||||||
|
def sql_search_notes( self, search_text ):
|
||||||
|
"""
|
||||||
|
Return a SQL string to search for notes whose contents contain the given search_text. This
|
||||||
|
is a case-insensitive search.
|
||||||
|
|
||||||
|
@type search_text: unicode
|
||||||
|
@param search_text: text to search for within the notes
|
||||||
|
"""
|
||||||
|
return \
|
||||||
|
"select * from note_current where notebook_id = %s and contents ilike %s;" % \
|
||||||
|
( quote( self.object_id ), quote( "%" + search_text + "%" ) )
|
||||||
|
|
||||||
|
def sql_highest_rank( self ):
|
||||||
|
return "select coalesce( max( rank ), -1 ) from note_current where notebook_id = %s;" % quote( self.object_id )
|
||||||
|
|
||||||
|
def to_dict( self ):
|
||||||
|
d = Persistent.to_dict( self )
|
||||||
|
|
||||||
|
d.update( dict(
|
||||||
|
name = self.__name,
|
||||||
|
trash_id = self.__trash_id,
|
||||||
|
read_write = self.__read_write,
|
||||||
|
) )
|
||||||
|
|
||||||
|
return d
|
||||||
|
|
||||||
|
def __set_name( self, name ):
|
||||||
|
self.__name = name
|
||||||
|
self.update_revision()
|
||||||
|
|
||||||
|
def __set_read_write( self, read_write ):
|
||||||
|
self.__read_write = read_write
|
||||||
|
|
||||||
|
name = property( lambda self: self.__name, __set_name )
|
||||||
|
trash_id = property( lambda self: self.__trash_id )
|
||||||
|
read_write = property( lambda self: self.__read_write, __set_read_write )
|
|
@ -0,0 +1,57 @@
|
||||||
|
from Persistent import Persistent, quote
|
||||||
|
|
||||||
|
|
||||||
|
class Password_reset( Persistent ):
|
||||||
|
"""
|
||||||
|
A request for a password reset.
|
||||||
|
"""
|
||||||
|
def __init__( self, object_id, email_address, redeemed = False ):
|
||||||
|
"""
|
||||||
|
Create a password reset request with the given id.
|
||||||
|
|
||||||
|
@type object_id: unicode
|
||||||
|
@param object_id: id of the password reset
|
||||||
|
@type email_address: unicode
|
||||||
|
@param email_address: where the reset confirmation was emailed
|
||||||
|
@type redeemed: bool or NoneType
|
||||||
|
@param redeemed: whether this password reset has been redeemed yet (optional, defaults to False)
|
||||||
|
@rtype: Password_reset
|
||||||
|
@return: newly constructed password reset
|
||||||
|
"""
|
||||||
|
Persistent.__init__( self, object_id )
|
||||||
|
self.__email_address = email_address
|
||||||
|
self.__redeemed = redeemed
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def sql_load( object_id, revision = None ):
|
||||||
|
# password resets don't track revisions
|
||||||
|
if revision:
|
||||||
|
raise NotImplementedError()
|
||||||
|
|
||||||
|
return "select * from password_reset where id = %s;" % quote( object_id )
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def sql_id_exists( object_id, revision = None ):
|
||||||
|
if revision:
|
||||||
|
raise NotImplementedError()
|
||||||
|
|
||||||
|
return "select id from password_reset where id = %s;" % quote( object_id )
|
||||||
|
|
||||||
|
def sql_exists( self ):
|
||||||
|
return Password_reset.sql_id_exists( self.object_id )
|
||||||
|
|
||||||
|
def sql_create( self ):
|
||||||
|
return "insert into password_reset ( id, email_address, redeemed ) values ( %s, %s, %s );" % \
|
||||||
|
( quote( self.object_id ), quote( self.__email_address ), quote( self.__redeemed and "t" or "f" ) )
|
||||||
|
|
||||||
|
def sql_update( self ):
|
||||||
|
return "update password_reset set redeemed = %s where id = %s;" % \
|
||||||
|
( quote( self.__redeemed and "t" or "f" ), quote( self.object_id ) )
|
||||||
|
|
||||||
|
def __set_redeemed( self, redeemed ):
|
||||||
|
if redeemed != self.__redeemed:
|
||||||
|
self.update_revision()
|
||||||
|
self.__redeemed = redeemed
|
||||||
|
|
||||||
|
email_address = property( lambda self: self.__email_address )
|
||||||
|
redeemed = property( lambda self: self.__redeemed, __set_redeemed )
|
|
@ -0,0 +1,82 @@
|
||||||
|
from datetime import datetime
|
||||||
|
from pytz import utc
|
||||||
|
|
||||||
|
|
||||||
|
class Persistent( object ):
|
||||||
|
"""
|
||||||
|
A persistent database object with a unique id.
|
||||||
|
"""
|
||||||
|
def __init__( self, object_id, revision = None ):
|
||||||
|
self.__object_id = object_id
|
||||||
|
self.__revision = revision
|
||||||
|
|
||||||
|
if not revision:
|
||||||
|
self.update_revision()
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def sql_load( object_id, revision = None ):
|
||||||
|
"""
|
||||||
|
Return a SQL string to load an object with the given information from the database. If a
|
||||||
|
revision is not provided, then the most current version of the given object will be loaded.
|
||||||
|
|
||||||
|
@type object_id: unicode
|
||||||
|
@param object_id: id of object to load
|
||||||
|
@type revision: unicode or NoneType
|
||||||
|
@param revision: revision of the object to load (optional)
|
||||||
|
"""
|
||||||
|
raise NotImplementedError()
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def sql_id_exists( object_id, revision = None ):
|
||||||
|
"""
|
||||||
|
Return a SQL string to determine whether the given object is present in the database. If a
|
||||||
|
revision is not provided, then the most current version of the given object will be used.
|
||||||
|
|
||||||
|
@type object_id: unicode
|
||||||
|
@param object_id: id of object to check for existence
|
||||||
|
@type revision: unicode or NoneType
|
||||||
|
@param revision: revision of the object to check (optional)
|
||||||
|
"""
|
||||||
|
raise NotImplementedError()
|
||||||
|
|
||||||
|
def sql_exists( self ):
|
||||||
|
"""
|
||||||
|
Return a SQL string to determine whether the current revision of this object is present in the
|
||||||
|
database.
|
||||||
|
"""
|
||||||
|
raise NotImplementedError()
|
||||||
|
|
||||||
|
def sql_create( self ):
|
||||||
|
"""
|
||||||
|
Return a SQL string to save this object to the database for the first time. This should be in
|
||||||
|
the form of a SQL insert.
|
||||||
|
"""
|
||||||
|
raise NotImplementedError()
|
||||||
|
|
||||||
|
def sql_update( self ):
|
||||||
|
"""
|
||||||
|
Return a SQL string to save an updated revision of this object to the database. Note that,
|
||||||
|
because of the retention of old row revisions in the database, this SQL string will usually
|
||||||
|
be in the form of an insert rather than an update to an existing row.
|
||||||
|
"""
|
||||||
|
raise NotImplementedError()
|
||||||
|
|
||||||
|
def to_dict( self ):
|
||||||
|
return dict(
|
||||||
|
object_id = self.__object_id,
|
||||||
|
revision = self.__revision,
|
||||||
|
)
|
||||||
|
|
||||||
|
def update_revision( self ):
|
||||||
|
self.__revision = datetime.now( tz = utc )
|
||||||
|
|
||||||
|
object_id = property( lambda self: self.__object_id )
|
||||||
|
revision = property( lambda self: self.__revision )
|
||||||
|
|
||||||
|
|
||||||
|
def quote( value ):
|
||||||
|
if value is None:
|
||||||
|
return "null"
|
||||||
|
|
||||||
|
value = unicode( value )
|
||||||
|
return "'%s'" % value.replace( "'", "''" ).replace( "\\", "\\\\" )
|
|
@ -0,0 +1,212 @@
|
||||||
|
import sha
|
||||||
|
import random
|
||||||
|
from copy import copy
|
||||||
|
from Persistent import Persistent, quote
|
||||||
|
|
||||||
|
|
||||||
|
class User( Persistent ):
|
||||||
|
"""
|
||||||
|
A Luminotes user.
|
||||||
|
"""
|
||||||
|
SALT_CHARS = [ chr( c ) for c in range( ord( "!" ), ord( "~" ) + 1 ) ]
|
||||||
|
SALT_SIZE = 12
|
||||||
|
|
||||||
|
def __init__( self, object_id, revision = None, username = None, salt = None, password_hash = None,
|
||||||
|
email_address = None, storage_bytes = None, rate_plan = None ):
|
||||||
|
"""
|
||||||
|
Create a new user with the given credentials and information.
|
||||||
|
|
||||||
|
@type object_id: unicode
|
||||||
|
@param object_id: id of the user
|
||||||
|
@type revision: datetime or NoneType
|
||||||
|
@param revision: revision timestamp of the object (optional, defaults to now)
|
||||||
|
@type username: unicode or NoneType
|
||||||
|
@param username: unique user identifier for login purposes (optional)
|
||||||
|
@type salt: unicode or NoneType
|
||||||
|
@param salt: salt to use when hashing the password (optional, defaults to random)
|
||||||
|
@type password_hash: unicode or NoneType
|
||||||
|
@param password_hash: cryptographic hash of secret password for login purposes (optional)
|
||||||
|
@type email_address: unicode or NoneType
|
||||||
|
@param email_address: a hopefully valid email address (optional)
|
||||||
|
@type storage_bytes: int or NoneType
|
||||||
|
@param storage_bytes: count of bytes that the user is currently using for storage (optional)
|
||||||
|
@type rate_plan: int or NoneType
|
||||||
|
@param rate_plan: index into the rate plan array in config/Common.py (optional, defaults to 0)
|
||||||
|
@rtype: User
|
||||||
|
@return: newly created user
|
||||||
|
"""
|
||||||
|
Persistent.__init__( self, object_id, revision )
|
||||||
|
self.__username = username
|
||||||
|
self.__salt = salt
|
||||||
|
self.__password_hash = password_hash
|
||||||
|
self.__email_address = email_address
|
||||||
|
self.__storage_bytes = storage_bytes or 0
|
||||||
|
self.__rate_plan = rate_plan or 0
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def create( object_id, username = None, password = None, email_address = None ):
|
||||||
|
"""
|
||||||
|
Convenience constructor for creating a new user.
|
||||||
|
|
||||||
|
@type object_id: unicode
|
||||||
|
@param object_id: id of the user
|
||||||
|
@type username: unicode or NoneType
|
||||||
|
@param username: unique user identifier for login purposes (optional)
|
||||||
|
@type password: unicode or NoneType
|
||||||
|
@param password: secret password for login purposes (optional)
|
||||||
|
@type email_address: unicode or NoneType
|
||||||
|
@param email_address: a hopefully valid email address (optional)
|
||||||
|
@rtype: User
|
||||||
|
@return: newly created user
|
||||||
|
"""
|
||||||
|
salt = User.__create_salt()
|
||||||
|
password_hash = User.__hash_password( salt, password )
|
||||||
|
|
||||||
|
return User( object_id, None, username, salt, password_hash, email_address )
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def __create_salt():
|
||||||
|
return "".join( [ random.choice( User.SALT_CHARS ) for i in range( User.SALT_SIZE ) ] )
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def __hash_password( salt, password ):
|
||||||
|
if password is None or len( password ) == 0:
|
||||||
|
return None
|
||||||
|
|
||||||
|
return sha.new( salt + password ).hexdigest()
|
||||||
|
|
||||||
|
def check_password( self, password ):
|
||||||
|
"""
|
||||||
|
Check that the given password matches this user's password.
|
||||||
|
|
||||||
|
@type password: unicode
|
||||||
|
@param password: password to check
|
||||||
|
@rtype: bool
|
||||||
|
@return: True if the password matches
|
||||||
|
"""
|
||||||
|
if self.__password_hash == None:
|
||||||
|
return False
|
||||||
|
|
||||||
|
hash = User.__hash_password( self.__salt, password )
|
||||||
|
if hash == self.__password_hash:
|
||||||
|
return True
|
||||||
|
|
||||||
|
return False
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def sql_load( object_id, revision = None ):
|
||||||
|
if revision:
|
||||||
|
return "select * from luminotes_user where id = %s and revision = %s;" % ( quote( object_id ), quote( revision ) )
|
||||||
|
|
||||||
|
return "select * from luminotes_user_current where id = %s;" % quote( object_id )
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def sql_id_exists( object_id, revision = None ):
|
||||||
|
if revision:
|
||||||
|
return "select id from luminotes_user where id = %s and revision = %s;" % ( quote( object_id ), quote( revision ) )
|
||||||
|
|
||||||
|
return "select id from luminotes_user_current where id = %s;" % quote( object_id )
|
||||||
|
|
||||||
|
def sql_exists( self ):
|
||||||
|
return User.sql_id_exists( self.object_id, self.revision )
|
||||||
|
|
||||||
|
def sql_create( self ):
|
||||||
|
return \
|
||||||
|
"insert into luminotes_user ( id, revision, username, salt, password_hash, email_address, storage_bytes, rate_plan ) " + \
|
||||||
|
"values ( %s, %s, %s, %s, %s, %s, %s, %s );" % \
|
||||||
|
( quote( self.object_id ), quote( self.revision ), quote( self.__username ),
|
||||||
|
quote( self.__salt ), quote( self.__password_hash ), quote( self.__email_address ),
|
||||||
|
self.__storage_bytes, self.__rate_plan )
|
||||||
|
|
||||||
|
def sql_update( self ):
|
||||||
|
return self.sql_create()
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def sql_load_by_username( username ):
|
||||||
|
return "select * from luminotes_user_current where username = %s;" % quote( username )
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def sql_load_by_email_address( email_address ):
|
||||||
|
return "select * from luminotes_user_current where username = %s;" % quote( email_address )
|
||||||
|
|
||||||
|
def sql_load_notebooks( self, parents_only = False ):
|
||||||
|
"""
|
||||||
|
Return a SQL string to load a list of the notebooks to which this user has access.
|
||||||
|
"""
|
||||||
|
if parents_only:
|
||||||
|
parents_only_clause = " and trash_id is not null";
|
||||||
|
else:
|
||||||
|
parents_only_clause = ""
|
||||||
|
|
||||||
|
return \
|
||||||
|
"select notebook_current.*, user_notebook.read_write from user_notebook, notebook_current " + \
|
||||||
|
"where user_id = %s%s and user_notebook.notebook_id = notebook_current.id order by revision;" % \
|
||||||
|
( quote( self.object_id ), parents_only_clause )
|
||||||
|
|
||||||
|
def sql_save_notebook( self, notebook_id, read_write = True ):
|
||||||
|
"""
|
||||||
|
Return a SQL string to save the id of a notebook to which this user has access.
|
||||||
|
"""
|
||||||
|
return \
|
||||||
|
"insert into user_notebook ( user_id, notebook_id, read_write ) values " + \
|
||||||
|
"( %s, %s, %s );" % ( quote( self.object_id ), quote( notebook_id ), quote( read_write and 't' or 'f' ) )
|
||||||
|
|
||||||
|
def sql_has_access( self, notebook_id, read_write = False ):
|
||||||
|
"""
|
||||||
|
Return a SQL string to determine whether this user has access to the given notebook.
|
||||||
|
"""
|
||||||
|
if read_write is True:
|
||||||
|
return \
|
||||||
|
"select user_id from user_notebook where user_id = %s and notebook_id = %s and read_write = 't';" % \
|
||||||
|
( quote( self.object_id ), quote( notebook_id ) )
|
||||||
|
else:
|
||||||
|
return \
|
||||||
|
"select user_id from user_notebook where user_id = %s and notebook_id = %s;" % \
|
||||||
|
( quote( self.object_id ), quote( notebook_id ) )
|
||||||
|
|
||||||
|
def sql_calculate_storage( self ):
|
||||||
|
"""
|
||||||
|
Return a SQL string to calculate the total bytes of storage usage by this user. Note that this
|
||||||
|
only includes storage for all the user's notes and past revisions. It doesn't include storage
|
||||||
|
for the notebooks themselves.
|
||||||
|
"""
|
||||||
|
return \
|
||||||
|
"""
|
||||||
|
select
|
||||||
|
coalesce( sum( pg_column_size( note.* ) ), 0 )
|
||||||
|
from
|
||||||
|
luminotes_user_current, user_notebook, note
|
||||||
|
where
|
||||||
|
luminotes_user_current.id = %s and
|
||||||
|
user_notebook.user_id = luminotes_user_current.id and
|
||||||
|
note.notebook_id = user_notebook.notebook_id;
|
||||||
|
""" % quote( self.object_id )
|
||||||
|
|
||||||
|
def to_dict( self ):
|
||||||
|
d = Persistent.to_dict( self )
|
||||||
|
d.update( dict(
|
||||||
|
username = self.username,
|
||||||
|
storage_bytes = self.__storage_bytes,
|
||||||
|
rate_plan = self.__rate_plan,
|
||||||
|
) )
|
||||||
|
|
||||||
|
return d
|
||||||
|
|
||||||
|
def __set_password( self, password ):
|
||||||
|
self.update_revision()
|
||||||
|
self.__salt = User.__create_salt()
|
||||||
|
self.__password_hash = User.__hash_password( self.__salt, password )
|
||||||
|
|
||||||
|
def __set_storage_bytes( self, storage_bytes ):
|
||||||
|
self.update_revision()
|
||||||
|
self.__storage_bytes = storage_bytes
|
||||||
|
|
||||||
|
def __set_rate_plan( self, rate_plan ):
|
||||||
|
self.update_revision()
|
||||||
|
self.__rate_plan = rate_plan
|
||||||
|
|
||||||
|
username = property( lambda self: self.__username )
|
||||||
|
email_address = property( lambda self: self.__email_address )
|
||||||
|
password = property( None, __set_password )
|
||||||
|
storage_bytes = property( lambda self: self.__storage_bytes, __set_storage_bytes )
|
||||||
|
rate_plan = property( lambda self: self.__rate_plan, __set_rate_plan )
|
|
@ -0,0 +1,8 @@
|
||||||
|
DROP VIEW luminotes_user_current;
|
||||||
|
DROP TABLE luminotes_user;
|
||||||
|
DROP VIEW note_current;
|
||||||
|
DROP TABLE note;
|
||||||
|
DROP VIEW notebook_current;
|
||||||
|
DROP TABLE notebook;
|
||||||
|
DROP TABLE password_reset;
|
||||||
|
DROP TABLE user_notebook;
|
|
@ -6,13 +6,6 @@ SET client_encoding = 'UTF8';
|
||||||
SET check_function_bodies = false;
|
SET check_function_bodies = false;
|
||||||
SET client_min_messages = warning;
|
SET client_min_messages = warning;
|
||||||
|
|
||||||
--
|
|
||||||
-- Name: SCHEMA public; Type: COMMENT; Schema: -; Owner: postgres
|
|
||||||
--
|
|
||||||
|
|
||||||
COMMENT ON SCHEMA public IS 'Standard public schema';
|
|
||||||
|
|
||||||
|
|
||||||
SET search_path = public, pg_catalog;
|
SET search_path = public, pg_catalog;
|
||||||
|
|
||||||
SET default_tablespace = '';
|
SET default_tablespace = '';
|
|
@ -0,0 +1,133 @@
|
||||||
|
from pytz import utc
|
||||||
|
from datetime import datetime, timedelta
|
||||||
|
from new_model.Note import Note
|
||||||
|
|
||||||
|
|
||||||
|
class Test_note( object ):
|
||||||
|
def setUp( self ):
|
||||||
|
self.object_id = u"17"
|
||||||
|
self.title = u"title goes here"
|
||||||
|
self.contents = u"<h3>%s</h3>blah" % self.title
|
||||||
|
self.notebook_id = u"18"
|
||||||
|
self.startup = False
|
||||||
|
self.rank = 17.5
|
||||||
|
self.delta = timedelta( seconds = 1 )
|
||||||
|
|
||||||
|
self.note = Note.create( self.object_id, self.contents, self.notebook_id, self.startup, self.rank )
|
||||||
|
|
||||||
|
def test_create( self ):
|
||||||
|
assert self.note.object_id == self.object_id
|
||||||
|
assert datetime.now( tz = utc ) - self.note.revision < self.delta
|
||||||
|
assert self.note.contents == self.contents
|
||||||
|
assert self.note.title == self.title
|
||||||
|
assert self.note.notebook_id == self.notebook_id
|
||||||
|
assert self.note.startup == self.startup
|
||||||
|
assert self.note.deleted_from_id == None
|
||||||
|
assert self.note.rank == self.rank
|
||||||
|
|
||||||
|
def test_set_contents( self ):
|
||||||
|
new_title = u"new title"
|
||||||
|
new_contents = u"<h3>%s</h3>new blah" % new_title
|
||||||
|
previous_revision = self.note.revision
|
||||||
|
|
||||||
|
self.note.contents = new_contents
|
||||||
|
|
||||||
|
assert self.note.revision > previous_revision
|
||||||
|
assert self.note.contents == new_contents
|
||||||
|
assert self.note.title == new_title
|
||||||
|
assert self.note.notebook_id == self.notebook_id
|
||||||
|
assert self.note.startup == self.startup
|
||||||
|
assert self.note.deleted_from_id == None
|
||||||
|
assert self.note.rank == self.rank
|
||||||
|
|
||||||
|
def test_set_contents_with_html_title( self ):
|
||||||
|
new_title = u"new title"
|
||||||
|
new_contents = u"<h3>new<br /> title</h3>new blah"
|
||||||
|
previous_revision = self.note.revision
|
||||||
|
|
||||||
|
self.note.contents = new_contents
|
||||||
|
|
||||||
|
# html should be stripped out of the title
|
||||||
|
assert self.note.revision > previous_revision
|
||||||
|
assert self.note.contents == new_contents
|
||||||
|
assert self.note.title == new_title
|
||||||
|
assert self.note.notebook_id == self.notebook_id
|
||||||
|
assert self.note.startup == self.startup
|
||||||
|
assert self.note.deleted_from_id == None
|
||||||
|
assert self.note.rank == self.rank
|
||||||
|
|
||||||
|
def test_set_contents_with_multiple_titles( self ):
|
||||||
|
new_title = u"new title"
|
||||||
|
new_contents = u"<h3>new<br /> title</h3>new blah<h3>other title</h3>hmm"
|
||||||
|
previous_revision = self.note.revision
|
||||||
|
|
||||||
|
self.note.contents = new_contents
|
||||||
|
|
||||||
|
# should only use the first title
|
||||||
|
assert self.note.revision > previous_revision
|
||||||
|
assert self.note.contents == new_contents
|
||||||
|
assert self.note.title == new_title
|
||||||
|
assert self.note.notebook_id == self.notebook_id
|
||||||
|
assert self.note.startup == self.startup
|
||||||
|
assert self.note.deleted_from_id == None
|
||||||
|
assert self.note.rank == self.rank
|
||||||
|
|
||||||
|
def test_set_notebook_id( self ):
|
||||||
|
previous_revision = self.note.revision
|
||||||
|
self.note.notebook_id = u"54"
|
||||||
|
|
||||||
|
assert self.note.revision > previous_revision
|
||||||
|
assert self.note.notebook_id == u"54"
|
||||||
|
|
||||||
|
def test_set_startup( self ):
|
||||||
|
previous_revision = self.note.revision
|
||||||
|
self.note.startup = True
|
||||||
|
|
||||||
|
assert self.note.revision > previous_revision
|
||||||
|
assert self.note.startup == True
|
||||||
|
|
||||||
|
def test_set_deleted_from_id( self ):
|
||||||
|
previous_revision = self.note.revision
|
||||||
|
self.note.deleted_from_id = u"55"
|
||||||
|
|
||||||
|
assert self.note.revision > previous_revision
|
||||||
|
assert self.note.deleted_from_id == u"55"
|
||||||
|
|
||||||
|
def test_set_rank( self ):
|
||||||
|
previous_revision = self.note.revision
|
||||||
|
self.note.rank = 5
|
||||||
|
|
||||||
|
assert self.note.revision > previous_revision
|
||||||
|
assert self.note.rank == 5
|
||||||
|
|
||||||
|
def test_to_dict( self ):
|
||||||
|
d = self.note.to_dict()
|
||||||
|
|
||||||
|
assert d.get( "object_id" ) == self.note.object_id
|
||||||
|
assert datetime.now( tz = utc ) - d.get( "revision" ) < self.delta
|
||||||
|
assert d.get( "contents" ) == self.contents
|
||||||
|
assert d.get( "title" ) == self.title
|
||||||
|
assert d.get( "deleted_from_id" ) == None
|
||||||
|
|
||||||
|
|
||||||
|
class Test_note_blank( Test_note ):
|
||||||
|
def setUp( self ):
|
||||||
|
self.object_id = u"17"
|
||||||
|
self.title = None
|
||||||
|
self.contents = None
|
||||||
|
self.notebook_id = None
|
||||||
|
self.startup = False
|
||||||
|
self.rank = None
|
||||||
|
self.delta = timedelta( seconds = 1 )
|
||||||
|
|
||||||
|
self.note = Note.create( self.object_id )
|
||||||
|
|
||||||
|
def test_create( self ):
|
||||||
|
assert self.note.object_id == self.object_id
|
||||||
|
assert datetime.now( tz = utc ) - self.note.revision < self.delta
|
||||||
|
assert self.note.contents == None
|
||||||
|
assert self.note.title == None
|
||||||
|
assert self.note.notebook_id == None
|
||||||
|
assert self.note.startup == False
|
||||||
|
assert self.note.deleted_from_id == None
|
||||||
|
assert self.note.rank == None
|
|
@ -0,0 +1,54 @@
|
||||||
|
from pytz import utc
|
||||||
|
from datetime import datetime, timedelta
|
||||||
|
from new_model.Notebook import Notebook
|
||||||
|
from new_model.Note import Note
|
||||||
|
|
||||||
|
|
||||||
|
class Test_notebook( object ):
|
||||||
|
def setUp( self ):
|
||||||
|
self.object_id = "17"
|
||||||
|
self.trash_id = "18"
|
||||||
|
self.name = u"my notebook"
|
||||||
|
self.trash_name = u"trash"
|
||||||
|
self.delta = timedelta( seconds = 1 )
|
||||||
|
|
||||||
|
self.trash = Notebook.create( self.trash_id, self.trash_name, read_write = False )
|
||||||
|
self.notebook = Notebook.create( self.object_id, self.name, trash_id = self.trash.object_id )
|
||||||
|
self.note = Note.create( "19", u"<h3>title</h3>blah" )
|
||||||
|
|
||||||
|
def test_create( self ):
|
||||||
|
assert self.notebook.object_id == self.object_id
|
||||||
|
assert datetime.now( tz = utc ) - self.notebook.revision < self.delta
|
||||||
|
assert self.notebook.name == self.name
|
||||||
|
assert self.notebook.read_write == True
|
||||||
|
assert self.notebook.trash_id == self.trash_id
|
||||||
|
|
||||||
|
assert self.trash.object_id == self.trash_id
|
||||||
|
assert datetime.now( tz = utc ) - self.trash.revision < self.delta
|
||||||
|
assert self.trash.name == self.trash_name
|
||||||
|
assert self.trash.read_write == False
|
||||||
|
assert self.trash.trash_id == None
|
||||||
|
|
||||||
|
def test_set_name( self ):
|
||||||
|
new_name = u"my new notebook"
|
||||||
|
previous_revision = self.notebook.revision
|
||||||
|
self.notebook.name = new_name
|
||||||
|
|
||||||
|
assert self.notebook.name == new_name
|
||||||
|
assert self.notebook.revision > previous_revision
|
||||||
|
|
||||||
|
def test_set_read_write( self ):
|
||||||
|
original_revision = self.notebook.revision
|
||||||
|
self.notebook.read_write = True
|
||||||
|
|
||||||
|
assert self.notebook.read_write == True
|
||||||
|
assert self.notebook.revision == original_revision
|
||||||
|
|
||||||
|
def test_to_dict( self ):
|
||||||
|
d = self.notebook.to_dict()
|
||||||
|
|
||||||
|
assert d.get( "name" ) == self.name
|
||||||
|
assert d.get( "trash_id" ) == self.trash.object_id
|
||||||
|
assert d.get( "read_write" ) == True
|
||||||
|
assert d.get( "object_id" ) == self.notebook.object_id
|
||||||
|
assert datetime.now( tz = utc ) - d.get( "revision" ) < self.delta
|
|
@ -0,0 +1,38 @@
|
||||||
|
from model.User import User
|
||||||
|
from new_model.Password_reset import Password_reset
|
||||||
|
|
||||||
|
|
||||||
|
class Test_password_reset( object ):
|
||||||
|
def setUp( self ):
|
||||||
|
self.object_id = u"17"
|
||||||
|
self.email_address = u"bob@example.com"
|
||||||
|
|
||||||
|
self.password_reset = Password_reset( self.object_id, self.email_address )
|
||||||
|
|
||||||
|
def test_create( self ):
|
||||||
|
assert self.password_reset.object_id == self.object_id
|
||||||
|
assert self.password_reset.email_address == self.email_address
|
||||||
|
assert self.password_reset.redeemed == False
|
||||||
|
|
||||||
|
def test_redeem( self ):
|
||||||
|
previous_revision = self.password_reset.revision
|
||||||
|
self.password_reset.redeemed = True
|
||||||
|
|
||||||
|
assert self.password_reset.redeemed == True
|
||||||
|
assert self.password_reset.revision > previous_revision
|
||||||
|
|
||||||
|
def test_redeem_twice( self ):
|
||||||
|
self.password_reset.redeemed = True
|
||||||
|
current_revision = self.password_reset.revision
|
||||||
|
self.password_reset.redeemed = True
|
||||||
|
|
||||||
|
assert self.password_reset.redeemed == True
|
||||||
|
assert self.password_reset.revision == current_revision
|
||||||
|
|
||||||
|
def test_unredeem( self ):
|
||||||
|
self.password_reset.redeemed = True
|
||||||
|
previous_revision = self.password_reset.revision
|
||||||
|
self.password_reset.redeemed = False
|
||||||
|
|
||||||
|
assert self.password_reset.redeemed == False
|
||||||
|
assert self.password_reset.revision > previous_revision
|
|
@ -0,0 +1,73 @@
|
||||||
|
from pytz import utc
|
||||||
|
from datetime import datetime, timedelta
|
||||||
|
from new_model.Persistent import Persistent, quote
|
||||||
|
|
||||||
|
|
||||||
|
class Test_persistent( object ):
|
||||||
|
def setUp( self ):
|
||||||
|
self.object_id = "17"
|
||||||
|
self.obj = Persistent( self.object_id )
|
||||||
|
self.delta = timedelta( seconds = 1 )
|
||||||
|
|
||||||
|
def test_create( self ):
|
||||||
|
assert self.obj.object_id == self.object_id
|
||||||
|
assert datetime.now( tz = utc ) - self.obj.revision < self.delta
|
||||||
|
|
||||||
|
def test_update_revision( self ):
|
||||||
|
previous_revision = self.obj.revision
|
||||||
|
self.obj.update_revision()
|
||||||
|
assert self.obj.revision > previous_revision
|
||||||
|
assert datetime.now( tz = utc ) - self.obj.revision < self.delta
|
||||||
|
|
||||||
|
previous_revision = self.obj.revision
|
||||||
|
self.obj.update_revision()
|
||||||
|
assert self.obj.revision > previous_revision
|
||||||
|
assert datetime.now( tz = utc ) - self.obj.revision < self.delta
|
||||||
|
|
||||||
|
def test_to_dict( self ):
|
||||||
|
d = self.obj.to_dict()
|
||||||
|
|
||||||
|
assert d.get( "object_id" ) == self.object_id
|
||||||
|
assert d.get( "revision" ) == self.obj.revision
|
||||||
|
|
||||||
|
|
||||||
|
class Test_persistent_with_revision( object ):
|
||||||
|
def setUp( self ):
|
||||||
|
self.object_id = "17"
|
||||||
|
self.revision = datetime.now( tz = utc ) - timedelta( hours = 24 )
|
||||||
|
self.obj = Persistent( self.object_id, self.revision )
|
||||||
|
self.delta = timedelta( seconds = 1 )
|
||||||
|
|
||||||
|
def test_create( self ):
|
||||||
|
assert self.obj.object_id == self.object_id
|
||||||
|
assert self.revision - self.obj.revision < self.delta
|
||||||
|
|
||||||
|
def test_update_revision( self ):
|
||||||
|
previous_revision = self.obj.revision
|
||||||
|
self.obj.update_revision()
|
||||||
|
assert self.obj.revision > previous_revision
|
||||||
|
assert datetime.now( tz = utc ) - self.obj.revision < self.delta
|
||||||
|
|
||||||
|
previous_revision = self.obj.revision
|
||||||
|
self.obj.update_revision()
|
||||||
|
assert self.obj.revision > previous_revision
|
||||||
|
assert datetime.now( tz = utc ) - self.obj.revision < self.delta
|
||||||
|
|
||||||
|
def test_to_dict( self ):
|
||||||
|
d = self.obj.to_dict()
|
||||||
|
|
||||||
|
assert d.get( "object_id" ) == self.object_id
|
||||||
|
assert d.get( "revision" ) == self.obj.revision
|
||||||
|
|
||||||
|
|
||||||
|
def test_quote():
|
||||||
|
assert "'foo'" == quote( "foo" )
|
||||||
|
|
||||||
|
def test_quote_apostrophe():
|
||||||
|
assert "'it''s'" == quote( "it's" )
|
||||||
|
|
||||||
|
def test_quote_backslash():
|
||||||
|
assert r"'c:\\\\whee'" == quote( r"c:\\whee" )
|
||||||
|
|
||||||
|
def test_quote_none():
|
||||||
|
assert "null" == quote( None )
|
|
@ -0,0 +1,69 @@
|
||||||
|
from pytz import utc
|
||||||
|
from datetime import datetime, timedelta
|
||||||
|
from new_model.User import User
|
||||||
|
|
||||||
|
|
||||||
|
class Test_user( object ):
|
||||||
|
def setUp( self ):
|
||||||
|
self.object_id = u"17"
|
||||||
|
self.username = u"bob"
|
||||||
|
self.password = u"foobar"
|
||||||
|
self.email_address = u"bob@example.com"
|
||||||
|
self.delta = timedelta( seconds = 1 )
|
||||||
|
|
||||||
|
self.user = User.create( self.object_id, self.username, self.password, self.email_address )
|
||||||
|
|
||||||
|
def test_create( self ):
|
||||||
|
assert self.user.object_id == self.object_id
|
||||||
|
assert datetime.now( tz = utc ) - self.user.revision < self.delta
|
||||||
|
assert self.user.username == self.username
|
||||||
|
assert self.user.email_address == self.email_address
|
||||||
|
assert self.user.storage_bytes == 0
|
||||||
|
assert self.user.rate_plan == 0
|
||||||
|
|
||||||
|
def test_check_correct_password( self ):
|
||||||
|
assert self.user.check_password( self.password ) == True
|
||||||
|
|
||||||
|
def test_check_incorrect_password( self ):
|
||||||
|
assert self.user.check_password( u"wrong" ) == False
|
||||||
|
|
||||||
|
def test_set_password( self ):
|
||||||
|
previous_revision = self.user.revision
|
||||||
|
new_password = u"newpass"
|
||||||
|
self.user.password = new_password
|
||||||
|
|
||||||
|
assert self.user.check_password( self.password ) == False
|
||||||
|
assert self.user.check_password( new_password ) == True
|
||||||
|
assert self.user.revision > previous_revision
|
||||||
|
|
||||||
|
def test_set_none_password( self ):
|
||||||
|
previous_revision = self.user.revision
|
||||||
|
new_password = None
|
||||||
|
self.user.password = new_password
|
||||||
|
|
||||||
|
assert self.user.check_password( self.password ) == False
|
||||||
|
assert self.user.check_password( new_password ) == False
|
||||||
|
assert self.user.revision > previous_revision
|
||||||
|
|
||||||
|
def test_set_storage_bytes( self ):
|
||||||
|
previous_revision = self.user.revision
|
||||||
|
storage_bytes = 44
|
||||||
|
self.user.storage_bytes = storage_bytes
|
||||||
|
|
||||||
|
assert self.user.storage_bytes == storage_bytes
|
||||||
|
assert self.user.revision > previous_revision
|
||||||
|
|
||||||
|
def test_set_rate_plan( self ):
|
||||||
|
previous_revision = self.user.revision
|
||||||
|
rate_plan = 2
|
||||||
|
self.user.rate_plan = rate_plan
|
||||||
|
|
||||||
|
assert self.user.rate_plan == rate_plan
|
||||||
|
assert self.user.revision > previous_revision
|
||||||
|
|
||||||
|
def test_to_dict( self ):
|
||||||
|
d = self.user.to_dict()
|
||||||
|
|
||||||
|
assert d.get( "username" ) == self.username
|
||||||
|
assert d.get( "storage_bytes" ) == self.user.storage_bytes
|
||||||
|
assert d.get( "rate_plan" ) == self.user.rate_plan
|
|
@ -1,9 +1,10 @@
|
||||||
function Editor( id, notebook_id, note_text, deleted_from, revisions_list, read_write, startup, highlight, focus ) {
|
function Editor( id, notebook_id, note_text, deleted_from_id, revision, read_write, startup, highlight, focus ) {
|
||||||
this.id = id;
|
this.id = id;
|
||||||
this.notebook_id = notebook_id;
|
this.notebook_id = notebook_id;
|
||||||
this.initial_text = note_text;
|
this.initial_text = note_text;
|
||||||
this.deleted_from = deleted_from || null;
|
this.deleted_from_id = deleted_from_id || null;
|
||||||
this.revisions_list = revisions_list || new Array();
|
this.revision = revision;
|
||||||
|
this.revisions_list = new Array(); // cache for this note's list of revisions, loaded from the server on-demand
|
||||||
this.read_write = read_write;
|
this.read_write = read_write;
|
||||||
this.startup = startup || false; // whether this Editor is for a startup note
|
this.startup = startup || false; // whether this Editor is for a startup note
|
||||||
this.init_highlight = highlight || false;
|
this.init_highlight = highlight || false;
|
||||||
|
@ -31,7 +32,7 @@ function Editor( id, notebook_id, note_text, deleted_from, revisions_list, read_
|
||||||
"type": "button",
|
"type": "button",
|
||||||
"class": "note_button",
|
"class": "note_button",
|
||||||
"id": "delete_" + iframe_id,
|
"id": "delete_" + iframe_id,
|
||||||
"value": "delete" + ( this.deleted_from ? " forever" : "" ),
|
"value": "delete" + ( this.deleted_from_id ? " forever" : "" ),
|
||||||
"title": "delete note [ctrl-d]"
|
"title": "delete note [ctrl-d]"
|
||||||
} );
|
} );
|
||||||
connect( this.delete_button, "onclick", function ( event ) { signal( self, "delete_clicked", event ); } );
|
connect( this.delete_button, "onclick", function ( event ) { signal( self, "delete_clicked", event ); } );
|
||||||
|
@ -45,7 +46,7 @@ function Editor( id, notebook_id, note_text, deleted_from, revisions_list, read_
|
||||||
} );
|
} );
|
||||||
connect( this.changes_button, "onclick", function ( event ) { signal( self, "changes_clicked", event ); } );
|
connect( this.changes_button, "onclick", function ( event ) { signal( self, "changes_clicked", event ); } );
|
||||||
|
|
||||||
if ( this.deleted_from ) {
|
if ( this.deleted_from_id ) {
|
||||||
this.undelete_button = createDOM( "input", {
|
this.undelete_button = createDOM( "input", {
|
||||||
"type": "button",
|
"type": "button",
|
||||||
"class": "note_button",
|
"class": "note_button",
|
||||||
|
@ -66,7 +67,7 @@ function Editor( id, notebook_id, note_text, deleted_from, revisions_list, read_
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
if ( !this.deleted_from && ( read_write || !startup ) ) {
|
if ( !this.deleted_from_id && ( read_write || !startup ) ) {
|
||||||
this.hide_button = createDOM( "input", {
|
this.hide_button = createDOM( "input", {
|
||||||
"type": "button",
|
"type": "button",
|
||||||
"class": "note_button",
|
"class": "note_button",
|
||||||
|
|
|
@ -72,7 +72,7 @@ Wiki.prototype.display_user = function ( result ) {
|
||||||
for ( var i in result.notebooks ) {
|
for ( var i in result.notebooks ) {
|
||||||
var notebook = result.notebooks[ i ];
|
var notebook = result.notebooks[ i ];
|
||||||
|
|
||||||
if ( notebook.name == "Luminotes" )
|
if ( notebook.name == "Luminotes" || notebook.name == "trash" )
|
||||||
continue;
|
continue;
|
||||||
|
|
||||||
var div_class = "link_area_item";
|
var div_class = "link_area_item";
|
||||||
|
@ -162,18 +162,20 @@ Wiki.prototype.populate = function ( result ) {
|
||||||
createDOM( "a", { "href": location.href, "id": "all_notes_link", "title": "View a list of all notes in this notebook." }, "all notes" )
|
createDOM( "a", { "href": location.href, "id": "all_notes_link", "title": "View a list of all notes in this notebook." }, "all notes" )
|
||||||
) );
|
) );
|
||||||
}
|
}
|
||||||
appendChildNodes( span, createDOM( "div", { "class": "link_area_item" },
|
if ( this.notebook.name != "Luminotes" ) {
|
||||||
createDOM( "a", { "href": "/notebooks/download_html/" + this.notebook.object_id, "id": "download_html_link", "title": "Download a stand-alone copy of the entire wiki notebook." }, "download as html" )
|
appendChildNodes( span, createDOM( "div", { "class": "link_area_item" },
|
||||||
) );
|
createDOM( "a", { "href": "/notebooks/download_html/" + this.notebook.object_id, "id": "download_html_link", "title": "Download a stand-alone copy of the entire wiki notebook." }, "download as html" )
|
||||||
|
) );
|
||||||
|
}
|
||||||
|
|
||||||
if ( this.notebook.read_write ) {
|
if ( this.notebook.read_write ) {
|
||||||
this.read_write = true;
|
this.read_write = true;
|
||||||
removeElementClass( "toolbar", "undisplayed" );
|
removeElementClass( "toolbar", "undisplayed" );
|
||||||
|
|
||||||
if ( this.notebook.trash ) {
|
if ( this.notebook.trash_id ) {
|
||||||
appendChildNodes( span, createDOM( "div", { "class": "link_area_item" },
|
appendChildNodes( span, createDOM( "div", { "class": "link_area_item" },
|
||||||
createDOM( "a", {
|
createDOM( "a", {
|
||||||
"href": "/notebooks/" + this.notebook.trash.object_id + "?parent_id=" + this.notebook.object_id,
|
"href": "/notebooks/" + this.notebook.trash_id + "?parent_id=" + this.notebook.object_id,
|
||||||
"id": "trash_link",
|
"id": "trash_link",
|
||||||
"title": "Look here for notes you've deleted."
|
"title": "Look here for notes you've deleted."
|
||||||
}, "trash" )
|
}, "trash" )
|
||||||
|
@ -221,9 +223,11 @@ Wiki.prototype.populate = function ( result ) {
|
||||||
} );
|
} );
|
||||||
}
|
}
|
||||||
|
|
||||||
connect( "download_html_link", "onclick", function ( event ) {
|
if ( this.notebook.name != "Luminotes" ) {
|
||||||
self.save_editor( null, true );
|
connect( "download_html_link", "onclick", function ( event ) {
|
||||||
} );
|
self.save_editor( null, true );
|
||||||
|
} );
|
||||||
|
}
|
||||||
|
|
||||||
// create an editor for each startup note in the received notebook, focusing the first one
|
// create an editor for each startup note in the received notebook, focusing the first one
|
||||||
var focus = true;
|
var focus = true;
|
||||||
|
@ -234,7 +238,7 @@ Wiki.prototype.populate = function ( result ) {
|
||||||
|
|
||||||
// don't actually create an editor if a particular note was provided in the result
|
// don't actually create an editor if a particular note was provided in the result
|
||||||
if ( !result.note ) {
|
if ( !result.note ) {
|
||||||
var editor = this.create_editor( note.object_id, note.contents, note.deleted_from, note.revisions_list, undefined, this.read_write, false, focus );
|
var editor = this.create_editor( note.object_id, note.contents, note.deleted_from_id, note.revision, this.read_write, false, focus );
|
||||||
this.open_editors[ note.title ] = editor;
|
this.open_editors[ note.title ] = editor;
|
||||||
focus = false;
|
focus = false;
|
||||||
}
|
}
|
||||||
|
@ -242,14 +246,16 @@ Wiki.prototype.populate = function ( result ) {
|
||||||
|
|
||||||
// if one particular note was provided, then just display an editor for that note
|
// if one particular note was provided, then just display an editor for that note
|
||||||
var read_write = this.read_write;
|
var read_write = this.read_write;
|
||||||
if ( getElement( "revision" ).value ) read_write = false;
|
var revision_element = getElement( "revision" );
|
||||||
|
if ( revision_element && revision_element.value ) read_write = false;
|
||||||
|
|
||||||
if ( result.note )
|
if ( result.note )
|
||||||
this.create_editor(
|
this.create_editor(
|
||||||
result.note.object_id,
|
result.note.object_id,
|
||||||
result.note.contents || getElement( "note_contents" ).value,
|
result.note.contents || getElement( "note_contents" ).value,
|
||||||
result.note.deleted_from,
|
result.note.deleted_from_id,
|
||||||
result.note.revisions_list,
|
result.note.revision,
|
||||||
undefined, read_write, false, true
|
read_write, false, true
|
||||||
);
|
);
|
||||||
|
|
||||||
if ( result.startup_notes.length == 0 && !result.note )
|
if ( result.startup_notes.length == 0 && !result.note )
|
||||||
|
@ -282,7 +288,7 @@ Wiki.prototype.create_blank_editor = function ( event ) {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
var editor = this.create_editor( undefined, undefined, undefined, undefined, undefined, this.read_write, true, true );
|
var editor = this.create_editor( undefined, undefined, undefined, undefined, this.read_write, true, true );
|
||||||
this.blank_editor_id = editor.id;
|
this.blank_editor_id = editor.id;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -448,26 +454,27 @@ Wiki.prototype.resolve_link = function ( note_title, link, callback ) {
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
Wiki.prototype.parse_loaded_editor = function ( result, note_title, revision, link ) {
|
Wiki.prototype.parse_loaded_editor = function ( result, note_title, requested_revision, link ) {
|
||||||
if ( result.note ) {
|
if ( result.note ) {
|
||||||
var id = result.note.object_id;
|
var id = result.note.object_id;
|
||||||
if ( revision ) id += " " + revision;
|
if ( requested_revision )
|
||||||
|
id += " " + requested_revision;
|
||||||
|
var actual_revision = result.note.revision;
|
||||||
var note_text = result.note.contents;
|
var note_text = result.note.contents;
|
||||||
var deleted_from = result.note.deleted;
|
var deleted_from_id = result.note.deleted;
|
||||||
var revisions_list = result.note.revisions_list;
|
|
||||||
} else {
|
} else {
|
||||||
var id = null;
|
var id = null;
|
||||||
var note_text = "<h3>" + note_title;
|
var note_text = "<h3>" + note_title;
|
||||||
var deleted_from = null;
|
var deleted_from_id = null;
|
||||||
var revisions_list = new Array();
|
var actual_revision = null;
|
||||||
}
|
}
|
||||||
|
|
||||||
if ( revision )
|
if ( requested_revision )
|
||||||
var read_write = false; // show previous revisions as read-only
|
var read_write = false; // show previous revisions as read-only
|
||||||
else
|
else
|
||||||
var read_write = this.read_write;
|
var read_write = this.read_write;
|
||||||
|
|
||||||
var editor = this.create_editor( id, note_text, deleted_from, revisions_list, note_title, read_write, true, false );
|
var editor = this.create_editor( id, note_text, deleted_from_id, actual_revision, read_write, true, false );
|
||||||
id = editor.id;
|
id = editor.id;
|
||||||
|
|
||||||
// if a link that launched this editor was provided, update it with the created note's id
|
// if a link that launched this editor was provided, update it with the created note's id
|
||||||
|
@ -475,7 +482,7 @@ Wiki.prototype.parse_loaded_editor = function ( result, note_title, revision, li
|
||||||
link.href = "/notebooks/" + this.notebook_id + "?note_id=" + id;
|
link.href = "/notebooks/" + this.notebook_id + "?note_id=" + id;
|
||||||
}
|
}
|
||||||
|
|
||||||
Wiki.prototype.create_editor = function ( id, note_text, deleted_from, revisions_list, note_title, read_write, highlight, focus ) {
|
Wiki.prototype.create_editor = function ( id, note_text, deleted_from_id, revision, read_write, highlight, focus ) {
|
||||||
var self = this;
|
var self = this;
|
||||||
if ( isUndefinedOrNull( id ) ) {
|
if ( isUndefinedOrNull( id ) ) {
|
||||||
if ( this.read_write ) {
|
if ( this.read_write ) {
|
||||||
|
@ -489,13 +496,13 @@ Wiki.prototype.create_editor = function ( id, note_text, deleted_from, revisions
|
||||||
}
|
}
|
||||||
|
|
||||||
// for read-only notes within read-write notebooks, tack the revision timestamp onto the start of the note text
|
// for read-only notes within read-write notebooks, tack the revision timestamp onto the start of the note text
|
||||||
if ( !read_write && this.read_write && revisions_list && revisions_list.length ) {
|
if ( !read_write && this.read_write && revision ) {
|
||||||
var short_revision = this.brief_revision( revisions_list[ revisions_list.length - 1 ] );
|
var short_revision = this.brief_revision( revision );
|
||||||
note_text = "<p>Previous revision from " + short_revision + "</p>" + note_text;
|
note_text = "<p>Previous revision from " + short_revision + "</p>" + note_text;
|
||||||
}
|
}
|
||||||
|
|
||||||
var startup = this.startup_notes[ id ];
|
var startup = this.startup_notes[ id ];
|
||||||
var editor = new Editor( id, this.notebook_id, note_text, deleted_from, revisions_list, read_write, startup, highlight, focus );
|
var editor = new Editor( id, this.notebook_id, note_text, deleted_from_id, revision, read_write, startup, highlight, focus );
|
||||||
|
|
||||||
if ( this.read_write ) {
|
if ( this.read_write ) {
|
||||||
connect( editor, "state_changed", this, "editor_state_changed" );
|
connect( editor, "state_changed", this, "editor_state_changed" );
|
||||||
|
@ -613,7 +620,7 @@ Wiki.prototype.editor_key_pressed = function ( editor, event ) {
|
||||||
this.create_blank_editor( event );
|
this.create_blank_editor( event );
|
||||||
// ctrl-h: hide note
|
// ctrl-h: hide note
|
||||||
} else if ( code == 72 ) {
|
} else if ( code == 72 ) {
|
||||||
if ( !editor.deleted_from )
|
if ( !editor.deleted_from_id )
|
||||||
this.hide_editor( event );
|
this.hide_editor( event );
|
||||||
// ctrl-d: delete note
|
// ctrl-d: delete note
|
||||||
} else if ( code == 68 ) {
|
} else if ( code == 68 ) {
|
||||||
|
@ -727,7 +734,7 @@ Wiki.prototype.delete_editor = function ( event, editor ) {
|
||||||
if ( editor == this.focused_editor )
|
if ( editor == this.focused_editor )
|
||||||
this.focused_editor = null;
|
this.focused_editor = null;
|
||||||
|
|
||||||
if ( this.notebook.trash && !editor.empty() ) {
|
if ( this.notebook.trash_id && !editor.empty() ) {
|
||||||
var undo_button = createDOM( "input", {
|
var undo_button = createDOM( "input", {
|
||||||
"type": "button",
|
"type": "button",
|
||||||
"class": "message_button",
|
"class": "message_button",
|
||||||
|
@ -735,7 +742,7 @@ Wiki.prototype.delete_editor = function ( event, editor ) {
|
||||||
"title": "undo deletion"
|
"title": "undo deletion"
|
||||||
} );
|
} );
|
||||||
var trash_link = createDOM( "a", {
|
var trash_link = createDOM( "a", {
|
||||||
"href": "/notebooks/" + this.notebook.trash.object_id + "?parent_id=" + this.notebook.object_id
|
"href": "/notebooks/" + this.notebook.trash_id + "?parent_id=" + this.notebook.object_id
|
||||||
}, "trash" );
|
}, "trash" );
|
||||||
this.display_message( 'The note has been moved to the', [ trash_link, ". ", undo_button ] )
|
this.display_message( 'The note has been moved to the', [ trash_link, ". ", undo_button ] )
|
||||||
var self = this;
|
var self = this;
|
||||||
|
@ -767,7 +774,7 @@ Wiki.prototype.undelete_editor_via_trash = function ( event, editor ) {
|
||||||
if ( this.read_write && editor.read_write ) {
|
if ( this.read_write && editor.read_write ) {
|
||||||
var self = this;
|
var self = this;
|
||||||
this.invoker.invoke( "/notebooks/undelete_note", "POST", {
|
this.invoker.invoke( "/notebooks/undelete_note", "POST", {
|
||||||
"notebook_id": editor.deleted_from,
|
"notebook_id": editor.deleted_from_id,
|
||||||
"note_id": editor.id
|
"note_id": editor.id
|
||||||
}, function ( result ) { self.display_storage_usage( result.storage_bytes ); } );
|
}, function ( result ) { self.display_storage_usage( result.storage_bytes ); } );
|
||||||
}
|
}
|
||||||
|
@ -816,13 +823,12 @@ Wiki.prototype.save_editor = function ( editor, fire_and_forget ) {
|
||||||
|
|
||||||
var self = this;
|
var self = this;
|
||||||
if ( editor && editor.read_write && !editor.empty() ) {
|
if ( editor && editor.read_write && !editor.empty() ) {
|
||||||
var revisions = editor.revisions_list;
|
|
||||||
this.invoker.invoke( "/notebooks/save_note", "POST", {
|
this.invoker.invoke( "/notebooks/save_note", "POST", {
|
||||||
"notebook_id": this.notebook_id,
|
"notebook_id": this.notebook_id,
|
||||||
"note_id": editor.id,
|
"note_id": editor.id,
|
||||||
"contents": editor.contents(),
|
"contents": editor.contents(),
|
||||||
"startup": editor.startup,
|
"startup": editor.startup,
|
||||||
"previous_revision": revisions.length ? revisions[ revisions.length - 1 ] : "None"
|
"previous_revision": editor.revision ? editor.revision : "None"
|
||||||
}, function ( result ) {
|
}, function ( result ) {
|
||||||
self.update_editor_revisions( result, editor );
|
self.update_editor_revisions( result, editor );
|
||||||
self.display_storage_usage( result.storage_bytes );
|
self.display_storage_usage( result.storage_bytes );
|
||||||
|
@ -835,8 +841,8 @@ Wiki.prototype.update_editor_revisions = function ( result, editor ) {
|
||||||
if ( !result.new_revision )
|
if ( !result.new_revision )
|
||||||
return;
|
return;
|
||||||
|
|
||||||
var revisions = editor.revisions_list;
|
var client_previous_revision = editor.revision;
|
||||||
var client_previous_revision = revisions.length ? revisions[ revisions.length - 1 ] : null;
|
editor.revision = result.new_revision;
|
||||||
|
|
||||||
// if the server's idea of the previous revision doesn't match the client's, then someone has
|
// if the server's idea of the previous revision doesn't match the client's, then someone has
|
||||||
// gone behind our back and saved the editor's note from another window
|
// gone behind our back and saved the editor's note from another window
|
||||||
|
@ -854,11 +860,15 @@ Wiki.prototype.update_editor_revisions = function ( result, editor ) {
|
||||||
self.compare_versions( event, editor, result.previous_revision );
|
self.compare_versions( event, editor, result.previous_revision );
|
||||||
} );
|
} );
|
||||||
|
|
||||||
revisions.push( result.previous_revision );
|
if ( !editor.revisions_list || editor.revisions_list.length == 0 )
|
||||||
|
return;
|
||||||
|
editor.revisions_list.push( result.previous_revision );
|
||||||
}
|
}
|
||||||
|
|
||||||
// add the new revision to the editor's revisions list
|
// add the new revision to the editor's revisions list
|
||||||
revisions.push( result.new_revision );
|
if ( !editor.revisions_list || editor.revisions_list.length == 0 )
|
||||||
|
return;
|
||||||
|
editor.revisions_list.push( result.new_revision );
|
||||||
}
|
}
|
||||||
|
|
||||||
Wiki.prototype.search = function ( event ) {
|
Wiki.prototype.search = function ( event ) {
|
||||||
|
@ -898,7 +908,7 @@ Wiki.prototype.display_search_results = function ( result ) {
|
||||||
}
|
}
|
||||||
|
|
||||||
// otherwise, create an editor for the one note
|
// otherwise, create an editor for the one note
|
||||||
this.create_editor( note.object_id, note.contents, note.deleted_from, note.revisions_list, undefined, this.read_write, true, true );
|
this.create_editor( note.object_id, note.contents, note.deleted_from_id, note.revision, this.read_write, true, true );
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -936,7 +946,7 @@ Wiki.prototype.display_search_results = function ( result ) {
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
this.search_results_editor = this.create_editor( "search_results", "<h3>search results</h3>" + list.innerHTML, undefined, undefined, undefined, false, true, true );
|
this.search_results_editor = this.create_editor( "search_results", "<h3>search results</h3>" + list.innerHTML, undefined, undefined, false, true, true );
|
||||||
}
|
}
|
||||||
|
|
||||||
Wiki.prototype.display_all_notes_list = function ( result ) {
|
Wiki.prototype.display_all_notes_list = function ( result ) {
|
||||||
|
@ -954,6 +964,8 @@ Wiki.prototype.display_all_notes_list = function ( result ) {
|
||||||
var note_tuple = result.notes[ i ]
|
var note_tuple = result.notes[ i ]
|
||||||
var note_id = note_tuple[ 0 ];
|
var note_id = note_tuple[ 0 ];
|
||||||
var note_title = note_tuple[ 1 ];
|
var note_title = note_tuple[ 1 ];
|
||||||
|
if ( !note_title )
|
||||||
|
note_title = "untitled note";
|
||||||
|
|
||||||
appendChildNodes( list,
|
appendChildNodes( list,
|
||||||
createDOM( "li", {},
|
createDOM( "li", {},
|
||||||
|
@ -962,7 +974,7 @@ Wiki.prototype.display_all_notes_list = function ( result ) {
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
this.all_notes_editor = this.create_editor( "all_notes", "<h3>all notes</h3>" + list.innerHTML, undefined, undefined, undefined, false, true, true );
|
this.all_notes_editor = this.create_editor( "all_notes", "<h3>all notes</h3>" + list.innerHTML, undefined, undefined, false, true, true );
|
||||||
}
|
}
|
||||||
|
|
||||||
Wiki.prototype.display_message = function ( text, nodes ) {
|
Wiki.prototype.display_message = function ( text, nodes ) {
|
||||||
|
@ -1099,8 +1111,22 @@ Wiki.prototype.display_empty_message = function () {
|
||||||
this.display_message( "The trash is empty." )
|
this.display_message( "The trash is empty." )
|
||||||
}
|
}
|
||||||
|
|
||||||
|
DATE_PATTERN = /(\d\d\d\d)-(\d\d)-(\d\d) (\d\d):(\d\d):(\d\d).(\d+)[+-](\d\d:?\d\d)/;
|
||||||
|
|
||||||
Wiki.prototype.brief_revision = function ( revision ) {
|
Wiki.prototype.brief_revision = function ( revision ) {
|
||||||
return revision.split( /\.\d/ )[ 0 ]; // strip off seconds from the timestamp
|
var matches = DATE_PATTERN.exec( revision );
|
||||||
|
|
||||||
|
return new Date( Date.UTC(
|
||||||
|
matches[ 1 ], // year
|
||||||
|
matches[ 2 ] - 1, // month (zero-based)
|
||||||
|
matches[ 3 ], // day
|
||||||
|
matches[ 4 ], // hour
|
||||||
|
matches[ 5 ], // minute
|
||||||
|
matches[ 6 ], // second
|
||||||
|
matches[ 7 ] * 0.001 // milliseconds
|
||||||
|
) ).toLocaleString();
|
||||||
|
|
||||||
|
// return revision.split( /\.\d/ )[ 0 ]; // strip off seconds from the timestamp
|
||||||
}
|
}
|
||||||
|
|
||||||
Wiki.prototype.toggle_editor_changes = function ( event, editor ) {
|
Wiki.prototype.toggle_editor_changes = function ( event, editor ) {
|
||||||
|
@ -1112,8 +1138,26 @@ Wiki.prototype.toggle_editor_changes = function ( event, editor ) {
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
new Changes_pulldown( this, this.notebook_id, this.invoker, editor );
|
|
||||||
event.stop();
|
event.stop();
|
||||||
|
|
||||||
|
// if there's already a cached revision list, display the changes pulldown with it
|
||||||
|
if ( editor.revisions_list.length > 0 ) {
|
||||||
|
new Changes_pulldown( this, this.notebook_id, this.invoker, editor );
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
// otherwise, load the revision list for this note from the server
|
||||||
|
var self = this;
|
||||||
|
this.invoker.invoke(
|
||||||
|
"/notebooks/load_note_revisions", "GET", {
|
||||||
|
"notebook_id": this.notebook_id,
|
||||||
|
"note_id": editor.id
|
||||||
|
},
|
||||||
|
function ( result ) {
|
||||||
|
editor.revisions_list = result.revisions;
|
||||||
|
new Changes_pulldown( self, self.notebook_id, self.invoker, editor );
|
||||||
|
}
|
||||||
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
Wiki.prototype.toggle_editor_options = function ( event, editor ) {
|
Wiki.prototype.toggle_editor_options = function ( event, editor ) {
|
||||||
|
@ -1236,13 +1280,13 @@ function Changes_pulldown( wiki, notebook_id, invoker, editor ) {
|
||||||
this.editor = editor;
|
this.editor = editor;
|
||||||
this.links = new Array();
|
this.links = new Array();
|
||||||
|
|
||||||
// display list of revision timestamps in reverse chronological order
|
if ( !editor.revisions_list || editor.revisions_list.length == 0 ) {
|
||||||
if ( isUndefinedOrNull( this.editor.revisions_list ) || this.editor.revisions_list.length == 0 ) {
|
|
||||||
appendChildNodes( this.div, createDOM( "span", "This note has no previous changes." ) );
|
appendChildNodes( this.div, createDOM( "span", "This note has no previous changes." ) );
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
var revisions_list = clone( this.editor.revisions_list );
|
// display list of revision timestamps in reverse chronological order
|
||||||
|
var revisions_list = clone( editor.revisions_list );
|
||||||
revisions_list.reverse();
|
revisions_list.reverse();
|
||||||
|
|
||||||
var self = this;
|
var self = this;
|
||||||
|
|
|
@ -3,14 +3,14 @@ function setUpPage() {
|
||||||
notebook_id = "fake_notebook_id";
|
notebook_id = "fake_notebook_id";
|
||||||
title = "the title"
|
title = "the title"
|
||||||
note_text = "<h3>" + title + "</h3>blah";
|
note_text = "<h3>" + title + "</h3>blah";
|
||||||
deleted_from = undefined;
|
deleted_from_id = undefined;
|
||||||
revisions_list = undefined;
|
revisions_list = undefined;
|
||||||
read_write = true;
|
read_write = true;
|
||||||
startup = false;
|
startup = false;
|
||||||
highlight = false;
|
highlight = false;
|
||||||
editor_focus = false;
|
editor_focus = false;
|
||||||
|
|
||||||
editor = new Editor( id, notebook_id, note_text, deleted_from, revisions_list, read_write, startup, highlight, editor_focus );
|
editor = new Editor( id, notebook_id, note_text, deleted_from_id, revisions_list, read_write, startup, highlight, editor_focus );
|
||||||
|
|
||||||
init_complete = false;
|
init_complete = false;
|
||||||
connect( editor, "init_complete", function () { init_complete = true; } );
|
connect( editor, "init_complete", function () { init_complete = true; } );
|
||||||
|
|
|
@ -1,8 +1,8 @@
|
||||||
function Editor( id, notebook_id, note_text, deleted_from, revisions_list, read_write, startup, highlight, focus ) {
|
function Editor( id, notebook_id, note_text, deleted_from_id, revisions_list, read_write, startup, highlight, focus ) {
|
||||||
this.id = id;
|
this.id = id;
|
||||||
this.notebook_id = notebook_id;
|
this.notebook_id = notebook_id;
|
||||||
this.initial_text = note_text;
|
this.initial_text = note_text;
|
||||||
this.deleted_from = deleted_from || null;
|
this.deleted_from_id = deleted_from_id || null;
|
||||||
this.revisions_list = revisions_list || new Array();
|
this.revisions_list = revisions_list || new Array();
|
||||||
this.read_write = read_write;
|
this.read_write = read_write;
|
||||||
this.startup = startup || false; // whether this Editor is for a startup note
|
this.startup = startup || false; // whether this Editor is for a startup note
|
||||||
|
|
|
@ -15,11 +15,10 @@ function test_Editor() {
|
||||||
assertNotUndefined( "editor should have changes_button member", editor.changes_button );
|
assertNotUndefined( "editor should have changes_button member", editor.changes_button );
|
||||||
assertNotUndefined( "editor should have options_button member", editor.options_button );
|
assertNotUndefined( "editor should have options_button member", editor.options_button );
|
||||||
assertFalse( "editor should not have closed flag set", editor.closed );
|
assertFalse( "editor should not have closed flag set", editor.closed );
|
||||||
assertEquals( "editor should have correct deleted_from flag", editor.deleted_from, deleted_from || null );
|
assertEquals( "editor should have correct deleted_from_id flag", editor.deleted_from_id, deleted_from_id || null );
|
||||||
assertNotUndefined( "editor should have document member", editor.document );
|
assertNotUndefined( "editor should have document member", editor.document );
|
||||||
assertEquals( "editor id should have correct id", editor.id, id );
|
assertEquals( "editor id should have correct id", editor.id, id );
|
||||||
assertNotUndefined( "editor should have iframe member", editor.iframe );
|
assertNotUndefined( "editor should have iframe member", editor.iframe );
|
||||||
assertEquals( "editor should have empty revisions list", editor.revisions_list.length, 0 );
|
|
||||||
assertEquals( "editor should have correct startup flag", editor.startup, startup );
|
assertEquals( "editor should have correct startup flag", editor.startup, startup );
|
||||||
assertEquals( "editor should have correct title", editor.title, title );
|
assertEquals( "editor should have correct title", editor.title, title );
|
||||||
assertEquals( "editor should have correct read_write flag", editor.read_write, read_write );
|
assertEquals( "editor should have correct read_write flag", editor.read_write, read_write );
|
||||||
|
|
|
@ -101,6 +101,7 @@ function test_Wiki() {
|
||||||
<input type="hidden" name="note_id" id="note_id" value="" />
|
<input type="hidden" name="note_id" id="note_id" value="" />
|
||||||
<input type="hidden" name="parent_id" id="parent_id" value="" />
|
<input type="hidden" name="parent_id" id="parent_id" value="" />
|
||||||
<input type="hidden" name="revision" id="revision" value="" />
|
<input type="hidden" name="revision" id="revision" value="" />
|
||||||
|
<input type="hidden" name="note_contents" id="note_contents" value="" />
|
||||||
|
|
||||||
</body>
|
</body>
|
||||||
</html>
|
</html>
|
||||||
|
|
|
@ -3,21 +3,31 @@
|
||||||
import os
|
import os
|
||||||
import os.path
|
import os.path
|
||||||
import psycopg2 as psycopg
|
import psycopg2 as psycopg
|
||||||
from controller.Database import Database
|
from pytz import timezone, utc
|
||||||
|
from datetime import datetime
|
||||||
|
from controller.Old_database import Old_database
|
||||||
from controller.Scheduler import Scheduler
|
from controller.Scheduler import Scheduler
|
||||||
|
|
||||||
|
|
||||||
|
pacific = timezone( "US/Pacific" )
|
||||||
|
|
||||||
|
|
||||||
def quote( value ):
|
def quote( value ):
|
||||||
if value is None:
|
if value is None:
|
||||||
return "null"
|
return "null"
|
||||||
|
|
||||||
|
# if this is a datetime, assume it's in the Pacific timezone, and then convert it to UTC
|
||||||
|
if isinstance( value, datetime ):
|
||||||
|
value = value.replace( tzinfo = pacific ).astimezone( utc )
|
||||||
|
|
||||||
value = unicode( value )
|
value = unicode( value )
|
||||||
|
|
||||||
return "'%s'" % value.replace( "'", "''" ).replace( "\\", "\\\\" )
|
return "'%s'" % value.replace( "'", "''" ).replace( "\\", "\\\\" )
|
||||||
|
|
||||||
|
|
||||||
class Converter( object ):
|
class Converter( object ):
|
||||||
"""
|
"""
|
||||||
Converts a Luminotes database from bsddb to PostgreSQL, using the old bsddb controller.Database.
|
Converts a Luminotes database from bsddb to PostgreSQL, using the old bsddb controller.Old_database.
|
||||||
This assumes that the PostgreSQL schema from model/schema.sql is already in the database.
|
This assumes that the PostgreSQL schema from model/schema.sql is already in the database.
|
||||||
"""
|
"""
|
||||||
def __init__( self, scheduler, database ):
|
def __init__( self, scheduler, database ):
|
||||||
|
@ -36,8 +46,8 @@ class Converter( object ):
|
||||||
notes = {} # map of note object id to its notebook
|
notes = {} # map of note object id to its notebook
|
||||||
startup_notes = {} # map of startup note object id to its notebook
|
startup_notes = {} # map of startup note object id to its notebook
|
||||||
|
|
||||||
for key in self.database._Database__db.keys():
|
for key in self.database._Old_database__db.keys():
|
||||||
if not self.database._Database__db.get( key ):
|
if not self.database._Old_database__db.get( key ):
|
||||||
continue
|
continue
|
||||||
|
|
||||||
self.database.load( key, self.scheduler.thread )
|
self.database.load( key, self.scheduler.thread )
|
||||||
|
@ -102,6 +112,14 @@ class Converter( object ):
|
||||||
( quote( value.object_id ), quote( notebook_id ),
|
( quote( value.object_id ), quote( notebook_id ),
|
||||||
quote( read_only and "f" or "t" ) )
|
quote( read_only and "f" or "t" ) )
|
||||||
)
|
)
|
||||||
|
if notebook.trash:
|
||||||
|
self.cursor.execute(
|
||||||
|
"insert into user_notebook " +
|
||||||
|
"( user_id, notebook_id, read_write ) " +
|
||||||
|
"values ( %s, %s, %s );" %
|
||||||
|
( quote( value.object_id ), quote( notebook.trash.object_id ),
|
||||||
|
quote( read_only and "f" or "t" ) )
|
||||||
|
)
|
||||||
elif class_name == "Read_only_notebook":
|
elif class_name == "Read_only_notebook":
|
||||||
pass
|
pass
|
||||||
elif class_name == "Password_reset":
|
elif class_name == "Password_reset":
|
||||||
|
@ -140,7 +158,7 @@ class Converter( object ):
|
||||||
|
|
||||||
def main():
|
def main():
|
||||||
scheduler = Scheduler()
|
scheduler = Scheduler()
|
||||||
database = Database( scheduler, "data.db" )
|
database = Old_database( scheduler, "data.db" )
|
||||||
initializer = Converter( scheduler, database )
|
initializer = Converter( scheduler, database )
|
||||||
scheduler.wait_until_idle()
|
scheduler.wait_until_idle()
|
||||||
|
|
||||||
|
|
|
@ -1,53 +0,0 @@
|
||||||
#!/usr/bin/python2.5
|
|
||||||
|
|
||||||
import os
|
|
||||||
import os.path
|
|
||||||
from config.Common import settings
|
|
||||||
from controller.Database import Database
|
|
||||||
from controller.Scheduler import Scheduler
|
|
||||||
from model.Note import Note
|
|
||||||
from tools.initdb import fix_note_contents
|
|
||||||
|
|
||||||
|
|
||||||
class Deleter( object ):
|
|
||||||
HTML_PATH = u"static/html"
|
|
||||||
|
|
||||||
def __init__( self, scheduler, database ):
|
|
||||||
self.scheduler = scheduler
|
|
||||||
self.database = database
|
|
||||||
|
|
||||||
threads = (
|
|
||||||
self.delete_note(),
|
|
||||||
)
|
|
||||||
|
|
||||||
for thread in threads:
|
|
||||||
self.scheduler.add( thread )
|
|
||||||
self.scheduler.wait_for( thread )
|
|
||||||
|
|
||||||
def delete_note( self ):
|
|
||||||
self.database.load( u"User anonymous", self.scheduler.thread )
|
|
||||||
anonymous = ( yield Scheduler.SLEEP )
|
|
||||||
read_only_main_notebook = anonymous.notebooks[ 0 ]
|
|
||||||
main_notebook = anonymous.notebooks[ 0 ]._Read_only_notebook__wrapped
|
|
||||||
startup_notes = []
|
|
||||||
|
|
||||||
for note in main_notebook.notes:
|
|
||||||
if note and note.title == "try it out": # FIXME: make the note title to delete not hard-coded
|
|
||||||
print "deleting note %s: %s" % ( note.object_id, note.title )
|
|
||||||
main_notebook.remove_note( note )
|
|
||||||
|
|
||||||
self.database.save( main_notebook )
|
|
||||||
|
|
||||||
|
|
||||||
def main( args ):
|
|
||||||
print "IMPORTANT: Stop the Luminotes server before running this program."
|
|
||||||
|
|
||||||
scheduler = Scheduler()
|
|
||||||
database = Database( scheduler, "data.db" )
|
|
||||||
initializer = Deleter( scheduler, database )
|
|
||||||
scheduler.wait_until_idle()
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
|
||||||
import sys
|
|
||||||
main( sys.argv[ 1: ] )
|
|
|
@ -1,35 +0,0 @@
|
||||||
#!/usr/bin/python2.5
|
|
||||||
|
|
||||||
import os
|
|
||||||
import os.path
|
|
||||||
from controller.Database import Database
|
|
||||||
from controller.Scheduler import Scheduler
|
|
||||||
|
|
||||||
|
|
||||||
class Dumper( object ):
|
|
||||||
def __init__( self, scheduler, database ):
|
|
||||||
self.scheduler = scheduler
|
|
||||||
self.database = database
|
|
||||||
|
|
||||||
thread = self.dump_database()
|
|
||||||
self.scheduler.add( thread )
|
|
||||||
self.scheduler.wait_for( thread )
|
|
||||||
|
|
||||||
def dump_database( self ):
|
|
||||||
for key in self.database._Database__db.keys():
|
|
||||||
self.database.load( key, self.scheduler.thread )
|
|
||||||
value = ( yield Scheduler.SLEEP )
|
|
||||||
print "%s: %s" % ( key, value )
|
|
||||||
|
|
||||||
yield None
|
|
||||||
|
|
||||||
|
|
||||||
def main():
|
|
||||||
scheduler = Scheduler()
|
|
||||||
database = Database( scheduler, "data.db" )
|
|
||||||
initializer = Dumper( scheduler, database )
|
|
||||||
scheduler.wait_until_idle()
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
|
||||||
main()
|
|
|
@ -1,14 +1,12 @@
|
||||||
#!/usr/bin/python2.5
|
#!/usr/bin/python2.4
|
||||||
|
|
||||||
import os
|
import os
|
||||||
import os.path
|
import os.path
|
||||||
|
import sys
|
||||||
from controller.Database import Database
|
from controller.Database import Database
|
||||||
from controller.Scheduler import Scheduler
|
from new_model.Notebook import Notebook
|
||||||
from model.Notebook import Notebook
|
from new_model.Note import Note
|
||||||
from model.Read_only_notebook import Read_only_notebook
|
from new_model.User import User
|
||||||
from model.Note import Note
|
|
||||||
from model.User import User
|
|
||||||
from model.User_list import User_list
|
|
||||||
|
|
||||||
|
|
||||||
class Initializer( object ):
|
class Initializer( object ):
|
||||||
|
@ -27,78 +25,68 @@ class Initializer( object ):
|
||||||
( u"advanced browser features.html", False ),
|
( u"advanced browser features.html", False ),
|
||||||
]
|
]
|
||||||
|
|
||||||
def __init__( self, scheduler, database ):
|
def __init__( self, database, nuke = False ):
|
||||||
self.scheduler = scheduler
|
|
||||||
self.database = database
|
self.database = database
|
||||||
self.main_notebook = None
|
self.main_notebook = None
|
||||||
self.read_only_main_notebook = None
|
|
||||||
self.anonymous = None
|
self.anonymous = None
|
||||||
|
|
||||||
threads = (
|
if nuke is True:
|
||||||
self.create_main_notebook(),
|
self.database.execute( file( "new_model/drop.sql" ).read(), commit = False )
|
||||||
self.create_anonymous_user(),
|
|
||||||
)
|
|
||||||
|
|
||||||
for thread in threads:
|
self.database.execute( file( "new_model/schema.sql" ).read(), commit = False )
|
||||||
self.scheduler.add( thread )
|
|
||||||
self.scheduler.wait_for( thread )
|
self.create_main_notebook()
|
||||||
|
self.create_anonymous_user()
|
||||||
|
self.database.commit()
|
||||||
|
|
||||||
def create_main_notebook( self ):
|
def create_main_notebook( self ):
|
||||||
# create the main notebook and all of its notes
|
# create the main notebook
|
||||||
self.database.next_id( self.scheduler.thread )
|
main_notebook_id = self.database.next_id( Notebook )
|
||||||
main_notebook_id = ( yield Scheduler.SLEEP )
|
self.main_notebook = Notebook.create( main_notebook_id, u"Luminotes" )
|
||||||
self.main_notebook = Notebook( main_notebook_id, u"Luminotes" )
|
self.database.save( self.main_notebook, commit = False )
|
||||||
|
|
||||||
# create the read-only view of the main notebook
|
|
||||||
self.database.next_id( self.scheduler.thread )
|
|
||||||
read_only_main_notebook_id = ( yield Scheduler.SLEEP )
|
|
||||||
self.read_only_main_notebook = Read_only_notebook( read_only_main_notebook_id, self.main_notebook )
|
|
||||||
|
|
||||||
# create an id for each note
|
# create an id for each note
|
||||||
note_ids = {}
|
note_ids = {}
|
||||||
for ( filename, startup ) in self.NOTE_FILES:
|
for ( filename, startup ) in self.NOTE_FILES:
|
||||||
self.database.next_id( self.scheduler.thread )
|
note_ids[ filename ] = self.database.next_id( Note )
|
||||||
note_ids[ filename ] = ( yield Scheduler.SLEEP )
|
|
||||||
|
|
||||||
|
rank = 0
|
||||||
for ( filename, startup ) in self.NOTE_FILES:
|
for ( filename, startup ) in self.NOTE_FILES:
|
||||||
full_filename = os.path.join( self.HTML_PATH, filename )
|
full_filename = os.path.join( self.HTML_PATH, filename )
|
||||||
contents = fix_note_contents( file( full_filename ).read(), read_only_main_notebook_id, note_ids )
|
contents = fix_note_contents( file( full_filename ).read(), main_notebook_id, note_ids )
|
||||||
|
|
||||||
note = Note( note_ids[ filename ], contents )
|
|
||||||
self.main_notebook.add_note( note )
|
|
||||||
|
|
||||||
if startup:
|
if startup:
|
||||||
self.main_notebook.add_startup_note( note )
|
rank += 1
|
||||||
|
|
||||||
self.database.save( self.main_notebook )
|
note = Note.create( note_ids[ filename ], contents, notebook_id = self.main_notebook.object_id, startup = startup, rank = startup and rank or None )
|
||||||
self.database.save( self.read_only_main_notebook )
|
self.database.save( note, commit = False )
|
||||||
|
|
||||||
def create_anonymous_user( self ):
|
def create_anonymous_user( self ):
|
||||||
# create the anonymous user
|
# create the anonymous user
|
||||||
self.database.next_id( self.scheduler.thread )
|
anonymous_user_id = self.database.next_id( User )
|
||||||
anonymous_user_id = ( yield Scheduler.SLEEP )
|
self.anonymous = User.create( anonymous_user_id, u"anonymous", None, None )
|
||||||
notebooks = [ self.read_only_main_notebook ]
|
self.database.save( self.anonymous, commit = False )
|
||||||
self.anonymous = User( anonymous_user_id, u"anonymous", None, None, notebooks )
|
|
||||||
self.database.save( self.anonymous )
|
|
||||||
|
|
||||||
# create a user list
|
# give the anonymous user read-only access to the main notebook
|
||||||
self.database.next_id( self.scheduler.thread )
|
self.database.execute( self.anonymous.sql_save_notebook( self.main_notebook.object_id, read_write = False ), commit = False )
|
||||||
user_list_id = ( yield Scheduler.SLEEP )
|
|
||||||
user_list = User_list( user_list_id, u"all" )
|
|
||||||
user_list.add_user( self.anonymous )
|
|
||||||
self.database.save( user_list )
|
|
||||||
|
|
||||||
|
|
||||||
def main():
|
def main( args = None ):
|
||||||
print "IMPORTANT: Stop the Luminotes server before running this program."
|
nuke = False
|
||||||
|
|
||||||
if os.path.exists( "data.db" ):
|
if args and ( "-n" in args or "--nuke" in args ):
|
||||||
os.remove( "data.db" )
|
nuke = True
|
||||||
|
print "This will nuke the contents of the database before initializing it with default data. Continue (y/n)? ",
|
||||||
|
confirmation = sys.stdin.readline().strip()
|
||||||
|
print
|
||||||
|
|
||||||
scheduler = Scheduler()
|
if confirmation.lower()[ 0 ] != 'y':
|
||||||
database = Database( scheduler, "data.db" )
|
print "Exiting without touching the database."
|
||||||
initializer = Initializer( scheduler, database )
|
return
|
||||||
scheduler.wait_until_idle()
|
|
||||||
|
print "Initializing the database with default data."
|
||||||
|
database = Database()
|
||||||
|
initializer = Initializer( database, nuke )
|
||||||
|
|
||||||
|
|
||||||
def fix_note_contents( contents, notebook_id, note_ids ):
|
def fix_note_contents( contents, notebook_id, note_ids ):
|
||||||
|
@ -134,4 +122,5 @@ def fix_note_contents( contents, notebook_id, note_ids ):
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
if __name__ == "__main__":
|
||||||
main()
|
import sys
|
||||||
|
main( sys.argv[ 1: ] )
|
||||||
|
|
|
@ -1,47 +0,0 @@
|
||||||
#!/usr/bin/python2.5
|
|
||||||
|
|
||||||
import os
|
|
||||||
import os.path
|
|
||||||
import sys
|
|
||||||
from config.Common import settings
|
|
||||||
from controller.Database import Database
|
|
||||||
from controller.Scheduler import Scheduler
|
|
||||||
|
|
||||||
|
|
||||||
class Lister( object ):
|
|
||||||
def __init__( self, scheduler, database, username ):
|
|
||||||
self.scheduler = scheduler
|
|
||||||
self.database = database
|
|
||||||
self.username = username
|
|
||||||
|
|
||||||
threads = (
|
|
||||||
self.list_user(),
|
|
||||||
)
|
|
||||||
|
|
||||||
for thread in threads:
|
|
||||||
self.scheduler.add( thread )
|
|
||||||
self.scheduler.wait_for( thread )
|
|
||||||
|
|
||||||
def list_user( self ):
|
|
||||||
self.database.load( u"User %s" % self.username, self.scheduler.thread )
|
|
||||||
user = ( yield Scheduler.SLEEP )
|
|
||||||
if user is None:
|
|
||||||
print "user %s is unknown" % self.username
|
|
||||||
else:
|
|
||||||
print "user %s: %s" % ( self.username, user )
|
|
||||||
|
|
||||||
|
|
||||||
def main( program_name, args ):
|
|
||||||
if len( args ) == 0:
|
|
||||||
print "usage: %s username" % program_name
|
|
||||||
sys.exit( 1 )
|
|
||||||
|
|
||||||
scheduler = Scheduler()
|
|
||||||
database = Database( scheduler, "data.db" )
|
|
||||||
initializer = Lister( scheduler, database, args[ 0 ] )
|
|
||||||
scheduler.wait_until_idle()
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
|
||||||
import sys
|
|
||||||
main( sys.argv[ 0 ], sys.argv[ 1: ] )
|
|
|
@ -1,11 +1,7 @@
|
||||||
#!/bin/sh
|
#!/bin/sh
|
||||||
|
|
||||||
# Run this from the root directory of Luminotes's source. Note: This will nuke your database!
|
# Run this from the root directory of Luminotes's source.
|
||||||
|
|
||||||
ORIG_PYTHONPATH="$PYTHONPATH"
|
|
||||||
export PYTHONPATH=.
|
|
||||||
python2.5 tools/initdb.py
|
|
||||||
export PYTHONPATH="$ORIG_PYTHONPATH"
|
|
||||||
cd ..
|
cd ..
|
||||||
rm -f luminotes.tar.gz
|
rm -f luminotes.tar.gz
|
||||||
tar cvfz luminotes.tar.gz --exclude=session --exclude="*.log" --exclude="*.pyc" --exclude=".*" luminotes
|
tar cvfz luminotes.tar.gz --exclude=session --exclude="*.log" --exclude="*.pyc" --exclude=".*" luminotes
|
||||||
|
|
|
@ -1,36 +0,0 @@
|
||||||
#!/usr/bin/python2.5
|
|
||||||
|
|
||||||
import os
|
|
||||||
import os.path
|
|
||||||
from controller.Database import Database
|
|
||||||
from controller.Scheduler import Scheduler
|
|
||||||
|
|
||||||
|
|
||||||
class Reloader( object ):
|
|
||||||
def __init__( self, scheduler, database ):
|
|
||||||
self.scheduler = scheduler
|
|
||||||
self.database = database
|
|
||||||
|
|
||||||
thread = self.reload_database()
|
|
||||||
self.scheduler.add( thread )
|
|
||||||
self.scheduler.wait_for( thread )
|
|
||||||
|
|
||||||
def reload_database( self ):
|
|
||||||
for key in self.database._Database__db.keys():
|
|
||||||
self.database.reload( key, self.scheduler.thread )
|
|
||||||
yield Scheduler.SLEEP
|
|
||||||
|
|
||||||
yield None
|
|
||||||
|
|
||||||
|
|
||||||
def main():
|
|
||||||
print "IMPORTANT: Stop the Luminotes server before running this program."
|
|
||||||
|
|
||||||
scheduler = Scheduler()
|
|
||||||
database = Database( scheduler, "data.db" )
|
|
||||||
initializer = Reloader( scheduler, database )
|
|
||||||
scheduler.wait_until_idle()
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
|
||||||
main()
|
|
|
@ -1,61 +0,0 @@
|
||||||
#!/usr/bin/python2.5
|
|
||||||
|
|
||||||
import os
|
|
||||||
import os.path
|
|
||||||
import sys
|
|
||||||
from config.Common import settings
|
|
||||||
from controller.Database import Database
|
|
||||||
from controller.Scheduler import Scheduler
|
|
||||||
|
|
||||||
|
|
||||||
class Resetter( object ):
|
|
||||||
def __init__( self, scheduler, database, username ):
|
|
||||||
self.scheduler = scheduler
|
|
||||||
self.database = database
|
|
||||||
self.username = username
|
|
||||||
self.password = None
|
|
||||||
|
|
||||||
self.prompt_for_password()
|
|
||||||
|
|
||||||
threads = (
|
|
||||||
self.reset_password(),
|
|
||||||
)
|
|
||||||
|
|
||||||
for thread in threads:
|
|
||||||
self.scheduler.add( thread )
|
|
||||||
self.scheduler.wait_for( thread )
|
|
||||||
|
|
||||||
def prompt_for_password( self ):
|
|
||||||
print "enter new password for user %s: " % self.username,
|
|
||||||
sys.stdout.flush()
|
|
||||||
self.password = sys.stdin.readline().strip()
|
|
||||||
print
|
|
||||||
|
|
||||||
def reset_password( self ):
|
|
||||||
self.database.load( u"User %s" % self.username, self.scheduler.thread )
|
|
||||||
user = ( yield Scheduler.SLEEP )
|
|
||||||
if user is None:
|
|
||||||
raise Exception( "user %s is unknown" % self.username )
|
|
||||||
|
|
||||||
|
|
||||||
user.password = self.password
|
|
||||||
self.database.save( user )
|
|
||||||
print "password reset"
|
|
||||||
|
|
||||||
|
|
||||||
def main( program_name, args ):
|
|
||||||
print "IMPORTANT: Stop the Luminotes server before running this program."
|
|
||||||
|
|
||||||
if len( args ) == 0:
|
|
||||||
print "usage: %s username" % program_name
|
|
||||||
sys.exit( 1 )
|
|
||||||
|
|
||||||
scheduler = Scheduler()
|
|
||||||
database = Database( scheduler, "data.db" )
|
|
||||||
initializer = Resetter( scheduler, database, args[ 0 ] )
|
|
||||||
scheduler.wait_until_idle()
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
|
||||||
import sys
|
|
||||||
main( sys.argv[ 0 ], sys.argv[ 1: ] )
|
|
|
@ -1,53 +0,0 @@
|
||||||
#!/usr/bin/python2.5
|
|
||||||
|
|
||||||
import os
|
|
||||||
import os.path
|
|
||||||
import sys
|
|
||||||
from config.Common import settings
|
|
||||||
from controller.Database import Database
|
|
||||||
from controller.Scheduler import Scheduler
|
|
||||||
|
|
||||||
|
|
||||||
class Setter( object ):
|
|
||||||
def __init__( self, scheduler, database, username, email_address ):
|
|
||||||
self.scheduler = scheduler
|
|
||||||
self.database = database
|
|
||||||
self.username = username
|
|
||||||
self.email_address = email_address
|
|
||||||
self.password = None
|
|
||||||
|
|
||||||
threads = (
|
|
||||||
self.set_email_address(),
|
|
||||||
)
|
|
||||||
|
|
||||||
for thread in threads:
|
|
||||||
self.scheduler.add( thread )
|
|
||||||
self.scheduler.wait_for( thread )
|
|
||||||
|
|
||||||
def set_email_address( self ):
|
|
||||||
self.database.load( u"User %s" % self.username, self.scheduler.thread )
|
|
||||||
user = ( yield Scheduler.SLEEP )
|
|
||||||
if user is None:
|
|
||||||
raise Exception( "user %s is unknown" % self.username )
|
|
||||||
|
|
||||||
user.email_address = self.email_address
|
|
||||||
self.database.save( user )
|
|
||||||
print "email set"
|
|
||||||
|
|
||||||
|
|
||||||
def main( program_name, args ):
|
|
||||||
print "IMPORTANT: Stop the Luminotes server before running this program."
|
|
||||||
|
|
||||||
if len( args ) < 2:
|
|
||||||
print "usage: %s username emailaddress" % program_name
|
|
||||||
sys.exit( 1 )
|
|
||||||
|
|
||||||
scheduler = Scheduler()
|
|
||||||
database = Database( scheduler, "data.db" )
|
|
||||||
initializer = Setter( scheduler, database, *args )
|
|
||||||
scheduler.wait_until_idle()
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
|
||||||
import sys
|
|
||||||
main( sys.argv[ 0 ], sys.argv[ 1: ] )
|
|
|
@ -1,53 +0,0 @@
|
||||||
#!/usr/bin/python2.5
|
|
||||||
|
|
||||||
import os
|
|
||||||
import os.path
|
|
||||||
import sys
|
|
||||||
from config.Common import settings
|
|
||||||
from controller.Database import Database
|
|
||||||
from controller.Scheduler import Scheduler
|
|
||||||
|
|
||||||
|
|
||||||
class Setter( object ):
|
|
||||||
def __init__( self, scheduler, database, username, rate_plan ):
|
|
||||||
self.scheduler = scheduler
|
|
||||||
self.database = database
|
|
||||||
self.username = username
|
|
||||||
self.rate_plan = rate_plan
|
|
||||||
self.password = None
|
|
||||||
|
|
||||||
threads = (
|
|
||||||
self.set_rate_plan(),
|
|
||||||
)
|
|
||||||
|
|
||||||
for thread in threads:
|
|
||||||
self.scheduler.add( thread )
|
|
||||||
self.scheduler.wait_for( thread )
|
|
||||||
|
|
||||||
def set_rate_plan( self ):
|
|
||||||
self.database.load( u"User %s" % self.username, self.scheduler.thread )
|
|
||||||
user = ( yield Scheduler.SLEEP )
|
|
||||||
if user is None:
|
|
||||||
raise Exception( "user %s is unknown" % self.username )
|
|
||||||
|
|
||||||
user.rate_plan = int( self.rate_plan )
|
|
||||||
self.database.save( user )
|
|
||||||
print "rate plan set"
|
|
||||||
|
|
||||||
|
|
||||||
def main( program_name, args ):
|
|
||||||
print "IMPORTANT: Stop the Luminotes server before running this program."
|
|
||||||
|
|
||||||
if len( args ) < 2:
|
|
||||||
print "usage: %s username rateplan" % program_name
|
|
||||||
sys.exit( 1 )
|
|
||||||
|
|
||||||
scheduler = Scheduler()
|
|
||||||
database = Database( scheduler, "data.db" )
|
|
||||||
initializer = Setter( scheduler, database, *args )
|
|
||||||
scheduler.wait_until_idle()
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
|
||||||
import sys
|
|
||||||
main( sys.argv[ 0 ], sys.argv[ 1: ] )
|
|
|
@ -1,16 +1,16 @@
|
||||||
#!/usr/bin/python2.5
|
#!/usr/bin/python2.4
|
||||||
|
|
||||||
import os
|
import os
|
||||||
import os.path
|
import os.path
|
||||||
from config.Common import settings
|
from config.Common import settings
|
||||||
from controller.Database import Database
|
from controller.Database import Database
|
||||||
from controller.Scheduler import Scheduler
|
from new_model.Notebook import Notebook
|
||||||
from model.Note import Note
|
from new_model.Note import Note
|
||||||
from model.User_list import User_list
|
from new_model.User import User
|
||||||
from tools.initdb import fix_note_contents
|
from tools.initdb import fix_note_contents
|
||||||
|
|
||||||
|
|
||||||
class Initializer( object ):
|
class Updater( object ):
|
||||||
HTML_PATH = u"static/html"
|
HTML_PATH = u"static/html"
|
||||||
NOTE_FILES = [ # the second element of the tuple is whether to show the note on startup
|
NOTE_FILES = [ # the second element of the tuple is whether to show the note on startup
|
||||||
( u"about.html", True ),
|
( u"about.html", True ),
|
||||||
|
@ -25,101 +25,58 @@ class Initializer( object ):
|
||||||
( u"advanced browser features.html", False ),
|
( u"advanced browser features.html", False ),
|
||||||
]
|
]
|
||||||
|
|
||||||
def __init__( self, scheduler, database, navigation_note_id = None ):
|
def __init__( self, database, navigation_note_id = None ):
|
||||||
self.scheduler = scheduler
|
|
||||||
self.database = database
|
self.database = database
|
||||||
self.navigation_note_id = navigation_note_id
|
self.navigation_note_id = navigation_note_id
|
||||||
|
|
||||||
threads = (
|
self.update_main_notebook()
|
||||||
self.create_user_list(),
|
self.database.commit()
|
||||||
self.update_main_notebook(),
|
|
||||||
)
|
|
||||||
|
|
||||||
for thread in threads:
|
|
||||||
self.scheduler.add( thread )
|
|
||||||
self.scheduler.wait_for( thread )
|
|
||||||
|
|
||||||
def create_user_list( self ):
|
|
||||||
# if there's no user list, create one and populate it with all users in the database
|
|
||||||
self.database.load( u"User_list all", self.scheduler.thread )
|
|
||||||
user_list = ( yield Scheduler.SLEEP )
|
|
||||||
if user_list is not None:
|
|
||||||
return
|
|
||||||
|
|
||||||
self.database.next_id( self.scheduler.thread )
|
|
||||||
user_list_id = ( yield Scheduler.SLEEP )
|
|
||||||
user_list = User_list( user_list_id, u"all" )
|
|
||||||
|
|
||||||
for key in self.database._Database__db.keys():
|
|
||||||
if not key.startswith( "User " ): continue
|
|
||||||
|
|
||||||
self.database.load( key, self.scheduler.thread )
|
|
||||||
user = ( yield Scheduler.SLEEP )
|
|
||||||
if user:
|
|
||||||
user_list.add_user( user )
|
|
||||||
|
|
||||||
self.database.save( user_list )
|
|
||||||
|
|
||||||
def update_main_notebook( self ):
|
def update_main_notebook( self ):
|
||||||
self.database.load( u"User anonymous", self.scheduler.thread )
|
anonymous = self.database.select_one( User, User.sql_load_by_username( u"anonymous" ) )
|
||||||
anonymous = ( yield Scheduler.SLEEP )
|
main_notebook = self.database.select_one( Notebook, anonymous.sql_load_notebooks() )
|
||||||
read_only_main_notebook = anonymous.notebooks[ 0 ]
|
|
||||||
main_notebook = anonymous.notebooks[ 0 ]._Read_only_notebook__wrapped
|
|
||||||
startup_notes = []
|
|
||||||
|
|
||||||
# get the id for each note
|
# get the id for each note
|
||||||
note_ids = {}
|
note_ids = {}
|
||||||
for ( filename, startup ) in self.NOTE_FILES:
|
for ( filename, startup ) in self.NOTE_FILES:
|
||||||
title = filename.replace( u".html", u"" )
|
title = filename.replace( u".html", u"" )
|
||||||
note = main_notebook.lookup_note_by_title( title )
|
note = self.database.select_one( Note, main_notebook.sql_load_note_by_title( title ) )
|
||||||
|
|
||||||
if note is not None:
|
if note is not None:
|
||||||
note_ids[ filename ] = note.object_id
|
note_ids[ filename ] = note.object_id
|
||||||
|
|
||||||
# update the navigation note if its id was given
|
# update the navigation note if its id was given
|
||||||
if self.navigation_note_id:
|
if self.navigation_note_id:
|
||||||
self.database.next_id( self.scheduler.thread )
|
note = self.database.load( Note, self.navigation_note_id )
|
||||||
next_id = ( yield Scheduler.SLEEP )
|
self.update_note( "navigation.html", True, main_notebook, note_ids, note )
|
||||||
note = main_notebook.lookup_note( self.navigation_note_id )
|
self.database.save( note, commit = False )
|
||||||
self.update_note( "navigation.html", True, main_notebook, read_only_main_notebook, startup_notes, next_id, note_ids, note )
|
|
||||||
|
|
||||||
# update all of the notes in the main notebook
|
# update all of the notes in the main notebook
|
||||||
for ( filename, startup ) in self.NOTE_FILES:
|
for ( filename, startup ) in self.NOTE_FILES:
|
||||||
self.database.next_id( self.scheduler.thread )
|
|
||||||
next_id = ( yield Scheduler.SLEEP )
|
|
||||||
title = filename.replace( u".html", u"" )
|
title = filename.replace( u".html", u"" )
|
||||||
note = main_notebook.lookup_note_by_title( title )
|
note = self.database.select_one( Note, main_notebook.sql_load_note_by_title( title ) )
|
||||||
self.update_note( filename, startup, main_notebook, read_only_main_notebook, startup_notes, next_id, note_ids, note )
|
self.update_note( filename, startup, main_notebook, note_ids, note )
|
||||||
|
|
||||||
for note in startup_notes:
|
if main_notebook.name != u"Luminotes":
|
||||||
main_notebook.add_startup_note( note )
|
main_notebook.name = u"Luminotes"
|
||||||
|
self.database.save( main_notebook, commit = False )
|
||||||
|
|
||||||
main_notebook.name = u"Luminotes"
|
def update_note( self, filename, startup, main_notebook, note_ids, note = None ):
|
||||||
self.database.save( main_notebook )
|
|
||||||
|
|
||||||
def update_note( self, filename, startup, main_notebook, read_only_main_notebook, startup_notes, next_id, note_ids, note = None ):
|
|
||||||
full_filename = os.path.join( self.HTML_PATH, filename )
|
full_filename = os.path.join( self.HTML_PATH, filename )
|
||||||
contents = fix_note_contents( file( full_filename ).read(), read_only_main_notebook.object_id, note_ids )
|
contents = fix_note_contents( file( full_filename ).read(), main_notebook.object_id, note_ids )
|
||||||
|
|
||||||
if note:
|
if note:
|
||||||
main_notebook.update_note( note, contents )
|
note.contents = contents
|
||||||
# if for some reason the note isn't present, create it
|
# if for some reason the note isn't present, create it
|
||||||
else:
|
else:
|
||||||
note = Note( next_id, contents )
|
next_id = self.database.next_id( Note )
|
||||||
main_notebook.add_note( note )
|
note = Note.create( next_id, contents, notebook_id = main_notebook.object_id, startup = startup )
|
||||||
|
|
||||||
main_notebook.remove_startup_note( note )
|
|
||||||
if startup:
|
|
||||||
startup_notes.append( note )
|
|
||||||
|
|
||||||
|
self.database.save( note, commit = False )
|
||||||
|
|
||||||
def main( args ):
|
def main( args ):
|
||||||
print "IMPORTANT: Stop the Luminotes server before running this program."
|
database = Database()
|
||||||
|
initializer = Updater( database, args and args[ 0 ] or None )
|
||||||
scheduler = Scheduler()
|
|
||||||
database = Database( scheduler, "data.db" )
|
|
||||||
initializer = Initializer( scheduler, database, args and args[ 0 ] or None )
|
|
||||||
scheduler.wait_until_idle()
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
if __name__ == "__main__":
|
||||||
|
|
|
@ -3,7 +3,7 @@
|
||||||
import os
|
import os
|
||||||
import os.path
|
import os.path
|
||||||
import psycopg2 as psycopg
|
import psycopg2 as psycopg
|
||||||
from controller.Database import Database
|
from controller.Old_database import Old_database
|
||||||
from controller.Scheduler import Scheduler
|
from controller.Scheduler import Scheduler
|
||||||
|
|
||||||
|
|
||||||
|
@ -34,8 +34,8 @@ class Verifier( object ):
|
||||||
def verify_database( self ):
|
def verify_database( self ):
|
||||||
inserts = set()
|
inserts = set()
|
||||||
|
|
||||||
for key in self.database._Database__db.keys():
|
for key in self.database._Old_database__db.keys():
|
||||||
if not self.database._Database__db.get( key ):
|
if not self.database._Old_database__db.get( key ):
|
||||||
continue
|
continue
|
||||||
|
|
||||||
self.database.load( key, self.scheduler.thread )
|
self.database.load( key, self.scheduler.thread )
|
||||||
|
@ -92,6 +92,16 @@ class Verifier( object ):
|
||||||
assert row[ 1 ] == notebook.object_id
|
assert row[ 1 ] == notebook.object_id
|
||||||
assert row[ 2 ] == read_write
|
assert row[ 2 ] == read_write
|
||||||
|
|
||||||
|
if notebook.trash:
|
||||||
|
self.cursor.execute(
|
||||||
|
"select * from user_notebook where user_id = %s and notebook_id = %s;" % ( quote( value.object_id ), quote( notebook.trash.object_id ) )
|
||||||
|
)
|
||||||
|
|
||||||
|
for row in self.cursor.fetchmany():
|
||||||
|
assert row[ 0 ] == value.object_id
|
||||||
|
assert row[ 1 ] == notebook.trash.object_id
|
||||||
|
assert row[ 2 ] == read_write
|
||||||
|
|
||||||
self.verify_notebook( notebook )
|
self.verify_notebook( notebook )
|
||||||
|
|
||||||
elif class_name == "Read_only_notebook":
|
elif class_name == "Read_only_notebook":
|
||||||
|
@ -105,9 +115,9 @@ class Verifier( object ):
|
||||||
)
|
)
|
||||||
|
|
||||||
for row in self.cursor.fetchmany():
|
for row in self.cursor.fetchmany():
|
||||||
assert row[ 0 ] == value.email_address
|
assert row[ 0 ] == value.object_id
|
||||||
assert row[ 1 ] == False
|
assert row[ 1 ] == value.email_address
|
||||||
assert row[ 2 ] == value.object_id
|
assert row[ 2 ] == False
|
||||||
elif class_name == "User_list":
|
elif class_name == "User_list":
|
||||||
pass
|
pass
|
||||||
else:
|
else:
|
||||||
|
@ -167,7 +177,7 @@ class Verifier( object ):
|
||||||
|
|
||||||
def main():
|
def main():
|
||||||
scheduler = Scheduler()
|
scheduler = Scheduler()
|
||||||
database = Database( scheduler, "data.db" )
|
database = Old_database( scheduler, "data.db" )
|
||||||
initializer = Verifier( scheduler, database )
|
initializer = Verifier( scheduler, database )
|
||||||
scheduler.wait_until_idle()
|
scheduler.wait_until_idle()
|
||||||
|
|
||||||
|
|
Reference in New Issue