Merge branch 'master' into DAL-modular

* master: (58 commits)
  changed version number
  better types by default, given that we're on 2005 at least
  fix for StorageList and tests added
  improved coverage, fix bug with IS_LIST_OF and items not being strings
  fix cache.increment, added tests
  R-2.9.11
  reverted simplejson
  R-2.9.10
  upgraded memcache and markdown2
  upgraded pypyodbc.py
  upgraded simplejson
  no more split in contains, thanks Niphlod
  fixed wording and bug on contains(), made smart_query use ilike instead of like
  ilike, thanks Niphlod
  CROSS JOIN, thanks jotbe
  added custom represent to GoogleDatastoreAdapter, thanks Alan
  postgresql: identifies what adapter auto-loads json values
  added more tests for json Field
  fixed typo in driver_auto_json
  Improve the graphing to show the name of the application.
  ...

Conflicts:
	gluon/dal.py
	gluon/globals.py
	gluon/tests/test_dal.py
This commit is contained in:
gi0baro
2014-09-25 12:49:16 +02:00
55 changed files with 2932 additions and 1954 deletions
+6 -1
View File
@@ -1,10 +1,13 @@
## 2.9.6
## 2.9.6 - 2.9.10
- fixed support of GAE + SQL
- fixed a typo in the license of some login_methods code. It is now LGPL consistently with the rest of the web2py code. This change applied to all previous web2py versions.
- support for SAML2 (with pysaml2)
- Sphinx documentation (thanks Niphlod)
- improved scheduler (thanks Niphlod)
- increased security
- better cache.disk (thanks Leonel)
- sessions are stored in subfolders for speed
- postgres support for "INSERT ... RETURING ..."
- ldap support for Certificate Authority (thanks Maggs and Shane)
- improved support for S/Mime X.509 (thanks Gyuris)
@@ -12,6 +15,8 @@
- support for Collection+JSON Hypermedia API (RESTful self documenting API)
- jQuery 1.11
- codemirror 4.0.3
- markdown2 2.2.3
- memcache 1.53
- support for the new janrain API
- new "web2py.py -G config" to make GAE configuration easier
- many small bug fixes
+10 -10
View File
@@ -30,20 +30,20 @@ update:
echo "remember that pymysql was tweaked"
src:
### Use semantic versioning
echo 'Version 2.9.6-stable+timestamp.'`date +%Y.%m.%d.%H.%M.%S` > VERSION
echo 'Version 2.10.0-beta+timestamp.'`date +%Y.%m.%d.%H.%M.%S` > VERSION
### rm -f all junk files
make clean
### clean up baisc apps
rm -f routes.py
rm -f applications/*/sessions/*
rm -f applications/*/errors/* | echo 'too many files'
rm -f applications/*/cache/*
rm -f applications/admin/databases/*
rm -f applications/welcome/databases/*
rm -f applications/examples/databases/*
rm -f applications/admin/uploads/*
rm -f applications/welcome/uploads/*
rm -f applications/examples/uploads/*
rm -rf applications/*/sessions/*
rm -rf applications/*/errors/* | echo 'too many files'
rm -rf applications/*/cache/*
rm -rf applications/admin/databases/*
rm -rf applications/welcome/databases/*
rm -rf applications/examples/databases/*
rm -rf applications/admin/uploads/*
rm -rf applications/welcome/uploads/*
rm -rf applications/examples/uploads/*
### NO MORE make epydoc
# make epydoc
### make welcome layout and appadmin the default
+8 -3
View File
@@ -6,7 +6,6 @@ It is written and programmable in Python. LGPLv3 License
Learn more at http://web2py.com
## Google App Engine deployment
cp examples/app.yaml ./
@@ -14,6 +13,10 @@ Learn more at http://web2py.com
Then edit ./app.yaml and replace "yourappname" with yourappname.
## Documentation (readthedocs.org)
[![Docs Status](https://readthedocs.org/projects/web2py/badge/?version=latest)](http://web2py.rtfd.org/)
## Tests
[![Build Status](https://travis-ci.org/web2py/web2py.png)](https://travis-ci.org/web2py/web2py)
@@ -36,11 +39,10 @@ That's it!!!
VERSION > this web2py version
web2py.py > the startup script
anyserver.py > to run with third party servers
wsgihandler.py > handler to connect to WSGI
... > other handlers and example files
gluon/ > the core libraries
contrib/ > third party libraries
tests/ > unittests
tests/ > unittests
applications/ > are the apps
admin/ > web based IDE
...
@@ -64,6 +66,9 @@ That's it!!!
examples/ > example config files, mv .. and customize
extras/ > other files which are required for building web2py
scripts/ > utility and installation scripts
handlers/
wsgihandler.py > handler to connect to WSGI
... > handlers for Fast-CGI, SCGI, Gevent, etc
site-packages/ > additional optional modules
logs/ > log files will go in there
deposit/ > a place where web2py stores apps temporarily
+1 -1
View File
@@ -1 +1 @@
Version 2.9.6-stable+timestamp.2014.09.02.10.10.07
Version 2.10.0-beta+timestamp.2014.09.24.13.35.58
+1 -1
View File
@@ -582,7 +582,7 @@ def bg_graph_model():
if hasattr(db[tablename],'_meta_graphmodel'):
meta_graphmodel = db[tablename]._meta_graphmodel
else:
meta_graphmodel = dict(group='Undefined', color='#ECECEC')
meta_graphmodel = dict(group=request.application, color='#ECECEC')
group = meta_graphmodel['group'].replace(' ', '')
if not subgraphs.has_key(group):
File diff suppressed because one or more lines are too long
+5 -2
View File
@@ -545,8 +545,11 @@
};
$('[data-show-trigger]', target).each(function () {
var name = $(this).attr('data-show-trigger');
if(!triggers[name]) triggers[name] = [];
triggers[name].push($(this).attr('id'));
// The field exists only when creating/editing a row
if ($('#' + name).length) {
if(!triggers[name]) triggers[name] = [];
triggers[name].push($(this).attr('id'));
}
});
for(var name in triggers) {
$('#' + name, target).change(show_if).keyup(show_if);
@@ -582,7 +582,7 @@ def bg_graph_model():
if hasattr(db[tablename],'_meta_graphmodel'):
meta_graphmodel = db[tablename]._meta_graphmodel
else:
meta_graphmodel = dict(group='Undefined', color='#ECECEC')
meta_graphmodel = dict(group=request.application, color='#ECECEC')
group = meta_graphmodel['group'].replace(' ', '')
if not subgraphs.has_key(group):
File diff suppressed because one or more lines are too long
+5 -2
View File
@@ -545,8 +545,11 @@
};
$('[data-show-trigger]', target).each(function () {
var name = $(this).attr('data-show-trigger');
if(!triggers[name]) triggers[name] = [];
triggers[name].push($(this).attr('id'));
// The field exists only when creating/editing a row
if ($('#' + name).length) {
if(!triggers[name]) triggers[name] = [];
triggers[name].push($(this).attr('id'));
}
});
for(var name in triggers) {
$('#' + name, target).change(show_if).keyup(show_if);
+1 -1
View File
@@ -582,7 +582,7 @@ def bg_graph_model():
if hasattr(db[tablename],'_meta_graphmodel'):
meta_graphmodel = db[tablename]._meta_graphmodel
else:
meta_graphmodel = dict(group='Undefined', color='#ECECEC')
meta_graphmodel = dict(group=request.application, color='#ECECEC')
group = meta_graphmodel['group'].replace(' ', '')
if not subgraphs.has_key(group):
File diff suppressed because one or more lines are too long
+5 -2
View File
@@ -545,8 +545,11 @@
};
$('[data-show-trigger]', target).each(function () {
var name = $(this).attr('data-show-trigger');
if(!triggers[name]) triggers[name] = [];
triggers[name].push($(this).attr('id'));
// The field exists only when creating/editing a row
if ($('#' + name).length) {
if(!triggers[name]) triggers[name] = [];
triggers[name].push($(this).attr('id'));
}
});
for(var name in triggers) {
$('#' + name, target).change(show_if).keyup(show_if);
+660 -628
View File
File diff suppressed because it is too large Load Diff
+2 -1
View File
@@ -39,6 +39,7 @@ import marshal
import shutil
import imp
import logging
import types
logger = logging.getLogger("web2py")
from gluon import rewrite
from custom_import import custom_import_install
@@ -212,7 +213,7 @@ def LOAD(c=None, f='index', args=None, vars=None,
request.env.path_info
other_request.cid = target
other_request.env.http_web2py_component_element = target
other_request.restful = request.restful # Needed when you call LOAD() on a controller who has some actions decorates with @request.restful()
other_request.restful = types.MethodType(request.restful.im_func, other_request) # A bit nasty but needed to use LOAD on action decorates with @request.restful()
other_response.view = '%s/%s.%s' % (c, f, other_request.extension)
other_environment = copy.copy(current.globalenv) # NASTY
-4
View File
@@ -1,12 +1,8 @@
# fix response
import re
import os
import cPickle
import gluon.serializers
from gluon import current, HTTP
from gluon.html import markmin_serializer, TAG, HTML, BODY, UL, XML, H1
from gluon.contenttype import contenttype
from gluon.contrib.fpdf import FPDF, HTMLMixin
from gluon.sanitizer import sanitize
from gluon.contrib.markmin.markmin2latex import markmin2latex
File diff suppressed because it is too large Load Diff
File diff suppressed because it is too large Load Diff
+205 -96
View File
@@ -4,7 +4,7 @@
# The MIT License (MIT)
#
# Copyright (c) 2013 Henry Zhou <jiangwen365@gmail.com> and PyPyODBC contributors
# Copyright (c) 2014 Henry Zhou <jiangwen365@gmail.com> and PyPyODBC contributors
# Copyright (c) 2004 Michele Petrazzo
# Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated
@@ -16,7 +16,7 @@
# of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO
# THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO #EVENT SHALL THE
# THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF
# CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
# DEALINGS IN THE SOFTWARE.
@@ -25,7 +25,7 @@ pooling = True
apilevel = '2.0'
paramstyle = 'qmark'
threadsafety = 1
version = '1.2.0'
version = '1.3.0'
lowercase=True
DEBUG = 0
@@ -592,38 +592,38 @@ if sys.platform == 'cli':
# http://infocenter.sybase.com/help/index.jsp?topic=/com.sybase.help.sdk_12.5.1.aseodbc/html/aseodbc/CACFDIGH.htm
SQL_data_type_dict = { \
#SQL Data TYPE 0.Python Data Type 1.Default Output Converter 2.Buffer Type 3.Buffer Allocator 4.Default Buffer Size
SQL_TYPE_NULL : (None, lambda x: None, SQL_C_CHAR, create_buffer, 2 ),
SQL_CHAR : (str, lambda x: x, SQL_C_CHAR, create_buffer, 2048 ),
SQL_NUMERIC : (Decimal, Decimal_cvt, SQL_C_CHAR, create_buffer, 150 ),
SQL_DECIMAL : (Decimal, Decimal_cvt, SQL_C_CHAR, create_buffer, 150 ),
SQL_INTEGER : (int, int, SQL_C_CHAR, create_buffer, 150 ),
SQL_SMALLINT : (int, int, SQL_C_CHAR, create_buffer, 150 ),
SQL_FLOAT : (float, float, SQL_C_CHAR, create_buffer, 150 ),
SQL_REAL : (float, float, SQL_C_CHAR, create_buffer, 150 ),
SQL_DOUBLE : (float, float, SQL_C_CHAR, create_buffer, 200 ),
SQL_DATE : (datetime.date, dt_cvt, SQL_C_CHAR , create_buffer, 30 ),
SQL_TIME : (datetime.time, tm_cvt, SQL_C_CHAR, create_buffer, 20 ),
SQL_SS_TIME2 : (datetime.time, tm_cvt, SQL_C_CHAR, create_buffer, 20 ),
SQL_TIMESTAMP : (datetime.datetime, dttm_cvt, SQL_C_CHAR, create_buffer, 30 ),
SQL_VARCHAR : (str, lambda x: x, SQL_C_CHAR, create_buffer, 2048 ),
SQL_LONGVARCHAR : (str, lambda x: x, SQL_C_CHAR, create_buffer, 20500 ),
SQL_BINARY : (bytearray, bytearray_cvt, SQL_C_BINARY, create_buffer, 5120 ),
SQL_VARBINARY : (bytearray, bytearray_cvt, SQL_C_BINARY, create_buffer, 5120 ),
SQL_LONGVARBINARY : (bytearray, bytearray_cvt, SQL_C_BINARY, create_buffer, 20500 ),
SQL_BIGINT : (long, long, SQL_C_CHAR, create_buffer, 150 ),
SQL_TINYINT : (int, int, SQL_C_CHAR, create_buffer, 150 ),
SQL_BIT : (bool, lambda x:x == BYTE_1, SQL_C_CHAR, create_buffer, 2 ),
SQL_WCHAR : (unicode, lambda x: x, SQL_C_WCHAR, create_buffer_u, 2048 ),
SQL_WVARCHAR : (unicode, lambda x: x, SQL_C_WCHAR, create_buffer_u, 2048 ),
SQL_GUID : (str, str, SQL_C_CHAR, create_buffer, 50 ),
SQL_WLONGVARCHAR : (unicode, lambda x: x, SQL_C_WCHAR, create_buffer_u, 20500 ),
SQL_TYPE_DATE : (datetime.date, dt_cvt, SQL_C_CHAR, create_buffer, 30 ),
SQL_TYPE_TIME : (datetime.time, tm_cvt, SQL_C_CHAR, create_buffer, 20 ),
SQL_TYPE_TIMESTAMP : (datetime.datetime, dttm_cvt, SQL_C_CHAR, create_buffer, 30 ),
SQL_SS_VARIANT : (str, lambda x: x, SQL_C_CHAR, create_buffer, 2048 ),
SQL_SS_XML : (unicode, lambda x: x, SQL_C_WCHAR, create_buffer_u, 20500 ),
SQL_SS_UDT : (bytearray, bytearray_cvt, SQL_C_BINARY, create_buffer, 5120 ),
#SQL Data TYPE 0.Python Data Type 1.Default Output Converter 2.Buffer Type 3.Buffer Allocator 4.Default Size 5.Variable Length
SQL_TYPE_NULL : (None, lambda x: None, SQL_C_CHAR, create_buffer, 2 , False ),
SQL_CHAR : (str, lambda x: x, SQL_C_CHAR, create_buffer, 2048 , False ),
SQL_NUMERIC : (Decimal, Decimal_cvt, SQL_C_CHAR, create_buffer, 150 , False ),
SQL_DECIMAL : (Decimal, Decimal_cvt, SQL_C_CHAR, create_buffer, 150 , False ),
SQL_INTEGER : (int, int, SQL_C_CHAR, create_buffer, 150 , False ),
SQL_SMALLINT : (int, int, SQL_C_CHAR, create_buffer, 150 , False ),
SQL_FLOAT : (float, float, SQL_C_CHAR, create_buffer, 150 , False ),
SQL_REAL : (float, float, SQL_C_CHAR, create_buffer, 150 , False ),
SQL_DOUBLE : (float, float, SQL_C_CHAR, create_buffer, 200 , False ),
SQL_DATE : (datetime.date, dt_cvt, SQL_C_CHAR, create_buffer, 30 , False ),
SQL_TIME : (datetime.time, tm_cvt, SQL_C_CHAR, create_buffer, 20 , False ),
SQL_SS_TIME2 : (datetime.time, tm_cvt, SQL_C_CHAR, create_buffer, 20 , False ),
SQL_TIMESTAMP : (datetime.datetime, dttm_cvt, SQL_C_CHAR, create_buffer, 30 , False ),
SQL_VARCHAR : (str, lambda x: x, SQL_C_CHAR, create_buffer, 2048 , False ),
SQL_LONGVARCHAR : (str, lambda x: x, SQL_C_CHAR, create_buffer, 20500 , True ),
SQL_BINARY : (bytearray, bytearray_cvt, SQL_C_BINARY, create_buffer, 5120 , True ),
SQL_VARBINARY : (bytearray, bytearray_cvt, SQL_C_BINARY, create_buffer, 5120 , True ),
SQL_LONGVARBINARY : (bytearray, bytearray_cvt, SQL_C_BINARY, create_buffer, 20500 , True ),
SQL_BIGINT : (long, long, SQL_C_CHAR, create_buffer, 150 , False ),
SQL_TINYINT : (int, int, SQL_C_CHAR, create_buffer, 150 , False ),
SQL_BIT : (bool, lambda x:x == BYTE_1, SQL_C_CHAR, create_buffer, 2 , False ),
SQL_WCHAR : (unicode, lambda x: x, SQL_C_WCHAR, create_buffer_u, 2048 , False ),
SQL_WVARCHAR : (unicode, lambda x: x, SQL_C_WCHAR, create_buffer_u, 2048 , False ),
SQL_GUID : (str, str, SQL_C_CHAR, create_buffer, 2048 , False ),
SQL_WLONGVARCHAR : (unicode, lambda x: x, SQL_C_WCHAR, create_buffer_u, 20500 , True ),
SQL_TYPE_DATE : (datetime.date, dt_cvt, SQL_C_CHAR, create_buffer, 30 , False ),
SQL_TYPE_TIME : (datetime.time, tm_cvt, SQL_C_CHAR, create_buffer, 20 , False ),
SQL_TYPE_TIMESTAMP : (datetime.datetime, dttm_cvt, SQL_C_CHAR, create_buffer, 30 , False ),
SQL_SS_VARIANT : (str, lambda x: x, SQL_C_CHAR, create_buffer, 2048 , True ),
SQL_SS_XML : (unicode, lambda x: x, SQL_C_WCHAR, create_buffer_u, 20500 , True ),
SQL_SS_UDT : (bytearray, bytearray_cvt, SQL_C_BINARY, create_buffer, 5120 , True ),
}
@@ -645,6 +645,7 @@ SQLRETURN -> ctypes.c_short
funcs_with_ret = [
"SQLAllocHandle",
"SQLBindParameter",
"SQLBindCol",
"SQLCloseCursor",
"SQLColAttribute",
"SQLColumns",
@@ -1175,7 +1176,9 @@ class Cursor:
"""prepare a query"""
#self._free_results(FREE_STATEMENT)
if not self.connection:
self.close()
if type(query_string) == unicode:
c_query_string = wchar_pointer(UCS_buf(query_string))
ret = ODBC_API.SQLPrepareW(self.stmt_h, c_query_string, len(query_string))
@@ -1231,6 +1234,8 @@ class Cursor:
def _BindParams(self, param_types, pram_io_list = []):
"""Create parameter buffers based on param types, and bind them to the statement"""
# Clear the old Parameters
if not self.connection:
self.close()
#self._free_results(NO_FREE_STATEMENT)
# Get the number of query parameters judged by database.
@@ -1414,7 +1419,7 @@ class Cursor:
dec_num, ADDR(ParameterBuffer), BufferLen,ADDR(LenOrIndBuf))
if ret != SQL_SUCCESS:
check_success(self, ret)
# Append the value buffer and the lenth buffer to the array
# Append the value buffer and the length buffer to the array
ParamBufferList.append((ParameterBuffer,LenOrIndBuf,sql_type))
self._last_param_types = param_types
@@ -1426,6 +1431,9 @@ class Cursor:
If parameters are provided, the query would first be prepared, then executed with parameters;
If parameters are not provided, only th query sting, it would be executed directly
"""
if not self.connection:
self.close()
self._free_stmt(SQL_CLOSE)
if params:
# If parameters exist, first prepare the query then executed with parameters
@@ -1549,7 +1557,7 @@ class Cursor:
c_buf_len = len(c_char_buf)
elif param_types[col_num][0] == 'bi':
c_char_buf = str(param_val)
c_char_buf = str_8b(param_val)
c_buf_len = len(c_char_buf)
else:
@@ -1587,6 +1595,8 @@ class Cursor:
def _SQLExecute(self):
if not self.connection:
self.close()
ret = SQLExecute(self.stmt_h)
if ret != SQL_SUCCESS:
check_success(self, ret)
@@ -1594,6 +1604,9 @@ class Cursor:
def execdirect(self, query_string):
"""Execute a query directly"""
if not self.connection:
self.close()
self._free_stmt()
self._last_param_types = None
self.statement = None
@@ -1611,6 +1624,8 @@ class Cursor:
def callproc(self, procname, args):
if not self.connection:
self.close()
raise Warning('', 'Still not fully implemented')
self._pram_io_list = [row[4] for row in self.procedurecolumns(procedure = procname).fetchall() if row[4] not in (SQL_RESULT_COL, SQL_RETURN_VALUE)]
@@ -1637,6 +1652,9 @@ class Cursor:
def executemany(self, query_string, params_list = [None]):
if not self.connection:
self.close()
for params in params_list:
self.execute(query_string, params, many_mode = True)
self._NumOfRows()
@@ -1647,28 +1665,38 @@ class Cursor:
def _CreateColBuf(self):
if not self.connection:
self.close()
self._free_stmt(SQL_UNBIND)
NOC = self._NumOfCols()
self._ColBufferList = []
bind_data = True
for col_num in range(NOC):
col_name = self.description[col_num][0]
col_sql_data_type = self._ColTypeCodeList[col_num]
col_name = self.description[col_num][0]
col_size = self.description[col_num][2]
col_sql_data_type = self._ColTypeCodeList[col_num]
target_type = SQL_data_type_dict[col_sql_data_type][2]
dynamic_length = SQL_data_type_dict[col_sql_data_type][5]
# set default size base on the column's sql data type
total_buf_len = SQL_data_type_dict[col_sql_data_type][4]
# over-write if there's preset size value for "large columns"
if total_buf_len >= 20500:
# over-write if there's pre-set size value for "large columns"
if total_buf_len > 20500:
total_buf_len = self._outputsize.get(None,total_buf_len)
# over-write if there's preset size value for the "col_num" column
# over-write if there's pre-set size value for the "col_num" column
total_buf_len = self._outputsize.get(col_num, total_buf_len)
# if the size of the buffer is very long, do not bind
# because a large buffer decrease performance, and sometimes you only get a NULL value.
# in that case use sqlgetdata instead.
if col_size >= 1024:
dynamic_length = True
alloc_buffer = SQL_data_type_dict[col_sql_data_type][3](total_buf_len)
used_buf_len = c_ssize_t()
target_type = SQL_data_type_dict[col_sql_data_type][2]
force_unicode = self.connection.unicode_results
if force_unicode and col_sql_data_type in (SQL_CHAR,SQL_VARCHAR,SQL_LONGVARCHAR):
@@ -1676,14 +1704,22 @@ class Cursor:
alloc_buffer = create_buffer_u(total_buf_len)
buf_cvt_func = self.connection.output_converter[self._ColTypeCodeList[col_num]]
ADDR(alloc_buffer)
ADDR(used_buf_len)
self._ColBufferList.append([col_name, target_type, used_buf_len, ADDR(used_buf_len), alloc_buffer, ADDR(alloc_buffer), total_buf_len, buf_cvt_func])
if bind_data:
if dynamic_length:
bind_data = False
self._ColBufferList.append([col_name, target_type, used_buf_len, ADDR(used_buf_len), alloc_buffer, ADDR(alloc_buffer), total_buf_len, buf_cvt_func, bind_data])
if bind_data:
ret = ODBC_API.SQLBindCol(self.stmt_h, col_num + 1, target_type, ADDR(alloc_buffer), total_buf_len, ADDR(used_buf_len))
if ret != SQL_SUCCESS:
check_success(self, ret)
def _UpdateDesc(self):
"Get the information of (name, type_code, display_size, internal_size, col_precision, scale, null_ok)"
if not self.connection:
self.close()
force_unicode = self.connection.unicode_results
if force_unicode:
Cname = create_buffer_u(1024)
@@ -1739,6 +1775,9 @@ class Cursor:
def _NumOfRows(self):
"""Get the number of rows"""
if not self.connection:
self.close()
NOR = c_ssize_t()
ret = SQLRowCount(self.stmt_h, ADDR(NOR))
if ret != SQL_SUCCESS:
@@ -1749,6 +1788,9 @@ class Cursor:
def _NumOfCols(self):
"""Get the number of cols"""
if not self.connection:
self.close()
NOC = c_short()
ret = SQLNumResultCols(self.stmt_h, ADDR(NOC))
if ret != SQL_SUCCESS:
@@ -1757,6 +1799,9 @@ class Cursor:
def fetchall(self):
if not self.connection:
self.close()
rows = []
while True:
row = self.fetchone()
@@ -1767,6 +1812,9 @@ class Cursor:
def fetchmany(self, num = None):
if not self.connection:
self.close()
if num is None:
num = self.arraysize
rows = []
@@ -1780,74 +1828,83 @@ class Cursor:
def fetchone(self):
if not self.connection:
self.close()
ret = SQLFetch(self.stmt_h)
if ret == SQL_SUCCESS:
if ret in (SQL_SUCCESS,SQL_SUCCESS_WITH_INFO):
'''Bind buffers for the record set columns'''
value_list = []
col_num = 1
for col_name, target_type, used_buf_len, ADDR_used_buf_len, alloc_buffer, ADDR_alloc_buffer, total_buf_len, buf_cvt_func in self._ColBufferList:
blocks = []
for col_name, target_type, used_buf_len, ADDR_used_buf_len, alloc_buffer, ADDR_alloc_buffer, total_buf_len, buf_cvt_func, bind_data in self._ColBufferList:
raw_data_parts = []
while 1:
ret = SQLGetData(self.stmt_h, col_num, target_type, ADDR_alloc_buffer, total_buf_len, ADDR_used_buf_len)
if bind_data:
ret = SQL_SUCCESS
else:
ret = SQLGetData(self.stmt_h, col_num, target_type, ADDR_alloc_buffer, total_buf_len, ADDR_used_buf_len)
if ret == SQL_SUCCESS:
if used_buf_len.value == SQL_NULL_DATA:
value_list.append(None)
else:
if blocks == []:
if raw_data_parts == []:
# Means no previous data, no need to combine
if target_type == SQL_C_BINARY:
value_list.append(buf_cvt_func(alloc_buffer.raw[:used_buf_len.value]))
elif target_type == SQL_C_WCHAR:
value_list.append(buf_cvt_func(from_buffer_u(alloc_buffer)))
else:
#print col_name, target_type, alloc_buffer.value
value_list.append(buf_cvt_func(alloc_buffer.value))
else:
# There are previous fetched raw data to combine
if target_type == SQL_C_BINARY:
blocks.append(alloc_buffer.raw[:used_buf_len.value])
raw_data_parts.append(alloc_buffer.raw[:used_buf_len.value])
elif target_type == SQL_C_WCHAR:
blocks.append(from_buffer_u(alloc_buffer))
raw_data_parts.append(from_buffer_u(alloc_buffer))
else:
#print col_name, target_type, alloc_buffer.value
blocks.append(alloc_buffer.value)
raw_data_parts.append(alloc_buffer.value)
break
elif ret == SQL_SUCCESS_WITH_INFO:
# Means the data is only partial
if target_type == SQL_C_BINARY:
blocks.append(alloc_buffer.raw)
raw_data_parts.append(alloc_buffer.raw)
else:
blocks.append(alloc_buffer.value)
raw_data_parts.append(alloc_buffer.value)
elif ret == SQL_NO_DATA:
# Means all data has been transmitted
break
else:
check_success(self, ret)
if blocks != []:
if raw_data_parts != []:
if py_v3:
if target_type != SQL_C_BINARY:
raw_value = ''.join(blocks)
raw_value = ''.join(raw_data_parts)
else:
raw_value = BLANK_BYTE.join(blocks)
raw_value = BLANK_BYTE.join(raw_data_parts)
else:
raw_value = ''.join(blocks)
raw_value = ''.join(raw_data_parts)
value_list.append(buf_cvt_func(raw_value))
col_num += 1
return self._row_type(value_list)
else:
if ret == SQL_NO_DATA_FOUND:
return None
else:
check_success(self, ret)
def __next__(self):
self.next()
return self.next()
def next(self):
def next(self):
row = self.fetchone()
if row is None:
raise(StopIteration)
@@ -1858,6 +1915,9 @@ class Cursor:
def skip(self, count = 0):
if not self.connection:
self.close()
for i in range(count):
ret = ODBC_API.SQLFetchScroll(self.stmt_h, SQL_FETCH_NEXT, 0)
if ret != SQL_SUCCESS:
@@ -1867,6 +1927,9 @@ class Cursor:
def nextset(self):
if not self.connection:
self.close()
ret = ODBC_API.SQLMoreResults(self.stmt_h)
if ret not in (SQL_SUCCESS, SQL_NO_DATA):
check_success(self, ret)
@@ -1882,6 +1945,9 @@ class Cursor:
def _free_stmt(self, free_type = None):
if not self.connection:
self.close()
if not self.connection.connected:
raise ProgrammingError('HY000','Attempt to use a closed connection.')
@@ -1903,6 +1969,9 @@ class Cursor:
def getTypeInfo(self, sqlType = None):
if not self.connection:
self.close()
if sqlType is None:
type = SQL_ALL_TYPES
else:
@@ -1917,6 +1986,9 @@ class Cursor:
def tables(self, table=None, catalog=None, schema=None, tableType=None):
"""Return a list with all tables"""
if not self.connection:
self.close()
l_catalog = l_schema = l_table = l_tableType = 0
if unicode in [type(x) for x in (table, catalog, schema,tableType)]:
@@ -1961,7 +2033,10 @@ class Cursor:
def columns(self, table=None, catalog=None, schema=None, column=None):
"""Return a list with all columns"""
"""Return a list with all columns"""
if not self.connection:
self.close()
l_catalog = l_schema = l_table = l_column = 0
if unicode in [type(x) for x in (table, catalog, schema,column)]:
@@ -2004,6 +2079,9 @@ class Cursor:
def primaryKeys(self, table=None, catalog=None, schema=None):
if not self.connection:
self.close()
l_catalog = l_schema = l_table = 0
if unicode in [type(x) for x in (table, catalog, schema)]:
@@ -2044,6 +2122,9 @@ class Cursor:
def foreignKeys(self, table=None, catalog=None, schema=None, foreignTable=None, foreignCatalog=None, foreignSchema=None):
if not self.connection:
self.close()
l_catalog = l_schema = l_table = l_foreignTable = l_foreignCatalog = l_foreignSchema = 0
if unicode in [type(x) for x in (table, catalog, schema,foreignTable,foreignCatalog,foreignSchema)]:
@@ -2092,6 +2173,9 @@ class Cursor:
def procedurecolumns(self, procedure=None, catalog=None, schema=None, column=None):
if not self.connection:
self.close()
l_catalog = l_schema = l_procedure = l_column = 0
if unicode in [type(x) for x in (procedure, catalog, schema,column)]:
string_p = lambda x:wchar_pointer(UCS_buf(x))
@@ -2132,6 +2216,9 @@ class Cursor:
def procedures(self, procedure=None, catalog=None, schema=None):
if not self.connection:
self.close()
l_catalog = l_schema = l_procedure = 0
if unicode in [type(x) for x in (procedure, catalog, schema)]:
@@ -2170,6 +2257,9 @@ class Cursor:
def statistics(self, table, catalog=None, schema=None, unique=False, quick=True):
if not self.connection:
self.close()
l_table = l_catalog = l_schema = 0
if unicode in [type(x) for x in (table, catalog, schema)]:
@@ -2217,15 +2307,23 @@ class Cursor:
def commit(self):
if not self.connection:
self.close()
self.connection.commit()
def rollback(self):
if not self.connection:
self.close()
self.connection.rollback()
def setoutputsize(self, size, column = None):
if not self.connection:
self.close()
self._outputsize[column] = size
def setinputsizes(self, sizes):
if not self.connection:
self.close()
self._inputsizers = [size for size in sizes]
@@ -2234,35 +2332,31 @@ class Cursor:
# ret = ODBC_API.SQLCloseCursor(self.stmt_h)
# check_success(self, ret)
#
ret = ODBC_API.SQLFreeStmt(self.stmt_h, SQL_CLOSE)
check_success(self, ret)
if self.connection.connected:
ret = ODBC_API.SQLFreeStmt(self.stmt_h, SQL_CLOSE)
check_success(self, ret)
ret = ODBC_API.SQLFreeStmt(self.stmt_h, SQL_UNBIND)
check_success(self, ret)
ret = ODBC_API.SQLFreeStmt(self.stmt_h, SQL_UNBIND)
check_success(self, ret)
ret = ODBC_API.SQLFreeStmt(self.stmt_h, SQL_RESET_PARAMS)
check_success(self, ret)
ret = ODBC_API.SQLFreeStmt(self.stmt_h, SQL_RESET_PARAMS)
check_success(self, ret)
ret = ODBC_API.SQLFreeHandle(SQL_HANDLE_STMT, self.stmt_h)
check_success(self, ret)
ret = ODBC_API.SQLFreeHandle(SQL_HANDLE_STMT, self.stmt_h)
check_success(self, ret)
self.closed = True
def __del__(self):
if not self.closed:
#if DEBUG:print 'auto closing cursor: ',
try:
self.close()
except:
#if DEBUG:print 'failed'
pass
else:
#if DEBUG:print 'succeed'
pass
self.close()
def __exit__(self, type, value, traceback):
if not self.connection:
self.close()
if value:
self.rollback()
else:
@@ -2291,7 +2385,7 @@ class Connection:
self.autocommit = autocommit
self.readonly = False
self.timeout = 0
self._cursors = []
# self._cursors = []
for key, value in list(kargs.items()):
connectString = connectString + key + '=' + value + ';'
self.connectString = connectString
@@ -2425,7 +2519,7 @@ class Connection:
if not self.connected:
raise ProgrammingError('HY000','Attempt to use a closed connection.')
cur = Cursor(self, row_type_callable=row_type_callable)
self._cursors.append(cur)
# self._cursors.append(cur)
return cur
def update_db_special_info(self):
@@ -2436,6 +2530,7 @@ class Connection:
SQL_SS_TIME2,
):
cur = Cursor(self)
try:
info_tuple = cur.getTypeInfo(sql_type)
if info_tuple is not None:
@@ -2534,10 +2629,10 @@ class Connection:
def close(self):
if not self.connected:
raise ProgrammingError('HY000','Attempt to close a closed connection.')
for cur in self._cursors:
if not cur is None:
if not cur.closed:
cur.close()
# for cur in self._cursors:
# if not cur is None:
# if not cur.closed:
# cur.close()
if self.connected:
#if DEBUG:print 'disconnect'
@@ -2680,4 +2775,18 @@ def dataSources():
ctrl_err(SQL_HANDLE_ENV, shared_env_h, ret)
else:
dsn_list[dsn.value] = desc.value
return dsn_list
return dsn_list
def monkey_patch_for_gevent():
import functools, gevent
apply_e = gevent.get_hub().threadpool.apply_e
def monkey_patch(func):
@functools.wraps(func)
def wrap(*args, **kwargs):
#if DEBUG:print('%s called with %s %s' % (func, args, kwargs))
return apply_e(Exception, func, args, kwargs)
return wrap
for attr in dir(ODBC_API):
if attr.startswith('SQL') and hasattr(getattr(ODBC_API, attr), 'argtypes'):
setattr(ODBC_API, attr, monkey_patch(getattr(ODBC_API, attr)))
+5 -2
View File
@@ -6,7 +6,10 @@ import redis
from redis.exceptions import ConnectionError
from gluon import current
from gluon.cache import CacheAbstract
import cPickle as pickle
try:
import cPickle as pickle
except:
import pickle
import time
import re
import logging
@@ -165,7 +168,7 @@ class RedisClient(object):
expireat = int(time.time() + time_expire) + 120
bucket_key = "%s:%s" % (cache_set_key, expireat / 60)
value = f()
value_ = pickle.dumps(value)
value_ = pickle.dumps(value, pickle.HIGHEST_PROTOCOL)
if time_expire == 0:
time_expire = 1
self.r_server.setex(key, value_, time_expire)
-3
View File
@@ -3,12 +3,9 @@ Developed by niphlod@gmail.com
"""
import redis
from redis.exceptions import ConnectionError
from gluon import current
from gluon.storage import Storage
import cPickle as pickle
import time
import re
import logging
import thread
+7 -6
View File
@@ -31,12 +31,13 @@ An interactive, stateful AJAX shell that runs Python code on the server.
import logging
import new
import os
import cPickle
try:
import cPickle as pickle
except:
import pickle
import sys
import traceback
import types
import wsgiref.handlers
import StringIO
import threading
locker = threading.RLock()
@@ -100,7 +101,7 @@ class History:
name: the name of the global to remove
value: any picklable value
"""
blob = cPickle.dumps(value)
blob = pickle.dumps(value, pickle.HIGHEST_PROTOCOL)
if name in self.global_names:
index = self.global_names.index(name)
@@ -159,7 +160,7 @@ def represent(obj):
code below to determine whether the object changes over time.
"""
try:
return cPickle.dumps(obj)
return pickle.dumps(obj, pickle.HIGHEST_PROTOCOL)
except:
return repr(obj)
@@ -258,7 +259,7 @@ def run(history, statement, env={}):
if not name.startswith('__'):
try:
history.set_global(name, val)
except (TypeError, cPickle.PicklingError), ex:
except (TypeError, pickle.PicklingError), ex:
UNPICKLABLE_TYPES.append(type(val))
history.add_unpicklable(statement, new_globals.keys())
+3 -1
View File
@@ -20,7 +20,7 @@ import datetime
import logging
from http import HTTP
from gzip import open as gzopen
from recfile import generate
__all__ = [
'parse_version',
@@ -400,6 +400,8 @@ def get_session(request, other_application='admin'):
session_id = request.cookies['session_id_' + other_application].value
session_filename = os.path.join(
up(request.folder), other_application, 'sessions', session_id)
if not os.path.exists(session_filename):
session_filename = generate(session_filename)
osession = storage.load_storage(session_filename)
except Exception, e:
osession = storage.Storage()
+20 -12
View File
@@ -25,15 +25,19 @@ from gluon.serializers import json, custom_json
import gluon.settings as settings
from gluon.utils import web2py_uuid, secure_dumps, secure_loads
from gluon.settings import global_settings
from gluon.dal import Field
from gluon import recfile
import hashlib
import portalocker
import cPickle
try:
import cPickle as pickle
except:
import pickle
from pickle import Pickler, MARK, DICT, EMPTY_DICT
from types import DictionaryType
import cStringIO
import datetime
import re
import copy_reg
import Cookie
import os
import sys
@@ -166,7 +170,6 @@ class Request(Storage):
- is_local
- is_https
- restful()
- settings
"""
def __init__(self, env):
@@ -826,11 +829,11 @@ class Session(Storage):
'sessions', response.session_id)
try:
response.session_file = \
open(response.session_filename, 'rb+')
recfile.open(response.session_filename, 'rb+')
portalocker.lock(response.session_file,
portalocker.LOCK_EX)
response.session_locked = True
self.update(cPickle.load(response.session_file))
self.update(pickle.load(response.session_file))
response.session_file.seek(0)
oc = response.session_filename.split('/')[-1].split('-')[0]
if check_client and response.session_client != oc:
@@ -895,7 +898,7 @@ class Session(Storage):
if row:
# rows[0].update_record(locked=True)
# Unpickle the data
session_data = cPickle.loads(row.session_data)
session_data = pickle.loads(row.session_data)
self.update(session_data)
response.session_new = False
else:
@@ -907,7 +910,7 @@ class Session(Storage):
else:
response.session_id = None
response.session_new = True
# if there is no session id yet, we'll need to create a
# if there is no session id yet, we'll need to create a
# new session
else:
response.session_new = True
@@ -925,7 +928,7 @@ class Session(Storage):
response.cookies[response.session_id_name]['expires'] = \
cookie_expires.strftime(FMT)
session_pickled = cPickle.dumps(self)
session_pickled = pickle.dumps(self, pickle.HIGHEST_PROTOCOL)
response.session_hash = hashlib.md5(session_pickled).hexdigest()
if self.flash:
@@ -1084,7 +1087,7 @@ class Session(Storage):
return True
def _unchanged(self, response):
session_pickled = cPickle.dumps(self)
session_pickled = pickle.dumps(self, pickle.HIGHEST_PROTOCOL)
response.session_pickled = session_pickled
session_hash = hashlib.md5(session_pickled).hexdigest()
return response.session_hash == session_hash
@@ -1111,7 +1114,7 @@ class Session(Storage):
else:
unique_key = response.session_db_unique_key
session_pickled = response.session_pickled or cPickle.dumps(self)
session_pickled = response.session_pickled or pickle.dumps(self, pickle.HIGHEST_PROTOCOL)
dd = dict(locked=False,
client_ip=response.session_client,
@@ -1148,11 +1151,11 @@ class Session(Storage):
session_folder = os.path.dirname(response.session_filename)
if not os.path.exists(session_folder):
os.mkdir(session_folder)
response.session_file = open(response.session_filename, 'wb')
response.session_file = recfile.open(response.session_filename, 'wb')
portalocker.lock(response.session_file, portalocker.LOCK_EX)
response.session_locked = True
if response.session_file:
session_pickled = response.session_pickled or cPickle.dumps(self)
session_pickled = response.session_pickled or pickle.dumps(self, pickle.HIGHEST_PROTOCOL)
response.session_file.write(session_pickled)
response.session_file.truncate()
finally:
@@ -1177,3 +1180,8 @@ class Session(Storage):
del response.session_file
except:
pass
def pickle_session(s):
return Session, (dict(s),)
copy_reg.pickle(Session, pickle_session)
+16 -3
View File
@@ -21,7 +21,10 @@ import sanitizer
import itertools
import decoder
import copy_reg
import cPickle
try:
import cPickle as pickle
except:
import pickle
import marshal
from HTMLParser import HTMLParser
@@ -31,6 +34,7 @@ from gluon.storage import Storage
from gluon.utils import web2py_uuid, simple_hash, compare
from gluon.highlight import highlight
regex_crlf = re.compile('\r|\n')
join = ''.join
@@ -43,6 +47,7 @@ entitydefs.setdefault('apos', u"'".encode('utf-8'))
__all__ = [
'A',
'ASSIGNJS',
'B',
'BEAUTIFY',
'BODY',
@@ -1240,13 +1245,13 @@ class CAT(DIV):
def TAG_unpickler(data):
return cPickle.loads(data)
return pickle.loads(data)
def TAG_pickler(data):
d = DIV()
d.__dict__ = data.__dict__
marshal_dump = cPickle.dumps(d)
marshal_dump = pickle.dumps(d, pickle.HIGHEST_PROTOCOL)
return (TAG_unpickler, (marshal_dump,))
@@ -2825,6 +2830,14 @@ class MARKMIN(XmlComponent):
def __str__(self):
return self.xml()
def ASSIGNJS(**kargs):
from gluon.serializers import json
s = ""
for key, value in kargs.items():
s+='var %s = %s;\n' % (key, json(value))
return XML(s)
if __name__ == '__main__':
import doctest
doctest.testmod()
+8 -5
View File
@@ -21,7 +21,10 @@ import datetime
import platform
import portalocker
import fileutils
import cPickle
try:
import cPickle as pickle
except:
import pickle
from gluon.settings import global_settings
logger = logging.getLogger("web2py.cron")
@@ -139,7 +142,7 @@ class Token(object):
ret = None
portalocker.lock(self.master, portalocker.LOCK_EX)
try:
(start, stop) = cPickle.load(self.master)
(start, stop) = pickle.load(self.master)
except:
(start, stop) = (0, 1)
if startup or self.now - start > locktime:
@@ -149,7 +152,7 @@ class Token(object):
logger.warning('WEB2PY CRON: Stale cron.master detected')
logger.debug('WEB2PY CRON: Acquiring lock')
self.master.seek(0)
cPickle.dump((self.now, 0), self.master)
pickle.dump((self.now, 0), self.master)
self.master.flush()
finally:
portalocker.unlock(self.master)
@@ -166,10 +169,10 @@ class Token(object):
portalocker.lock(self.master, portalocker.LOCK_EX)
logger.debug('WEB2PY CRON: Releasing cron lock')
self.master.seek(0)
(start, stop) = cPickle.load(self.master)
(start, stop) = pickle.load(self.master)
if start == self.now: # if this is my lock
self.master.seek(0)
cPickle.dump((self.now, time.time()), self.master)
pickle.dump((self.now, time.time()), self.master)
portalocker.unlock(self.master)
self.master.close()
+63
View File
@@ -0,0 +1,63 @@
import os, uuid
def generate(filename, depth=2, base=512):
if os.path.sep in filename:
path, filename = os.path.split(filename)
else:
path = None
dummyhash = sum(ord(c)*256**(i % 4) for i,c in enumerate(filename)) % base**depth
folders = []
for level in range(depth-1,-1,-1):
code, dummyhash = divmod(dummyhash, base**level)
folders.append("%03x" % code)
folders.append(filename)
if path:
folders.insert(0,path)
return os.path.join(*folders)
def exists(filename, path=None):
if os.path.exists(filename):
return True
if path is None:
path, filename = os.path.split(filename)
fullfilename = os.path.join(path, generate(filename))
if os.path.exists(fullfilename):
return True
return False
def remove(filename, path=None):
if os.path.exists(filename):
return os.unlink(filename)
if path is None:
path, filename = os.path.split(filename)
fullfilename = os.path.join(path, generate(filename))
if os.path.exists(fullfilename):
return os.unlink(fullfilename)
raise IOError
def open(filename, mode="r", path=None):
if not path:
path, filename = os.path.split(filename)
fullfilename = None
if not mode.startswith('w'):
fullfilename = os.path.join(path, filename)
if not os.path.exists(fullfilename):
fullfilename = None
if not fullfilename:
fullfilename = os.path.join(path, generate(filename))
if mode.startswith('w') and not os.path.exists(os.path.dirname(fullfilename)):
os.makedirs(os.path.dirname(fullfilename))
return file(fullfilename, mode)
def test():
if not os.path.exists('tests'):
os.mkdir('tests')
for k in range(20):
filename = os.path.join('tests',str(uuid.uuid4())+'.test')
open(filename, "w").write('test')
assert open(filename, "r").read()=='test'
if exists(filename):
remove(filename)
if __name__ == '__main__':
test()
+8 -5
View File
@@ -11,7 +11,10 @@ Restricted environment to execute application's code
"""
import sys
import cPickle
try:
import cPickle as pickle
except:
import pickle
import traceback
import types
import os
@@ -55,7 +58,7 @@ class TicketStorage(Storage):
try:
table = self._get_table(self.db, self.tablename, request.application)
table.insert(ticket_id=ticket_id,
ticket_data=cPickle.dumps(ticket_data),
ticket_data=pickle.dumps(ticket_data, pickle.HIGHEST_PROTOCOL),
created_datetime=request.now)
self.db.commit()
message = 'In FILE: %(layer)s\n\n%(traceback)s\n'
@@ -68,7 +71,7 @@ class TicketStorage(Storage):
def _store_on_disk(self, request, ticket_id, ticket_data):
ef = self._error_file(request, ticket_id, 'wb')
try:
cPickle.dump(ticket_data, ef)
pickle.dump(ticket_data, ef)
finally:
ef.close()
@@ -103,13 +106,13 @@ class TicketStorage(Storage):
except IOError:
return {}
try:
return cPickle.load(ef)
return pickle.load(ef)
finally:
ef.close()
else:
table = self._get_table(self.db, self.tablename, app)
rows = self.db(table.ticket_id == ticket_id).select()
return cPickle.loads(rows[0].ticket_data) if rows else {}
return pickle.loads(rows[0].ticket_data) if rows else {}
class RestrictedError(Exception):
+7 -4
View File
@@ -96,7 +96,7 @@ IDENTIFIER = "%s#%s" % (socket.gethostname(),os.getpid())
logger = logging.getLogger('web2py.scheduler.%s' % IDENTIFIER)
from gluon import DAL, Field, IS_NOT_EMPTY, IS_IN_SET, IS_NOT_IN_DB
from gluon import IS_INT_IN_RANGE, IS_DATETIME
from gluon import IS_INT_IN_RANGE, IS_DATETIME, IS_IN_DB
from gluon.utils import web2py_uuid
from gluon.storage import Storage
@@ -671,7 +671,10 @@ class Scheduler(MetaScheduler):
db.define_table(
'scheduler_task_deps',
Field('job_name', default='job_0'),
Field('task_parent', 'reference scheduler_task'),
Field('task_parent', 'integer',
requires=IS_IN_DB(db, 'scheduler_task.id',
'%(task_name)s')
),
Field('task_child', 'reference scheduler_task'),
Field('can_visit', 'boolean', default=False),
migrate=self.__get_migrate('scheduler_task_deps', migrate)
@@ -1311,7 +1314,7 @@ class Scheduler(MetaScheduler):
"""
from gluon.dal import Query
sr, st = self.db.scheduler_run, self.db.scheduler_task
if isinstance(ref, int):
if isinstance(ref, (int, long)):
q = st.id == ref
elif isinstance(ref, str):
q = st.uuid == ref
@@ -1362,7 +1365,7 @@ class Scheduler(MetaScheduler):
Experimental
"""
st, sw = self.db.scheduler_task, self.db.scheduler_worker
if isinstance(ref, int):
if isinstance(ref, (int, long)):
q = st.id == ref
elif isinstance(ref, str):
q = st.uuid == ref
+9 -6
View File
@@ -163,15 +163,18 @@ def ics(events, title=None, link=None, timeshift=0, calname=True,
def rss(feed):
if not 'entries' in feed and 'items' in feed:
feed['entries'] = feed['items']
def safestr(obj, key, default=''):
return str(obj[key]).encode('utf-8', 'replace') if key in obj else default
now = datetime.datetime.now()
rss = rss2.RSS2(title=str(feed.get('title', '(notitle)').encode('utf-8', 'replace')),
link=str(feed.get('link', None).encode('utf-8', 'replace')),
description=str(feed.get('description', '').encode('utf-8', 'replace')),
rss = rss2.RSS2(title=safestr(feed,'title'),
link=safestr(feed,'link'),
description=safestr(feed,'description'),
lastBuildDate=feed.get('created_on', now),
items=[rss2.RSSItem(
title=str(entry.get('title', '(notitle)').encode('utf-8', 'replace')),
link=str(entry.get('link', None).encode('utf-8', 'replace')),
description=str(entry.get('description', '').encode('utf-8', 'replace')),
title=safestr(entry,'title','(notitle)'),
link=safestr(entry,'link'),
description=safestr(entry,'description'),
pubDate=entry.get('created_on', now)
) for entry in feed.get('entries', [])])
return rss.to_xml(encoding='utf-8')
+4 -2
View File
@@ -655,7 +655,7 @@ class AutocompleteWidget(object):
if settings and settings.global_settings.web2py_runtime_gae:
rows = self.db(field.__ge__(self.request.vars[self.keyword]) & field.__lt__(self.request.vars[self.keyword] + u'\ufffd')).select(orderby=self.orderby, limitby=self.limitby, *(self.fields+self.help_fields))
else:
rows = self.db(field.like(self.request.vars[self.keyword] + '%')).select(orderby=self.orderby, limitby=self.limitby, distinct=self.distinct, *(self.fields+self.help_fields))
rows = self.db(field.like(self.request.vars[self.keyword] + '%', case_sensitive=False)).select(orderby=self.orderby, limitby=self.limitby, distinct=self.distinct, *(self.fields+self.help_fields))
if rows:
if self.is_reference:
id_field = self.fields[1]
@@ -1292,7 +1292,7 @@ class SQLFORM(FORM):
xfields.append(
(self.FIELDKEY_DELETE_RECORD + SQLFORM.ID_ROW_SUFFIX,
LABEL(
T(delete_label), separator,
T(delete_label), sep,
_for=self.FIELDKEY_DELETE_RECORD,
_id=self.FIELDKEY_DELETE_RECORD + \
SQLFORM.ID_LABEL_SUFFIX),
@@ -2114,6 +2114,8 @@ class SQLFORM(FORM):
field_id = groupby #take the field passed as groupby
elif groupby and isinstance(groupby, Expression):
field_id = groupby.first #take the first groupby field
while not(isinstance(field_id, Field)): # Navigate to the first Field of the expression
field_id = field_id.first
table = field_id.table
tablename = table._tablename
if not any(str(f) == str(field_id) for f in fields):
+15 -5
View File
@@ -12,7 +12,11 @@ Provides:
- Storage; like dictionary allowing also for `obj.foo` for `obj['foo']`
"""
import cPickle
try:
import cPickle as pickle
except:
import pickle
import copy_reg
import gluon.portalocker as portalocker
__all__ = ['List', 'Storage', 'Settings', 'Messages',
@@ -129,6 +133,12 @@ class Storage(dict):
values = self.getlist(key)
return values[-1] if values else default
def pickle_storage(s):
return Storage, (dict(s),)
copy_reg.pickle(Storage, pickle_storage)
PICKABLE = (str, int, long, float, bool, list, dict, tuple, set)
@@ -141,10 +151,10 @@ class StorageList(Storage):
def __getattr__(self, key):
if key in self:
return getattr(self, key)
return self.get(key)
else:
r = []
setattr(self, key, r)
self[key] = r
return r
@@ -152,7 +162,7 @@ def load_storage(filename):
fp = None
try:
fp = portalocker.LockedFile(filename, 'rb')
storage = cPickle.load(fp)
storage = pickle.load(fp)
finally:
if fp:
fp.close()
@@ -163,7 +173,7 @@ def save_storage(storage, filename):
fp = None
try:
fp = portalocker.LockedFile(filename, 'wb')
cPickle.dump(dict(storage), fp)
pickle.dump(dict(storage), fp)
finally:
if fp:
fp.close()
+10 -6
View File
@@ -279,15 +279,19 @@ class TemplateParser(object):
self.context = context
# allow optional alternative delimiters
if delimiters is None:
delimiters = context.get('response', {})\
.get('app_settings',{}).get('template_delimiters')
if delimiters != self.default_delimiters:
escaped_delimiters = (escape(elimiters[0]),
escaped_delimiters = (escape(delimiters[0]),
escape(delimiters[1]))
self.r_tag = compile(r'(%s.*?%s)' % escaped_delimiters, DOTALL)
else:
delimiters = self.default_delimiters
elif hasattr(context.get('response', None), 'delimiters'):
if context['response'].delimiters != self.default_delimiters:
delimiters = context['response'].delimiters
escaped_delimiters = (
escape(delimiters[0]),
escape(delimiters[1]))
self.r_tag = compile(r'(%s.*?%s)' % escaped_delimiters,
DOTALL)
self.delimiters = delimiters
# Create a root level Content that everything will go into.
+11 -8
View File
@@ -2,14 +2,8 @@ import os, sys
from test_http import *
from test_cache import *
NOSQL = any([name in (os.getenv("DB") or "")
for name in ("datastore", "mongodb", "imap")])
if NOSQL:
from test_dal_nosql import *
else:
from test_dal import *
from test_contenttype import *
from test_fileutils import *
from test_html import *
from test_is_url import *
from test_languages import *
@@ -25,3 +19,12 @@ from test_web import *
if sys.version[:3] == '2.7':
from test_old_doctests import *
NOSQL = any([name in (os.getenv("DB") or "")
for name in ("datastore", "mongodb", "imap")])
if NOSQL:
from test_dal_nosql import *
else:
from test_dal import *
+30
View File
@@ -0,0 +1,30 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
import os
import sys
def fix_sys_path(current_path):
"""
logic to have always the correct sys.path
'', web2py/gluon, web2py/site-packages, web2py/ ...
"""
def add_path_first(path):
sys.path = [path] + [p for p in sys.path if (
not p == path and not p == (path + '/'))]
path = os.path.dirname(os.path.abspath(current_path))
if not os.path.isfile(os.path.join(path,'web2py.py')):
i = 0
while i<10:
i += 1
if os.path.exists(os.path.join(path,'web2py.py')):
break
path = os.path.abspath(os.path.join(path, '..'))
paths = [path,
os.path.abspath(os.path.join(path, 'site-packages')),
os.path.abspath(os.path.join(path, 'gluon')),
'']
[add_path_first(path) for path in paths]
+41 -31
View File
@@ -4,43 +4,15 @@
"""
Unit tests for gluon.cache
"""
import sys
import os
import unittest
from fix_path import fix_sys_path
def fix_sys_path():
"""
logic to have always the correct sys.path
'', web2py/gluon, web2py/site-packages, web2py/ ...
"""
def add_path_first(path):
sys.path = [path] + [p for p in sys.path if (
not p == path and not p == (path + '/'))]
path = os.path.dirname(os.path.abspath(__file__))
if not os.path.isfile(os.path.join(path,'web2py.py')):
i = 0
while i<10:
i += 1
if os.path.exists(os.path.join(path,'web2py.py')):
break
path = os.path.abspath(os.path.join(path, '..'))
paths = [path,
os.path.abspath(os.path.join(path, 'site-packages')),
os.path.abspath(os.path.join(path, 'gluon')),
'']
[add_path_first(path) for path in paths]
fix_sys_path()
fix_sys_path(__file__)
from storage import Storage
from cache import CacheInRam, CacheOnDisk
from cache import CacheInRam, CacheOnDisk, Cache
oldcwd = None
@@ -76,6 +48,20 @@ class TestCache(unittest.TestCase):
cache.clear()
self.assertEqual(cache('a', lambda: 3, 100), 3)
self.assertEqual(cache('a', lambda: 4, 0), 4)
#test singleton behaviour
cache = CacheInRam()
cache.clear()
self.assertEqual(cache('a', lambda: 3, 100), 3)
self.assertEqual(cache('a', lambda: 4, 0), 4)
#test key deletion
cache('a', None)
self.assertEqual(cache('a', lambda: 5, 100), 5)
#test increment
self.assertEqual(cache.increment('a'), 6)
self.assertEqual(cache('a', lambda: 1, 100), 6)
cache.increment('b')
self.assertEqual(cache('b', lambda: 'x', 100), 1)
def testCacheOnDisk(self):
@@ -93,6 +79,30 @@ class TestCache(unittest.TestCase):
cache.clear()
self.assertEqual(cache('a', lambda: 3, 100), 3)
self.assertEqual(cache('a', lambda: 4, 0), 4)
#test singleton behaviour
cache = CacheOnDisk(s)
cache.clear()
self.assertEqual(cache('a', lambda: 3, 100), 3)
self.assertEqual(cache('a', lambda: 4, 0), 4)
#test key deletion
cache('a', None)
self.assertEqual(cache('a', lambda: 5, 100), 5)
#test increment
self.assertEqual(cache.increment('a'), 6)
self.assertEqual(cache('a', lambda: 1, 100), 6)
cache.increment('b')
self.assertEqual(cache('b', lambda: 'x', 100), 1)
def testCacheWithPrefix(self):
s = Storage({'application': 'admin',
'folder': 'applications/admin'})
cache = Cache(s)
prefix = cache.with_prefix(cache.ram,'prefix')
self.assertEqual(prefix('a', lambda: 1, 0), 1)
self.assertEqual(prefix('a', lambda: 2, 100), 1)
self.assertEqual(cache.ram('prefixa', lambda: 2, 100), 1)
if __name__ == '__main__':
+43
View File
@@ -0,0 +1,43 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
"""
Unit tests for gluon.contenttype
"""
import unittest
from fix_path import fix_sys_path
fix_sys_path(__file__)
from contenttype import contenttype
class TestContentType(unittest.TestCase):
def testTypeRecognition(self):
rtn = contenttype('.png')
self.assertEqual(rtn, 'image/png')
rtn = contenttype('.gif')
self.assertEqual(rtn, 'image/gif')
rtn = contenttype('.tar.bz2')
self.assertEqual(rtn, 'application/x-bzip-compressed-tar')
# test overrides and additions
mapping = {
'.load': 'text/html; charset=utf-8',
'.json': 'application/json',
'.jsonp': 'application/jsonp',
'.pickle': 'application/python-pickle',
'.w2p': 'application/w2p',
'.md': 'text/x-markdown; charset=utf-8'
}
for k, v in mapping.iteritems():
self.assertEqual(contenttype(k), v)
# test without dot extension
rtn = contenttype('png')
self.assertEqual(rtn, 'text/plain; charset=utf-8')
if __name__ == '__main__':
unittest.main()
+2 -29
View File
@@ -3,37 +3,10 @@
""" Unit tests for contribs """
import sys
import os
import unittest
from fix_path import fix_sys_path
def fix_sys_path():
"""
logic to have always the correct sys.path
'', web2py/gluon, web2py/site-packages, web2py/ ...
"""
def add_path_first(path):
sys.path = [path] + [p for p in sys.path if (
not p == path and not p == (path + '/'))]
path = os.path.dirname(os.path.abspath(__file__))
if not os.path.isfile(os.path.join(path,'web2py.py')):
i = 0
while i<10:
i += 1
if os.path.exists(os.path.join(path,'web2py.py')):
break
path = os.path.abspath(os.path.join(path, '..'))
paths = [path,
os.path.abspath(os.path.join(path, 'site-packages')),
os.path.abspath(os.path.join(path, 'gluon')),
'']
[add_path_first(path) for path in paths]
fix_sys_path()
fix_sys_path(__file__)
from utils import md5_hash
+23 -32
View File
@@ -15,42 +15,20 @@ try:
except:
from io import StringIO
def fix_sys_path():
"""
logic to have always the correct sys.path
'', web2py/gluon, web2py/site-packages, web2py/ ...
"""
from fix_path import fix_sys_path
def add_path_first(path):
sys.path = [path] + [p for p in sys.path if (
not p == path and not p == (path + '/'))]
path = os.path.dirname(os.path.abspath(__file__))
if not os.path.isfile(os.path.join(path,'web2py.py')):
i = 0
while i<10:
i += 1
if os.path.exists(os.path.join(path,'web2py.py')):
break
path = os.path.abspath(os.path.join(path, '..'))
paths = [path,
os.path.abspath(os.path.join(path, 'site-packages')),
os.path.abspath(os.path.join(path, 'gluon')),
'']
[add_path_first(path) for path in paths]
fix_sys_path()
fix_sys_path(__file__)
#for travis-ci
DEFAULT_URI = os.environ.get('DB', 'sqlite:memory')
DEFAULT_URI = os.getenv('DB', 'sqlite:memory')
print 'Testing against %s engine (%s)' % (DEFAULT_URI.partition(':')[0], DEFAULT_URI)
from dal import DAL, Field
from dal.objects import Table
from dal.helpers.classes import SQLALL
from dal import DAL, Field, Table, SQLALL
from gluon.cache import CacheInRam
ALLOWED_DATATYPES = [
'string',
@@ -131,6 +109,7 @@ class TestFields(unittest.TestCase):
isinstance(f.formatter(datetime.datetime.now()), str)
def testRun(self):
"""Test all field types and their return values"""
db = DAL(DEFAULT_URI, check_reserved=['all'])
for ft in ['string', 'text', 'password', 'upload', 'blob']:
db.define_table('tt', Field('aa', ft, default=''))
@@ -150,8 +129,22 @@ class TestFields(unittest.TestCase):
self.assertEqual(db().select(db.tt.aa)[0].aa, True)
db.tt.drop()
db.define_table('tt', Field('aa', 'json', default={}))
self.assertEqual(db.tt.insert(aa={}), 1)
self.assertEqual(db().select(db.tt.aa)[0].aa, {})
# test different python objects for correct serialization in json
objs = [
{'a' : 1, 'b' : 2},
[1, 2, 3],
'abc',
True,
False,
None,
11,
14.3,
long(11)
]
for obj in objs:
rtn_id = db.tt.insert(aa=obj)
rtn = db(db.tt.id == rtn_id).select().first().aa
self.assertEqual(obj, rtn)
db.tt.drop()
db.define_table('tt', Field('aa', 'date',
default=datetime.date.today()))
@@ -551,9 +544,8 @@ class TestMinMaxSumAvg(unittest.TestCase):
db.tt.drop()
class TestCache(unittest.TestCase):
class TestCacheSelect(unittest.TestCase):
def testRun(self):
from cache import CacheInRam
cache = CacheInRam()
db = DAL(DEFAULT_URI, check_reserved=['all'])
db.define_table('tt', Field('aa'))
@@ -1446,7 +1438,6 @@ class TestQuoting(unittest.TestCase):
db._adapter.types[key]=db._adapter.types[key].replace(
'%(on_delete_action)s','NO ACTION')
t0 = db.define_table('t0',
Field('f', 'string'))
t1 = db.define_table('b',
+2 -27
View File
@@ -19,34 +19,9 @@ try:
except:
from io import StringIO
from fix_path import fix_sys_path
def fix_sys_path():
"""
logic to have always the correct sys.path
'', web2py/gluon, web2py/site-packages, web2py/ ...
"""
def add_path_first(path):
sys.path = [path] + [p for p in sys.path if (
not p == path and not p == (path + '/'))]
path = os.path.dirname(os.path.abspath(__file__))
if not os.path.isfile(os.path.join(path,'web2py.py')):
i = 0
while i<10:
i += 1
if os.path.exists(os.path.join(path,'web2py.py')):
break
path = os.path.abspath(os.path.join(path, '..'))
paths = [path,
os.path.abspath(os.path.join(path, 'site-packages')),
os.path.abspath(os.path.join(path, 'gluon')),
'']
[add_path_first(path) for path in paths]
fix_sys_path()
fix_sys_path(__file__)
#for travis-ci
DEFAULT_URI = os.environ.get('DB', 'sqlite:memory')
+25
View File
@@ -0,0 +1,25 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
import unittest
import datetime
from fix_path import fix_sys_path
fix_sys_path(__file__)
from fileutils import parse_version
class TestFileUtils(unittest.TestCase):
def testParseVersion(self):
rtn = parse_version('Version 1.99.0-rc.1+timestamp.2011.09.19.08.23.26')
self.assertEqual(rtn, (1, 99, 0, 'rc.1', datetime.datetime(2011, 9, 19, 8, 23, 26)))
rtn = parse_version('Version 2.9.11-stable+timestamp.2014.09.15.18.31.17')
self.assertEqual(rtn, (2, 9, 11, 'stable', datetime.datetime(2014, 9, 15, 18, 31, 17)))
rtn = parse_version('Version 1.99.0 (2011-09-19 08:23:26)')
self.assertEqual(rtn, (1, 99, 0, 'dev', datetime.datetime(2011, 9, 19, 8, 23, 26)))
if __name__ == '__main__':
unittest.main()
+2 -29
View File
@@ -5,37 +5,10 @@
Unit tests for gluon.html
"""
import sys
import os
import unittest
from fix_path import fix_sys_path
def fix_sys_path():
"""
logic to have always the correct sys.path
'', web2py/gluon, web2py/site-packages, web2py/ ...
"""
def add_path_first(path):
sys.path = [path] + [p for p in sys.path if (
not p == path and not p == (path + '/'))]
path = os.path.dirname(os.path.abspath(__file__))
if not os.path.isfile(os.path.join(path,'web2py.py')):
i = 0
while i<10:
i += 1
if os.path.exists(os.path.join(path,'web2py.py')):
break
path = os.path.abspath(os.path.join(path, '..'))
paths = [path,
os.path.abspath(os.path.join(path, 'site-packages')),
os.path.abspath(os.path.join(path, 'gluon')),
'']
[add_path_first(path) for path in paths]
fix_sys_path()
fix_sys_path(__file__)
from html import *
from storage import Storage
+2 -33
View File
@@ -3,38 +3,10 @@
"""Unit tests for http.py """
import sys
import os
import unittest
from fix_path import fix_sys_path
def fix_sys_path():
"""
logic to have always the correct sys.path
'', web2py/gluon, web2py/site-packages, web2py/ ...
"""
def add_path_first(path):
sys.path = [path] + [p for p in sys.path if (
not p == path and not p == (path + '/'))]
path = os.path.dirname(os.path.abspath(__file__))
if not os.path.isfile(os.path.join(path,'web2py.py')):
i = 0
while i<10:
i += 1
if os.path.exists(os.path.join(path,'web2py.py')):
break
path = os.path.abspath(os.path.join(path, '..'))
paths = [path,
os.path.abspath(os.path.join(path, 'site-packages')),
os.path.abspath(os.path.join(path, 'gluon')),
'']
[add_path_first(path) for path in paths]
fix_sys_path()
fix_sys_path(__file__)
from http import HTTP, defined_status
@@ -68,8 +40,5 @@ class TestHTTP(unittest.TestCase):
# test wrong call detection
if __name__ == '__main__':
unittest.main()
+5 -33
View File
@@ -4,42 +4,14 @@
Unit tests for IS_URL()
"""
import sys
import os
import unittest
from fix_path import fix_sys_path
fix_sys_path(__file__)
def fix_sys_path():
"""
logic to have always the correct sys.path
'', web2py/gluon, web2py/site-packages, web2py/ ...
"""
def add_path_first(path):
sys.path = [path] + [p for p in sys.path if (
not p == path and not p == (path + '/'))]
path = os.path.dirname(os.path.abspath(__file__))
if not os.path.isfile(os.path.join(path,'web2py.py')):
i = 0
while i<10:
i += 1
if os.path.exists(os.path.join(path,'web2py.py')):
break
path = os.path.abspath(os.path.join(path, '..'))
paths = [path,
os.path.abspath(os.path.join(path, 'site-packages')),
os.path.abspath(os.path.join(path, 'gluon')),
'']
[add_path_first(path) for path in paths]
fix_sys_path()
from validators import IS_URL, IS_HTTP_URL, IS_GENERIC_URL, \
unicode_to_ascii_authority
from validators import IS_URL, IS_HTTP_URL, IS_GENERIC_URL
from validators import unicode_to_ascii_authority
class TestIsUrl(unittest.TestCase):
+8 -35
View File
@@ -7,38 +7,11 @@
import sys
import os
import unittest
import tempfile
import threading
import logging
import unittest
from fix_path import fix_sys_path
def fix_sys_path():
"""
logic to have always the correct sys.path
'', web2py/gluon, web2py/site-packages, web2py/ ...
"""
def add_path_first(path):
sys.path = [path] + [p for p in sys.path if (
not p == path and not p == (path + '/'))]
path = os.path.dirname(os.path.abspath(__file__))
if not os.path.isfile(os.path.join(path,'web2py.py')):
i = 0
while i<10:
i += 1
if os.path.exists(os.path.join(path,'web2py.py')):
break
path = os.path.abspath(os.path.join(path, '..'))
paths = [path,
os.path.abspath(os.path.join(path, 'site-packages')),
os.path.abspath(os.path.join(path, 'gluon')),
'']
[add_path_first(path) for path in paths]
fix_sys_path()
fix_sys_path(__file__)
#support skipif also in python 2.6
def _skipIf(cond, message=''):
@@ -55,7 +28,6 @@ else:
skipIf = _skipIf
import languages
from storage import Storage
MP_WORKING = 0
try:
import multiprocessing
@@ -76,6 +48,7 @@ def read_write(args):
languages.write_dict(filename, content)
return True
class TestLanguagesParallel(unittest.TestCase):
def setUp(self):
@@ -91,7 +64,7 @@ class TestLanguagesParallel(unittest.TestCase):
os.remove(self.filename)
except:
pass
@skipIf(MP_WORKING == 0, 'multiprocessing tests unavailable')
def test_reads_and_writes(self):
readwriters = 10
@@ -99,7 +72,7 @@ class TestLanguagesParallel(unittest.TestCase):
results = pool.map(read_write, [[self.filename, 10]] * readwriters)
for result in results:
self.assertTrue(result)
@skipIf(MP_WORKING == 1, 'multiprocessing tests available')
def test_reads_and_writes_no_mp(self):
results = []
@@ -108,6 +81,7 @@ class TestLanguagesParallel(unittest.TestCase):
for result in results:
self.assertTrue(result)
class TestTranslations(unittest.TestCase):
def setUp(self):
@@ -144,7 +118,6 @@ class TestTranslations(unittest.TestCase):
T.force('it')
self.assertEqual(str(T('Hello World')),
'Salve Mondo')
if __name__ == '__main__':
unittest.main()
+3 -29
View File
@@ -7,41 +7,15 @@
"""
import sys
import os
import unittest
import doctest
import unittest
from fix_path import fix_sys_path
fix_sys_path(__file__)
def fix_sys_path():
"""
logic to have always the correct sys.path
'', web2py/gluon, web2py/site-packages, web2py/ ...
"""
def add_path_first(path):
sys.path = [path] + [p for p in sys.path if (
not p == path and not p == (path + '/'))]
path = os.path.dirname(os.path.abspath(__file__))
if not os.path.isfile(os.path.join(path,'web2py.py')):
i = 0
while i<10:
i += 1
if os.path.exists(os.path.join(path,'web2py.py')):
break
path = os.path.abspath(os.path.join(path, '..'))
paths = [path,
os.path.abspath(os.path.join(path, 'site-packages')),
os.path.abspath(os.path.join(path, 'gluon')),
'']
[add_path_first(path) for path in paths]
fix_sys_path()
def load_tests(loader, tests, ignore):
tests.addTests(
doctest.DocTestSuite('html')
)
+70 -31
View File
@@ -3,40 +3,14 @@
""" Unit tests for storage.py """
import sys
import os
import unittest
from fix_path import fix_sys_path
fix_sys_path(__file__)
def fix_sys_path():
"""
logic to have always the correct sys.path
'', web2py/gluon, web2py/site-packages, web2py/ ...
"""
def add_path_first(path):
sys.path = [path] + [p for p in sys.path if (
not p == path and not p == (path + '/'))]
path = os.path.dirname(os.path.abspath(__file__))
if not os.path.isfile(os.path.join(path,'web2py.py')):
i = 0
while i<10:
i += 1
if os.path.exists(os.path.join(path,'web2py.py')):
break
path = os.path.abspath(os.path.join(path, '..'))
paths = [path,
os.path.abspath(os.path.join(path, 'site-packages')),
os.path.abspath(os.path.join(path, 'gluon')),
'']
[add_path_first(path) for path in paths]
fix_sys_path()
from storage import Storage
from storage import Storage, StorageList, List
from http import HTTP
import pickle
class TestStorage(unittest.TestCase):
@@ -97,5 +71,70 @@ class TestStorage(unittest.TestCase):
self.assertEquals(s['a'], None)
self.assertTrue('a' in s)
def test_pickling(self):
""" Test storage pickling """
s = Storage(a=1)
sd = pickle.dumps(s, pickle.HIGHEST_PROTOCOL)
news = pickle.loads(sd)
self.assertEqual(news.a, 1)
def test_getlist(self):
# usually used with request.vars
a = Storage()
a.x = 'abc'
a.y = ['abc', 'def']
self.assertEqual(a.getlist('x'), ['abc'])
self.assertEqual(a.getlist('y'), ['abc', 'def'])
self.assertEqual(a.getlist('z'), [])
def test_getfirst(self):
# usually with request.vars
a = Storage()
a.x = 'abc'
a.y = ['abc', 'def']
self.assertEqual(a.getfirst('x'), 'abc')
self.assertEqual(a.getfirst('y'), 'abc')
self.assertEqual(a.getfirst('z'), None)
def test_getlast(self):
# usually with request.vars
a = Storage()
a.x = 'abc'
a.y = ['abc', 'def']
self.assertEqual(a.getlast('x'), 'abc')
self.assertEqual(a.getlast('y'), 'def')
self.assertEqual(a.getlast('z'), None)
class TestStorageList(unittest.TestCase):
""" Tests storage.StorageList """
def test_attribute(self):
s = StorageList(a=1)
self.assertEqual(s.a, 1)
self.assertEqual(s['a'], 1)
self.assertEqual(s.b, [])
s.b.append(1)
self.assertEqual(s.b, [1])
class TestList(unittest.TestCase):
""" Tests Storage.List (fast-check for request.args()) """
def test_listcall(self):
a = List((1, 2, 3))
self.assertEqual(a(1), 2)
self.assertEqual(a(-1), 3)
self.assertEqual(a(-5), None)
self.assertEqual(a(-5, default='x'), 'x')
self.assertEqual(a(-3, cast=str), '1')
a.append('1234')
self.assertEqual(a(3), '1234')
self.assertEqual(a(3, cast=int), 1234)
a.append('x')
self.assertRaises(HTTP, a, 4, cast=int)
if __name__ == '__main__':
unittest.main()
+2 -30
View File
@@ -4,38 +4,10 @@
Unit tests for gluon.template
"""
import sys
import os
import unittest
from fix_path import fix_sys_path
def fix_sys_path():
"""
logic to have always the correct sys.path
'', web2py/gluon, web2py/site-packages, web2py/ ...
"""
def add_path_first(path):
sys.path = [path] + [p for p in sys.path if (
not p == path and not p == (path + '/'))]
path = os.path.dirname(os.path.abspath(__file__))
if not os.path.isfile(os.path.join(path,'web2py.py')):
i = 0
while i<10:
i += 1
if os.path.exists(os.path.join(path,'web2py.py')):
break
path = os.path.abspath(os.path.join(path, '..'))
paths = [path,
os.path.abspath(os.path.join(path, 'site-packages')),
os.path.abspath(os.path.join(path, 'gluon')),
'']
[add_path_first(path) for path in paths]
fix_sys_path()
fix_sys_path(__file__)
from template import render
+2 -30
View File
@@ -3,38 +3,10 @@
""" Unit tests for utils.py """
import sys
import os
import unittest
from fix_path import fix_sys_path
def fix_sys_path():
"""
logic to have always the correct sys.path
'', web2py/gluon, web2py/site-packages, web2py/ ...
"""
def add_path_first(path):
sys.path = [path] + [p for p in sys.path if (
not p == path and not p == (path + '/'))]
path = os.path.dirname(os.path.abspath(__file__))
if not os.path.isfile(os.path.join(path,'web2py.py')):
i = 0
while i<10:
i += 1
if os.path.exists(os.path.join(path,'web2py.py')):
break
path = os.path.abspath(os.path.join(path, '..'))
paths = [path,
os.path.abspath(os.path.join(path, 'site-packages')),
os.path.abspath(os.path.join(path, 'gluon')),
'']
[add_path_first(path) for path in paths]
fix_sys_path()
fix_sys_path(__file__)
from utils import md5_hash
+151 -33
View File
@@ -3,43 +3,17 @@
"""Unit tests for http.py """
import sys
import os
import unittest
def fix_sys_path():
"""
logic to have always the correct sys.path
'', web2py/gluon, web2py/site-packages, web2py/ ...
"""
def add_path_first(path):
sys.path = [path] + [p for p in sys.path if (
not p == path and not p == (path + '/'))]
path = os.path.dirname(os.path.abspath(__file__))
if not os.path.isfile(os.path.join(path,'web2py.py')):
i = 0
while i<10:
i += 1
if os.path.exists(os.path.join(path,'web2py.py')):
break
path = os.path.abspath(os.path.join(path, '..'))
paths = [path,
os.path.abspath(os.path.join(path, 'site-packages')),
os.path.abspath(os.path.join(path, 'gluon')),
'']
[add_path_first(path) for path in paths]
fix_sys_path()
import datetime
import decimal
from gluon.validators import *
import re
from fix_path import fix_sys_path
fix_sys_path(__file__)
from gluon.validators import *
class TestValidators(unittest.TestCase):
@@ -114,6 +88,19 @@ class TestValidators(unittest.TestCase):
self.assertEqual(rtn, (datetime.date(2008, 3, 3), None))
rtn = v(datetime.date(2010,3,3))
self.assertEqual(rtn, (datetime.date(2010, 3, 3), 'oops'))
v = IS_DATE_IN_RANGE(maximum=datetime.date(2009,12,31),
format="%m/%d/%Y")
rtn = v('03/03/2010')
self.assertEqual(rtn, ('03/03/2010', 'Enter date on or before 12/31/2009'))
v = IS_DATE_IN_RANGE(minimum=datetime.date(2008,1,1),
format="%m/%d/%Y")
rtn = v('03/03/2007')
self.assertEqual(rtn, ('03/03/2007', 'Enter date on or after 01/01/2008'))
v = IS_DATE_IN_RANGE(minimum=datetime.date(2008,1,1),
maximum=datetime.date(2009,12,31),
format="%m/%d/%Y")
rtn = v('03/03/2007')
self.assertEqual(rtn, ('03/03/2007', 'Enter date in range 01/01/2008 12/31/2009'))
def test_IS_DATE(self):
v = IS_DATE(format="%m/%d/%Y",error_message="oops")
@@ -135,6 +122,19 @@ class TestValidators(unittest.TestCase):
self.assertEquals(rtn, (datetime.datetime(2008, 3, 3, 0, 0), None))
rtn = v(datetime.datetime(2010,3,3,0,0))
self.assertEquals(rtn, (datetime.datetime(2010, 3, 3, 0, 0), 'oops'))
v = IS_DATETIME_IN_RANGE(maximum=datetime.datetime(2009,12,31,12,20),
format='%m/%d/%Y %H:%M:%S')
rtn = v('03/03/2010 12:20:00')
self.assertEqual(rtn, ('03/03/2010 12:20:00', 'Enter date and time on or before 12/31/2009 12:20:00'))
v = IS_DATETIME_IN_RANGE(minimum=datetime.datetime(2008,1,1,12,20),
format='%m/%d/%Y %H:%M:%S')
rtn = v('03/03/2007 12:20:00')
self.assertEqual(rtn, ('03/03/2007 12:20:00', 'Enter date and time on or after 01/01/2008 12:20:00'))
v = IS_DATETIME_IN_RANGE(minimum=datetime.datetime(2008,1,1,12,20),
maximum=datetime.datetime(2009,12,31,12,20),
format='%m/%d/%Y %H:%M:%S')
rtn = v('03/03/2007 12:20:00')
self.assertEqual(rtn, ('03/03/2007 12:20:00', 'Enter date and time in range 01/01/2008 12:20:00 12/31/2009 12:20:00'))
def test_IS_DATETIME(self):
v = IS_DATETIME(format="%m/%d/%Y %H:%M",error_message="oops")
@@ -186,6 +186,8 @@ class TestValidators(unittest.TestCase):
self.assertEqual(rtn, ('6,5', 'Enter a number'))
rtn = IS_DECIMAL_IN_RANGE(dot=',')('6.5')
self.assertEqual(rtn, (decimal.Decimal('6.5'), None))
rtn = IS_DECIMAL_IN_RANGE(1,5)(decimal.Decimal('4'))
self.assertEqual(rtn, (decimal.Decimal('4'), None))
def test_IS_EMAIL(self):
rtn = IS_EMAIL()('a@b.com')
@@ -242,6 +244,16 @@ class TestValidators(unittest.TestCase):
self.assertEqual(rtn, ('Ima Fool@example.com', 'Enter a valid email address'))
rtn = IS_EMAIL()('localguy@localhost') # localhost as domain
self.assertEqual(rtn, ('localguy@localhost', None))
# test for banned
rtn = IS_EMAIL(banned='^.*\.com(|\..*)$')('localguy@localhost') # localhost as domain
self.assertEqual(rtn, ('localguy@localhost', None))
rtn = IS_EMAIL(banned='^.*\.com(|\..*)$')('abc@example.com')
self.assertEqual(rtn, ('abc@example.com', 'Enter a valid email address'))
# test for forced
rtn = IS_EMAIL(forced='^.*\.edu(|\..*)$')('localguy@localhost')
self.assertEqual(rtn, ('localguy@localhost', 'Enter a valid email address'))
rtn = IS_EMAIL(forced='^.*\.edu(|\..*)$')('localguy@example.edu')
self.assertEqual(rtn, ('localguy@example.edu', None))
def test_IS_LIST_OF_EMAILS(self):
emails = ['localguy@localhost', '_Yosemite.Sam@example.com']
@@ -255,6 +267,19 @@ class TestValidators(unittest.TestCase):
rtn = IS_LIST_OF_EMAILS()(';'.join(emails))
self.assertEqual(rtn, ('localguy@localhost;_Yosemite.Sam@example.com;a', 'Invalid emails: a'))
def test_IS_LIST_OF(self):
values = [0,1,2,3,4]
rtn = IS_LIST_OF(IS_INT_IN_RANGE(0, 10))(values)
self.assertEqual(rtn, (values, None))
values.append(11)
rtn = IS_LIST_OF(IS_INT_IN_RANGE(0, 10))(values)
self.assertEqual(rtn, (values, 'Enter an integer between 0 and 9'))
rtn = IS_LIST_OF(IS_INT_IN_RANGE(0, 10))(1)
self.assertEqual(rtn, ([1], None))
rtn = IS_LIST_OF(IS_INT_IN_RANGE(0, 10), minimum=10)([1,2])
self.assertEqual(rtn, ([1, 2], 'Enter between 10 and 100 values'))
rtn = IS_LIST_OF(IS_INT_IN_RANGE(0, 10), maximum=2)([1,2,3])
self.assertEqual(rtn, ([1, 2, 3], 'Enter between 0 and 2 values'))
def test_IS_EMPTY_OR(self):
rtn = IS_EMPTY_OR(IS_EMAIL())('abc@def.com')
@@ -515,8 +540,46 @@ class TestValidators(unittest.TestCase):
self.assertEqual(rtn, (None, 'Enter from 1 to 255 characters'))
rtn = IS_LENGTH(minsize=1)([])
self.assertEqual(rtn, ([], 'Enter from 1 to 255 characters'))
rtn = IS_LENGTH(minsize=1)([1, 2])
self.assertEqual(rtn, ([1, 2], None))
rtn = IS_LENGTH(minsize=1)([1])
self.assertEqual(rtn, ([1], None))
# test unicode
rtn = IS_LENGTH(2)(u'°2')
self.assertEqual(rtn, ('\xc2\xb02', None))
rtn = IS_LENGTH(2)(u'°12')
self.assertEqual(rtn, (u'\xb012', 'Enter from 0 to 2 characters'))
# test automatic str()
rtn = IS_LENGTH(minsize=1)(1)
self.assertEqual(rtn, ('1', None))
rtn = IS_LENGTH(minsize=2)(1)
self.assertEqual(rtn, (1, 'Enter from 2 to 255 characters'))
# test FieldStorage
import cgi
from StringIO import StringIO
a = cgi.FieldStorage()
a.file = StringIO('abc')
rtn = IS_LENGTH(minsize=4)(a)
self.assertEqual(rtn, (a, 'Enter from 4 to 255 characters'))
urlencode_data = "key2=value2x&key3=value3&key4=value4"
urlencode_environ = {
'CONTENT_LENGTH': str(len(urlencode_data)),
'CONTENT_TYPE': 'application/x-www-form-urlencoded',
'QUERY_STRING': 'key1=value1&key2=value2y',
'REQUEST_METHOD': 'POST',
}
fake_stdin = StringIO(urlencode_data)
fake_stdin.seek(0)
a = cgi.FieldStorage(fp=fake_stdin, environ=urlencode_environ)
rtn = IS_LENGTH(minsize=6)(a)
self.assertEqual(rtn, (a, 'Enter from 6 to 255 characters'))
a = cgi.FieldStorage()
rtn = IS_LENGTH(minsize=6)(a)
self.assertEqual(rtn, (a, 'Enter from 6 to 255 characters'))
rtn = IS_LENGTH(6)(a)
self.assertEqual(rtn, (a, None))
def test_IS_LOWER(self):
rtn = IS_LOWER()('ABC')
@@ -545,10 +608,14 @@ class TestValidators(unittest.TestCase):
self.assertEqual(rtn, ('hellas', 'Invalid expression'))
rtn = IS_MATCH('hell$', strict=True)('hellas')
self.assertEqual(rtn, ('hellas', 'Invalid expression'))
rtn = IS_MATCH('^.hell$', strict=True)('shell')
self.assertEqual(rtn, ('shell', None))
rtn = IS_MATCH(u'hell', is_unicode=True)('àòè')
self.assertEqual(rtn, ('\xc3\xa0\xc3\xb2\xc3\xa8', 'Invalid expression'))
rtn = IS_MATCH(u'hell', is_unicode=True)(u'hell')
self.assertEqual(rtn, (u'hell', None))
rtn = IS_MATCH('hell', is_unicode=True)(u'hell')
self.assertEqual(rtn, (u'hell', None))
def test_IS_EQUAL_TO(self):
@@ -718,6 +785,57 @@ class TestValidators(unittest.TestCase):
rtn = IS_JSON()('spam1234')
self.assertEqual(rtn, ('spam1234', 'Invalid json'))
def test_IS_UPLOAD_FILENAME(self):
import cgi
from StringIO import StringIO
def gen_fake(filename):
formdata_file_data = """
---123
Content-Disposition: form-data; name="key2"
value2y
---123
Content-Disposition: form-data; name="file_attach"; filename="%s"
Content-Type: text/plain
this is the content of the fake file
---123--
""" % filename
formdata_file_environ = {
'CONTENT_LENGTH': str(len(formdata_file_data)),
'CONTENT_TYPE': 'multipart/form-data; boundary=-123',
'QUERY_STRING': 'key1=value1&key2=value2x',
'REQUEST_METHOD': 'POST',
}
return cgi.FieldStorage(fp=StringIO(formdata_file_data), environ=formdata_file_environ)['file_attach']
fake = gen_fake('example.pdf')
rtn = IS_UPLOAD_FILENAME(extension='pdf')(fake)
self.assertEqual(rtn, (fake, None))
fake = gen_fake('example.gif')
rtn = IS_UPLOAD_FILENAME(extension='pdf')(fake)
self.assertEqual(rtn, (fake, 'Enter valid filename'))
fake = gen_fake('backup2014.tar.gz')
rtn = IS_UPLOAD_FILENAME(filename='backup.*', extension='tar.gz', lastdot=False)(fake)
self.assertEqual(rtn, (fake, None))
fake = gen_fake('README')
rtn = IS_UPLOAD_FILENAME(filename='^README$', extension='^$', case=0)(fake)
self.assertEqual(rtn, (fake, None))
fake = gen_fake('readme')
rtn = IS_UPLOAD_FILENAME(filename='^README$', extension='^$', case=0)(fake)
self.assertEqual(rtn, (fake, 'Enter valid filename'))
fake = gen_fake('readme')
rtn = IS_UPLOAD_FILENAME(filename='README', case=2)(fake)
self.assertEqual(rtn, (fake, None))
fake = gen_fake('README')
rtn = IS_UPLOAD_FILENAME(filename='README', case=2)(fake)
self.assertEqual(rtn, (fake, None))
rtn = IS_UPLOAD_FILENAME(extension='pdf')('example.pdf')
self.assertEqual(rtn, ('example.pdf', 'Enter valid filename'))
if __name__ == '__main__':
unittest.main()
+2 -27
View File
@@ -13,34 +13,9 @@ import subprocess
import time
import signal
from fix_path import fix_sys_path
def fix_sys_path():
"""
logic to have always the correct sys.path
'', web2py/gluon, web2py/site-packages, web2py/ ...
"""
def add_path_first(path):
sys.path = [path] + [p for p in sys.path if (
not p == path and not p == (path + '/'))]
path = os.path.dirname(os.path.abspath(__file__))
if not os.path.isfile(os.path.join(path,'web2py.py')):
i = 0
while i<10:
i += 1
if os.path.exists(os.path.join(path,'web2py.py')):
break
path = os.path.abspath(os.path.join(path, '..'))
paths = [path,
os.path.abspath(os.path.join(path, 'site-packages')),
os.path.abspath(os.path.join(path, 'gluon')),
'']
[add_path_first(path) for path in paths]
fix_sys_path()
fix_sys_path(__file__)
from contrib.webclient import WebClient
from urllib2 import HTTPError
+15 -8
View File
@@ -11,7 +11,10 @@ Auth, Mail, PluginManager and various utilities
"""
import base64
import cPickle
try:
import cPickle as pickle
except:
import pickle
import datetime
import thread
import logging
@@ -2710,7 +2713,8 @@ class Auth(object):
extra_fields = [
Field("password_two", "password", requires=IS_EQUAL_TO(
request.post_vars.get(passfield,None),
error_message=self.messages.mismatched_password))]
error_message=self.messages.mismatched_password),
label=current.T("Confirm Password"))]
else:
extra_fields = []
form = SQLFORM(table_user,
@@ -3187,11 +3191,14 @@ class Auth(object):
if log is DEFAULT:
log = self.messages['change_password_log']
passfield = self.settings.password_field
is_crypt = copy.copy([t for t in table_user[passfield].requires
if isinstance(t,CRYPT)][0])
is_crypt.min_length = 0
requires = table_user[passfield].requires
if not isinstance(requires,(list, tuple)):
requires = [requires]
requires = filter(lambda t:isinstance(t,CRYPT), requires)
if requires:
requires[0].min_length = 0
form = SQLFORM.factory(
Field('old_password', 'password', requires=[is_crypt],
Field('old_password', 'password', requires=requires,
label=self.messages.old_password),
Field('new_password', 'password',
label=self.messages.new_password,
@@ -3326,7 +3333,7 @@ class Auth(object):
user = table_user(user_id)
if not user:
raise HTTP(401, "Not Authorized")
auth.impersonator = cPickle.dumps(session)
auth.impersonator = pickle.dumps(session, pickle.HIGHEST_PROTOCOL)
auth.user.update(
table_user._filter_fields(user, True))
self.user = auth.user
@@ -3337,7 +3344,7 @@ class Auth(object):
elif user_id in (0, '0'):
if self.is_impersonating():
session.clear()
session.update(cPickle.loads(auth.impersonator))
session.update(pickle.loads(auth.impersonator))
self.user = session.auth.user
self.update_groups()
self.run_login_onaccept()
+1 -2
View File
@@ -23,7 +23,6 @@ import logging
import socket
import base64
import zlib
import types
_struct_2_long_long = struct.Struct('=QQ')
@@ -160,7 +159,7 @@ def pad(s, n=32, padchar=' '):
def secure_dumps(data, encryption_key, hash_key=None, compression_level=None):
if not hash_key:
hash_key = sha1(encryption_key).hexdigest()
dump = pickle.dumps(data)
dump = pickle.dumps(data, pickle.HIGHEST_PROTOCOL)
if compression_level:
dump = zlib.compress(dump, compression_level)
key = pad(encryption_key[:32])
+7 -6
View File
@@ -372,14 +372,18 @@ class IS_JSON(Validator):
if self.native_json:
simplejson.loads(value) # raises error in case of malformed json
return (value, None) # the serialized value is not passed
return (simplejson.loads(value), None)
else:
return (simplejson.loads(value), None)
except JSONErrors:
return (value, translate(self.error_message))
def formatter(self,value):
if value is None:
return None
return simplejson.dumps(value)
if self.native_json:
return value
else:
return simplejson.dumps(value)
class IS_IN_SET(Validator):
@@ -1164,11 +1168,8 @@ class IS_LIST_OF_EMAILS(object):
def __call__(self, value):
bad_emails = []
emails = []
f = IS_EMAIL()
for email in self.split_emails.findall(value):
if not email in emails:
emails.append(email)
error = f(email)[1]
if error and not email in bad_emails:
bad_emails.append(email)
@@ -2516,7 +2517,7 @@ class IS_LIST_OF(Validator):
if not isinstance(other, (list,tuple)):
other = [other]
for item in ivalue:
if item.strip():
if str(item).strip():
v = item
for validator in other:
(v, e) = validator(v)
+13 -2
View File
@@ -29,6 +29,12 @@ Typical usage:
"""
from __future__ import with_statement
import sys
import os
print os.path.join(*__file__.split(os.sep)[:-2] or ['.'])
sys.path.append(os.path.join(*__file__.split(os.sep)[:-2] or ['.']))
from gluon import current
from gluon.storage import Storage
from optparse import OptionParser
@@ -37,6 +43,7 @@ import datetime
import os
import stat
import time
import glob
EXPIRATION_MINUTES = 60
SLEEP_MINUTES = 5
@@ -157,6 +164,9 @@ class SessionFile(object):
def delete(self):
try:
os.unlink(self.filename)
path = os.path.dirname(filename)
if not path.endswith('sessions') and len(os.listdir(path))==0:
os.rmdir(path)
except:
pass
@@ -191,10 +201,11 @@ def single_loop(expiration=None, force=False, verbose=False):
except:
expiration = EXPIRATION_MINUTES * 60
set_db = SessionSetDb(expiration, force, verbose)
set_files = SessionSetFiles(expiration, force, verbose)
set_db.trash()
set_files.trash()
set_db = SessionSetDb(expiration, force, verbose)
set_db.trash()
def main():
"""Main processing."""