Compare commits

..

364 Commits

Author SHA1 Message Date
Oliver Gierke
8d18729898 DATAMONGO-340 - Prepare 1.0.0.RC1. 2011-12-07 01:09:58 +01:00
Oliver Gierke
b5e0b2bec2 DATAMONGO-340 - Polished reference documentation.
Added section ids to generate stable URLs for HTML documentation.
2011-12-07 01:09:58 +01:00
Oliver Gierke
9eb47827c1 DATAMONGO-337 - Added Criteria.nin(…) and ….all(…) taking a Collection. 2011-12-07 01:09:57 +01:00
Oliver Gierke
f97ab25411 DATAMONGO-338 - Updated reference documentation regarding new keywords. 2011-12-07 01:09:57 +01:00
Oliver Gierke
6616761f50 DATAMONGO-322 - MongoTemplates refuses to save entities with unset id if not auto-generateable.
If an entity is handed into the template to be saved or inserted we now check that the auto-generated ObjectId can actually be applied to the id property after saving the object.
2011-12-07 01:09:57 +01:00
Mark Pollack
89de566893 Add findAndModify to docs, update test to include findAndModify with upsert 2011-12-06 13:04:24 -05:00
Mark Pollack
ea1f090b40 Add docs for index ops and clean up 'bare' references to Mongo, change to MongoDB 2011-12-06 11:09:23 -05:00
Oliver Gierke
b5958fb5cc DATAMONGO-338 - Query parser implementation go Regex, Exists, True and False keywords. 2011-12-06 17:01:56 +01:00
Oliver Gierke
75b7aff80a DATAMONGO-318 - Don't throw exceptions for updates not affecting any documents.
Throwing an exception if an update does not affect any documents doesn't make sense in all cases. Removed throwing an exception by default but made the relevant method (handleAnyWriteResultErrors(…)) protected so that subclasses might override this behavior.
2011-12-06 15:15:13 +01:00
Oliver Gierke
7da0fcdd0c DATAMONGO-199 - Fixed bug in CachingMongoPersistentProperty. 2011-12-06 14:48:25 +01:00
Oliver Gierke
c88b6d89db DATAMONGO-251 - Polishing.
JavaDoc, Formatting. Made dependencies in DefaultIndexOperations final. Reduced dependency to MongoOperations instead of depending on MongoTemplate directly. Added not-null assertion to constructor of DIO.
2011-12-06 14:33:45 +01:00
Oliver Gierke
de1540aadc DATAMONGO-234 - Polishing.
Removed unused imports, corrected whitespace, formatting.
2011-12-06 14:24:51 +01:00
Oliver Gierke
d1b24d6cfb DATAMONGO-332 - Updated reference documentation to list correct dependencies.
Fixed formatting of log output along the way.
2011-12-06 14:06:59 +01:00
Mark Pollack
e85f3d3299 DATAMONGO-251 - Support geting index information on a collection or mapped class. 2011-12-06 02:25:13 -05:00
Mark Pollack
ef6e08d3f4 DATAMONGO-234 - MongoTemplate should support the findAndModify operation to update version fields 2011-12-06 00:26:18 -05:00
Oliver Gierke
21010fbd49 DATACMNS-91 - Reject null parameters in SimpleMongoRepository.
According to the specification in CrudRepository we now reject null values for ids and entities in CRUD methods.
2011-12-02 13:34:44 +01:00
Oliver Gierke
4325d6c9fa Reactivated accidentally disabled unit tests. 2011-12-02 13:02:57 +01:00
Oliver Gierke
bc16ccfded DATACMNS-77 - Using constants from ClassTypeInformation inside MappingMongoConverter. 2011-12-02 11:33:37 +01:00
Oliver Gierke
04f5f9f662 DATACMNS-103 - Adapt changes in BeanWrapper.
Removed obsolete exception handling code.
2011-12-02 10:08:48 +01:00
Oliver Gierke
b1f1b8efaa DATAMONGO-321 - Overhaul of id handling.
Cleaned up the id handling on query mapping and mapping in general. We now only try to convert id values into an ObjectId and store it as is using potentially registered custom converters. Register BigInteger<->String converters by default now.
2011-12-01 17:50:36 +01:00
Oliver Gierke
de300e2643 DATAMONGO-328 - Set required MongoDB version to 0.
The MANIFEST.MF in current MongoDB driver version is broken in terms of not stating package versions. Thus we unfortunately cannot refer to a particular version range but have to use the generic 0 as required version.
2011-12-01 17:05:28 +01:00
Oliver Gierke
20088b83d9 Removed compiler warnings. 2011-12-01 16:47:40 +01:00
Oliver Gierke
58f200f15e DATAMONGO-335 - Set up hybrid Spring 3.1/3.0.6 build.
Also see DATACMNS-103.
2011-12-01 16:06:38 +01:00
Oliver Gierke
8718700249 DATAMONGO-334 - Switched to use http://repo.springsource.org as repository.
Fixed versions of build plugins along the way.
2011-12-01 16:05:08 +01:00
Oliver Gierke
f4063d1679 DATAMONGO-333 - Default to Object for AbstractMongoEventlistener domain type.
In case an extension of AbstractMongoEventListener does not define a parameter type we now default to Object as handled domain type as we'd cause a NullPointerException if not.
2011-12-01 12:16:27 +01:00
Oliver Gierke
ef063613c7 DATAMONGO-325 - MongoTemplate now correctly refuses not found map reduce JavaScript files.
We now check whether a URL was passed in as map and/or reduce function and throw an exception in case the file either does not exist or cannot be read.
2011-11-30 22:53:56 +01:00
Oliver Gierke
2eda0f1701 DATAMONGO-185 - Expose hints on Query.
Query now exposes a withHint(…) method which will be applied to the DBCursor on query execution. Reduced CursorPreparer's visibility to the package and removed methods exposing it from MongoOperations.
2011-11-30 22:29:59 +01:00
Oliver Gierke
ec7b65e21d DATAMONGO-331 - Fixed typo in WriteConcern enumeration for db-factory element. 2011-11-30 18:27:33 +01:00
Oliver Gierke
c7f7571f3f DATAMONGO-326 - QueryMapper now delegates type conversion to MongoConverter.
QueryMapper now delegates to a MongoConverter instead of a plain ConversionService and invokes optional conversion on it. This optional conversion now removes type information from the created DBObject.
2011-11-30 17:56:44 +01:00
Oliver Gierke
9f71af42e8 DATAMONGO-329 - Fixed Collection and Map value handling for more open properties.
The decision whether a property value was handled as Collection or Map was based on inspecting the property's type which failed for classes using very open property declarations such as:

class MyClass {

  Object something;
}

We now rather inspect the value type instead of the property.
2011-11-30 16:20:25 +01:00
Oliver Gierke
92775170e1 DATAMONGO-301 - Allow classpath-scanning for Converters.
<mongo:custom-conversions /> now has a base-package attribute that scans for Converter and GenericConverter beans. Added <tool:exports /> metadata for MappingMongoConverter.
2011-11-30 15:26:18 +01:00
Oliver Gierke
4c7e338770 Adapt refactorings in SD Commons. 2011-11-28 18:07:33 +01:00
Oliver Gierke
e3fff52d17 DATAMONGO-298 - CustomConversions now also considers sub-types of Number as simple.
CustomConversions now delegates to MongoSimpleTypes.HOLDER.isSimpleType(…) instead of maintaining an additional list of Mongo-primitive types. Added DBObject to the list of Mongo-primitive types.
2011-11-24 15:20:49 +01:00
Oliver Gierke
5477ab20b2 DATAMONGO-324 - Added shortcut in MappingMongoConverter to allow reading DBObjects without conversion.
Added check in MappingMongoConverter.read(…) to shortcut object conversion if the requested type is DBObject.
2011-11-24 12:59:47 +01:00
Oliver Gierke
4cf3567f42 DATAMONGO-310 - MappingMongoConverter now creates native Mongo types for Maps and Collections in convertToMongoType(…).
MappingMongoConverter.convertToMongoType(…) not only converts elements of collections and maps but also converts the wrapper into the appropriate MongoDB type (BasicDBList, BasicDBObject).
2011-11-23 13:15:50 +01:00
Oliver Gierke
b26bb62a63 DATAMONGO-305 - Removed synchronization from Query class.
As Query is not intended to be thread-safe at all, we can safely remove the synchronized blocks from sort() and fields().
2011-11-23 12:48:39 +01:00
Oliver Gierke
f156d7b5af DATAMONGO-312 - MappingMongoConverter handles complex enum types correctly.
If an Enum implements abstract methods, the Class object derived from ${ENUM}.getClass() does not return true for ….isEnum(). Thus we have to rather check Enum.class.isAssignableFrom(…) as this catches this scenario as well. Also see DATACMNS-99 for a related fix in simple type handling in the core infrastructure.
2011-11-23 11:58:09 +01:00
Oliver Gierke
7bf3643902 DATAMONGO-304 - Removed document subpackage from Log4jAppender module. 2011-11-23 11:07:47 +01:00
Oliver Gierke
201ae3e92d Polished unit test. 2011-11-23 10:58:35 +01:00
Oliver Gierke
07556ec58c DATAMONGO-323 - Annotated repository queries consider dynamic sort now.
Applying a Sort parameter handed into a repository query method now for string based (aka. @Query annotated) queries.
2011-11-23 10:58:22 +01:00
Oliver Gierke
39807b17e1 DATAMONGO-309 - MappingMongoConverter now correctly maps Arrays as Map values.
We now not only convert collection values of Maps into BasicDBLists but arrays as well.
2011-11-23 10:28:40 +01:00
Oliver Gierke
4913fe26ac DATAMONGO-296 - Hook into Querydsl serialization to get predicate parameters converted.
Overrode MongoDbSerializer.asDBObject(…) and delegate to our MongoConverter to potentially convert predicate parameters. Upgraded to Querydsl 2.2.5.
2011-11-23 10:15:46 +01:00
Oliver Gierke
7642a719ff Polishing.
Removed unused imports, removed compiler warnings, polished JavaDoc.
2011-11-23 09:26:19 +01:00
Oliver Gierke
6c1ce576a4 DATACMNS-98 - Reflect refactoring.
MongoQueryMethod now uses RepositoryMetadata.getReturnedDomainType(…) instead of the static method of ClassUtils.
2011-11-21 19:21:22 +01:00
Mark Pollack
c99882201d DATAMONGO-234 - MongoTemplate should support the findAndModify operation to update version fields 2011-11-17 17:38:54 -05:00
Mark Pollack
ce6a64e4a9 DATAMONGO-308 - Add support for upsert methods 2011-11-16 16:25:58 -05:00
Mark Pollack
2fcc323bcd DATAMONGO-213 - Add WriteConcern to arguments of MongoOperations.update*() methods 2011-11-16 14:55:35 -05:00
Mark Pollack
17c7b1d2b5 DATAMONGO-213 - Add WriteConcern to arguments of MongoOperations.update*() methods 2011-11-16 14:30:56 -05:00
Mark Pollack
cfefe46cd4 DATAMONGO-213 - Add WriteConcern to arguments of MongoOperations.update*() methods
DATAMONGO-320 - Remove use of slaveOk boolean option in MongoTemplate as it is deprecated. Replace with ReadPreference
2011-11-16 13:28:13 -05:00
Mark Pollack
64921ddad1 DATAMONGO-319 - WriteConcern not parsed correctly in namespace handlers
DATAMONGO-311 - Update MongoDB driver to v 2.7.x
2011-11-15 16:54:02 -05:00
Mark Pollack
edda1764fe DATAMONGO-311 - Update MongoDB driver to v 2.7.x
Still investigating write_concern compatiblity as mentioned in the ticket
2011-11-15 12:44:25 -05:00
Mark Pollack
8113b79109 DATAMONGO-315 - MongoTemplate.findOne(query) methods ignore SortOrder on query 2011-11-14 23:26:16 -05:00
Mark Pollack
9fde4dff3e DATAMONGO-195 - Add description of @Field mapping annotation to reference docs 2011-11-14 22:53:08 -05:00
Mark Pollack
d4b3e2b99d DATAMONGO-306 - NullPointerException if mongo factory created via URI with out credentials 2011-11-14 22:48:07 -05:00
Mark Pollack
68a31d75f3 DATAMONGO-313 [Refactoring] - Use MongoOperations interface instead of MongoTemplate class 2011-11-14 22:20:50 -05:00
Mark Pollack
3e15c21419 DATAMONGO-208 - Add suppoprt for group() operation on collection in MongoOperations 2011-11-14 22:08:29 -05:00
Mark Pollack
e9f253d34f DATAMONGO-316 - Replica Set configuration via properties file throws ArrayIndexOutOfBoundsException 2011-11-14 16:38:18 -05:00
Thomas Risberg
80aa057acb Preparing for snapshot builds 2011-11-04 09:24:45 -04:00
Thomas Risberg
613b183b70 Preparing for 1.0.0.M5 MongoDB release
* Fixing spring-data-commons dependency
2011-11-04 09:20:18 -04:00
Thomas Risberg
5d1320a82a preparing for snapshot builds 2011-10-24 17:46:12 -04:00
Thomas Risberg
8ccaf61d12 preparing for 1.0.0.M5 MongoDB release 2011-10-24 17:39:28 -04:00
Thomas Risberg
2c46cfd8fe Updating documentation with project name changes 2011-10-24 17:32:41 -04:00
Thomas Risberg
45f2900d15 Updating documentation with package name changes and Criteria changes 2011-10-24 17:04:20 -04:00
Oliver Gierke
79934538b6 DATAMONGO-303 - Updated to Querydsl 2.2.4. 2011-10-24 14:52:01 -05:00
Oliver Gierke
0e4e0094a5 DATAMONGO-302, DATACMNS-91 - Added null-checks for CRUD methods where necessary.
CRUD methods in SimpleMongoRepository now consistently throw IllegalArgumentExceptions for null parameters handed to them.
2011-10-24 14:21:12 -05:00
Thomas Risberg
5df61563f4 DATADOC-300 Changing to use InvalidMongoDbApiUsageException 2011-10-20 15:44:23 -04:00
Thomas Risberg
04ebec993d DATADOC-300 Removing old 'or' metod, adding JavaDoc 2011-10-20 15:34:51 -04:00
Thomas Risberg
caa245dd08 DATADOC-283 DATADOC-300 Refatoring the QueryCriteria implementations to better support $and, $or and $nor queries 2011-10-20 15:22:37 -04:00
Oliver Gierke
717ff38319 DATADOC-230 - Added method to remove objects from specific collection.
Added MongoOperations.remove(Object, String) and according MongoTemplate implementation to be able to explicitly define the collection an object should be removed from. This aligns to the method signatures we provide for all other methods as well.

Consolidated MongoTemplate.getIdPropertName(…) and ….getIdValue(…) into ….getIdQueryFor(…) as they we're only used inside a method building a by-id-query for an object.
2011-10-20 10:02:36 +02:00
Oliver Gierke
33155ba0f4 DATADOC-295 - Added ability to setup a SimpleMongoDbFactory using a MongoURI.
Extended SimpleMongoDbFactory with a constructor to take a MongoUri instance. Expose uri attribute at the db-factory namespace element.
2011-10-14 13:32:46 +02:00
Oliver Gierke
9f62efa47c Fixed typo in reference documentation. 2011-10-14 13:32:03 +02:00
Oliver Gierke
620fc876f4 Removed unused imports. 2011-10-14 13:32:03 +02:00
Oliver Gierke
cfcf839232 DATADOC-297 - Pruned irrelevant sub modules.
Removed CouchDB module as well as the generic document one. Renamed document-parent into mongodb-parent. Adapted poms accordingly.
2011-10-13 20:27:03 +02:00
Oliver Gierke
7ce1e5fbd3 DATADOC-294 - Overhaul of collection and type information handling.
Streamlined handling of when and how to write type information into DBObjects being created. Added handling for converting Collections on the top level.
2011-10-13 18:01:17 +02:00
Christoph Leiter
2d12ba38f8 DATADOC-289 - Filter AfterLoadEvent for specific domain type.
AfterLoadEvent can now be typed to a domain type again and will only be invoked if documents are loaded that shall be mapped onto the declared type.
2011-10-12 15:25:09 +02:00
Oliver Gierke
1554d489ca DATACMNS-73 - Fixed package imports in DocumentBacking aspect. 2011-10-12 15:05:02 +02:00
Oliver Gierke
f030d304f4 DATADOC-271, DATACMNS-73 - Adapted changes of SD Commons. 2011-10-12 14:40:42 +02:00
Oliver Gierke
105e1da82b DATADOC-271 - Added license headers to cross-store files. 2011-10-12 14:30:21 +02:00
Oliver Gierke
dbc7601d7b DATADOC-293 - Activated Polygon related tests.
CI system has been updated to Mongo 2.0 so we can activate the test invoking its features.
2011-10-12 14:27:22 +02:00
Thomas Risberg
ccf981b8fb DATADOC-271 Re-packaging Mongo cross-store support, removing 'document' in package name 2011-10-12 08:16:43 -04:00
Oliver Gierke
8a43b4bbc0 DATADOC-65 - Allow usage of SpEL in @Document.
@Document can now use SpEL expressions to let the collection an entity shall be stored to be calculated on the fly.
2011-10-12 13:38:14 +02:00
Thomas Risberg
1c2c592b96 DATADOC-271 Re-packaging Mongo cross-store support, removing 'document' in package name 2011-10-12 07:08:34 -04:00
Thomas Risberg
cd4b409bf2 DATADOC-283 - Adding query support for $and operator. 2011-10-12 11:42:36 +02:00
Oliver Gierke
f71477f17d DATADOC-289 - Fixed invocation of AfterLoadEvent in AbstractMongoEventListener.
Invoke events that are not bound to the domain type before the domain type check. Removed unnecessary gentrification of AfterLoadEvent.
2011-10-11 17:37:32 +02:00
Oliver Gierke
454df1e7f1 DATADOC-273 - Persisting type information for raw types as well.
We now trigger writing type informations for subtypes of raw collections and maps as well.
2011-10-11 13:21:24 +02:00
Oliver Gierke
80641a0943 DATADOC-293 - Added Polygon abstraction to Criteria.
Introduced Polygon value object to capture a list of Points. Polished implementation of Circle (equals(…) and hashCode()) and API of Criteria. Added some additional unit tests. Introduced Shape interface to allow streamling the implementation of building within-Criterias.

Ignoring the tests for polygons right now until we have updated the Mongo instance on the CI server to 2.0.
2011-10-11 12:31:26 +02:00
Oliver Gierke
e405bf574c DATADOC-291 - MongoQueryCreator now considers mapping information for query building.
Instead of using the pure PropertyPath of the PartTree we ask the MappingContext for a PersistentPropertyPath and create a field name based path expression from it.
2011-10-07 12:00:31 +02:00
Oliver Gierke
9f8e406aff DATADOC-275 - Fixed id handling for DBRefs.
Trying to convert target ids into ObjectId or String before using the actual type.
2011-09-30 11:50:44 +02:00
Oliver Gierke
7d26366352 Added test case to show usage of custom converters and querying. 2011-09-30 09:35:10 +02:00
Oliver Gierke
864ae831dd DATADOC-285 - Added test case for issue, doesn't fail right now. 2011-09-27 14:36:50 +02:00
Oliver Gierke
bfb13d99e3 DATADOC-280 - Added maxAutoConnectRetryTime config option.
Extended MongoOperationsFactoryBean to carry maxAutoConnectRetryTime property and exposed that through the XML namespace.
2011-09-27 13:59:50 +02:00
Oliver Gierke
f39de4c28e DATADOC-286 - Added support for GreaterThanEqual and LessThanEqual.
Using those keywords provided by Spring Data Commons will cause $gte and $lte criterias to be created.
2011-09-27 11:48:20 +02:00
Oliver Gierke
3bdeb68617 DATADOC-278 - QueryMapper now converts ids for $ne correctly. 2011-09-27 10:43:02 +02:00
Oliver Gierke
6b40a27c92 DATACMNS-76 - Adapt changes of Spring Data Commons. 2011-09-26 20:21:58 +02:00
Oliver Gierke
237cbec945 DATADOC-284 - Added custom SpringDataMongodbSerializer.
The custom MongodbSerializer considers mapping information when building keys for the Querydsl queries to be executed.
2011-09-26 20:18:47 +02:00
Oliver Gierke
39bc6771b7 DATADOC-183 - Added count(…) methods to MongoOperations.
MongoTemplate is now able to count documents using either an entity class or collection name.
2011-09-15 12:00:03 +02:00
Oliver Gierke
73b2d5a99c DATADOC-270 - Removed critical Sonar warnings from codebase. 2011-09-14 15:34:29 +02:00
Oliver Gierke
5dab0c721e DATADOC-277 - Upgraded to Querydsl 2.2.2. 2011-09-14 11:49:36 +02:00
Oliver Gierke
bf7c9663cf DATADOC-274, DATADOC-276 - Split up repository package to be consistent with Spring Data JPA.
Introduced dedicated config, query and support packages. Updated Sonargraph architecture description introducing subsystems for repository layer.
2011-09-14 11:47:55 +02:00
Oliver Gierke
135742b7e4 Fixed potential NullPointException in debug level in MongoQueryCreator. 2011-09-14 11:38:58 +02:00
Mark Pollack
dee0307055 DATADOC-269 - XML configuration for replica sets is not working 2011-09-13 12:17:40 -04:00
Oliver Gierke
06fb4144e0 DATADOC-261 - Added id to geoNear section of reference Documentation. 2011-09-07 13:46:11 +02:00
Oliver Gierke
5fdc600570 DATADOC-272 - Moved XSD file into correct package. 2011-09-07 13:29:06 +02:00
Oliver Gierke
f54c69b6ef DATADOC-258 - Updated dependencies to the latest versions. 2011-09-07 11:53:54 +02:00
Oliver Gierke
decdcff79f Added fallback references to general repository documentation. 2011-09-07 11:51:28 +02:00
Oliver Gierke
be8daa5268 Removed obsolete MongoBeanWrapper and MongoPropertyDescriptors. 2011-09-07 11:50:04 +02:00
Oliver Gierke
c4cd074d4d DATADOC-259 - Fixed potential NullPointerException in MapingMongoConverter.writeMapInternal(…).
MappingMongoConverter.writeInternal(…) invoked ….writeMapInternal(…) handing in null for the TypeInformation which violated the implicit contract for the method. Made contract explicit in Javadoc and hand in plain Map TypeInformation.
2011-09-07 11:24:50 +02:00
Oliver Gierke
8b7521a93b DATADOC-268 - CustomConversion considers types only simple for registered write converters.
In cases where only a reading converter is registered (e.g. to manually instantiate the object instance) the type the reading converter is registered for must not be regarded as simple as it will be written to the DBObject as is.
2011-09-07 10:25:44 +02:00
Oliver Gierke
ba508f497c DATACMNS-72 - Adapt changes introduced in Spring Data Commons.
Removed references to MappingContextAware(BeanPostProcessor).
2011-09-07 08:14:48 +02:00
Oliver Gierke
091246a9aa Polished map-reduce tests and formatted documentation with XMLEditor. 2011-09-04 14:00:09 +02:00
Oliver Gierke
745e1f313d DATADOC-259 - MappingMongoConverter now converts Maps nested in Collections as well.
Fixed nested type handling in MappingMongoConverter.writeMapInternally(…). Force usage of ConversionService for simple values if read value doesn't match target type.
2011-09-04 13:57:43 +02:00
Mark Pollack
234e04f4a3 minor doc fices 2011-09-02 19:27:36 -04:00
Thomas Risberg
15ab529596 preparing for snapshot builds 2011-09-01 20:37:40 -04:00
Thomas Risberg
00bea23eed preparing for 1.0.0.M4 MongoDB release 2011-09-01 20:32:10 -04:00
Thomas Risberg
c646c1f3b2 Updated changelog for 1.0.0.M4 MongoDB 2011-09-01 20:23:48 -04:00
Thomas Risberg
069f491603 DATADOC-232 added unit tests for multi set and inc updates 2011-09-01 18:56:16 -04:00
Thomas Risberg
e426ba4d43 DATADOC-210 removed Java 6 only method to provide Java 5 compatibility 2011-09-01 18:56:16 -04:00
Mark Pollack
756f886cef DATADOC-7 - Support for map-reduce operations in MongoTemplate 2011-09-01 18:03:32 -04:00
Thomas Risberg
9bb42245bb changed the test for index names since they don't seem to be consistent 2011-09-01 17:52:21 -04:00
Thomas Risberg
c15263a259 updated .gitignore 2011-09-01 17:51:46 -04:00
Mark Pollack
758ee97a8d DATADOC-7 - Support for map-reduce operations in MongoTemplate 2011-09-01 16:34:52 -04:00
Mark Pollack
4f92690f58 DATADOC-255 - Add to MongoOperations and executeCommand with an additional integer options argument
DATADOC-256 - Update to use MongoDB driver version 2.6.5
DATADOC-7 - Support for map-reduce operations in MongoTemplate
2011-09-01 15:33:55 -04:00
Mark Pollack
595ed69820 DATADOC-255 - Add to MongoOperations and executeCommand with an additional integer options argument
DATADOC-256 - Update to use MongoDB driver version 2.6.5
DATADOC-7 - Support for map-reduce operations in MongoTemplate
2011-09-01 15:33:13 -04:00
Oliver Gierke
48bf08afa3 DATADOC-68 - Updated documentation regading geoNear usage with MongoOperations. 2011-09-01 17:32:27 +02:00
Mark Pollack
98bdae4a00 DATADOC-7 (Add initial options class for Map reduce) 2011-09-01 11:19:54 -04:00
Mark Pollack
f08b87d8d6 DATADOC-7 (Add initial options class for Map reduce) 2011-09-01 10:35:12 -04:00
Oliver Gierke
6c3ffeac5f Merge branch 'geo-repo'
* geo-repo:
  DATADOC-68 - Support for geo-near queries at repositories.
2011-09-01 00:33:38 +02:00
Oliver Gierke
490ddc4a0e DATADOC-254 - Reject invalid MongoDB database names.
Only letters, numbers underscores and dashes are now allowed.
2011-09-01 00:29:55 +02:00
Oliver Gierke
ae26f4fea1 DATADOC-68 - Support for geo-near queries at repositories. 2011-09-01 00:28:48 +02:00
Oliver Gierke
e9ea756a3a DATADOC-253 - Upgraded to Spring 3.0.6. 2011-08-31 20:32:30 +02:00
Oliver Gierke
c4c95813b7 DATADOC-199 - Added caching for MongoPersistentProperty.isIdProperty and ….getFieldName.
Continuous field and annotation lookups in those methods have turned out to be some hotspots in performance tests. Added a CachingMongoPersistentProperty that delegates to the actual implementation once and caching it.
2011-08-31 19:41:04 +02:00
Mark Pollack
bfeb1b34c1 DATADOC-170 - Streamlined AbstractMongoEventListener (fix test) 2011-08-30 10:30:23 -04:00
Oliver Gierke
44def7dddb DATADOC-170 - Streamlined AbstractMongoEventListener.
Renamed Abstract{Mapping => Mongo}EventListener. Removed generic typing for the MongoMappingEvent and stick to domain type generification only.
2011-08-29 23:04:43 +02:00
Mark Pollack
df10bb2168 DATADOC-202 - Add a 'DocumentCallbackHandler' so that a callback can process each DBObject returned from a query (cherry-picking from commit 2ddc77c25e09a81ee61b3931337b35ae9a67b6e5) 2011-08-29 14:48:35 -04:00
Mark Pollack
f98607f5dc Add integration tests for mapping events 2011-08-29 11:46:05 -04:00
Oliver Gierke
da23133327 Yet another round of formatting.
Added Eclipse formatter settings.
2011-08-26 20:26:06 +02:00
Oliver Gierke
ce5046c35f DATADOC-63 - Added TypeMapper abstraction to customize how type information is written to a DBObject and retrieved from it.
Added TypeMapper abstraction to allow plugging in a custom implementation. Current type handling logic is now in DefaultTypeMapper which additionally allows customizing the key the type information its stored under or disabling writing type info by setting the key to null.

Added ConfigurableTypeMapper implementation that gets a Map configured to define Strings to be used to store type information.
2011-08-26 20:20:38 +02:00
Oliver Gierke
95245015bc DATADOC-245 - Default to custom target type if raw type is null.
Really only use the custom target type in case it is a real subtype of the basic one. Before that we used the plain custom one which was is lacking generics information potentially available in the basic one.
2011-08-26 13:44:47 +02:00
Oliver Gierke
fc40e6b08c DATADOC-243 - Allow db-factory-ref attribute for converter-element. 2011-08-24 16:51:56 +02:00
Oliver Gierke
eac5cb8c46 DATADOC-247 - BigInteger ids handled properly in QueryMapper.
QueryMapper now tries to convert given ids to ObjectId and String.
2011-08-24 16:31:18 +02:00
Oliver Gierke
54377031bb DATADOC-216 - Added ability to configure a WriteConcern on DB level.
Added WriteConcern property to MongoDbFactory and expose it through the db-factory namespace element.
2011-08-24 09:16:06 +02:00
Oliver Gierke
213963f2ff DATADOC-215 - Allow configuring a WriteConcern per MongoFactoryBean.
Added ability to configure a WriteConcern for an entire MongoFactoryBean and exposed the attribute via the namespace.
2011-08-23 20:47:19 +02:00
Oliver Gierke
0fb21aa2a1 DATADOC-248 - Customized MappingMongoEntityInformation to allow taking a custom collection.
In case a repository query method returns a domain type not assignable to the repository domain type we have to use the repositories domain type to determine the collection but still use the returned domain type to hand to the unmarshalling. Thus, we need to set up a custom MongoEntityInformation to reflect this scenario.

Extended MappingMongoEntityInformation to allow manually defining a custom collection name. EntityInformationCreator was extended accordingly and MongoQueryMethod now sets up the EntityInformation accordingly.
2011-08-23 13:42:32 +02:00
Oliver Gierke
e0da98ec51 DATADOC-217 - Consolidated collection handling when reading.
Extracted readCollectionOrArray(…) method and make sure it's used everywhere a Collection has to be read. Test cases now checks that empty collections are correctly read when using @PersistenceConstructor as well.
2011-08-22 18:15:22 +02:00
Oliver Gierke
7610246a1d Fixed GeoResultsUnitTests. 2011-08-22 17:37:37 +02:00
Oliver Gierke
ce59d893ba DATACMNS-61 - Adapted changes in Spring Data Commons.
Use QueryMethod accessor methods instead of dropped Type enum to determine query execution. Added custom MongoParameters and MongoParameter to allow discovering a Distance parameter for repository queries which will transparently add a 'maxDistance' clause for *Near criterias in query methods. If the given Distance is equipped with a Metric we will rather use $nearSphere over $near.

Added asList() to Point class to circumvent bug in BasicBSONObject.equals(…) which breaks equals(…) comparisons of DBObjects in case they use arrays as values. See [0] for details. Adapted usage of Point objects to use asList() over asArray().

[0] https://jira.mongodb.org/browse/JAVA-416
2011-08-21 16:46:11 +02:00
Oliver Gierke
e130fb5a2d DATACMNS-61 - Adapted changes in Spring Data Commons.
Use QueryMethod accessor methods instead of dropped Type enum to determine query execution. Added custom MongoParameters and MongoParameter to allow discovering a Distance parameter for repository queries which will transparently add a 'maxDistance' clause for *Near criterias in query methods. If the given Distance is equipped with a Metric we will rather use $nearSphere over $near.

Added asList() to Point class to circumvent bug in BasicBSONObject.equals(…) which breaks equals(…) comparisons of DBObjects in case they use arrays as values. See [0] for details. Adapted usage of Point objects to use asList() over asArray().

[0] https://jira.mongodb.org/browse/JAVA-416
2011-08-21 15:18:36 +02:00
Oliver Gierke
fedcbdae4f DATADOC-68 - Added support for geoNear command.
Introduced GeoResult value object as well as NearQuery. NearQuery allows definition of an origin and distances. Introduced a Metric interface and Metrics enum to carry commonly used metrics like kilometers and miles to ease the handling in NearQueries. Introduced Distance value object to capture distances in Metrics.
2011-08-20 17:56:26 +02:00
Oliver Gierke
7cd020ffa7 Removed some compiler warnings. 2011-08-20 13:27:08 +02:00
Oliver Gierke
b01e1a994b DATADOC-245 - MappingMongoConverter now reads nested, untyped Maps correctly.
Added some defaulting code in MappingMongoConverter that converts DBObject instances into Maps in case the raw property type is Object (see getMoreConcreteTargetType()).
2011-08-19 22:43:26 +02:00
Oliver Gierke
dd02338b5e DATADOC-246 - Added DBRef to Mongo simple types.
General overhaul of default setup of custom Mongo simple types. MongoTemplate.doUpdate(…) transparently handles null queries now as well.
2011-08-19 21:28:25 +02:00
Oliver Gierke
7084839df1 DATADOC-194 - Use OSGi valid version loading a class. 2011-08-19 09:33:39 +02:00
Oliver Gierke
09aad4343f DATADOC-241 - Made readMap(…) protected to allow overriding.
Aligned method signatures for reading and writing for consistent parameter order.
2011-08-16 14:25:48 +02:00
Oliver Gierke
c6a97ef407 DATADOC-240, DATADOC-212 - Overhaul of MongoTemplate.doUpdate(…).
Replaced manual ID conversion with delegating to QueryMapper. Added ObjectId as Mongo native type to CustomConversions and added unit tests around its handling.
2011-08-16 13:03:52 +02:00
Oliver Gierke
ceac760d19 DATADOC-236 - MongoQueryCreator now regards the Sort object from the PartTree on query completion. 2011-08-16 10:40:44 +02:00
Oliver Gierke
244e9bc6d8 DATADOC-238 - Reverted change in test case that introduced side effects. 2011-08-16 10:30:50 +02:00
Oliver Gierke
2016aab969 DATADOC-235 - Arbitrary Map values are now converted correctly.
Map values are now handled correctly regardless of the actual Map value type declaration (can be Object in the most open case). We now handle collections as Map value types correctly. BasicDBList instances are now hinted to become Lists by default (if not typed to another collection type by the property). 

Reduced visibility of MappingMongoConverter.addCustomTypeKeyIfNecessary(…) and made createCollectionDBObject(…) safe against null values for the TypeInformation.
2011-08-15 20:54:16 +02:00
Oliver Gierke
35f180f999 DATADOC-238 - Fixed applying pagination for manually defined queries.
BasicQuery accidentally shadowed limit and skip fields of Query and introduced setters instead of builder style mutators. This caused the getters not returning the values set throughout the mutators which essentially turned off pagination for manually defined queries.

Removed the shadowing and created test case. Refactored constructors.
2011-08-15 19:59:08 +02:00
Oliver Gierke
ad8b6fccb4 DATADOC-237 - Index name considers field name if not configured explicitly. 2011-08-14 17:22:50 +02:00
Oliver Gierke
d237ee80c8 DATADOC-239 - Clarify documentation of passed in values for MongoReader. 2011-08-12 20:06:04 +02:00
Oliver Gierke
eb276841dd DATADOC-214 - Cleaned up MongoConverter interface and implementations.
Removed quite some obsolete methods from MongoConverter interface. Renamed maybeConvertObject(…) to convertToMongoType(…). Moved implementation of that method into MappingMongoConverter. Let the implementation transparently use custom Converters as well. Removed SimpleMongoConverter. Switched QueryMapper implementation from using a MongoConverter to use a ConversionService. Removed custom "maybe convert" logic from ConvertingParameterAccessor in favor of MongoWriter.convertToMongo(…).
2011-08-12 15:39:57 +02:00
Mark Pollack
1d51052ab1 Merge pull request #4 from georgecalm/master
DATADOC-231: setting optional deps in the manifest to make the bundle work in an OSGi server
2011-08-02 07:09:46 -07:00
Yuriy Nemtsov
58898e97c2 DATADOC-231 - Making mysema.query & collections15 deps optional in the manifest 2011-07-31 20:46:12 -04:00
Oliver Gierke
a5328da460 Added handling of BigInteger.
Added converter to handle BigInteger values. Adapted id handling to try converting and object to String before taking the id as is. Made custom converter implementations safe against invocations with null.
2011-07-28 13:29:32 +02:00
Oliver Gierke
fd7e41b753 DATADOC-228 - MappingMongoConverter now writes null for map values.
Fixes a NPE that one got when trying to persist a Map containing null as value for entries.
2011-07-28 11:17:47 +02:00
Oliver Gierke
f349f5ea10 DATADOC-226 - Added QuerydslRepositorySupport class to ease implementing repositories using Querydsl predicates. 2011-07-28 11:16:33 +02:00
Oliver Gierke
aa9d69d584 DATADOC-225 - BasicMongoPersistentEntity doesn't reject root entities without id anymore. 2011-07-27 09:55:56 +02:00
Oliver Gierke
764317635c Fixed annotation package pattern for APT processor. 2011-07-27 09:52:13 +02:00
Oliver Gierke
9ed5e6886c DATADOC-224 - Inspect value entity metadata in case it's a subtype of the declared property. 2011-07-26 23:35:05 +02:00
Oliver Gierke
94f36d27fc Upgraded Querydsl APT plugin to 1.0.2. 2011-07-26 21:22:11 +02:00
Oliver Gierke
a5fb4872b7 DATADOC-221 - Potentially convert values of maps on reading. 2011-07-26 21:13:51 +02:00
Oliver Gierke
2f577c9678 Updated Spring Data Commons dependency to 1.2.0.BUILD-SNAPSHOT. 2011-07-25 23:44:38 +02:00
Oliver Gierke
9bcd19866f DATADOC-211 - Guard potential NullPointerException in AbstractMongoConverter.maybeConvertObject(…).
Quickfix, will probably undergo a deeper cleanup as part of DATADOC-214.
2011-07-25 14:37:30 +02:00
Oliver Gierke
2af45518bd Formatting in MappingMongoConverter. 2011-07-22 17:53:04 +02:00
Oliver Gierke
e1daf36ed8 DATADOC-209 - MappingMongoConverter handles collections of enums correctly now. 2011-07-22 17:52:48 +02:00
Oliver Gierke
101064769c DATADOC-207 - Empty custom maps get read correctly.
Eagerly detect the map target type by using the TypeInformation instead of falling back to the raw Map interface.
2011-07-22 10:16:13 +02:00
Oliver Gierke
ac9c804aae DATADOC-206 - Upgraded to Querydsl 2.2.0. 2011-07-21 09:11:22 +02:00
Oliver Gierke
992f09d731 Removed unused imports, polished license headers and suppres some compiler warnings. 2011-07-21 09:07:49 +02:00
Oliver Gierke
4324ed8231 Javadoc cleanups and a few code cleanups. 2011-07-20 19:03:08 +02:00
Oliver Gierke
eebe973209 Added *.sonar4clipseExternals, .springBeans and *.orig to .gitignore.
Removed accidentally checked in .classpath.orig file.
2011-07-20 18:59:14 +02:00
Oliver Gierke
bb01cccac5 Updated architecture description to allow core package to have access to geo. 2011-07-20 18:51:35 +02:00
Oliver Gierke
2bdd49e3b7 Beautify upper version ranges in manifest. 2011-07-14 21:34:32 +02:00
Oliver Gierke
f424b7c760 DATADOC-166 - Check for null on removing objects. 2011-07-14 21:34:11 +02:00
Oliver Gierke
58b9db28a8 Polished Sonargraph architecture description. 2011-07-13 12:42:56 +02:00
Oliver Gierke
3e56210c78 Polished test cases a little. 2011-07-13 12:01:32 +02:00
Oliver Gierke
523612a3a9 DATADOC-175 - Broke up cyclic dependencies and added architecture management file.
Added initial architecture description for Sonargraph. Moved some types around and introduced core package to break up cyclic dependencies.
2011-07-13 11:57:38 +02:00
Jon Brisbin
1604c80d32 DATADOC-176 - Added test case to make sure non-ObjectIds can be used in DBRefs 2011-07-12 10:18:10 -05:00
Jon Brisbin
d27a1ad310 DATADOC-176 - Changed from using ObjectId to using Object for IDs in DBRefs. 2011-07-12 09:33:31 -05:00
Oliver Gierke
d2cca2c52a DATADOC-192 - Tweak Array handling in processing collection-like structures in MappingMongoConverter.
Fixes broken tests.
2011-07-08 13:22:48 +02:00
Oliver Gierke
ce0539a3dc DATADOC-192 - MappingMongoConverter handles collections correctly now.
Replaced hard coded List creation with delegate to Spring's CollectionFactory. Although the List should get converted before setting the value we can prevent that additional step by looking up the correct collection type upfront.
2011-07-08 13:04:25 +02:00
Oliver Gierke
4b4c35b904 DATADOC-191 - Removed 'document' from package names.
Polished template.mf as well by using Maven version placeholders.
2011-07-08 11:25:43 +02:00
Oliver Gierke
f0df16a340 Added more unit tests for Box. 2011-07-08 11:00:06 +02:00
Oliver Gierke
a9670de959 DATADOC-134 - Added test case to show indexes with unique flag work. 2011-07-08 10:23:45 +02:00
Oliver Gierke
2d5f41f65c DATADOC-171 - Convert BigDecimals to String by default.
Unfortunately MongoDB can't handle BigDecimal instances by default thus we have to provide a default serialization. We do by simply serializing it into a String and reading it back. Can be customized to register a custom Converter with the MongoConverter.
2011-07-07 21:35:54 +02:00
Oliver Gierke
2e3f2c602c Made integration test less aggressive in cleaning up to avoid inter-test collisions. 2011-07-07 19:52:16 +02:00
Oliver Gierke
a2687b6688 DATADOC-188 - Allow controlling repository index creation.
Disabled index creation for repository query methods by default. Added property on MongoRepositoryFactoryBean to enable index creation and expose that via 'create-query-indexes' attribute on the namespace.
2011-07-07 13:22:25 +02:00
Oliver Gierke
e89ea320bc Added @NoRepositoryBean to MongoRepository to prevent it being accidentally picked up during class path scanning. 2011-07-07 13:14:48 +02:00
Oliver Gierke
df24218a4f DATADOC-189 - Improved extensibility of MongoRepositoryFactoryBean.
Creation of MongoRepositoryFactory is now delegated into a template method that gets a MongoTemplate handed over.
2011-07-07 10:46:54 +02:00
Oliver Gierke
2d09b8b7d1 DATADOC-186 - Ensure index property order.
Use LinkedHashMap inside Index to make sure properties added are kept in the order of the addition.
2011-07-07 07:45:15 +02:00
Oliver Gierke
dd06973f50 DATADOC-190 - SimpleMongoRepository.exists(…) now works for entities with non-ObjectId id type.
Changed the implementation of the exists(…) method to make sure the query is handed to the QueryMapper to convert the id appropriately before executing the query.
2011-07-06 19:49:11 +02:00
Oliver Gierke
2256ebac1b DATADOC-172 - Allow defining property order for document mapping.
Entities can now use @Field annotation to define the order using the order attribute. We will simply start with the lower values and proceed to the higher ones. Unannotated properties are ordered behind all annotated (and order declared) ones. One might not assume any particular order for fields where the order is not manually defined.

Removed @FieldName as it is superseeded by @Field.
2011-07-06 19:09:24 +02:00
Oliver Gierke
f4373957b3 DATADOC-177 - Sort now preserves order of individual sort properties.
Using a LinkedHashMap now to preserve the order of properties to be sorted upon.
2011-07-06 15:39:22 +02:00
Thomas Risberg
af0cd9049a DATADOC-178 removed System.out.println statement 2011-06-24 12:45:17 -04:00
Thomas Risberg
99edcb82cf reformatted code to use tabs 2011-06-24 11:50:28 -04:00
Thomas Risberg
7a98877e75 DATADOC-181 added destroy() method and DisposableBean interface to call close on Mongo 2011-06-24 11:45:42 -04:00
Oliver Gierke
d856a7cad9 DATACMNS-49, DATADOC-100 - Allow externalizing Mongo repository queries.
Adapted changes in Spring Data Commons for DATACMNS-49. We now automatically pick up classpath*:META-INF/mongo-named-queries.properties to find named queries when using the namespace. See AbstractMongoRepositoryIntegrationTests.findsPeopleByNamedQuery() for sample.
2011-06-23 19:12:22 +02:00
Oliver Gierke
c1b396cca5 Replaced custom implementation over usage of TypeInformation.getActualType(). 2011-06-17 20:28:13 +02:00
Oliver Gierke
2e86012b3f Lowered log level to speed up test execution. 2011-06-13 20:25:36 -07:00
Oliver Gierke
538a62efce Adapted package refactorings in Spring Data Commons. 2011-06-13 20:25:15 -07:00
Oliver Gierke
3573851bfc Polishing.
Removed obsolete methods from MappingMongoConverter exposing internals. Added JavaDoc here and there. Polished GeospatialIndex value object and added some assertions to it. Fixed generics warnings in GeoIndexedAppConfig.
2011-06-13 14:53:07 -07:00
Oliver Gierke
e47ddf4bf2 Polished MonogoPersistentEntityIndexCreator.
Fixed generics warnings and the index key being set to the field name in any case now. setting the index name is now only affecting the name.
2011-06-13 13:25:06 -07:00
Oliver Gierke
c59711a409 Use findOne(…) for single entity queries inside repository abstraction. 2011-06-13 13:24:43 -07:00
Oliver Gierke
b1ab3f6ade DATADOC-162 - Added more test cases around Point class. 2011-06-13 09:53:11 -07:00
Thomas Risberg
fa094eef25 DATADOC-162 fixed formatting in toString method 2011-06-10 09:01:56 -04:00
Oliver Gierke
8ac2ff6b81 DATACMNS-169 - Revised custom converter handling.
Registering custom converts now automatically declares the custom handled types simple by using the SimpleTypeHolder built up by the CustomConversions instance. All the custom converter lookup logic is now in CustomConversions and AbstractMongoConverter uses that over handling it itself.
2011-06-09 16:35:04 -07:00
Oliver Gierke
46d6dffdd4 DATADOC-167 - @Documentation annotation is now inherited into subclasses 2011-06-07 11:42:53 -07:00
Oliver Gierke
06632cc3ae DATACMNS-42 - Fixed some bugs in nested collection handling. 2011-06-06 14:46:06 -07:00
Oliver Gierke
df0427442a DATACMNS-42 - Removed BigInteger as Mongo specific custom type as MappingBeanHelper covers Number implementations already. 2011-06-06 11:10:53 -07:00
Thomas Risberg
eb20aea7fe preparing for snapshot builds 2011-06-02 13:45:25 -04:00
Thomas Risberg
1e52f8641c preparing for 1.0.0.M3 MongoDB release 2011-06-02 13:39:28 -04:00
Thomas Risberg
37f2fcee31 DATADOC-147 - Updated reference documentation to cover changes from M2 to M3 2011-06-02 12:03:02 -04:00
Thomas Risberg
9a0e45de83 DATADOC-147 - Updated reference documentation to cover changes from M2 to M3 2011-06-02 11:54:54 -04:00
Oliver Gierke
b140fc1698 Updated changelog. 2011-06-02 11:50:48 +02:00
Oliver Gierke
ac4b27159b DATADOC-161 - MappingMongoConverter now supports nested Maps as well.
Extracted method to handle reading Maps which is now also able to recursively resolve nested maps by using the generics information.
2011-06-02 11:47:00 +02:00
Thomas Risberg
e8b130691c updated changelog for M3 2011-06-01 22:11:44 -04:00
Thomas Risberg
836b5ad258 DATADOC-160 refactoring to make parameter ordering more consistent, especially for string parameter specifying collection name override; renamed insertList methods; removed unnecessary methods 2011-06-01 22:08:09 -04:00
Thomas Risberg
dc9f96908f DATADOC-157 reformatted; changing spaces back to tabs for indentation 2011-06-01 19:20:05 -04:00
Thomas Risberg
f9fdbb469f updated changelog for M3 2011-06-01 13:03:57 -04:00
Thomas Risberg
134996d079 DATADOC-146 added overloaded regex method taking options parameter 2011-06-01 10:14:22 -04:00
Thomas Risberg
25b9a56030 DATADOC-156 added some more tests around different id types and in queries 2011-06-01 08:48:44 -04:00
Thomas Risberg
2c585d49d6 removed duplicate dependency from pom; removed reference to removed module from assembly 2011-05-31 23:03:59 -04:00
Thomas Risberg
7bed1264b1 DATADOC-157 - changed order parameters are declared for JavaDoc 2011-05-31 22:54:21 -04:00
Thomas Risberg
523d48759f DATADOC-155 removed print to sysout; polished 2011-05-31 22:50:00 -04:00
Thomas Risberg
1a5bffb52a DATADOC-155 added more tests for id properties; fixed mapping issue where String id wasn't converted to ObjectId properly 2011-05-31 22:45:09 -04:00
Thomas Risberg
467f7c61af DATADOC-158 changed to use Spring;s StringUtils 2011-05-31 22:35:26 -04:00
Thomas Risberg
89e83d1f69 fixed NPE during document processing by changing to correct language for programlisting element 2011-05-31 17:45:11 -04:00
Mark Pollack
b4b61a96e9 DATADOC-156 - MongoOperations.find(query(where("id").in(ids)) fails where ids aren't ObjectIds 2011-05-31 16:09:37 -04:00
Mark Pollack
131a2912e9 DATADOC-157 - MongoTemplate updateFirst/updateMulti methods to take java.lang.Class parameter as last in method param list to be consistent with other usage 2011-05-31 15:55:13 -04:00
Mark Pollack
b4236bdd78 More tests for DATADOC-155 - Need to support plain POJOs with non-ObjectId-compatible ID properties 2011-05-31 15:44:17 -04:00
J. Brisbin
43d0f74a3e Initial stab at changing the way IDs are handled in the mapping converter. 2011-05-31 13:21:03 -05:00
Oliver Gierke
aeea1bc5d5 Added @PersistenceConstructor to Circle. 2011-05-31 13:54:10 +01:00
Oliver Gierke
187f087270 Minor generics polishing in test cases. 2011-05-30 12:05:08 +02:00
Mark Pollack
115d419d0a comment back in geo tests. 2011-05-27 16:43:18 -04:00
Mark Pollack
e0bd465649 DATADOC-147 - Update reference documentation to cover changes from M2 to M3 (partial work) 2011-05-27 16:40:38 -04:00
J. Brisbin
6a73e94c57 DATADOC-155 - Add support for any type for id properties, not just the ones that can be converted into an ObjectId 2011-05-27 14:22:58 -05:00
Mark Pollack
7ea14bb4d5 Add failing test PersonPojoWithPrimitiveIdTests
DATADOC-147 - Update reference documentation to cover changes from M2 to M3 (partial work)
2011-05-27 13:13:23 -04:00
J. Brisbin
bfc4bc2100 Added clean-up step for new primitive ID tests, cleanup up unused imports in MappingTests.java 2011-05-25 13:03:38 -05:00
Mark Pollack
842c87389d remove extraneous import 2011-05-25 13:50:16 -04:00
J. Brisbin
e751666f90 Added test and a fix around using primitive ints as IDs 2011-05-25 12:39:36 -05:00
Mark Pollack
f2305681d3 DATADOC-147 - Update reference documentation to cover changes from M2 to M3 (partial work) 2011-05-25 00:30:10 -04:00
Mark Pollack
fb39f01f25 DATADOC-88 - Create MongoDbFactory to consolidate DB, Server location, and user credentials into one location
DATADOC-147 - Update reference documentation to cover changes from M2 to M3 (partial work)
2011-05-24 22:09:12 -04:00
Mark Pollack
e1f8eee2d1 DATADOC-88 - Create MongoDbFactory to consolidate DB, Server location, and user credentials into one location
DATADOC-147 - Update reference documentation to cover changes from M2 to M3 (partial work)
2011-05-24 18:07:18 -04:00
Mark Pollack
c7c2a66c3b DATADOC-88 - Create MongoDbFactory to consolidate DB, Server location, and user credentials into one location 2011-05-24 17:24:08 -04:00
Mark Pollack
69b1b9b96b DATADOC-88 - Create MongoDbFactory to consolidate DB, Server location, and user credentials into one location 2011-05-24 17:23:02 -04:00
Mark Pollack
2284a5137e DATADOC-88 - Create MongoDbFactory to consolidate DB, Server location, and user credentials into one location 2011-05-24 17:17:54 -04:00
Mark Pollack
46e2cf698e DATADOC-147 - Update reference documentation to cover changes from M2 to M3 (partial work) 2011-05-24 16:15:14 -04:00
Mark Pollack
1087b07086 DATADOC-147 - Update reference documentation to cover changes from M2 to M3 (partial work) 2011-05-24 13:44:43 -04:00
Oliver Gierke
d55505f1e5 DATADOC-130 - Added custom converters for Locale and Character.
Register a custom converter for Locale and Character classes by default as we have to consider them simple (as  they must not be inspected during mapping) but the have to be converted to String values before being handed over to Mongo.
2011-05-24 18:39:24 +02:00
Oliver Gierke
de06029ea2 Updated reference documentation for Mongo repositories. 2011-05-24 17:36:00 +02:00
Oliver Gierke
4d33c9c360 DATADOC-149 - Removed constructor not taking a MongoDbFactory as well as setter for it from MappingMongoConverter. 2011-05-24 17:32:23 +02:00
Oliver Gierke
8474a28538 Adapted introduction of BeanWrapper in Spring Data Commons.
Moved code dealing with the ConversionService into AbstractMongoConverter. Added getConversionService() to MongoConverter interface. Replaced usage of MappingBeanHelper with BeanWrapper usage.
2011-05-24 16:03:49 +02:00
Thomas Risberg
edba941dd0 DATADOC-83 - removed the spring-data-document-core module; move @RelatedDocument to the spring-data-mongodb-cross-store module; renamed "org.springframework.data.persistence.document.mongo" package 2011-05-24 10:00:06 -04:00
Mark Pollack
c75218387b DATADOC-42 - Provide option for configuring replica sets using the Mongo namespace 2011-05-24 02:08:17 -04:00
Mark Pollack
2e5906fa56 DATADOC-147 - Update reference documentation to cover changes from M2 to M3 (partial work) 2011-05-24 00:57:39 -04:00
Mark Pollack
c6c3dfef15 DATADOC-42 - Provide option for configuring replica sets using the Mongo namespace (partial work) 2011-05-24 00:57:03 -04:00
Mark Pollack
43925c9cf6 add logging to some collection operations 2011-05-24 00:55:49 -04:00
Mark Pollack
3893eb126d DATADOC-83 - Review dependencies 2011-05-24 00:55:30 -04:00
Mark Pollack
c8e16318e8 DATADOC-83 - Review dependencies 2011-05-24 00:49:17 -04:00
Mark Pollack
c48f892124 DATADOC-83 - Review dependencies 2011-05-23 23:16:43 -04:00
Mark Pollack
4c01993666 DATADOC-138 - Expose all properties of MongoOptions classin Mongo namespace
DATADOC-135 - <mongo:options /> should use - instead of camel case to be consistent with other atrtibute names
2011-05-23 23:08:07 -04:00
Oliver Gierke
bf8b85ef98 Fixed invalid exception in looking up a type hint from DBObject.
So far the implementation of MappingMongoConverter.findTypeToBeUsed(…) threw an exception if it found a type hind but couldn't load the class. This causes issues when the class names change where the document still contains 'old' type information. Not being able to load a class should simply be considered as no type information found, thus we're returning null now. Updated Javadoc accordingly.
2011-05-24 00:42:09 +02:00
Thomas Risberg
974faec6dd DATADOC-88 fixed missing propagation of host/port when no nested <mongo> element provided 2011-05-23 13:33:09 -04:00
Thomas Risberg
b5e83d4350 DATADOC-105 added list of jars need for builds not using Maven 2011-05-23 12:46:28 -04:00
Thomas Risberg
c79989ce99 DATADOC-56 added a section documenting _id conversion rules 2011-05-23 12:30:04 -04:00
Oliver Gierke
208c977e0a Moved QueryDslPredicateExecutor into Spring Data Commons. 2011-05-23 12:23:06 +02:00
Oliver Gierke
df1e900c55 Fixed template.mf to only export mongodb package. 2011-05-22 15:28:13 +02:00
Oliver Gierke
22ab2007da DATADOC-145 - Fixed mapping collections with abstract component types.
Let recursive mapping calls of collection elements use the value type instead of the collections component type. Refactored MappingMongoConverter to make collection handling more maintainable. Added code to always add custom type information if the actual value being stored differs from the declared one.

Moved some of the DBRef discovering code into implementations of MongoPersistentProperty. Renamed MongoPersistentProperty.getKey() to ….getFieldName().
2011-05-22 15:20:04 +02:00
Thomas Risberg
fdc81440bd DATADOC-15 introduced a protected WriteConcern prepareWriteConcern(WriteConcern writeConcern) method to MongoTemplate to faciliate subclass customization 2011-05-20 17:01:31 -04:00
Thomas Risberg
4052506df1 DATADOC-88 changed db attribute to dbname on <mongo:db-factory> 2011-05-20 16:28:36 -04:00
Thomas Risberg
4356297421 Changed tests to use 'database' as the database name except for the repositories tests 2011-05-20 16:11:33 -04:00
Oliver Gierke
d0da787f70 Removed mapping-context-ref from repositories namespace.
We now take the mapping context from the wired MongoTemplate.
2011-05-20 20:47:16 +02:00
Oliver Gierke
e89d09cc86 DATADOC-144 - Added @FieldName annotation to allow defining the name of the field a property shall be stored to. 2011-05-20 20:47:16 +02:00
Thomas Risberg
3e38595fe8 DATADOC-142 changed constructor taking Mongo and database name to accept user credentials 2011-05-20 12:58:07 -04:00
J. Brisbin
ad287efb4e DATADOC-143 - Made MongoMappingContext the default converter for the template, which also meant:
Several changes to how objects are initialized inside the template:

1. In one is not specified, a MappingMongoConverter is created and set as the default.
2. A special ApplicationEventPublisher implementation is installed by default to handle creating indexes when the template isn't used inside a Spring application context.
3. If a Spring application context is available, it will be set as the template's application context and eventPublisher, with the index creator being registered as an event listener if one isn't already present.

The tests had to be changed in a couple places to accurately reflect how mapping contexts and converters are now handled.
2011-05-20 11:38:20 -05:00
Thomas Risberg
98da8beb67 DATADOC-80 changed setCustomConverters to take a Set<Object> instead of a List similar to ConversionServiceFactory.registerConverters(…) 2011-05-20 08:04:13 -04:00
Oliver Gierke
9324ae2593 DATADOC-137 - Repositories now correctly replace multiple placeholders in JSON based queries.
When annotation a repository method with e.g. @Query("{ 'firstname' : ?0, 'lastname' : ?1 }") all placeholders get now replaced correctly. Added unit tests and fixed broken logging in StringBasedQueryCreator,
2011-05-20 08:27:08 +02:00
Mark Pollack
adc56ce79f DATADOC-83 - Review dependencies
Removed dependency on spring mvc in data-document-core to spring-data-document-examples\mongodb-myrestaurants-analytics
2011-05-19 14:04:34 -04:00
Thomas Risberg
2832b524d3 DATADOC-121 switched to use MappingMongoConverter as default; removed some renamed methods reintroduced from recent merge 2011-05-19 13:25:29 -04:00
Mark Pollack
1640db5d7c removed unused abstractions in data-document-core 2011-05-19 13:02:53 -04:00
Thomas Risberg
ac762b2289 changed Xlint options to avoid 'bad path element' error during build 2011-05-19 12:54:23 -04:00
Thomas Risberg
9429326ec2 removed collection from annotation since it is not used 2011-05-19 12:37:34 -04:00
Thomas Risberg
3e840e2380 DATADOC-121 deprecated the SimpleMongoConverter 2011-05-19 10:11:41 -04:00
J. Brisbin
5b57b40274 Added namespace parser for MongoDbFactory, made changes to getting mapping stuff to use MongoDbFactory 2011-05-19 08:47:50 -05:00
Thomas Risberg
63d9d35cba DATADOC-80 renamed setConverters to setCustomConverters and made signature consistent between Simple and mapping converters 2011-05-18 17:59:42 -04:00
Thomas Risberg
bf5fc0ff1f DATADOC-122 added a MongoCollectionsUtils to provide a method for preferred collection name 2011-05-18 17:24:50 -04:00
Oliver Gierke
6287fa425d DATADOC-108 - added findbyId(…) methods to MongoTemplate.
Removed Criteria.whereId(…). Updated SimpleMongoRepository to use the new method and use more core template methods to prevent objects from being marshalled to find out whether a particular object exists.
2011-05-18 23:16:45 +02:00
Oliver Gierke
345e6ebf8b DATADOC-136 - Fixed conversion of enums.
Introduced writeSimpleInternal(…) method that automatically stores the name of an enum instead of the enum itself. Changed quite a few places to rather use MongoPersistentProperty.getKey() over getName().
2011-05-18 23:16:44 +02:00
Thomas Risberg
4aaa32fe1d DATADOC-141 DATADOC-89 introduced a protected prepareCollection method and a boolean flag setSlaveOk on the MongoTemplate to have template instance level control over slave behavior 2011-05-18 15:46:22 -04:00
Thomas Risberg
60774dca26 DATADOC-88 switched test config files to use MongoDbFactory 2011-05-18 10:20:33 -04:00
Oliver Gierke
c9d5565aaa DATACMNS-38 - Added unit test for adding a self-referencing entity to the mapping context. 2011-05-17 20:19:08 +01:00
Thomas Risberg
30e96f9c97 DATADOC-88 added accessors for Mongo and database name to MongoDbFactory 2011-05-17 12:13:42 -04:00
Thomas Risberg
9641434090 DATADOC-88 removed unnessecary package protected constructor 2011-05-17 12:06:07 -04:00
Thomas Risberg
86960006cb DATADOC-88 introducing a MongoDbFactory interface and MongoTemplate constructors that take this as an argument 2011-05-16 18:07:56 -04:00
Thomas Risberg
21cd013cf1 DATADOC-88 re-introducing a MongoDbFactoryBean for configuring Mongo DB objects 2011-05-16 14:46:58 -04:00
Thomas Risberg
95750ffde1 cleaned up unused imports 2011-05-16 12:19:34 -04:00
Thomas Risberg
6ed1d2b226 DATADOC-110 removed 'substituteMappedIdIfNecessary' since this is now handled by QueryMapper 2011-05-16 12:11:55 -04:00
Thomas Risberg
4e04d0acd1 DATADOC-119 added test using 'afterMappingMongoConverterCreation' to register custom converters 2011-05-15 11:26:48 -04:00
Thomas Risberg
2212ca4b00 DATADOC-133 changed non-string attributes to xsd:string to support configuration using property placeholder; upgraded to Mongo Java driver 2.5.3 2011-05-13 17:26:47 -04:00
Oliver Gierke
025691a97a Change indentation from spaces to tabs. 2011-05-13 18:04:19 +02:00
Oliver Gierke
94e4d2b095 DATADOC-130 - Convert map key type to discovered generic key type.
Reding values into a map assumed map keys to be String values. We know leverage the conversion service to convert the key to the type we discover from the generics property information.
2011-05-13 12:45:38 +02:00
J. Brisbin
41e49ad3e2 DATADOC-96 - Had to tweak the way the QueryMapper works so that it more correctly searches for id properties to convert to _id as well as trying to turn them into ObjectIds if it can. If it can't, it puts the value in as-is. 2011-05-12 16:52:03 -05:00
Thomas Risberg
35629f5370 DATADOC-119 improved the converter parsing 2011-05-12 09:18:23 -04:00
Oliver Gierke
0b50e58020 DATACMNS-29 - Adapted changes in Spring Data Commons.
User repository interfaces are not required to implement MongoRepository anymore. Added missing override for save(Iterable<? extends T> entities) to return a List<T>.
2011-05-12 13:43:09 +02:00
Thomas Risberg
9b86637031 DATADOC-119 added support for registering custom converters with the mongo namespace support 2011-05-11 15:00:59 -04:00
Oliver Gierke
d7f33774e0 DATACMNS-35 - Added implementation and test for newly introduced Repository.delete(ID id) method. 2011-05-11 18:37:59 +02:00
Oliver Gierke
12ddfcc9f9 DATADOC-128 - Enable storing inheritance trees.
The document being persisted now gets a _class attribute to carry the actual type. That field key will be made configurable by a subsequent commit and its value should be interpreted as type hint to a client and  might be interpreted using a type mapper at a later stage as well (see DATADOC-63). For now it carries the fully-qualified Java class name.

On reads MappingMongoConverter will consider this field when choosing a type to bind the data to if - and only if - the type stored in there is a subtype of the actually requested one. So if we have a document carrying Person type information and you query for Contact you would get back a Person object. If you query for any other type not extending Contact you would get this custom type instead.

Added unit tests and an integration tests covering the Contact/Person scenario.
2011-05-10 18:21:47 +02:00
Oliver Gierke
47f184dbf0 Fixed generic warning by using more concrete meta-model type. 2011-05-10 18:16:05 +02:00
Oliver Gierke
2ca10c13c5 Extracted custom mongo repository namespace attributes to make them usable in <repositories /> and <repository />. 2011-05-10 18:14:06 +02:00
Oliver Gierke
965f9fd260 DATACMNS-34 - Adapted refactorings in Spring Data Commons regarding PersistenceConstructor discovery. 2011-05-10 10:20:06 +02:00
Thomas Risberg
d5c625dc2a DATADOC-107 added an overloaded method taking a Collection; added test to only allow a single parameter of type Collection 2011-05-09 08:54:39 -04:00
Oliver Gierke
55ce0b8272 DATADOC-130 - Conversion of Maps with simple key types works again.
Don't use ConversionService for simple type keys as we unregistered ObjectToStringConverter. This in turn causes Number, Boolean, Locale and the like not being convertible to String anymore as for those types only the Type -> String converter is registered but no corresponding converter back. Opened a ticket for this against Core Spring (SPR-8306).
2011-05-07 19:27:48 +02:00
Oliver Gierke
1c8a55a081 DATADOC-113 - NotNull and IsNull are now working for repositories.
Fixed Criteria being able to create correct query for ….is(null). MongoQueryCreator now uses ….ne(…) instead of ….not().is(…) as this doesn't generate a correct query (see DATADOC-129).
2011-05-07 15:53:45 +02:00
Oliver Gierke
35259f4e12 Reapplied old formatting to avoid merge conflicts (yet again). 2011-05-07 15:50:15 +02:00
Thomas Risberg
c9facf5338 DATADOC-129 changed Criteria to disallow 'not' before 'is' - should use 'ne' instead 2011-05-06 12:00:35 -04:00
Thomas Risberg
283e5cb76b DATADOC-118 refactoring some of the protected methods; some other minor refactorings 2011-05-05 18:12:56 -04:00
J. Brisbin
f05d200cd3 Fix the build since a new change for DATADOC-114 caused some tests to not compile cleanly. 2011-05-05 15:14:43 -05:00
J. Brisbin
62ecfc8416 DATADOC-114 - Fixes for updateFirst/updateMulti not converting POJOs correctly 2011-05-05 15:09:28 -05:00
Thomas Risberg
dfbc89c3b6 DATADOC-118 removed methods that take a MongoWriter 2011-05-05 13:59:29 -04:00
Thomas Risberg
b6c6016b1e DATADOC-120 removed MongoReaderWriter interface 2011-05-05 13:46:07 -04:00
Thomas Risberg
c9fe785c32 DATADOC-118 removed methods that take a MongoReader 2011-05-05 13:45:43 -04:00
Oliver Gierke
7b3d030794 DATADOC-43 - Implemented Near and Within for repository queries.
Refactored spatial domain classes to use each other a bit more. Added assertions to fail fast on invalid usage. Improved Geospatial index creation in MongoPersistentEntityIndexCreator by using the index abstraction instead of manually building the DBObject.

Fixed implementation of SimpleMongoRepository.deleteAll() to not drop the collection as this causes indexes to be dropped as well.

Skip index creation from query methods for now if we encounter a Near or Within part as we can't build combined queries right now.
2011-05-05 14:30:07 +02:00
Oliver Gierke
02448bc0ee Reverted formatting change in Criteria for now. 2011-05-05 14:21:08 +02:00
Thomas Risberg
e4fdabba1d DATADOC-117 removed defaultCollection on MongoTemplate/Operations; changed getDefaultCollectionName to getCollectionName(Class clazz) to determine collection name used for specific class; added a class parameter where necessary 2011-05-04 18:34:15 -04:00
Thomas Risberg
0c9ee0eacd DATADOC-106 added support for $ne and $nor 2011-05-03 07:54:59 -04:00
Oliver Gierke
7f6a6094e6 DATADOC-115 - Upgraded to Querydsl 2.2.0-beta4.
Enabled treatment of not annotated classes as embedded documents to let Querydsl create query classes for them. Annotated BasePerson with @QuerySupertype as Querydsl seems to need this annotation to create the query class for super types in the 2.2 branch.

That issue is tracked in https://bugs.launchpad.net/querydsl/+bug/776219 so that we should be able to remove the annotation as soon as this one is fixed.
2011-05-03 11:51:28 +02:00
Jon Brisbin
68f8bd62d1 DATADOC-114 - Fix bug in Update class that wasn't converting POJOs properly when being used from updateFirst/Multi 2011-05-02 13:04:10 -05:00
Thomas Risberg
caa8faf769 DATADOC-102 modified Update to allow multiple field updates for most operations 2011-04-30 17:39:55 -04:00
Thomas Risberg
6e4e487eb4 added collection name to logging messages 2011-04-30 10:48:28 -04:00
Thomas Risberg
d065bd74b3 DATADOC-96 added additional or test 2011-04-28 16:28:01 -04:00
Oliver Gierke
d648c95b62 Removed accidentally commited files. 2011-04-28 21:44:43 +02:00
Oliver Gierke
ef2b0235c7 DATADOC-109 - Introduced MappingContext implementation for SimpleMongoConverter.
Adapted changes in Spring Data Commons. Adapted test cases accordingly. Introduced SimpleMongoMappingContext that that reflects the meta-model assumptions in SimpleMongoConverter. Adapted repository factories accordingly as we can now assume that there is a MappingContext available always.

Refactored QueryMapper to be stateless so that we don't need to recreate instances of it. Added unit tests to verify id property to key mapping and type conversion to ObjectId.

Polished MappingTests to simply drop the database after all test were finished to make sure it starts with a clean state on a potential next run.
2011-04-28 21:35:19 +02:00
Jon Brisbin
321948d4a2 DATADOC-97 - Fix a couple bugs in the QueryMapper class. 2011-04-27 11:26:20 -05:00
Jon Brisbin
21f859e222 DATADOC-97 - Fixed an NPE in the QueryMapper class. 2011-04-27 11:17:06 -05:00
Jon Brisbin
224934a28c DATADOC-97 - Took out extraneous methods from MongoTemplate as the result of incorporating a new QueryMapper class. 2011-04-27 11:13:09 -05:00
J. Brisbin
a74d9ca7cd Merge branch 'query-mapper' 2011-04-27 11:07:35 -05:00
Jon Brisbin
76159a2216 DATADOC-97 - Added a QueryMapper helper to map properties referenced in query criteria to their proper name (_id in the case of @Id properties) as well as support conversion to ObjectId for those values that support the conversion. 2011-04-27 11:04:39 -05:00
Oliver Gierke
9c9138c4e9 DATADOC-101 - Improved custom converter handling.
Converters are now considered for both reading and writing. I also added a shortcut to invoke custom converters for top level types. So far the registered ones had only be used for properties of the given root object, not the root object itself.
2011-04-27 14:54:53 +02:00
Jon Brisbin
7567ba0355 DATADOC-95 - Fix issue where an object will all null properties wasn't being saved. 2011-04-27 07:52:01 -05:00
Thomas Risberg
dc36d91d8e DATADOC-94 added plugin repository info for maven.springframework.org/release to resolve aws wagon extension 2011-04-25 16:42:20 -04:00
Mark Pollack
f0b30ec39e DATADOC-99 - Reference documentation shows invalid field spec for @Query usage with repositories 2011-04-25 14:01:32 -04:00
Mark Pollack
178b220d2d expand on test to test retrieval of subelements. add-remove spring .xsd in STS to remove tooling releated error 2011-04-25 14:01:31 -04:00
Oliver Gierke
fe74557c95 DATACMNS-33 - Adapted API changes for Repository.count().
Also changed return types for count() methods on QueryDslPredicateExecutor.
2011-04-21 10:02:59 +02:00
Oliver Gierke
399beff795 DATACMNS-32 - Removed Querydsl support classes and use the ones from Spring Data Commons Core. 2011-04-20 20:37:06 +02:00
J. Brisbin
9380a88f26 Fix for problem loading domain classes with DBRef on them. 2011-04-19 14:01:57 -05:00
J. Brisbin
5e4583110e Took out unused variable 2011-04-19 13:13:09 -05:00
J. Brisbin
8a3296758d DATADOC-98 - Fixes for multi-dimensional arrays and lists as properties. 2011-04-19 12:50:50 -05:00
Thomas Risberg
60ba9bfcc2 updated to 1.1.0.BUILD-SNAPSHOT for data-commons 2011-04-19 11:26:52 -04:00
Thomas Risberg
db75aca336 fixed highlight xls 2011-04-19 11:24:55 -04:00
Oliver Gierke
36d62e12fc Added Mongo annotation package to the supported annotation types. 2011-04-19 16:28:27 +02:00
Oliver Gierke
f9b1fb57cd Added Spring Release repository as plugin repository to find AspectJ 1.6.11.RELEASE. 2011-04-19 16:28:27 +02:00
Mark Pollack
3ab2aeb5c1 add mvn repo information 2011-04-11 14:10:04 -04:00
Thomas Risberg
99e96958ed preparing for snapshot builds 2011-04-09 19:11:26 -04:00
Thomas Risberg
20c9af6550 updated changelog with latest release info 2011-04-09 19:02:44 -04:00
Thomas Risberg
df50623fc7 preparing for snapshot builds 2011-04-09 18:51:57 -04:00
506 changed files with 31133 additions and 22690 deletions

5
.gitignore vendored
View File

@@ -2,11 +2,16 @@
*.iml
*.ipr
*.iws
*.orig
target
.springBeans
.sonar4clipse
*.sonar4clipseExternals
.ant-targets-build.xml
.settings/
.project
.classpath
src/ant/.ant-targets-upload-dist.xml
atlassian-ide-plugin.xml
/.gradle/
/.idea/

36
pom.xml
View File

@@ -3,17 +3,15 @@
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-document-dist</artifactId>
<name>Spring Data Document Distribution</name>
<version>1.0.0.M2</version>
<artifactId>spring-data-mongo-dist</artifactId>
<name>Spring Data MongoDB Distribution</name>
<version>1.0.0.RC1</version>
<packaging>pom</packaging>
<modules>
<module>spring-data-document-parent</module>
<module>spring-data-document-core</module>
<module>spring-data-mongodb</module>
<module>spring-data-mongodb-cross-store</module>
<module>spring-data-mongodb-log4j</module>
<!-- <module>spring-data-couchdb</module> -->
<module>spring-data-mongodb-parent</module>
</modules>
<developers>
@@ -91,9 +89,9 @@
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<!-- dist.* properties are used by the antrun tasks below -->
<dist.id>spring-data-document</dist.id>
<dist.name>Spring Data</dist.name>
<dist.key>DATADOC</dist.key>
<dist.id>spring-data-mongo</dist.id>
<dist.name>Spring Data Mongo</dist.name>
<dist.key>SDMONGO</dist.key>
<dist.version>${project.version}</dist.version>
<dist.releaseType>snapshot</dist.releaseType>
<dist.finalName>${dist.id}-${dist.version}</dist.finalName>
@@ -176,7 +174,7 @@
</fileset>
</copy>
<move file="${project.basedir}/target/site/reference/pdf/index.pdf"
tofile="${project.basedir}/target/site/reference/pdf/spring-data-document-reference.pdf"
tofile="${project.basedir}/target/site/reference/pdf/spring-data-mongo-reference.pdf"
failonerror="false"/>
</postProcess>
</configuration>
@@ -259,11 +257,25 @@
</dependencies>
</plugin>
</plugins>
<!-- the name of this project is 'spring-data-document-dist';
make sure the zip file is just 'spring-data-document'. -->
<!-- the name of this project is 'spring-data-mongo-dist';
make sure the zip file is just 'spring-data-mongo'. -->
<finalName>${dist.finalName}</finalName>
</build>
<pluginRepositories>
<pluginRepository>
<!-- necessary for bundlor and utils -->
<id>repository.plugin.springsource.release</id>
<name>SpringSource Maven Repository</name>
<url>http://repository.springsource.com/maven/bundles/release</url>
</pluginRepository>
<pluginRepository>
<id>repository.springframework.maven.release</id>
<name>Spring Framework Maven Release Repository</name>
<url>http://maven.springframework.org/release</url>
</pluginRepository>
</pluginRepositories>
<distributionManagement>
<!-- see 'staging' profile for dry-run deployment settings -->
<downloadUrl>http://www.springsource.com/spring-data</downloadUrl>

View File

@@ -1,133 +0,0 @@
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-document-parent</artifactId>
<version>1.0.0.BUILD-SNAPSHOT</version>
<relativePath>../spring-data-document-parent/pom.xml</relativePath>
</parent>
<artifactId>spring-data-couchdb</artifactId>
<packaging>jar</packaging>
<name>Spring Data CouchDB Support</name>
<licenses>
<license>
<name>The Apache Software License, Version 2.0</name>
<url>http://www.apache.org/licenses/LICENSE-2.0.txt</url>
<distribution>repo</distribution>
</license>
</licenses>
<developers>
<developer>
<id>tareq.abedrabbo</id>
<name>Tareq Abedrabbo</name>
<email>tareq.abedrabbo@opencredo.com</email>
<organization>OpenCredo</organization>
<organizationUrl>http://www.opencredo.org</organizationUrl>
<roles>
<role>Project Admin</role>
<role>Developer</role>
</roles>
<timezone>+0</timezone>
</developer>
<developer>
<id>tomas.lukosius</id>
<name>Tomas Lukosius</name>
<email>tomas.lukosius@opencredo.com</email>
<organization>OpenCredo</organization>
<organizationUrl>http://www.opencredo.org</organizationUrl>
<roles>
<role>Project Admin</role>
<role>Developer</role>
</roles>
<timezone>+0</timezone>
</developer>
</developers>
<dependencies>
<!-- Spring -->
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-beans</artifactId>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-tx</artifactId>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-test</artifactId>
<scope>test</scope>
</dependency>
<!-- Spring Data -->
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-document-core</artifactId>
</dependency>
<!-- Jackson JSON -->
<dependency>
<groupId>org.codehaus.jackson</groupId>
<artifactId>jackson-core-asl</artifactId>
<version>1.6.1</version>
</dependency>
<dependency>
<groupId>org.codehaus.jackson</groupId>
<artifactId>jackson-mapper-asl</artifactId>
<version>1.6.1</version>
</dependency>
<dependency>
<groupId>javax.annotation</groupId>
<artifactId>jsr250-api</artifactId>
<optional>true</optional>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.hamcrest</groupId>
<artifactId>hamcrest-all</artifactId>
<version>1.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.mockito</groupId>
<artifactId>mockito-all</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
</dependency>
<!-- Couch DB -->
<dependency>
<groupId>com.google.code.jcouchdb</groupId>
<artifactId>jcouchdb</artifactId>
<version>0.11.0-1</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>com.springsource.bundlor</groupId>
<artifactId>com.springsource.bundlor.maven</artifactId>
</plugin>
</plugins>
</build>
</project>

View File

@@ -1,34 +0,0 @@
/*
* Copyright 2011 the original author or authors
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb;
import org.springframework.dao.InvalidDataAccessResourceUsageException;
import org.springframework.web.client.HttpServerErrorException;
public class CouchServerResourceUsageException extends InvalidDataAccessResourceUsageException {
/**
* Create a new CouchServerResourceUsageException,
* wrapping an arbitrary HttpServerErrorException.
*
* @param cause the HttpServerErrorException thrown
*/
public CouchServerResourceUsageException(HttpServerErrorException cause) {
super(cause != null ? cause.getMessage() : null, cause);
}
}

View File

@@ -1,34 +0,0 @@
/*
* Copyright 2011 the original author or authors
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.web.client.HttpClientErrorException;
public class CouchUsageException extends InvalidDataAccessApiUsageException {
/**
* Create a new CouchUsageException,
* wrapping an arbitrary HttpServerErrorException.
*
* @param cause the HttpServerErrorException thrown
*/
public CouchUsageException(HttpClientErrorException cause) {
super(cause != null ? cause.getMessage() : null, cause);
}
}

View File

@@ -1,34 +0,0 @@
/*
* Copyright 2011 the original author or authors
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb;
import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.web.client.HttpStatusCodeException;
public class DocumentExistsException extends DataIntegrityViolationException {
/**
* Create a new DocumentExistsException,
* wrapping an arbitrary HttpServerErrorException.
*
* @param cause the HttpServerErrorException thrown
*/
public DocumentExistsException(String documentId, HttpStatusCodeException cause) {
super(cause != null ? cause.getMessage() : null, cause);
}
}

View File

@@ -1,34 +0,0 @@
/*
* Copyright 2011 the original author or authors
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb;
import org.springframework.dao.UncategorizedDataAccessException;
import org.springframework.web.client.RestClientException;
public class UncategorizedCouchDataAccessException extends UncategorizedDataAccessException {
/**
* Create a new HibernateSystemException,
* wrapping an arbitrary HibernateException.
*
* @param cause the HibernateException thrown
*/
public UncategorizedCouchDataAccessException(RestClientException cause) {
super(cause != null ? cause.getMessage() : null, cause);
}
}

View File

@@ -1,64 +0,0 @@
/*
* Copyright 2011 the original author or authors
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb.admin;
import java.util.Arrays;
import java.util.List;
import java.util.Map;
import org.springframework.data.document.couchdb.support.CouchUtils;
import org.springframework.util.StringUtils;
import org.springframework.web.client.RestOperations;
import org.springframework.web.client.RestTemplate;
public class CouchAdmin implements CouchAdminOperations {
private String databaseUrl;
private RestOperations restOperations = new RestTemplate();
public CouchAdmin(String databaseUrl) {
if (!databaseUrl.trim().endsWith("/")) {
this.databaseUrl = databaseUrl.trim() + "/";
} else {
this.databaseUrl = databaseUrl.trim();
}
}
public List<String> listDatabases() {
String dbs = restOperations.getForObject(databaseUrl + "_all_dbs", String.class);
return Arrays.asList(StringUtils.commaDelimitedListToStringArray(dbs));
}
public void createDatabase(String dbName) {
org.springframework.util.Assert.hasText(dbName);
restOperations.put(databaseUrl + dbName, null);
}
public void deleteDatabase(String dbName) {
org.springframework.util.Assert.hasText(dbName);
restOperations.delete(CouchUtils.ensureTrailingSlash(databaseUrl + dbName));
}
public DbInfo getDatabaseInfo(String dbName) {
String url = CouchUtils.ensureTrailingSlash(databaseUrl + dbName);
Map dbInfoMap = (Map) restOperations.getForObject(url, Map.class);
return new DbInfo(dbInfoMap);
}
}

View File

@@ -1,34 +0,0 @@
/*
* Copyright 2011 the original author or authors
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb.admin;
import java.util.List;
public interface CouchAdminOperations {
// functionality for /_special - replication, logs, UUIDs
List<String> listDatabases();
void createDatabase(String name);
void deleteDatabase(String name);
DbInfo getDatabaseInfo(String name);
}

View File

@@ -1,72 +0,0 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb.admin;
import java.util.Collections;
import java.util.Map;
public class DbInfo {
private Map dbInfoMap;
public DbInfo(Map dbInfoMap) {
super();
this.dbInfoMap = dbInfoMap;
}
public boolean isCompactRunning() {
return (Boolean) this.dbInfoMap.get("compact_running");
}
public String getDbName() {
return (String) this.dbInfoMap.get("db_name");
}
public long getDiskFormatVersion() {
return (Long) this.dbInfoMap.get("disk_format_version");
}
public long getDiskSize() {
return (Long) this.dbInfoMap.get("disk_size");
}
public long getDocCount() {
return (Long) this.dbInfoMap.get("doc_count");
}
public long getDocDeleteCount() {
return (Long) this.dbInfoMap.get("doc_del_count");
}
public long getInstanceStartTime() {
return (Long) this.dbInfoMap.get("instance_start_time");
}
public long getPurgeSequence() {
return (Long) this.dbInfoMap.get("purge_seq");
}
public long getUpdateSequence() {
return (Long) this.dbInfoMap.get("update_seq");
}
public Map getDbInfoMap() {
return Collections.unmodifiableMap(dbInfoMap);
}
}

View File

@@ -1,70 +0,0 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb.config;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.parsing.BeanComponentDefinition;
import org.springframework.beans.factory.parsing.CompositeComponentDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.document.couchdb.monitor.ServerInfo;
import org.springframework.util.StringUtils;
import org.w3c.dom.Element;
public class CouchJmxParser implements BeanDefinitionParser {
public BeanDefinition parse(Element element, ParserContext parserContext) {
String databaseUrl = element.getAttribute("database-url");
if (!StringUtils.hasText(databaseUrl)) {
databaseUrl = "http://localhost:5984";
}
registerJmxComponents(databaseUrl, element, parserContext);
return null;
}
protected void registerJmxComponents(String databaseUrl, Element element, ParserContext parserContext) {
Object eleSource = parserContext.extractSource(element);
CompositeComponentDefinition compositeDef = new CompositeComponentDefinition(element.getTagName(), eleSource);
/*
createBeanDefEntry(AssertMetrics.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(BackgroundFlushingMetrics.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(BtreeIndexCounters.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(ConnectionMetrics.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(GlobalLockMetrics.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(MemoryMetrics.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(OperationCounters.class, compositeDef, mongoRefName, eleSource, parserContext);
*/
createBeanDefEntry(ServerInfo.class, compositeDef, databaseUrl, eleSource, parserContext);
//createBeanDefEntry(MongoAdmin.class, compositeDef, mongoRefName, eleSource, parserContext);
parserContext.registerComponent(compositeDef);
}
protected void createBeanDefEntry(Class clazz, CompositeComponentDefinition compositeDef, String databaseUrl, Object eleSource, ParserContext parserContext) {
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(clazz);
builder.getRawBeanDefinition().setSource(eleSource);
builder.addConstructorArg(databaseUrl);
BeanDefinition assertDef = builder.getBeanDefinition();
String assertName = parserContext.getReaderContext().registerWithGeneratedName(assertDef);
compositeDef.addNestedComponent(new BeanComponentDefinition(assertDef, assertName));
}
}

View File

@@ -1,63 +0,0 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb.core;
import java.net.URI;
public interface CouchOperations {
/**
* Reads a document from the database and maps it a Java object.
* </p>
* This method is intended to work when a default database
* is set on the CouchDbDocumentOperations instance.
*
* @param id the id of the CouchDB document to read
* @param targetClass the target type to map to
* @return the mapped object
*/
<T> T findOne(String id, Class<T> targetClass);
/**
* Reads a document from the database and maps it a Java object.
*
* @param uri the full URI of the document to read
* @param targetClass the target type to map to
* @return the mapped object
*/
<T> T findOne(URI uri, Class<T> targetClass);
/**
* Maps a Java object to JSON and writes it to the database
* </p>
* This method is intended to work when a default database
* is set on the CouchDbDocumentOperations instance.
*
* @param id the id of the document to write
* @param document the object to write
*/
void save(String id, Object document);
/**
* Maps a Java object to JSON and writes it to the database
*
* @param uri the full URI of the document to write
* @param document the object to write
*/
void save(URI uri, Object document);
}

View File

@@ -1,143 +0,0 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb.core;
import java.net.URI;
import java.util.Map;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.springframework.data.document.couchdb.CouchServerResourceUsageException;
import org.springframework.data.document.couchdb.CouchUsageException;
import org.springframework.data.document.couchdb.DocumentRetrievalFailureException;
import org.springframework.data.document.couchdb.UncategorizedCouchDataAccessException;
import org.springframework.data.document.couchdb.support.CouchUtils;
import org.springframework.http.*;
import org.springframework.util.Assert;
import org.springframework.web.client.*;
public class CouchTemplate implements CouchOperations {
protected final Log logger = LogFactory.getLog(this.getClass());
private String defaultDocumentUrl;
private RestOperations restOperations = new RestTemplate();
/**
* Constructs an instance of CouchDbDocumentTemplate with a default database
*
* @param defaultDatabaseUrl the default database to connect to
*/
public CouchTemplate(String defaultDatabaseUrl) {
Assert.hasText(defaultDatabaseUrl, "defaultDatabaseUrl must not be empty");
defaultDocumentUrl = CouchUtils.addId(defaultDatabaseUrl);
}
/**
* Constructs an instance of CouchDbDocumentTemplate with a default database
*
* @param defaultDatabaseUrl the default database to connect to
*/
public CouchTemplate(String defaultDatabaseUrl, RestOperations restOperations) {
this(defaultDatabaseUrl);
Assert.notNull(restOperations, "restOperations must not be null");
this.restOperations = restOperations;
}
public <T> T findOne(String id, Class<T> targetClass) {
Assert.state(defaultDocumentUrl != null, "defaultDatabaseUrl must be set to use this method");
try {
return restOperations.getForObject(defaultDocumentUrl, targetClass, id);
//TODO check this exception translation and centralize.
} catch (HttpClientErrorException clientError) {
if (clientError.getStatusCode() == HttpStatus.NOT_FOUND) {
throw new DocumentRetrievalFailureException(defaultDocumentUrl + "/" + id);
}
throw new CouchUsageException(clientError);
} catch (HttpServerErrorException serverError) {
throw new CouchServerResourceUsageException(serverError);
} catch (RestClientException otherError) {
throw new UncategorizedCouchDataAccessException(otherError);
}
}
public <T> T findOne(URI uri, Class<T> targetClass) {
Assert.state(uri != null, "uri must be set to use this method");
try {
return restOperations.getForObject(uri, targetClass);
//TODO check this exception translation and centralize.
} catch (HttpClientErrorException clientError) {
if (clientError.getStatusCode() == HttpStatus.NOT_FOUND) {
throw new DocumentRetrievalFailureException(uri.getPath());
}
throw new CouchUsageException(clientError);
} catch (HttpServerErrorException serverError) {
throw new CouchServerResourceUsageException(serverError);
} catch (RestClientException otherError) {
throw new UncategorizedCouchDataAccessException(otherError);
}
}
public void save(String id, Object document) {
Assert.notNull(document, "document must not be null for save");
HttpEntity<?> httpEntity = createHttpEntity(document);
try {
ResponseEntity<Map> response = restOperations.exchange(defaultDocumentUrl, HttpMethod.PUT, httpEntity, Map.class, id);
//TODO update the document revision id on the object from the returned value
//TODO better exception translation
} catch (RestClientException e) {
throw new UncategorizedCouchDataAccessException(e);
}
}
public void save(URI uri, Object document) {
Assert.notNull(document, "document must not be null for save");
Assert.notNull(uri, "URI must not be null for save");
HttpEntity<?> httpEntity = createHttpEntity(document);
try {
ResponseEntity<Map> response = restOperations.exchange(uri, HttpMethod.PUT, httpEntity, Map.class);
//TODO update the document revision id on the object from the returned value
//TODO better exception translation
} catch (RestClientException e) {
throw new UncategorizedCouchDataAccessException(e);
}
}
private HttpEntity<?> createHttpEntity(Object document) {
if (document instanceof HttpEntity) {
HttpEntity httpEntity = (HttpEntity) document;
Assert.isTrue(httpEntity.getHeaders().getContentType().equals(MediaType.APPLICATION_JSON),
"HttpEntity payload with non application/json content type found.");
return httpEntity;
}
HttpHeaders httpHeaders = new HttpHeaders();
httpHeaders.setContentType(MediaType.APPLICATION_JSON);
HttpEntity<Object> httpEntity = new HttpEntity<Object>(document, httpHeaders);
return httpEntity;
}
}

View File

@@ -1,315 +0,0 @@
/*
* Copyright 2011 the original author or authors
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb.core.support;
import java.io.IOException;
import java.nio.charset.Charset;
import java.util.*;
import org.codehaus.jackson.*;
import org.codehaus.jackson.map.JsonMappingException;
import org.codehaus.jackson.map.ObjectMapper;
import org.codehaus.jackson.map.type.TypeFactory;
import org.codehaus.jackson.type.JavaType;
import org.springframework.http.HttpInputMessage;
import org.springframework.http.HttpOutputMessage;
import org.springframework.http.MediaType;
import org.springframework.http.converter.AbstractHttpMessageConverter;
import org.springframework.http.converter.HttpMessageNotReadableException;
import org.springframework.http.converter.HttpMessageNotWritableException;
import org.springframework.util.Assert;
public class CouchDbMappingJacksonHttpMessageConverter extends
AbstractHttpMessageConverter<Object> {
public static final Charset DEFAULT_CHARSET = Charset.forName("UTF-8");
private static final String ROWS_FIELD_NAME = "rows";
private static final String VALUE_FIELD_NAME = "value";
private static final String INCLUDED_DOC_FIELD_NAME = "doc";
private static final String TOTAL_ROWS_FIELD_NAME = "total_rows";
private ObjectMapper objectMapper = new ObjectMapper();
private boolean prefixJson = false;
/**
* Construct a new {@code BindingJacksonHttpMessageConverter}.
*/
public CouchDbMappingJacksonHttpMessageConverter() {
super(new MediaType("application", "json", DEFAULT_CHARSET));
}
/**
* Sets the {@code ObjectMapper} for this view. If not set, a default
* {@link ObjectMapper#ObjectMapper() ObjectMapper} is used.
* <p/>
* Setting a custom-configured {@code ObjectMapper} is one way to take
* further control of the JSON serialization process. For example, an
* extended {@link org.codehaus.jackson.map.SerializerFactory} can be
* configured that provides custom serializers for specific types. The other
* option for refining the serialization process is to use Jackson's
* provided annotations on the types to be serialized, in which case a
* custom-configured ObjectMapper is unnecessary.
*/
public void setObjectMapper(ObjectMapper objectMapper) {
Assert.notNull(objectMapper, "'objectMapper' must not be null");
this.objectMapper = objectMapper;
}
/**
* Indicates whether the JSON output by this view should be prefixed with
* "{} &&". Default is false.
* <p/>
* Prefixing the JSON string in this manner is used to help prevent JSON
* Hijacking. The prefix renders the string syntactically invalid as a
* script so that it cannot be hijacked. This prefix does not affect the
* evaluation of JSON, but if JSON validation is performed on the string,
* the prefix would need to be ignored.
*/
public void setPrefixJson(boolean prefixJson) {
this.prefixJson = prefixJson;
}
@Override
public boolean canRead(Class<?> clazz, MediaType mediaType) {
JavaType javaType = getJavaType(clazz);
return this.objectMapper.canDeserialize(javaType) && canRead(mediaType);
}
/**
* Returns the Jackson {@link JavaType} for the specific class.
* <p/>
* <p/>
* Default implementation returns
* {@link TypeFactory#type(java.lang.reflect.Type)}, but this can be
* overridden in subclasses, to allow for custom generic collection
* handling. For instance:
* <p/>
* <pre class="code">
* protected JavaType getJavaType(Class&lt;?&gt; clazz) {
* if (List.class.isAssignableFrom(clazz)) {
* return TypeFactory.collectionType(ArrayList.class, MyBean.class);
* } else {
* return super.getJavaType(clazz);
* }
* }
* </pre>
*
* @param clazz the class to return the java type for
* @return the java type
*/
protected JavaType getJavaType(Class<?> clazz) {
return TypeFactory.type(clazz);
}
@Override
public boolean canWrite(Class<?> clazz, MediaType mediaType) {
return this.objectMapper.canSerialize(clazz) && canWrite(mediaType);
}
@Override
protected boolean supports(Class<?> clazz) {
// should not be called, since we override canRead/Write instead
throw new UnsupportedOperationException();
}
@Override
protected Object readInternal(Class<?> clazz, HttpInputMessage inputMessage)
throws IOException, HttpMessageNotReadableException {
JavaType javaType = getJavaType(clazz);
try {
return success(clazz, inputMessage);
// return this.objectMapper.readValue(inputMessage.getBody(),
// javaType);
} catch (Exception ex) {
throw new HttpMessageNotReadableException("Could not read JSON: "
+ ex.getMessage(), ex);
}
}
private Object success(Class<?> clazz, HttpInputMessage inputMessage)
throws JsonParseException, IOException {
//Note, parsing code used from ektorp project
JsonParser jp = objectMapper.getJsonFactory().createJsonParser(
inputMessage.getBody());
if (jp.nextToken() != JsonToken.START_OBJECT) {
throw new RuntimeException("Expected data to start with an Object");
}
Map<String, Integer> fields = readHeaderFields(jp);
List result;
if (fields.containsKey(TOTAL_ROWS_FIELD_NAME)) {
int totalRows = fields.get(TOTAL_ROWS_FIELD_NAME);
if (totalRows == 0) {
return Collections.emptyList();
}
result = new ArrayList(totalRows);
} else {
result = new ArrayList();
}
ParseState state = new ParseState();
Object first = parseFirstRow(jp, state, clazz);
if (first == null) {
return Collections.emptyList();
} else {
result.add(first);
}
while (jp.getCurrentToken() != null) {
skipToField(jp, state.docFieldName, state);
if (atEndOfRows(jp)) {
return result;
}
result.add(jp.readValueAs(clazz));
endRow(jp, state);
}
return result;
}
private Object parseFirstRow(JsonParser jp, ParseState state, Class clazz)
throws JsonParseException, IOException, JsonProcessingException,
JsonMappingException {
skipToField(jp, VALUE_FIELD_NAME, state);
JsonNode value = null;
if (atObjectStart(jp)) {
value = jp.readValueAsTree();
jp.nextToken();
if (isEndOfRow(jp)) {
state.docFieldName = VALUE_FIELD_NAME;
Object doc = objectMapper.readValue(value, clazz);
endRow(jp, state);
return doc;
}
}
skipToField(jp, INCLUDED_DOC_FIELD_NAME, state);
if (atObjectStart(jp)) {
state.docFieldName = INCLUDED_DOC_FIELD_NAME;
Object doc = jp.readValueAs(clazz);
endRow(jp, state);
return doc;
}
return null;
}
private boolean isEndOfRow(JsonParser jp) {
return jp.getCurrentToken() == JsonToken.END_OBJECT;
}
private void endRow(JsonParser jp, ParseState state) throws IOException, JsonParseException {
state.inRow = false;
jp.nextToken();
}
private boolean atObjectStart(JsonParser jp) {
return jp.getCurrentToken() == JsonToken.START_OBJECT;
}
private boolean atEndOfRows(JsonParser jp) {
return jp.getCurrentToken() != JsonToken.START_OBJECT;
}
private void skipToField(JsonParser jp, String fieldName, ParseState state) throws JsonParseException, IOException {
String lastFieldName = null;
while (jp.getCurrentToken() != null) {
switch (jp.getCurrentToken()) {
case FIELD_NAME:
lastFieldName = jp.getCurrentName();
jp.nextToken();
break;
case START_OBJECT:
if (!state.inRow) {
state.inRow = true;
jp.nextToken();
} else {
if (isInField(fieldName, lastFieldName)) {
return;
} else {
jp.skipChildren();
}
}
break;
default:
if (isInField(fieldName, lastFieldName)) {
jp.nextToken();
return;
}
jp.nextToken();
break;
}
}
}
private boolean isInField(String fieldName, String lastFieldName) {
return lastFieldName != null && lastFieldName.equals(fieldName);
}
private Map<String, Integer> readHeaderFields(JsonParser jp)
throws JsonParseException, IOException {
Map<String, Integer> map = new HashMap<String, Integer>();
jp.nextToken();
String nextFieldName = jp.getCurrentName();
while (!nextFieldName.equals(ROWS_FIELD_NAME)) {
jp.nextToken();
map.put(nextFieldName, Integer.valueOf(jp.getIntValue()));
jp.nextToken();
nextFieldName = jp.getCurrentName();
}
return map;
}
@Override
protected void writeInternal(Object o, HttpOutputMessage outputMessage)
throws IOException, HttpMessageNotWritableException {
JsonEncoding encoding = getEncoding(outputMessage.getHeaders()
.getContentType());
JsonGenerator jsonGenerator = this.objectMapper.getJsonFactory()
.createJsonGenerator(outputMessage.getBody(), encoding);
try {
if (this.prefixJson) {
jsonGenerator.writeRaw("{} && ");
}
this.objectMapper.writeValue(jsonGenerator, o);
} catch (JsonGenerationException ex) {
throw new HttpMessageNotWritableException("Could not write JSON: "
+ ex.getMessage(), ex);
}
}
private JsonEncoding getEncoding(MediaType contentType) {
if (contentType != null && contentType.getCharSet() != null) {
Charset charset = contentType.getCharSet();
for (JsonEncoding encoding : JsonEncoding.values()) {
if (charset.name().equals(encoding.getJavaName())) {
return encoding;
}
}
}
return JsonEncoding.UTF8;
}
private static class ParseState {
boolean inRow;
String docFieldName = "";
}
}

View File

@@ -1,41 +0,0 @@
/*
* Copyright 2002-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb.monitor;
import org.springframework.web.client.RestTemplate;
/**
* Base class to encapsulate common configuration settings when connecting to a CouchDB database
*
* @author Mark Pollack
*/
public abstract class AbstractMonitor {
protected RestTemplate restTemplate;
protected String databaseUrl;
/**
* Gets the databaseUrl used to connect to CouchDB
*
* @return
*/
public String getDatabaseUrl() {
return this.databaseUrl;
}
}

View File

@@ -1,62 +0,0 @@
/*
* Copyright 2002-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb.monitor;
import java.net.InetAddress;
import java.net.UnknownHostException;
import java.util.Map;
import org.springframework.jmx.export.annotation.ManagedOperation;
import org.springframework.jmx.export.annotation.ManagedResource;
import org.springframework.web.client.RestTemplate;
/**
* Expose basic server information via JMX
*
* @author Mark Pollack
*/
@ManagedResource(description = "Server Information")
public class ServerInfo extends AbstractMonitor {
public ServerInfo(String databaseUrl) {
this.databaseUrl = databaseUrl;
this.restTemplate = new RestTemplate();
}
@ManagedOperation(description = "Server host name")
public String getHostName() throws UnknownHostException {
return InetAddress.getLocalHost().getHostName();
}
@ManagedOperation(description = "CouchDB Server Version")
public String getVersion() {
return (String) getRoot().get("version");
}
@ManagedOperation(description = "Message of the day")
public String getMotd() {
return (String) getRoot().get("greeting");
}
public Map getRoot() {
Map map = restTemplate.getForObject(getDatabaseUrl(), Map.class);
return map;
}
}

View File

@@ -1,4 +0,0 @@
/**
* CouchDB specific JMX monitoring support.
*/
package org.springframework.data.document.couchdb.monitor;

View File

@@ -1,81 +0,0 @@
/*
* Copyright 2010 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb.support;
import org.springframework.dao.DataAccessException;
/**
* Helper class featuring helper methods for internal CouchDB classes.
* <p/>
* <p>Mainly intended for internal use within the framework.
*
* @author Thomas Risberg
* @author Tareq Abedrabbo
* @since 1.0
*/
public abstract class CouchUtils {
/**
* Convert the given runtime exception to an appropriate exception from the
* <code>org.springframework.dao</code> hierarchy.
* Return null if no translation is appropriate: any other exception may
* have resulted from user code, and should not be translated.
*
* @param ex runtime exception that occurred
* @return the corresponding DataAccessException instance,
* or <code>null</code> if the exception should not be translated
*/
public static DataAccessException translateCouchExceptionIfPossible(RuntimeException ex) {
return null;
}
/**
* Adds an id variable to a URL
*
* @param url the URL to modify
* @return the modified URL
*/
public static String addId(String url) {
return ensureTrailingSlash(url) + "{id}";
}
/**
* Adds a 'changes since' variable to a URL
*
* @param url
* @return
*/
public static String addChangesSince(String url) {
return ensureTrailingSlash(url) + "_changes?since={seq}";
}
/**
* Ensures that a URL ends with a slash.
*
* @param url the URL to modify
* @return the modified URL
*/
public static String ensureTrailingSlash(String url) {
if (!url.endsWith("/")) {
url += "/";
}
return url;
}
}

View File

@@ -1 +0,0 @@
http\://www.springframework.org/schema/data/couch=org.springframework.data.document.couchdb.config.CouchNamespaceHandler

View File

@@ -1,2 +0,0 @@
http\://www.springframework.org/schema/data/couch/spring-couch-1.0.xsd=org/springframework/data/document/couchdb/config/spring-couch-1.0.xsd
http\://www.springframework.org/schema/data/couch/spring-couch.xsd=org/springframework/data/document/couchdb/config/spring-couch-1.0.xsd

View File

@@ -1,4 +0,0 @@
# Tooling related information for the Couch DB namespace
http\://www.springframework.org/schema/data/couch@name=Couch Namespace
http\://www.springframework.org/schema/data/couch@prefix=couch
http\://www.springframework.org/schema/data/couch@icon=org/springframework/jdbc/config/spring-jdbc.gif

View File

@@ -1,33 +0,0 @@
<?xml version="1.0" encoding="UTF-8" ?>
<xsd:schema xmlns="http://www.springframework.org/schema/data/couch"
xmlns:xsd="http://www.w3.org/2001/XMLSchema"
xmlns:tool="http://www.springframework.org/schema/tool"
xmlns:context="http://www.springframework.org/schema/context"
xmlns:repository="http://www.springframework.org/schema/data/repository"
targetNamespace="http://www.springframework.org/schema/data/couch"
elementFormDefault="qualified" attributeFormDefault="unqualified">
<xsd:import namespace="http://www.springframework.org/schema/tool"/>
<xsd:import namespace="http://www.springframework.org/schema/context"
schemaLocation="http://www.springframework.org/schema/context/spring-context.xsd"/>
<xsd:import namespace="http://www.springframework.org/schema/data/repository"
schemaLocation="http://www.springframework.org/schema/data/repository/spring-repository.xsd"/>
<xsd:element name="jmx">
<xsd:annotation>
<xsd:documentation><![CDATA[
Defines a JMX Model MBeans for monitoring a CouchDB server'.
]]></xsd:documentation>
</xsd:annotation>
<xsd:complexType>
<xsd:attribute name="database-url" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The database URL of the CouchDB]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
</xsd:complexType>
</xsd:element>
</xsd:schema>

View File

@@ -1,77 +0,0 @@
/*
* Copyright 2011 the original author or authors
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb;
import java.util.Date;
import org.codehaus.jackson.annotate.JsonIgnoreProperties;
/**
* @author Tareq Abedrabbo (tareq.abedrabbo@opencredo.com)
* @since 13/01/2011
*/
@JsonIgnoreProperties(ignoreUnknown = true)
public class DummyDocument {
private String message;
private String timestamp = new Date().toString();
public DummyDocument() {
}
public DummyDocument(String message) {
this.message = message;
}
public String getMessage() {
return message;
}
public void setMessage(String message) {
this.message = message;
}
public String getTimestamp() {
return timestamp;
}
@Override
public boolean equals(Object o) {
if (this == o) return true;
if (o == null || getClass() != o.getClass()) return false;
DummyDocument document = (DummyDocument) o;
if (message != null ? !message.equals(document.message) : document.message != null) return false;
return true;
}
@Override
public int hashCode() {
return message != null ? message.hashCode() : 0;
}
@Override
public String toString() {
return "DummyDocument{" +
"message='" + message + '\'' +
", timestamp=" + timestamp +
'}';
}
}

View File

@@ -1,52 +0,0 @@
/*
* Copyright 2011 the original author or authors
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb;
import org.hamcrest.Description;
import org.hamcrest.Factory;
import org.hamcrest.Matcher;
import org.hamcrest.TypeSafeMatcher;
import org.springframework.http.HttpEntity;
/**
* Matches the content of the body of an HttpEntity.
*
* @author Tareq Abedrabbo
* @since 31/01/2011
*/
public class IsBodyEqual extends TypeSafeMatcher<HttpEntity> {
private Object object;
public IsBodyEqual(Object object) {
this.object = object;
}
@Override
public boolean matchesSafely(HttpEntity httpEntity) {
return httpEntity.getBody().equals(object);
}
public void describeTo(Description description) {
description.appendText("body equals ").appendValue(object);
}
@Factory
public static Matcher<HttpEntity> bodyEqual(Object object) {
return new IsBodyEqual(object);
}
}

View File

@@ -1,38 +0,0 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb.admin;
import java.util.List;
import junit.framework.Assert;
import org.junit.Ignore;
import org.junit.Test;
import org.springframework.data.document.couchdb.core.CouchConstants;
public class CouchAdminIntegrationTests {
@Test
@Ignore("until CI has couch server running")
public void dbLifecycle() {
CouchAdmin admin = new CouchAdmin(CouchConstants.COUCHDB_URL);
admin.deleteDatabase("foo");
List<String> dbs = admin.listDatabases();
admin.createDatabase("foo");
List<String> newDbs = admin.listDatabases();
Assert.assertEquals(dbs.size() + 1, newDbs.size());
}
}

View File

@@ -1,118 +0,0 @@
/*
* Copyright 2011 the original author or authors
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb.core;
import static org.junit.Assume.assumeNoException;
import static org.junit.Assume.assumeTrue;
import static org.springframework.http.HttpStatus.OK;
import java.io.IOException;
import java.util.UUID;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.junit.Before;
import org.junit.BeforeClass;
import org.springframework.http.*;
import org.springframework.http.client.ClientHttpResponse;
import org.springframework.web.client.DefaultResponseErrorHandler;
import org.springframework.web.client.RestClientException;
import org.springframework.web.client.RestTemplate;
/**
* Base class for CouchDB integration tests. Checks whether CouchDB is available before running each test,
* in which case the test is executed. If CouchDB is not available, tests are ignored.
*
* @author Tareq Abedrabbo (tareq.abedrabbo@opencredo.com)
* @since 13/01/2011
*/
public abstract class AbstractCouchTemplateIntegrationTests {
protected static final Log log = LogFactory.getLog(AbstractCouchTemplateIntegrationTests.class);
protected static final RestTemplate restTemplate = new RestTemplate();
/**
* This methods ensures that the database is running. Otherwise, the test is ignored.
*/
@BeforeClass
public static void assumeDatabaseIsUpAndRunning() {
try {
ResponseEntity<String> responseEntity = restTemplate.getForEntity(CouchConstants.COUCHDB_URL, String.class);
assumeTrue(responseEntity.getStatusCode().equals(OK));
log.debug("CouchDB is running on " + CouchConstants.COUCHDB_URL +
" with status " + responseEntity.getStatusCode());
} catch (RestClientException e) {
log.debug("CouchDB is not running on " + CouchConstants.COUCHDB_URL);
assumeNoException(e);
}
}
@Before
public void setUpTestDatabase() throws Exception {
RestTemplate template = new RestTemplate();
template.setErrorHandler(new DefaultResponseErrorHandler() {
@Override
public void handleError(ClientHttpResponse response) throws IOException {
// do nothing, error status will be handled in the switch statement
}
});
ResponseEntity<String> response = template.getForEntity(CouchConstants.TEST_DATABASE_URL, String.class);
HttpStatus statusCode = response.getStatusCode();
switch (statusCode) {
case NOT_FOUND:
createNewTestDatabase();
break;
case OK:
deleteExisitingTestDatabase();
createNewTestDatabase();
break;
default:
throw new IllegalStateException("Unsupported http status [" + statusCode + "]");
}
}
private void deleteExisitingTestDatabase() {
restTemplate.delete(CouchConstants.TEST_DATABASE_URL);
}
private void createNewTestDatabase() {
restTemplate.put(CouchConstants.TEST_DATABASE_URL, null);
}
/**
* Reads a CouchDB document and converts it to the expected type.
*/
protected <T> T getDocument(String id, Class<T> expectedType) {
String url = CouchConstants.TEST_DATABASE_URL + "{id}";
return restTemplate.getForObject(url, expectedType, id);
}
/**
* Writes a CouchDB document
*/
protected String putDocument(Object document) {
HttpHeaders headers = new HttpHeaders();
headers.setContentType(MediaType.APPLICATION_JSON);
HttpEntity request = new HttpEntity(document, headers);
String id = UUID.randomUUID().toString();
restTemplate.put(CouchConstants.TEST_DATABASE_URL + "{id}", request, id);
return id;
}
}

View File

@@ -1,27 +0,0 @@
/*
* Copyright 2011 the original author or authors
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb.core;
public abstract class CouchConstants {
public static final String COUCHDB_URL = "http://127.0.0.1:5984/";
public static final String TEST_DATABASE_URL = COUCHDB_URL + "si_couchdb_test/";
public CouchConstants() {
// TODO Auto-generated constructor stub
}
}

View File

@@ -1,39 +0,0 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb.core;
import java.util.UUID;
import junit.framework.Assert;
import org.junit.Ignore;
import org.junit.Test;
import org.springframework.data.document.couchdb.DummyDocument;
public class CouchTemplateIntegrationTests extends AbstractCouchTemplateIntegrationTests {
@Test
@Ignore("until CI has couch server running")
public void saveAndFindTest() {
CouchTemplate template = new CouchTemplate(CouchConstants.TEST_DATABASE_URL);
DummyDocument document = new DummyDocument("hello");
String id = UUID.randomUUID().toString();
template.save(id, document);
DummyDocument foundDocument = template.findOne(id, DummyDocument.class);
Assert.assertEquals(document.getMessage(), foundDocument.getMessage());
}
}

View File

@@ -1,13 +0,0 @@
log4j.rootCategory=INFO, stdout
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d %p [%c] - <%m>%n
log4j.category.org.apache.activemq=ERROR
log4j.category.org.springframework.batch=DEBUG
log4j.category.org.springframework.transaction=INFO
log4j.category.org.hibernate.SQL=DEBUG
# for debugging datasource initialization
# log4j.category.test.jdbc=DEBUG

View File

@@ -1,24 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:context="http://www.springframework.org/schema/context"
xmlns:p="http://www.springframework.org/schema/p"
xmlns:couch="http://www.springframework.org/schema/data/couch"
xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.0.xsd
http://www.springframework.org/schema/data/couch http://www.springframework.org/schema/data/couch/spring-couch-1.0.xsd
http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context-3.0.xsd">
<couch:jmx/>
<context:mbean-export/>
<bean id="registry" class="org.springframework.remoting.rmi.RmiRegistryFactoryBean"
p:port="1099"/>
<!-- Expose JMX over RMI -->
<bean id="serverConnector" class="org.springframework.jmx.support.ConnectorServerFactoryBean" depends-on="registry"
p:objectName="connector:name=rmi"
p:serviceUrl="service:jmx:rmi://localhost/jndi/rmi://localhost:1099/myconnector"/>
</beans>

View File

@@ -1,24 +0,0 @@
Bundle-SymbolicName: org.springframework.data.couchdb
Bundle-Name: Spring Data Couch DB Support
Bundle-Vendor: SpringSource
Bundle-ManifestVersion: 2
Import-Package:
sun.reflect;version="0";resolution:=optional
Import-Template:
org.springframework.beans.*;version="[3.0.0, 4.0.0)",
org.springframework.core.*;version="[3.0.0, 4.0.0)",
org.springframework.dao.*;version="[3.0.0, 4.0.0)",
org.springframework.http.*;version="[3.0.0, 4.0.0)",
org.springframework.web.*;version="[3.0.0, 4.0.0)",
org.springframework.util.*;version="[3.0.0, 4.0.0)",
org.springframework.context.*;version="[3.0.0, 4.0.0)",
org.springframework.jmx.*;version="[3.0.0, 4.0.0)",
org.springframework.remoting.*;version="[3.0.0, 4.0.0)",
org.springframework.data.core.*;version="[1.0.0, 2.0.0)",
org.springframework.data.document.*;version="[1.0.0, 2.0.0)",
org.codehaus.jackson.*;version="[1.0.0, 2.0.0)";resolution:=optional,
org.aopalliance.*;version="[1.0.0, 2.0.0)";resolution:=optional,
org.apache.commons.logging.*;version="[1.1.1, 2.0.0)",
org.w3c.dom.*;version="0"

View File

@@ -1,79 +0,0 @@
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-document-parent</artifactId>
<version>1.0.0.M2</version>
<relativePath>../spring-data-document-parent/pom.xml</relativePath>
</parent>
<artifactId>spring-data-document-core</artifactId>
<packaging>jar</packaging>
<name>Spring Data Document Support</name>
<dependencies>
<!-- Spring -->
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-beans</artifactId>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-tx</artifactId>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-web</artifactId>
</dependency>
<dependency>
<groupId>javax.annotation</groupId>
<artifactId>jsr250-api</artifactId>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.mockito</groupId>
<artifactId>mockito-all</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
</dependency>
<!-- Dependencies for web analytics functionality - to me moved into spring framework -->
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>servlet-api</artifactId>
<version>2.5</version>
<scope>provided</scope>
</dependency>
<!-- Spring dependencies -->
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-core</artifactId>
<version>${org.springframework.version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-web</artifactId>
<version>${org.springframework.version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-webmvc</artifactId>
<version>${org.springframework.version}</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>com.springsource.bundlor</groupId>
<artifactId>com.springsource.bundlor.maven</artifactId>
</plugin>
</plugins>
</build>
</project>

View File

@@ -1,3 +0,0 @@
Manifest-Version: 1.0
Class-Path:

View File

@@ -1,32 +0,0 @@
/*
* Copyright 2010 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document;
public abstract class AbstractDocumentStoreTemplate<C> {
public abstract C getConnection();
public <T> T execute(DocumentStoreConnectionCallback<C, T> action) {
try {
return action.doInConnection(getConnection());
} catch (Exception e) {
throw new UncategorizedDocumentStoreException("Failure executing using datastore connection", e);
}
}
}

View File

@@ -1,43 +0,0 @@
package org.springframework.data.document.analytics;
import java.util.Map;
public class ControllerCounter {
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public double getCount() {
return count;
}
public void setCount(double count) {
this.count = count;
}
public Map<String, Double> getMethods() {
return methods;
}
public void setMethods(Map<String, Double> methods) {
this.methods = methods;
}
private String name;
private double count;
private Map<String, Double> methods;
@Override
public String toString() {
return "ControllerCounter [name=" + name + ", count=" + count
+ ", methods=" + methods + "]";
}
}

View File

@@ -1,83 +0,0 @@
package org.springframework.data.document.analytics;
import java.util.Date;
public class MvcEvent {
private String controller;
private String action;
private Parameters parameters;
private Date date;
private String requestUri;
private String requestAddress;
private String remoteUser;
private String view;
public String getController() {
return controller;
}
public void setController(String controller) {
this.controller = controller;
}
public String getAction() {
return action;
}
public void setAction(String action) {
this.action = action;
}
public Parameters getParameters() {
return parameters;
}
public void setParameters(Parameters parameters) {
this.parameters = parameters;
}
public Date getDate() {
return date;
}
public void setDate(Date date) {
this.date = date;
}
public String getRequestUri() {
return requestUri;
}
public void setRequestUri(String requestUri) {
this.requestUri = requestUri;
}
public String getRequestAddress() {
return requestAddress;
}
public void setRequestAddress(String requestAddress) {
this.requestAddress = requestAddress;
}
public String getRemoteUser() {
return remoteUser;
}
public void setRemoteUser(String remoteUser) {
this.remoteUser = remoteUser;
}
//TODO
//Map sessionAttributes
}

View File

@@ -1,35 +0,0 @@
package org.springframework.data.document.analytics;
public class Parameters {
private String p1;
private String p2;
private String p3;
public String getP1() {
return p1;
}
public void setP1(String p1) {
this.p1 = p1;
}
public String getP2() {
return p2;
}
public void setP2(String p2) {
this.p2 = p2;
}
public String getP3() {
return p3;
}
public void setP3(String p3) {
this.p3 = p3;
}
}

View File

@@ -1,821 +0,0 @@
/*
* Copyright 2002-2010 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.web.bind.annotation.support;
import java.lang.annotation.Annotation;
import java.lang.reflect.*;
import java.util.*;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.springframework.beans.BeanUtils;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.core.*;
import org.springframework.core.annotation.AnnotationUtils;
import org.springframework.http.*;
import org.springframework.http.converter.HttpMessageConverter;
import org.springframework.ui.ExtendedModelMap;
import org.springframework.ui.Model;
import org.springframework.util.*;
import org.springframework.validation.BindException;
import org.springframework.validation.BindingResult;
import org.springframework.validation.Errors;
import org.springframework.web.HttpMediaTypeNotSupportedException;
import org.springframework.web.bind.WebDataBinder;
import org.springframework.web.bind.annotation.*;
import org.springframework.web.bind.annotation.support.HandlerMethodInvocationException;
import org.springframework.web.bind.annotation.support.HandlerMethodResolver;
import org.springframework.web.bind.support.*;
import org.springframework.web.context.request.NativeWebRequest;
import org.springframework.web.context.request.WebRequest;
import org.springframework.web.multipart.MultipartFile;
import org.springframework.web.multipart.MultipartRequest;
/**
* Support class for invoking an annotated handler method. Operates on the introspection results of a {@link
* HandlerMethodResolver} for a specific handler type.
* <p/>
* <p>Used by {@link org.springframework.web.servlet.mvc.annotation.AnnotationMethodHandlerAdapter} and {@link
* org.springframework.web.portlet.mvc.annotation.AnnotationMethodHandlerAdapter}.
*
* @author Juergen Hoeller
* @author Arjen Poutsma
* @see #invokeHandlerMethod
* @since 2.5.2
*/
public class HandlerMethodInvoker {
private static final String MODEL_KEY_PREFIX_STALE = SessionAttributeStore.class.getName() + ".STALE.";
/**
* We'll create a lot of these objects, so we don't want a new logger every time.
*/
private static final Log logger = LogFactory.getLog(HandlerMethodInvoker.class);
private final HandlerMethodResolver methodResolver;
private final WebBindingInitializer bindingInitializer;
private final SessionAttributeStore sessionAttributeStore;
private final ParameterNameDiscoverer parameterNameDiscoverer;
private final WebArgumentResolver[] customArgumentResolvers;
private final HttpMessageConverter[] messageConverters;
private final SimpleSessionStatus sessionStatus = new SimpleSessionStatus();
public HandlerMethodInvoker(HandlerMethodResolver methodResolver) {
this(methodResolver, null);
}
public HandlerMethodInvoker(HandlerMethodResolver methodResolver, WebBindingInitializer bindingInitializer) {
this(methodResolver, bindingInitializer, new DefaultSessionAttributeStore(), null, null, null);
}
public HandlerMethodInvoker(HandlerMethodResolver methodResolver, WebBindingInitializer bindingInitializer,
SessionAttributeStore sessionAttributeStore, ParameterNameDiscoverer parameterNameDiscoverer,
WebArgumentResolver[] customArgumentResolvers, HttpMessageConverter[] messageConverters) {
this.methodResolver = methodResolver;
this.bindingInitializer = bindingInitializer;
this.sessionAttributeStore = sessionAttributeStore;
this.parameterNameDiscoverer = parameterNameDiscoverer;
this.customArgumentResolvers = customArgumentResolvers;
this.messageConverters = messageConverters;
}
public final Object invokeHandlerMethod(Method handlerMethod, Object handler,
NativeWebRequest webRequest, ExtendedModelMap implicitModel) throws Exception {
Method handlerMethodToInvoke = BridgeMethodResolver.findBridgedMethod(handlerMethod);
try {
boolean debug = logger.isDebugEnabled();
for (String attrName : this.methodResolver.getActualSessionAttributeNames()) {
Object attrValue = this.sessionAttributeStore.retrieveAttribute(webRequest, attrName);
if (attrValue != null) {
implicitModel.addAttribute(attrName, attrValue);
}
}
for (Method attributeMethod : this.methodResolver.getModelAttributeMethods()) {
Method attributeMethodToInvoke = BridgeMethodResolver.findBridgedMethod(attributeMethod);
Object[] args = resolveHandlerArguments(attributeMethodToInvoke, handler, webRequest, implicitModel);
if (debug) {
logger.debug("Invoking model attribute method: " + attributeMethodToInvoke);
}
String attrName = AnnotationUtils.findAnnotation(attributeMethod, ModelAttribute.class).value();
if (!"".equals(attrName) && implicitModel.containsAttribute(attrName)) {
continue;
}
ReflectionUtils.makeAccessible(attributeMethodToInvoke);
Object attrValue = attributeMethodToInvoke.invoke(handler, args);
if ("".equals(attrName)) {
Class resolvedType = GenericTypeResolver.resolveReturnType(attributeMethodToInvoke, handler.getClass());
attrName = Conventions.getVariableNameForReturnType(attributeMethodToInvoke, resolvedType, attrValue);
}
if (!implicitModel.containsAttribute(attrName)) {
implicitModel.addAttribute(attrName, attrValue);
}
}
Object[] args = resolveHandlerArguments(handlerMethodToInvoke, handler, webRequest, implicitModel);
if (debug) {
logger.debug("Invoking request handler method: " + handlerMethodToInvoke);
}
ReflectionUtils.makeAccessible(handlerMethodToInvoke);
return handlerMethodToInvoke.invoke(handler, args);
} catch (IllegalStateException ex) {
// Internal assertion failed (e.g. invalid signature):
// throw exception with full handler method context...
throw new HandlerMethodInvocationException(handlerMethodToInvoke, ex);
} catch (InvocationTargetException ex) {
// User-defined @ModelAttribute/@InitBinder/@RequestMapping method threw an exception...
ReflectionUtils.rethrowException(ex.getTargetException());
return null;
}
}
public final void updateModelAttributes(Object handler, Map<String, Object> mavModel,
ExtendedModelMap implicitModel, NativeWebRequest webRequest) throws Exception {
if (this.methodResolver.hasSessionAttributes() && this.sessionStatus.isComplete()) {
for (String attrName : this.methodResolver.getActualSessionAttributeNames()) {
this.sessionAttributeStore.cleanupAttribute(webRequest, attrName);
}
}
// Expose model attributes as session attributes, if required.
// Expose BindingResults for all attributes, making custom editors available.
Map<String, Object> model = (mavModel != null ? mavModel : implicitModel);
if (model != null) {
try {
String[] originalAttrNames = model.keySet().toArray(new String[model.size()]);
for (String attrName : originalAttrNames) {
Object attrValue = model.get(attrName);
boolean isSessionAttr = this.methodResolver.isSessionAttribute(
attrName, (attrValue != null ? attrValue.getClass() : null));
if (isSessionAttr) {
if (this.sessionStatus.isComplete()) {
implicitModel.put(MODEL_KEY_PREFIX_STALE + attrName, Boolean.TRUE);
} else if (!implicitModel.containsKey(MODEL_KEY_PREFIX_STALE + attrName)) {
this.sessionAttributeStore.storeAttribute(webRequest, attrName, attrValue);
}
}
if (!attrName.startsWith(BindingResult.MODEL_KEY_PREFIX) &&
(isSessionAttr || isBindingCandidate(attrValue))) {
String bindingResultKey = BindingResult.MODEL_KEY_PREFIX + attrName;
if (mavModel != null && !model.containsKey(bindingResultKey)) {
WebDataBinder binder = createBinder(webRequest, attrValue, attrName);
initBinder(handler, attrName, binder, webRequest);
mavModel.put(bindingResultKey, binder.getBindingResult());
}
}
}
} catch (InvocationTargetException ex) {
// User-defined @InitBinder method threw an exception...
ReflectionUtils.rethrowException(ex.getTargetException());
}
}
}
private Object[] resolveHandlerArguments(Method handlerMethod, Object handler,
NativeWebRequest webRequest, ExtendedModelMap implicitModel) throws Exception {
Class[] paramTypes = handlerMethod.getParameterTypes();
Object[] args = new Object[paramTypes.length];
for (int i = 0; i < args.length; i++) {
MethodParameter methodParam = new MethodParameter(handlerMethod, i);
methodParam.initParameterNameDiscovery(this.parameterNameDiscoverer);
GenericTypeResolver.resolveParameterType(methodParam, handler.getClass());
String paramName = null;
String headerName = null;
boolean requestBodyFound = false;
String cookieName = null;
String pathVarName = null;
String attrName = null;
boolean required = false;
String defaultValue = null;
boolean validate = false;
int annotationsFound = 0;
Annotation[] paramAnns = methodParam.getParameterAnnotations();
for (Annotation paramAnn : paramAnns) {
if (RequestParam.class.isInstance(paramAnn)) {
RequestParam requestParam = (RequestParam) paramAnn;
paramName = requestParam.value();
required = requestParam.required();
defaultValue = parseDefaultValueAttribute(requestParam.defaultValue());
annotationsFound++;
} else if (RequestHeader.class.isInstance(paramAnn)) {
RequestHeader requestHeader = (RequestHeader) paramAnn;
headerName = requestHeader.value();
required = requestHeader.required();
defaultValue = parseDefaultValueAttribute(requestHeader.defaultValue());
annotationsFound++;
} else if (RequestBody.class.isInstance(paramAnn)) {
requestBodyFound = true;
annotationsFound++;
} else if (CookieValue.class.isInstance(paramAnn)) {
CookieValue cookieValue = (CookieValue) paramAnn;
cookieName = cookieValue.value();
required = cookieValue.required();
defaultValue = parseDefaultValueAttribute(cookieValue.defaultValue());
annotationsFound++;
} else if (PathVariable.class.isInstance(paramAnn)) {
PathVariable pathVar = (PathVariable) paramAnn;
pathVarName = pathVar.value();
annotationsFound++;
} else if (ModelAttribute.class.isInstance(paramAnn)) {
ModelAttribute attr = (ModelAttribute) paramAnn;
attrName = attr.value();
annotationsFound++;
} else if (Value.class.isInstance(paramAnn)) {
defaultValue = ((Value) paramAnn).value();
} else if ("Valid".equals(paramAnn.annotationType().getSimpleName())) {
validate = true;
}
}
if (annotationsFound > 1) {
throw new IllegalStateException("Handler parameter annotations are exclusive choices - " +
"do not specify more than one such annotation on the same parameter: " + handlerMethod);
}
if (annotationsFound == 0) {
Object argValue = resolveCommonArgument(methodParam, webRequest);
if (argValue != WebArgumentResolver.UNRESOLVED) {
args[i] = argValue;
} else if (defaultValue != null) {
args[i] = resolveDefaultValue(defaultValue);
} else {
Class paramType = methodParam.getParameterType();
if (Model.class.isAssignableFrom(paramType) || Map.class.isAssignableFrom(paramType)) {
args[i] = implicitModel;
} else if (SessionStatus.class.isAssignableFrom(paramType)) {
args[i] = this.sessionStatus;
} else if (HttpEntity.class.isAssignableFrom(paramType)) {
args[i] = resolveHttpEntityRequest(methodParam, webRequest);
} else if (Errors.class.isAssignableFrom(paramType)) {
throw new IllegalStateException("Errors/BindingResult argument declared " +
"without preceding model attribute. Check your handler method signature!");
} else if (BeanUtils.isSimpleProperty(paramType)) {
paramName = "";
} else {
attrName = "";
}
}
}
if (paramName != null) {
args[i] = resolveRequestParam(paramName, required, defaultValue, methodParam, webRequest, handler);
} else if (headerName != null) {
args[i] = resolveRequestHeader(headerName, required, defaultValue, methodParam, webRequest, handler);
} else if (requestBodyFound) {
args[i] = resolveRequestBody(methodParam, webRequest, handler);
} else if (cookieName != null) {
args[i] = resolveCookieValue(cookieName, required, defaultValue, methodParam, webRequest, handler);
} else if (pathVarName != null) {
args[i] = resolvePathVariable(pathVarName, methodParam, webRequest, handler);
} else if (attrName != null) {
WebDataBinder binder =
resolveModelAttribute(attrName, methodParam, implicitModel, webRequest, handler);
boolean assignBindingResult = (args.length > i + 1 && Errors.class.isAssignableFrom(paramTypes[i + 1]));
if (binder.getTarget() != null) {
doBind(binder, webRequest, validate, !assignBindingResult);
}
args[i] = binder.getTarget();
if (assignBindingResult) {
args[i + 1] = binder.getBindingResult();
i++;
}
implicitModel.putAll(binder.getBindingResult().getModel());
}
}
return args;
}
protected void initBinder(Object handler, String attrName, WebDataBinder binder, NativeWebRequest webRequest)
throws Exception {
if (this.bindingInitializer != null) {
this.bindingInitializer.initBinder(binder, webRequest);
}
if (handler != null) {
Set<Method> initBinderMethods = this.methodResolver.getInitBinderMethods();
if (!initBinderMethods.isEmpty()) {
boolean debug = logger.isDebugEnabled();
for (Method initBinderMethod : initBinderMethods) {
Method methodToInvoke = BridgeMethodResolver.findBridgedMethod(initBinderMethod);
String[] targetNames = AnnotationUtils.findAnnotation(initBinderMethod, InitBinder.class).value();
if (targetNames.length == 0 || Arrays.asList(targetNames).contains(attrName)) {
Object[] initBinderArgs =
resolveInitBinderArguments(handler, methodToInvoke, binder, webRequest);
if (debug) {
logger.debug("Invoking init-binder method: " + methodToInvoke);
}
ReflectionUtils.makeAccessible(methodToInvoke);
Object returnValue = methodToInvoke.invoke(handler, initBinderArgs);
if (returnValue != null) {
throw new IllegalStateException(
"InitBinder methods must not have a return value: " + methodToInvoke);
}
}
}
}
}
}
private Object[] resolveInitBinderArguments(Object handler, Method initBinderMethod,
WebDataBinder binder, NativeWebRequest webRequest) throws Exception {
Class[] initBinderParams = initBinderMethod.getParameterTypes();
Object[] initBinderArgs = new Object[initBinderParams.length];
for (int i = 0; i < initBinderArgs.length; i++) {
MethodParameter methodParam = new MethodParameter(initBinderMethod, i);
methodParam.initParameterNameDiscovery(this.parameterNameDiscoverer);
GenericTypeResolver.resolveParameterType(methodParam, handler.getClass());
String paramName = null;
boolean paramRequired = false;
String paramDefaultValue = null;
String pathVarName = null;
Annotation[] paramAnns = methodParam.getParameterAnnotations();
for (Annotation paramAnn : paramAnns) {
if (RequestParam.class.isInstance(paramAnn)) {
RequestParam requestParam = (RequestParam) paramAnn;
paramName = requestParam.value();
paramRequired = requestParam.required();
paramDefaultValue = parseDefaultValueAttribute(requestParam.defaultValue());
break;
} else if (ModelAttribute.class.isInstance(paramAnn)) {
throw new IllegalStateException(
"@ModelAttribute is not supported on @InitBinder methods: " + initBinderMethod);
} else if (PathVariable.class.isInstance(paramAnn)) {
PathVariable pathVar = (PathVariable) paramAnn;
pathVarName = pathVar.value();
}
}
if (paramName == null && pathVarName == null) {
Object argValue = resolveCommonArgument(methodParam, webRequest);
if (argValue != WebArgumentResolver.UNRESOLVED) {
initBinderArgs[i] = argValue;
} else {
Class paramType = initBinderParams[i];
if (paramType.isInstance(binder)) {
initBinderArgs[i] = binder;
} else if (BeanUtils.isSimpleProperty(paramType)) {
paramName = "";
} else {
throw new IllegalStateException("Unsupported argument [" + paramType.getName() +
"] for @InitBinder method: " + initBinderMethod);
}
}
}
if (paramName != null) {
initBinderArgs[i] =
resolveRequestParam(paramName, paramRequired, paramDefaultValue, methodParam, webRequest, null);
} else if (pathVarName != null) {
initBinderArgs[i] = resolvePathVariable(pathVarName, methodParam, webRequest, null);
}
}
return initBinderArgs;
}
@SuppressWarnings("unchecked")
private Object resolveRequestParam(String paramName, boolean required, String defaultValue,
MethodParameter methodParam, NativeWebRequest webRequest, Object handlerForInitBinderCall)
throws Exception {
Class<?> paramType = methodParam.getParameterType();
if (Map.class.isAssignableFrom(paramType) && paramName.length() == 0) {
return resolveRequestParamMap((Class<? extends Map>) paramType, webRequest);
}
if (paramName.length() == 0) {
paramName = getRequiredParameterName(methodParam);
}
Object paramValue = null;
MultipartRequest multipartRequest = webRequest.getNativeRequest(MultipartRequest.class);
if (multipartRequest != null) {
List<MultipartFile> files = multipartRequest.getFiles(paramName);
if (!files.isEmpty()) {
if (files.size() == 1 && !paramType.isArray() && !Collection.class.isAssignableFrom(paramType)) {
paramValue = files.get(0);
} else {
paramValue = files;
}
}
}
if (paramValue == null) {
String[] paramValues = webRequest.getParameterValues(paramName);
if (paramValues != null) {
if (paramValues.length == 1 && !paramType.isArray() && !Collection.class.isAssignableFrom(paramType)) {
paramValue = paramValues[0];
} else {
paramValue = paramValues;
}
}
}
if (paramValue == null) {
if (defaultValue != null) {
paramValue = resolveDefaultValue(defaultValue);
} else if (required) {
raiseMissingParameterException(paramName, paramType);
}
paramValue = checkValue(paramName, paramValue, paramType);
}
WebDataBinder binder = createBinder(webRequest, null, paramName);
initBinder(handlerForInitBinderCall, paramName, binder, webRequest);
return binder.convertIfNecessary(paramValue, paramType, methodParam);
}
private Map resolveRequestParamMap(Class<? extends Map> mapType, NativeWebRequest webRequest) {
Map<String, String[]> parameterMap = webRequest.getParameterMap();
if (MultiValueMap.class.isAssignableFrom(mapType)) {
MultiValueMap<String, String> result = new LinkedMultiValueMap<String, String>(parameterMap.size());
for (Map.Entry<String, String[]> entry : parameterMap.entrySet()) {
for (String value : entry.getValue()) {
result.add(entry.getKey(), value);
}
}
return result;
} else {
Map<String, String> result = new LinkedHashMap<String, String>(parameterMap.size());
for (Map.Entry<String, String[]> entry : parameterMap.entrySet()) {
if (entry.getValue().length > 0) {
result.put(entry.getKey(), entry.getValue()[0]);
}
}
return result;
}
}
@SuppressWarnings("unchecked")
private Object resolveRequestHeader(String headerName, boolean required, String defaultValue,
MethodParameter methodParam, NativeWebRequest webRequest, Object handlerForInitBinderCall)
throws Exception {
Class<?> paramType = methodParam.getParameterType();
if (Map.class.isAssignableFrom(paramType)) {
return resolveRequestHeaderMap((Class<? extends Map>) paramType, webRequest);
}
if (headerName.length() == 0) {
headerName = getRequiredParameterName(methodParam);
}
Object headerValue = null;
String[] headerValues = webRequest.getHeaderValues(headerName);
if (headerValues != null) {
headerValue = (headerValues.length == 1 ? headerValues[0] : headerValues);
}
if (headerValue == null) {
if (defaultValue != null) {
headerValue = resolveDefaultValue(defaultValue);
} else if (required) {
raiseMissingHeaderException(headerName, paramType);
}
headerValue = checkValue(headerName, headerValue, paramType);
}
WebDataBinder binder = createBinder(webRequest, null, headerName);
initBinder(handlerForInitBinderCall, headerName, binder, webRequest);
return binder.convertIfNecessary(headerValue, paramType, methodParam);
}
private Map resolveRequestHeaderMap(Class<? extends Map> mapType, NativeWebRequest webRequest) {
if (MultiValueMap.class.isAssignableFrom(mapType)) {
MultiValueMap<String, String> result;
if (HttpHeaders.class.isAssignableFrom(mapType)) {
result = new HttpHeaders();
} else {
result = new LinkedMultiValueMap<String, String>();
}
for (Iterator<String> iterator = webRequest.getHeaderNames(); iterator.hasNext();) {
String headerName = iterator.next();
for (String headerValue : webRequest.getHeaderValues(headerName)) {
result.add(headerName, headerValue);
}
}
return result;
} else {
Map<String, String> result = new LinkedHashMap<String, String>();
for (Iterator<String> iterator = webRequest.getHeaderNames(); iterator.hasNext();) {
String headerName = iterator.next();
String headerValue = webRequest.getHeader(headerName);
result.put(headerName, headerValue);
}
return result;
}
}
/**
* Resolves the given {@link RequestBody @RequestBody} annotation.
*/
protected Object resolveRequestBody(MethodParameter methodParam, NativeWebRequest webRequest, Object handler)
throws Exception {
return readWithMessageConverters(methodParam, createHttpInputMessage(webRequest), methodParam.getParameterType());
}
private HttpEntity resolveHttpEntityRequest(MethodParameter methodParam, NativeWebRequest webRequest)
throws Exception {
HttpInputMessage inputMessage = createHttpInputMessage(webRequest);
Class<?> paramType = getHttpEntityType(methodParam);
Object body = readWithMessageConverters(methodParam, inputMessage, paramType);
return new HttpEntity<Object>(body, inputMessage.getHeaders());
}
private Object readWithMessageConverters(MethodParameter methodParam, HttpInputMessage inputMessage, Class paramType)
throws Exception {
MediaType contentType = inputMessage.getHeaders().getContentType();
if (contentType == null) {
StringBuilder builder = new StringBuilder(ClassUtils.getShortName(methodParam.getParameterType()));
String paramName = methodParam.getParameterName();
if (paramName != null) {
builder.append(' ');
builder.append(paramName);
}
throw new HttpMediaTypeNotSupportedException(
"Cannot extract parameter (" + builder.toString() + "): no Content-Type found");
}
List<MediaType> allSupportedMediaTypes = new ArrayList<MediaType>();
if (this.messageConverters != null) {
for (HttpMessageConverter<?> messageConverter : this.messageConverters) {
allSupportedMediaTypes.addAll(messageConverter.getSupportedMediaTypes());
if (messageConverter.canRead(paramType, contentType)) {
if (logger.isDebugEnabled()) {
logger.debug("Reading [" + paramType.getName() + "] as \"" + contentType
+ "\" using [" + messageConverter + "]");
}
return messageConverter.read(paramType, inputMessage);
}
}
}
throw new HttpMediaTypeNotSupportedException(contentType, allSupportedMediaTypes);
}
private Class<?> getHttpEntityType(MethodParameter methodParam) {
Assert.isAssignable(HttpEntity.class, methodParam.getParameterType());
ParameterizedType type = (ParameterizedType) methodParam.getGenericParameterType();
if (type.getActualTypeArguments().length == 1) {
Type typeArgument = type.getActualTypeArguments()[0];
if (typeArgument instanceof Class) {
return (Class<?>) typeArgument;
} else if (typeArgument instanceof GenericArrayType) {
Type componentType = ((GenericArrayType) typeArgument).getGenericComponentType();
if (componentType instanceof Class) {
// Surely, there should be a nicer way to do this
Object array = Array.newInstance((Class<?>) componentType, 0);
return array.getClass();
}
}
}
throw new IllegalArgumentException(
"HttpEntity parameter (" + methodParam.getParameterName() + ") is not parameterized");
}
private Object resolveCookieValue(String cookieName, boolean required, String defaultValue,
MethodParameter methodParam, NativeWebRequest webRequest, Object handlerForInitBinderCall)
throws Exception {
Class<?> paramType = methodParam.getParameterType();
if (cookieName.length() == 0) {
cookieName = getRequiredParameterName(methodParam);
}
Object cookieValue = resolveCookieValue(cookieName, paramType, webRequest);
if (cookieValue == null) {
if (defaultValue != null) {
cookieValue = resolveDefaultValue(defaultValue);
} else if (required) {
raiseMissingCookieException(cookieName, paramType);
}
cookieValue = checkValue(cookieName, cookieValue, paramType);
}
WebDataBinder binder = createBinder(webRequest, null, cookieName);
initBinder(handlerForInitBinderCall, cookieName, binder, webRequest);
return binder.convertIfNecessary(cookieValue, paramType, methodParam);
}
/**
* Resolves the given {@link CookieValue @CookieValue} annotation.
* <p>Throws an UnsupportedOperationException by default.
*/
protected Object resolveCookieValue(String cookieName, Class paramType, NativeWebRequest webRequest)
throws Exception {
throw new UnsupportedOperationException("@CookieValue not supported");
}
private Object resolvePathVariable(String pathVarName, MethodParameter methodParam,
NativeWebRequest webRequest, Object handlerForInitBinderCall) throws Exception {
Class<?> paramType = methodParam.getParameterType();
if (pathVarName.length() == 0) {
pathVarName = getRequiredParameterName(methodParam);
}
String pathVarValue = resolvePathVariable(pathVarName, paramType, webRequest);
WebDataBinder binder = createBinder(webRequest, null, pathVarName);
initBinder(handlerForInitBinderCall, pathVarName, binder, webRequest);
return binder.convertIfNecessary(pathVarValue, paramType, methodParam);
}
/**
* Resolves the given {@link PathVariable @PathVariable} annotation.
* <p>Throws an UnsupportedOperationException by default.
*/
protected String resolvePathVariable(String pathVarName, Class paramType, NativeWebRequest webRequest)
throws Exception {
throw new UnsupportedOperationException("@PathVariable not supported");
}
private String getRequiredParameterName(MethodParameter methodParam) {
String name = methodParam.getParameterName();
if (name == null) {
throw new IllegalStateException(
"No parameter name specified for argument of type [" + methodParam.getParameterType().getName() +
"], and no parameter name information found in class file either.");
}
return name;
}
private Object checkValue(String name, Object value, Class paramType) {
if (value == null) {
if (boolean.class.equals(paramType)) {
return Boolean.FALSE;
} else if (paramType.isPrimitive()) {
throw new IllegalStateException("Optional " + paramType + " parameter '" + name +
"' is not present but cannot be translated into a null value due to being declared as a " +
"primitive type. Consider declaring it as object wrapper for the corresponding primitive type.");
}
}
return value;
}
private WebDataBinder resolveModelAttribute(String attrName, MethodParameter methodParam,
ExtendedModelMap implicitModel, NativeWebRequest webRequest, Object handler) throws Exception {
// Bind request parameter onto object...
String name = attrName;
if ("".equals(name)) {
name = Conventions.getVariableNameForParameter(methodParam);
}
Class<?> paramType = methodParam.getParameterType();
Object bindObject;
if (implicitModel.containsKey(name)) {
bindObject = implicitModel.get(name);
} else if (this.methodResolver.isSessionAttribute(name, paramType)) {
bindObject = this.sessionAttributeStore.retrieveAttribute(webRequest, name);
if (bindObject == null) {
raiseSessionRequiredException("Session attribute '" + name + "' required - not found in session");
}
} else {
bindObject = BeanUtils.instantiateClass(paramType);
}
WebDataBinder binder = createBinder(webRequest, bindObject, name);
initBinder(handler, name, binder, webRequest);
return binder;
}
/**
* Determine whether the given value qualifies as a "binding candidate", i.e. might potentially be subject to
* bean-style data binding later on.
*/
protected boolean isBindingCandidate(Object value) {
return (value != null && !value.getClass().isArray() && !(value instanceof Collection) &&
!(value instanceof Map) && !BeanUtils.isSimpleValueType(value.getClass()));
}
protected void raiseMissingParameterException(String paramName, Class paramType) throws Exception {
throw new IllegalStateException("Missing parameter '" + paramName + "' of type [" + paramType.getName() + "]");
}
protected void raiseMissingHeaderException(String headerName, Class paramType) throws Exception {
throw new IllegalStateException("Missing header '" + headerName + "' of type [" + paramType.getName() + "]");
}
protected void raiseMissingCookieException(String cookieName, Class paramType) throws Exception {
throw new IllegalStateException(
"Missing cookie value '" + cookieName + "' of type [" + paramType.getName() + "]");
}
protected void raiseSessionRequiredException(String message) throws Exception {
throw new IllegalStateException(message);
}
protected WebDataBinder createBinder(NativeWebRequest webRequest, Object target, String objectName)
throws Exception {
return new WebRequestDataBinder(target, objectName);
}
private void doBind(WebDataBinder binder, NativeWebRequest webRequest, boolean validate, boolean failOnErrors)
throws Exception {
doBind(binder, webRequest);
if (validate) {
binder.validate();
}
if (failOnErrors && binder.getBindingResult().hasErrors()) {
throw new BindException(binder.getBindingResult());
}
}
protected void doBind(WebDataBinder binder, NativeWebRequest webRequest) throws Exception {
((WebRequestDataBinder) binder).bind(webRequest);
}
/**
* Return a {@link HttpInputMessage} for the given {@link NativeWebRequest}.
* <p>Throws an UnsupportedOperation1Exception by default.
*/
protected HttpInputMessage createHttpInputMessage(NativeWebRequest webRequest) throws Exception {
throw new UnsupportedOperationException("@RequestBody not supported");
}
/**
* Return a {@link HttpOutputMessage} for the given {@link NativeWebRequest}.
* <p>Throws an UnsupportedOperationException by default.
*/
protected HttpOutputMessage createHttpOutputMessage(NativeWebRequest webRequest) throws Exception {
throw new UnsupportedOperationException("@ResponseBody not supported");
}
protected String parseDefaultValueAttribute(String value) {
return (ValueConstants.DEFAULT_NONE.equals(value) ? null : value);
}
protected Object resolveDefaultValue(String value) {
return value;
}
protected Object resolveCommonArgument(MethodParameter methodParameter, NativeWebRequest webRequest)
throws Exception {
// Invoke custom argument resolvers if present...
if (this.customArgumentResolvers != null) {
for (WebArgumentResolver argumentResolver : this.customArgumentResolvers) {
Object value = argumentResolver.resolveArgument(methodParameter, webRequest);
if (value != WebArgumentResolver.UNRESOLVED) {
return value;
}
}
}
// Resolution of standard parameter types...
Class paramType = methodParameter.getParameterType();
Object value = resolveStandardArgument(paramType, webRequest);
if (value != WebArgumentResolver.UNRESOLVED && !ClassUtils.isAssignableValue(paramType, value)) {
throw new IllegalStateException("Standard argument type [" + paramType.getName() +
"] resolved to incompatible value of type [" + (value != null ? value.getClass() : null) +
"]. Consider declaring the argument type in a less specific fashion.");
}
return value;
}
protected Object resolveStandardArgument(Class<?> parameterType, NativeWebRequest webRequest) throws Exception {
if (WebRequest.class.isAssignableFrom(parameterType)) {
return webRequest;
}
return WebArgumentResolver.UNRESOLVED;
}
protected final void addReturnValueAsModelAttribute(Method handlerMethod, Class handlerType,
Object returnValue, ExtendedModelMap implicitModel) {
ModelAttribute attr = AnnotationUtils.findAnnotation(handlerMethod, ModelAttribute.class);
String attrName = (attr != null ? attr.value() : "");
if ("".equals(attrName)) {
Class resolvedType = GenericTypeResolver.resolveReturnType(handlerMethod, handlerType);
attrName = Conventions.getVariableNameForReturnType(handlerMethod, resolvedType, returnValue);
}
implicitModel.addAttribute(attrName, returnValue);
}
}

View File

@@ -1,58 +0,0 @@
/*
* Copyright 2002-2010 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.web.servlet;
import java.util.Arrays;
import org.springframework.web.servlet.ModelAndView;
public class ActionExecutedContext extends ActionExecutingContext {
private ModelAndView modelAndView;
private Exception exception;
public ActionExecutedContext(ActionExecutingContext actionExecutingContext, ModelAndView modelAndView, Exception exception) {
super(actionExecutingContext.getServletWebRequest(), actionExecutingContext.getHandler(),
actionExecutingContext.getHandlerMethod(), actionExecutingContext.getHandlerParameters(),
actionExecutingContext.getImplicitModel());
this.modelAndView = modelAndView;
this.exception = exception;
}
@Override
public String toString() {
return "ActionExecutedContext [handler=" + getHandler()
+ ", servletWebRequest=" + getServletWebRequest()
+ ", implicitModel=" + getImplicitModel() + ", handlerMethod="
+ getHandlerMethod() + ", handlerParameters="
+ Arrays.toString(getHandlerParameters()) + ",modelAndView=" + modelAndView
+ ", exception=" + exception + "]";
}
public ModelAndView getModelAndView() {
return modelAndView;
}
public Exception getException() {
return exception;
}
}

View File

@@ -1,89 +0,0 @@
/*
* Copyright 2002-2010 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.web.servlet;
import java.lang.reflect.Method;
import java.util.Arrays;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import org.springframework.ui.ExtendedModelMap;
import org.springframework.web.context.request.ServletWebRequest;
public class ActionExecutingContext {
private Object handler;
private ServletWebRequest servletWebRequest;
private ExtendedModelMap implicitModel;
private Method handlerMethod;
private Object[] handlerParameters;
public ActionExecutingContext(ServletWebRequest servletWebRequest,
Object handler, Method handlerMethod, Object[] handlerParameters,
ExtendedModelMap implicitModel) {
super();
this.servletWebRequest = servletWebRequest;
this.handler = handler;
this.handlerMethod = handlerMethod;
this.handlerParameters = handlerParameters;
this.implicitModel = implicitModel;
}
public HttpServletRequest getHttpServletRequest() {
return servletWebRequest.getRequest();
}
public HttpServletResponse getHttpServletResponse() {
return servletWebRequest.getResponse();
}
public Object getHandler() {
return handler;
}
public ServletWebRequest getServletWebRequest() {
return servletWebRequest;
}
public ExtendedModelMap getImplicitModel() {
return implicitModel;
}
public Method getHandlerMethod() {
return handlerMethod;
}
public Object[] getHandlerParameters() {
return handlerParameters;
}
@Override
public String toString() {
return "ActionExecutingContext [handler=" + handler
+ ", servletWebRequest=" + servletWebRequest
+ ", implicitModel=" + implicitModel + ", handlerMethod="
+ handlerMethod + ", handlerParameters="
+ Arrays.toString(handlerParameters) + "]";
}
}

View File

@@ -1,25 +0,0 @@
/*
* Copyright 2002-2010 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.web.servlet;
public interface ActionInterceptor {
boolean preHandle(ActionExecutingContext actionExecutingContext);
void postHandle(ActionExecutedContext actionExecutedContext);
void afterCompletion(ActionExecutedContext actionExecutedContext);
}

View File

@@ -1,141 +0,0 @@
/*
* Copyright 2002-2010 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.web.servlet.mvc.annotation;
import java.util.Iterator;
import java.util.List;
import javax.servlet.http.HttpServletRequest;
import org.springframework.http.MediaType;
import org.springframework.util.ObjectUtils;
import org.springframework.web.bind.annotation.RequestMethod;
import org.springframework.web.util.WebUtils;
/**
* Helper class for annotation-based request mapping.
*
* @author Juergen Hoeller
* @author Arjen Poutsma
* @since 2.5.2
*/
abstract class ServletAnnotationMappingUtils {
/**
* Check whether the given request matches the specified request methods.
*
* @param methods the HTTP request methods to check against
* @param request the current HTTP request to check
*/
public static boolean checkRequestMethod(RequestMethod[] methods, HttpServletRequest request) {
if (ObjectUtils.isEmpty(methods)) {
return true;
}
for (RequestMethod method : methods) {
if (method.name().equals(request.getMethod())) {
return true;
}
}
return false;
}
/**
* Check whether the given request matches the specified parameter conditions.
*
* @param params the parameter conditions, following
* {@link org.springframework.web.bind.annotation.RequestMapping#params() RequestMapping.#params()}
* @param request the current HTTP request to check
*/
public static boolean checkParameters(String[] params, HttpServletRequest request) {
if (!ObjectUtils.isEmpty(params)) {
for (String param : params) {
int separator = param.indexOf('=');
if (separator == -1) {
if (param.startsWith("!")) {
if (WebUtils.hasSubmitParameter(request, param.substring(1))) {
return false;
}
} else if (!WebUtils.hasSubmitParameter(request, param)) {
return false;
}
} else {
boolean negated = separator > 0 && param.charAt(separator - 1) == '!';
String key = !negated ? param.substring(0, separator) : param.substring(0, separator - 1);
String value = param.substring(separator + 1);
if (!value.equals(request.getParameter(key))) {
return negated;
}
}
}
}
return true;
}
/**
* Check whether the given request matches the specified header conditions.
*
* @param headers the header conditions, following
* {@link org.springframework.web.bind.annotation.RequestMapping#headers() RequestMapping.headers()}
* @param request the current HTTP request to check
*/
public static boolean checkHeaders(String[] headers, HttpServletRequest request) {
if (!ObjectUtils.isEmpty(headers)) {
for (String header : headers) {
int separator = header.indexOf('=');
if (separator == -1) {
if (header.startsWith("!")) {
if (request.getHeader(header.substring(1)) != null) {
return false;
}
} else if (request.getHeader(header) == null) {
return false;
}
} else {
boolean negated = separator > 0 && header.charAt(separator - 1) == '!';
String key = !negated ? header.substring(0, separator) : header.substring(0, separator - 1);
String value = header.substring(separator + 1);
if (isMediaTypeHeader(key)) {
List<MediaType> requestMediaTypes = MediaType.parseMediaTypes(request.getHeader(key));
List<MediaType> valueMediaTypes = MediaType.parseMediaTypes(value);
boolean found = false;
for (Iterator<MediaType> valIter = valueMediaTypes.iterator(); valIter.hasNext() && !found;) {
MediaType valueMediaType = valIter.next();
for (Iterator<MediaType> reqIter = requestMediaTypes.iterator();
reqIter.hasNext() && !found;) {
MediaType requestMediaType = reqIter.next();
if (valueMediaType.includes(requestMediaType)) {
found = true;
}
}
}
if (!found) {
return negated;
}
} else if (!value.equals(request.getHeader(key))) {
return negated;
}
}
}
}
return true;
}
private static boolean isMediaTypeHeader(String headerName) {
return "Accept".equalsIgnoreCase(headerName) || "Content-Type".equalsIgnoreCase(headerName);
}
}

View File

@@ -1,828 +0,0 @@
/*
* Copyright 2002-2010 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.web.servlet.mvc.annotation.support;
import java.lang.annotation.Annotation;
import java.lang.reflect.*;
import java.util.*;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.springframework.beans.BeanUtils;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.core.*;
import org.springframework.core.annotation.AnnotationUtils;
import org.springframework.data.document.web.servlet.ActionInterceptor;
import org.springframework.http.*;
import org.springframework.http.converter.HttpMessageConverter;
import org.springframework.ui.ExtendedModelMap;
import org.springframework.ui.Model;
import org.springframework.util.*;
import org.springframework.validation.BindException;
import org.springframework.validation.BindingResult;
import org.springframework.validation.Errors;
import org.springframework.web.HttpMediaTypeNotSupportedException;
import org.springframework.web.bind.WebDataBinder;
import org.springframework.web.bind.annotation.*;
import org.springframework.web.bind.annotation.support.HandlerMethodInvocationException;
import org.springframework.web.bind.annotation.support.HandlerMethodResolver;
import org.springframework.web.bind.support.*;
import org.springframework.web.context.request.NativeWebRequest;
import org.springframework.web.context.request.WebRequest;
import org.springframework.web.multipart.MultipartFile;
import org.springframework.web.multipart.MultipartRequest;
/**
* Support class for invoking an annotated handler method. Operates on the introspection results of a {@link
* HandlerMethodResolver} for a specific handler type.
* <p/>
* <p>Used by {@link org.springframework.web.servlet.mvc.annotation.AnnotationMethodHandlerAdapter} and {@link
* org.springframework.web.portlet.mvc.annotation.AnnotationMethodHandlerAdapter}.
*
* @author Juergen Hoeller
* @author Arjen Poutsma
* @see #invokeHandlerMethod
* @since 2.5.2
*/
public class InterceptingHandlerMethodInvoker {
private static final String MODEL_KEY_PREFIX_STALE = SessionAttributeStore.class.getName() + ".STALE.";
/**
* We'll create a lot of these objects, so we don't want a new logger every time.
*/
private static final Log logger = LogFactory.getLog(InterceptingHandlerMethodInvoker.class);
protected final HandlerMethodResolver methodResolver;
private final WebBindingInitializer bindingInitializer;
protected final SessionAttributeStore sessionAttributeStore;
private final ParameterNameDiscoverer parameterNameDiscoverer;
private final WebArgumentResolver[] customArgumentResolvers;
private final HttpMessageConverter[] messageConverters;
private final ActionInterceptor[] actionInterceptors;
private final SimpleSessionStatus sessionStatus = new SimpleSessionStatus();
public InterceptingHandlerMethodInvoker(HandlerMethodResolver methodResolver) {
this(methodResolver, null);
}
public InterceptingHandlerMethodInvoker(HandlerMethodResolver methodResolver, WebBindingInitializer bindingInitializer) {
this(methodResolver, bindingInitializer, new DefaultSessionAttributeStore(), null, null, null, null);
}
public InterceptingHandlerMethodInvoker(HandlerMethodResolver methodResolver, WebBindingInitializer bindingInitializer,
SessionAttributeStore sessionAttributeStore, ParameterNameDiscoverer parameterNameDiscoverer,
WebArgumentResolver[] customArgumentResolvers, HttpMessageConverter[] messageConverters, ActionInterceptor[] actionInterceptors) {
this.methodResolver = methodResolver;
this.bindingInitializer = bindingInitializer;
this.sessionAttributeStore = sessionAttributeStore;
this.parameterNameDiscoverer = parameterNameDiscoverer;
this.customArgumentResolvers = customArgumentResolvers;
this.messageConverters = messageConverters;
this.actionInterceptors = actionInterceptors;
}
protected ActionInterceptor[] getActionInterceptors() {
return actionInterceptors;
}
public final Object invokeHandlerMethod(Method handlerMethod, Object handler,
NativeWebRequest webRequest, ExtendedModelMap implicitModel) throws Exception {
Method handlerMethodToInvoke = BridgeMethodResolver.findBridgedMethod(handlerMethod);
try {
boolean debug = logger.isDebugEnabled();
for (String attrName : this.methodResolver.getActualSessionAttributeNames()) {
Object attrValue = this.sessionAttributeStore.retrieveAttribute(webRequest, attrName);
if (attrValue != null) {
implicitModel.addAttribute(attrName, attrValue);
}
}
for (Method attributeMethod : this.methodResolver.getModelAttributeMethods()) {
Method attributeMethodToInvoke = BridgeMethodResolver.findBridgedMethod(attributeMethod);
Object[] args = resolveHandlerArguments(attributeMethodToInvoke, handler, webRequest, implicitModel);
if (debug) {
logger.debug("Invoking model attribute method: " + attributeMethodToInvoke);
}
String attrName = AnnotationUtils.findAnnotation(attributeMethod, ModelAttribute.class).value();
if (!"".equals(attrName) && implicitModel.containsAttribute(attrName)) {
continue;
}
ReflectionUtils.makeAccessible(attributeMethodToInvoke);
Object attrValue = attributeMethodToInvoke.invoke(handler, args);
if ("".equals(attrName)) {
Class resolvedType = GenericTypeResolver.resolveReturnType(attributeMethodToInvoke, handler.getClass());
attrName = Conventions.getVariableNameForReturnType(attributeMethodToInvoke, resolvedType, attrValue);
}
if (!implicitModel.containsAttribute(attrName)) {
implicitModel.addAttribute(attrName, attrValue);
}
}
Object[] args = resolveHandlerArguments(handlerMethodToInvoke, handler, webRequest, implicitModel);
if (debug) {
logger.debug("Invoking request handler method: " + handlerMethodToInvoke);
}
ReflectionUtils.makeAccessible(handlerMethodToInvoke);
return handlerMethodToInvoke.invoke(handler, args);
} catch (IllegalStateException ex) {
// Internal assertion failed (e.g. invalid signature):
// throw exception with full handler method context...
throw new HandlerMethodInvocationException(handlerMethodToInvoke, ex);
} catch (InvocationTargetException ex) {
// User-defined @ModelAttribute/@InitBinder/@RequestMapping method threw an exception...
ReflectionUtils.rethrowException(ex.getTargetException());
return null;
}
}
public final void updateModelAttributes(Object handler, Map<String, Object> mavModel,
ExtendedModelMap implicitModel, NativeWebRequest webRequest) throws Exception {
if (this.methodResolver.hasSessionAttributes() && this.sessionStatus.isComplete()) {
for (String attrName : this.methodResolver.getActualSessionAttributeNames()) {
this.sessionAttributeStore.cleanupAttribute(webRequest, attrName);
}
}
// Expose model attributes as session attributes, if required.
// Expose BindingResults for all attributes, making custom editors available.
Map<String, Object> model = (mavModel != null ? mavModel : implicitModel);
if (model != null) {
try {
String[] originalAttrNames = model.keySet().toArray(new String[model.size()]);
for (String attrName : originalAttrNames) {
Object attrValue = model.get(attrName);
boolean isSessionAttr = this.methodResolver.isSessionAttribute(
attrName, (attrValue != null ? attrValue.getClass() : null));
if (isSessionAttr) {
if (this.sessionStatus.isComplete()) {
implicitModel.put(MODEL_KEY_PREFIX_STALE + attrName, Boolean.TRUE);
} else if (!implicitModel.containsKey(MODEL_KEY_PREFIX_STALE + attrName)) {
this.sessionAttributeStore.storeAttribute(webRequest, attrName, attrValue);
}
}
if (!attrName.startsWith(BindingResult.MODEL_KEY_PREFIX) &&
(isSessionAttr || isBindingCandidate(attrValue))) {
String bindingResultKey = BindingResult.MODEL_KEY_PREFIX + attrName;
if (mavModel != null && !model.containsKey(bindingResultKey)) {
WebDataBinder binder = createBinder(webRequest, attrValue, attrName);
initBinder(handler, attrName, binder, webRequest);
mavModel.put(bindingResultKey, binder.getBindingResult());
}
}
}
} catch (InvocationTargetException ex) {
// User-defined @InitBinder method threw an exception...
ReflectionUtils.rethrowException(ex.getTargetException());
}
}
}
protected Object[] resolveHandlerArguments(Method handlerMethod, Object handler,
NativeWebRequest webRequest, ExtendedModelMap implicitModel) throws Exception {
Class[] paramTypes = handlerMethod.getParameterTypes();
Object[] args = new Object[paramTypes.length];
for (int i = 0; i < args.length; i++) {
MethodParameter methodParam = new MethodParameter(handlerMethod, i);
methodParam.initParameterNameDiscovery(this.parameterNameDiscoverer);
GenericTypeResolver.resolveParameterType(methodParam, handler.getClass());
String paramName = null;
String headerName = null;
boolean requestBodyFound = false;
String cookieName = null;
String pathVarName = null;
String attrName = null;
boolean required = false;
String defaultValue = null;
boolean validate = false;
int annotationsFound = 0;
Annotation[] paramAnns = methodParam.getParameterAnnotations();
for (Annotation paramAnn : paramAnns) {
if (RequestParam.class.isInstance(paramAnn)) {
RequestParam requestParam = (RequestParam) paramAnn;
paramName = requestParam.value();
required = requestParam.required();
defaultValue = parseDefaultValueAttribute(requestParam.defaultValue());
annotationsFound++;
} else if (RequestHeader.class.isInstance(paramAnn)) {
RequestHeader requestHeader = (RequestHeader) paramAnn;
headerName = requestHeader.value();
required = requestHeader.required();
defaultValue = parseDefaultValueAttribute(requestHeader.defaultValue());
annotationsFound++;
} else if (RequestBody.class.isInstance(paramAnn)) {
requestBodyFound = true;
annotationsFound++;
} else if (CookieValue.class.isInstance(paramAnn)) {
CookieValue cookieValue = (CookieValue) paramAnn;
cookieName = cookieValue.value();
required = cookieValue.required();
defaultValue = parseDefaultValueAttribute(cookieValue.defaultValue());
annotationsFound++;
} else if (PathVariable.class.isInstance(paramAnn)) {
PathVariable pathVar = (PathVariable) paramAnn;
pathVarName = pathVar.value();
annotationsFound++;
} else if (ModelAttribute.class.isInstance(paramAnn)) {
ModelAttribute attr = (ModelAttribute) paramAnn;
attrName = attr.value();
annotationsFound++;
} else if (Value.class.isInstance(paramAnn)) {
defaultValue = ((Value) paramAnn).value();
} else if ("Valid".equals(paramAnn.annotationType().getSimpleName())) {
validate = true;
}
}
if (annotationsFound > 1) {
throw new IllegalStateException("Handler parameter annotations are exclusive choices - " +
"do not specify more than one such annotation on the same parameter: " + handlerMethod);
}
if (annotationsFound == 0) {
Object argValue = resolveCommonArgument(methodParam, webRequest);
if (argValue != WebArgumentResolver.UNRESOLVED) {
args[i] = argValue;
} else if (defaultValue != null) {
args[i] = resolveDefaultValue(defaultValue);
} else {
Class paramType = methodParam.getParameterType();
if (Model.class.isAssignableFrom(paramType) || Map.class.isAssignableFrom(paramType)) {
args[i] = implicitModel;
} else if (SessionStatus.class.isAssignableFrom(paramType)) {
args[i] = this.sessionStatus;
} else if (HttpEntity.class.isAssignableFrom(paramType)) {
args[i] = resolveHttpEntityRequest(methodParam, webRequest);
} else if (Errors.class.isAssignableFrom(paramType)) {
throw new IllegalStateException("Errors/BindingResult argument declared " +
"without preceding model attribute. Check your handler method signature!");
} else if (BeanUtils.isSimpleProperty(paramType)) {
paramName = "";
} else {
attrName = "";
}
}
}
if (paramName != null) {
args[i] = resolveRequestParam(paramName, required, defaultValue, methodParam, webRequest, handler);
} else if (headerName != null) {
args[i] = resolveRequestHeader(headerName, required, defaultValue, methodParam, webRequest, handler);
} else if (requestBodyFound) {
args[i] = resolveRequestBody(methodParam, webRequest, handler);
} else if (cookieName != null) {
args[i] = resolveCookieValue(cookieName, required, defaultValue, methodParam, webRequest, handler);
} else if (pathVarName != null) {
args[i] = resolvePathVariable(pathVarName, methodParam, webRequest, handler);
} else if (attrName != null) {
WebDataBinder binder =
resolveModelAttribute(attrName, methodParam, implicitModel, webRequest, handler);
boolean assignBindingResult = (args.length > i + 1 && Errors.class.isAssignableFrom(paramTypes[i + 1]));
if (binder.getTarget() != null) {
doBind(binder, webRequest, validate, !assignBindingResult);
}
args[i] = binder.getTarget();
if (assignBindingResult) {
args[i + 1] = binder.getBindingResult();
i++;
}
implicitModel.putAll(binder.getBindingResult().getModel());
}
}
return args;
}
protected void initBinder(Object handler, String attrName, WebDataBinder binder, NativeWebRequest webRequest)
throws Exception {
if (this.bindingInitializer != null) {
this.bindingInitializer.initBinder(binder, webRequest);
}
if (handler != null) {
Set<Method> initBinderMethods = this.methodResolver.getInitBinderMethods();
if (!initBinderMethods.isEmpty()) {
boolean debug = logger.isDebugEnabled();
for (Method initBinderMethod : initBinderMethods) {
Method methodToInvoke = BridgeMethodResolver.findBridgedMethod(initBinderMethod);
String[] targetNames = AnnotationUtils.findAnnotation(initBinderMethod, InitBinder.class).value();
if (targetNames.length == 0 || Arrays.asList(targetNames).contains(attrName)) {
Object[] initBinderArgs =
resolveInitBinderArguments(handler, methodToInvoke, binder, webRequest);
if (debug) {
logger.debug("Invoking init-binder method: " + methodToInvoke);
}
ReflectionUtils.makeAccessible(methodToInvoke);
Object returnValue = methodToInvoke.invoke(handler, initBinderArgs);
if (returnValue != null) {
throw new IllegalStateException(
"InitBinder methods must not have a return value: " + methodToInvoke);
}
}
}
}
}
}
private Object[] resolveInitBinderArguments(Object handler, Method initBinderMethod,
WebDataBinder binder, NativeWebRequest webRequest) throws Exception {
Class[] initBinderParams = initBinderMethod.getParameterTypes();
Object[] initBinderArgs = new Object[initBinderParams.length];
for (int i = 0; i < initBinderArgs.length; i++) {
MethodParameter methodParam = new MethodParameter(initBinderMethod, i);
methodParam.initParameterNameDiscovery(this.parameterNameDiscoverer);
GenericTypeResolver.resolveParameterType(methodParam, handler.getClass());
String paramName = null;
boolean paramRequired = false;
String paramDefaultValue = null;
String pathVarName = null;
Annotation[] paramAnns = methodParam.getParameterAnnotations();
for (Annotation paramAnn : paramAnns) {
if (RequestParam.class.isInstance(paramAnn)) {
RequestParam requestParam = (RequestParam) paramAnn;
paramName = requestParam.value();
paramRequired = requestParam.required();
paramDefaultValue = parseDefaultValueAttribute(requestParam.defaultValue());
break;
} else if (ModelAttribute.class.isInstance(paramAnn)) {
throw new IllegalStateException(
"@ModelAttribute is not supported on @InitBinder methods: " + initBinderMethod);
} else if (PathVariable.class.isInstance(paramAnn)) {
PathVariable pathVar = (PathVariable) paramAnn;
pathVarName = pathVar.value();
}
}
if (paramName == null && pathVarName == null) {
Object argValue = resolveCommonArgument(methodParam, webRequest);
if (argValue != WebArgumentResolver.UNRESOLVED) {
initBinderArgs[i] = argValue;
} else {
Class paramType = initBinderParams[i];
if (paramType.isInstance(binder)) {
initBinderArgs[i] = binder;
} else if (BeanUtils.isSimpleProperty(paramType)) {
paramName = "";
} else {
throw new IllegalStateException("Unsupported argument [" + paramType.getName() +
"] for @InitBinder method: " + initBinderMethod);
}
}
}
if (paramName != null) {
initBinderArgs[i] =
resolveRequestParam(paramName, paramRequired, paramDefaultValue, methodParam, webRequest, null);
} else if (pathVarName != null) {
initBinderArgs[i] = resolvePathVariable(pathVarName, methodParam, webRequest, null);
}
}
return initBinderArgs;
}
@SuppressWarnings("unchecked")
private Object resolveRequestParam(String paramName, boolean required, String defaultValue,
MethodParameter methodParam, NativeWebRequest webRequest, Object handlerForInitBinderCall)
throws Exception {
Class<?> paramType = methodParam.getParameterType();
if (Map.class.isAssignableFrom(paramType) && paramName.length() == 0) {
return resolveRequestParamMap((Class<? extends Map>) paramType, webRequest);
}
if (paramName.length() == 0) {
paramName = getRequiredParameterName(methodParam);
}
Object paramValue = null;
MultipartRequest multipartRequest = webRequest.getNativeRequest(MultipartRequest.class);
if (multipartRequest != null) {
List<MultipartFile> files = multipartRequest.getFiles(paramName);
if (!files.isEmpty()) {
if (files.size() == 1 && !paramType.isArray() && !Collection.class.isAssignableFrom(paramType)) {
paramValue = files.get(0);
} else {
paramValue = files;
}
}
}
if (paramValue == null) {
String[] paramValues = webRequest.getParameterValues(paramName);
if (paramValues != null) {
if (paramValues.length == 1 && !paramType.isArray() && !Collection.class.isAssignableFrom(paramType)) {
paramValue = paramValues[0];
} else {
paramValue = paramValues;
}
}
}
if (paramValue == null) {
if (defaultValue != null) {
paramValue = resolveDefaultValue(defaultValue);
} else if (required) {
raiseMissingParameterException(paramName, paramType);
}
paramValue = checkValue(paramName, paramValue, paramType);
}
WebDataBinder binder = createBinder(webRequest, null, paramName);
initBinder(handlerForInitBinderCall, paramName, binder, webRequest);
return binder.convertIfNecessary(paramValue, paramType, methodParam);
}
private Map resolveRequestParamMap(Class<? extends Map> mapType, NativeWebRequest webRequest) {
Map<String, String[]> parameterMap = webRequest.getParameterMap();
if (MultiValueMap.class.isAssignableFrom(mapType)) {
MultiValueMap<String, String> result = new LinkedMultiValueMap<String, String>(parameterMap.size());
for (Map.Entry<String, String[]> entry : parameterMap.entrySet()) {
for (String value : entry.getValue()) {
result.add(entry.getKey(), value);
}
}
return result;
} else {
Map<String, String> result = new LinkedHashMap<String, String>(parameterMap.size());
for (Map.Entry<String, String[]> entry : parameterMap.entrySet()) {
if (entry.getValue().length > 0) {
result.put(entry.getKey(), entry.getValue()[0]);
}
}
return result;
}
}
@SuppressWarnings("unchecked")
private Object resolveRequestHeader(String headerName, boolean required, String defaultValue,
MethodParameter methodParam, NativeWebRequest webRequest, Object handlerForInitBinderCall)
throws Exception {
Class<?> paramType = methodParam.getParameterType();
if (Map.class.isAssignableFrom(paramType)) {
return resolveRequestHeaderMap((Class<? extends Map>) paramType, webRequest);
}
if (headerName.length() == 0) {
headerName = getRequiredParameterName(methodParam);
}
Object headerValue = null;
String[] headerValues = webRequest.getHeaderValues(headerName);
if (headerValues != null) {
headerValue = (headerValues.length == 1 ? headerValues[0] : headerValues);
}
if (headerValue == null) {
if (defaultValue != null) {
headerValue = resolveDefaultValue(defaultValue);
} else if (required) {
raiseMissingHeaderException(headerName, paramType);
}
headerValue = checkValue(headerName, headerValue, paramType);
}
WebDataBinder binder = createBinder(webRequest, null, headerName);
initBinder(handlerForInitBinderCall, headerName, binder, webRequest);
return binder.convertIfNecessary(headerValue, paramType, methodParam);
}
private Map resolveRequestHeaderMap(Class<? extends Map> mapType, NativeWebRequest webRequest) {
if (MultiValueMap.class.isAssignableFrom(mapType)) {
MultiValueMap<String, String> result;
if (HttpHeaders.class.isAssignableFrom(mapType)) {
result = new HttpHeaders();
} else {
result = new LinkedMultiValueMap<String, String>();
}
for (Iterator<String> iterator = webRequest.getHeaderNames(); iterator.hasNext();) {
String headerName = iterator.next();
for (String headerValue : webRequest.getHeaderValues(headerName)) {
result.add(headerName, headerValue);
}
}
return result;
} else {
Map<String, String> result = new LinkedHashMap<String, String>();
for (Iterator<String> iterator = webRequest.getHeaderNames(); iterator.hasNext();) {
String headerName = iterator.next();
String headerValue = webRequest.getHeader(headerName);
result.put(headerName, headerValue);
}
return result;
}
}
/**
* Resolves the given {@link RequestBody @RequestBody} annotation.
*/
protected Object resolveRequestBody(MethodParameter methodParam, NativeWebRequest webRequest, Object handler)
throws Exception {
return readWithMessageConverters(methodParam, createHttpInputMessage(webRequest), methodParam.getParameterType());
}
private HttpEntity resolveHttpEntityRequest(MethodParameter methodParam, NativeWebRequest webRequest)
throws Exception {
HttpInputMessage inputMessage = createHttpInputMessage(webRequest);
Class<?> paramType = getHttpEntityType(methodParam);
Object body = readWithMessageConverters(methodParam, inputMessage, paramType);
return new HttpEntity<Object>(body, inputMessage.getHeaders());
}
private Object readWithMessageConverters(MethodParameter methodParam, HttpInputMessage inputMessage, Class paramType)
throws Exception {
MediaType contentType = inputMessage.getHeaders().getContentType();
if (contentType == null) {
StringBuilder builder = new StringBuilder(ClassUtils.getShortName(methodParam.getParameterType()));
String paramName = methodParam.getParameterName();
if (paramName != null) {
builder.append(' ');
builder.append(paramName);
}
throw new HttpMediaTypeNotSupportedException(
"Cannot extract parameter (" + builder.toString() + "): no Content-Type found");
}
List<MediaType> allSupportedMediaTypes = new ArrayList<MediaType>();
if (this.messageConverters != null) {
for (HttpMessageConverter<?> messageConverter : this.messageConverters) {
allSupportedMediaTypes.addAll(messageConverter.getSupportedMediaTypes());
if (messageConverter.canRead(paramType, contentType)) {
if (logger.isDebugEnabled()) {
logger.debug("Reading [" + paramType.getName() + "] as \"" + contentType
+ "\" using [" + messageConverter + "]");
}
return messageConverter.read(paramType, inputMessage);
}
}
}
throw new HttpMediaTypeNotSupportedException(contentType, allSupportedMediaTypes);
}
private Class<?> getHttpEntityType(MethodParameter methodParam) {
Assert.isAssignable(HttpEntity.class, methodParam.getParameterType());
ParameterizedType type = (ParameterizedType) methodParam.getGenericParameterType();
if (type.getActualTypeArguments().length == 1) {
Type typeArgument = type.getActualTypeArguments()[0];
if (typeArgument instanceof Class) {
return (Class<?>) typeArgument;
} else if (typeArgument instanceof GenericArrayType) {
Type componentType = ((GenericArrayType) typeArgument).getGenericComponentType();
if (componentType instanceof Class) {
// Surely, there should be a nicer way to do this
Object array = Array.newInstance((Class<?>) componentType, 0);
return array.getClass();
}
}
}
throw new IllegalArgumentException(
"HttpEntity parameter (" + methodParam.getParameterName() + ") is not parameterized");
}
private Object resolveCookieValue(String cookieName, boolean required, String defaultValue,
MethodParameter methodParam, NativeWebRequest webRequest, Object handlerForInitBinderCall)
throws Exception {
Class<?> paramType = methodParam.getParameterType();
if (cookieName.length() == 0) {
cookieName = getRequiredParameterName(methodParam);
}
Object cookieValue = resolveCookieValue(cookieName, paramType, webRequest);
if (cookieValue == null) {
if (defaultValue != null) {
cookieValue = resolveDefaultValue(defaultValue);
} else if (required) {
raiseMissingCookieException(cookieName, paramType);
}
cookieValue = checkValue(cookieName, cookieValue, paramType);
}
WebDataBinder binder = createBinder(webRequest, null, cookieName);
initBinder(handlerForInitBinderCall, cookieName, binder, webRequest);
return binder.convertIfNecessary(cookieValue, paramType, methodParam);
}
/**
* Resolves the given {@link CookieValue @CookieValue} annotation.
* <p>Throws an UnsupportedOperationException by default.
*/
protected Object resolveCookieValue(String cookieName, Class paramType, NativeWebRequest webRequest)
throws Exception {
throw new UnsupportedOperationException("@CookieValue not supported");
}
private Object resolvePathVariable(String pathVarName, MethodParameter methodParam,
NativeWebRequest webRequest, Object handlerForInitBinderCall) throws Exception {
Class<?> paramType = methodParam.getParameterType();
if (pathVarName.length() == 0) {
pathVarName = getRequiredParameterName(methodParam);
}
String pathVarValue = resolvePathVariable(pathVarName, paramType, webRequest);
WebDataBinder binder = createBinder(webRequest, null, pathVarName);
initBinder(handlerForInitBinderCall, pathVarName, binder, webRequest);
return binder.convertIfNecessary(pathVarValue, paramType, methodParam);
}
/**
* Resolves the given {@link PathVariable @PathVariable} annotation.
* <p>Throws an UnsupportedOperationException by default.
*/
protected String resolvePathVariable(String pathVarName, Class paramType, NativeWebRequest webRequest)
throws Exception {
throw new UnsupportedOperationException("@PathVariable not supported");
}
private String getRequiredParameterName(MethodParameter methodParam) {
String name = methodParam.getParameterName();
if (name == null) {
throw new IllegalStateException(
"No parameter name specified for argument of type [" + methodParam.getParameterType().getName() +
"], and no parameter name information found in class file either.");
}
return name;
}
private Object checkValue(String name, Object value, Class paramType) {
if (value == null) {
if (boolean.class.equals(paramType)) {
return Boolean.FALSE;
} else if (paramType.isPrimitive()) {
throw new IllegalStateException("Optional " + paramType + " parameter '" + name +
"' is not present but cannot be translated into a null value due to being declared as a " +
"primitive type. Consider declaring it as object wrapper for the corresponding primitive type.");
}
}
return value;
}
private WebDataBinder resolveModelAttribute(String attrName, MethodParameter methodParam,
ExtendedModelMap implicitModel, NativeWebRequest webRequest, Object handler) throws Exception {
// Bind request parameter onto object...
String name = attrName;
if ("".equals(name)) {
name = Conventions.getVariableNameForParameter(methodParam);
}
Class<?> paramType = methodParam.getParameterType();
Object bindObject;
if (implicitModel.containsKey(name)) {
bindObject = implicitModel.get(name);
} else if (this.methodResolver.isSessionAttribute(name, paramType)) {
bindObject = this.sessionAttributeStore.retrieveAttribute(webRequest, name);
if (bindObject == null) {
raiseSessionRequiredException("Session attribute '" + name + "' required - not found in session");
}
} else {
bindObject = BeanUtils.instantiateClass(paramType);
}
WebDataBinder binder = createBinder(webRequest, bindObject, name);
initBinder(handler, name, binder, webRequest);
return binder;
}
/**
* Determine whether the given value qualifies as a "binding candidate", i.e. might potentially be subject to
* bean-style data binding later on.
*/
protected boolean isBindingCandidate(Object value) {
return (value != null && !value.getClass().isArray() && !(value instanceof Collection) &&
!(value instanceof Map) && !BeanUtils.isSimpleValueType(value.getClass()));
}
protected void raiseMissingParameterException(String paramName, Class paramType) throws Exception {
throw new IllegalStateException("Missing parameter '" + paramName + "' of type [" + paramType.getName() + "]");
}
protected void raiseMissingHeaderException(String headerName, Class paramType) throws Exception {
throw new IllegalStateException("Missing header '" + headerName + "' of type [" + paramType.getName() + "]");
}
protected void raiseMissingCookieException(String cookieName, Class paramType) throws Exception {
throw new IllegalStateException(
"Missing cookie value '" + cookieName + "' of type [" + paramType.getName() + "]");
}
protected void raiseSessionRequiredException(String message) throws Exception {
throw new IllegalStateException(message);
}
protected WebDataBinder createBinder(NativeWebRequest webRequest, Object target, String objectName)
throws Exception {
return new WebRequestDataBinder(target, objectName);
}
private void doBind(WebDataBinder binder, NativeWebRequest webRequest, boolean validate, boolean failOnErrors)
throws Exception {
doBind(binder, webRequest);
if (validate) {
binder.validate();
}
if (failOnErrors && binder.getBindingResult().hasErrors()) {
throw new BindException(binder.getBindingResult());
}
}
protected void doBind(WebDataBinder binder, NativeWebRequest webRequest) throws Exception {
((WebRequestDataBinder) binder).bind(webRequest);
}
/**
* Return a {@link HttpInputMessage} for the given {@link NativeWebRequest}.
* <p>Throws an UnsupportedOperation1Exception by default.
*/
protected HttpInputMessage createHttpInputMessage(NativeWebRequest webRequest) throws Exception {
throw new UnsupportedOperationException("@RequestBody not supported");
}
/**
* Return a {@link HttpOutputMessage} for the given {@link NativeWebRequest}.
* <p>Throws an UnsupportedOperationException by default.
*/
protected HttpOutputMessage createHttpOutputMessage(NativeWebRequest webRequest) throws Exception {
throw new UnsupportedOperationException("@ResponseBody not supported");
}
protected String parseDefaultValueAttribute(String value) {
return (ValueConstants.DEFAULT_NONE.equals(value) ? null : value);
}
protected Object resolveDefaultValue(String value) {
return value;
}
protected Object resolveCommonArgument(MethodParameter methodParameter, NativeWebRequest webRequest)
throws Exception {
// Invoke custom argument resolvers if present...
if (this.customArgumentResolvers != null) {
for (WebArgumentResolver argumentResolver : this.customArgumentResolvers) {
Object value = argumentResolver.resolveArgument(methodParameter, webRequest);
if (value != WebArgumentResolver.UNRESOLVED) {
return value;
}
}
}
// Resolution of standard parameter types...
Class paramType = methodParameter.getParameterType();
Object value = resolveStandardArgument(paramType, webRequest);
if (value != WebArgumentResolver.UNRESOLVED && !ClassUtils.isAssignableValue(paramType, value)) {
throw new IllegalStateException("Standard argument type [" + paramType.getName() +
"] resolved to incompatible value of type [" + (value != null ? value.getClass() : null) +
"]. Consider declaring the argument type in a less specific fashion.");
}
return value;
}
protected Object resolveStandardArgument(Class<?> parameterType, NativeWebRequest webRequest) throws Exception {
if (WebRequest.class.isAssignableFrom(parameterType)) {
return webRequest;
}
return WebArgumentResolver.UNRESOLVED;
}
protected final void addReturnValueAsModelAttribute(Method handlerMethod, Class handlerType,
Object returnValue, ExtendedModelMap implicitModel) {
ModelAttribute attr = AnnotationUtils.findAnnotation(handlerMethod, ModelAttribute.class);
String attrName = (attr != null ? attr.value() : "");
if ("".equals(attrName)) {
Class resolvedType = GenericTypeResolver.resolveReturnType(handlerMethod, handlerType);
attrName = Conventions.getVariableNameForReturnType(handlerMethod, resolvedType, returnValue);
}
implicitModel.addAttribute(attrName, returnValue);
}
}

View File

@@ -1,38 +0,0 @@
Bundle-SymbolicName: org.springframework.data.document
Bundle-Name: Spring Data Document
Bundle-Vendor: SpringSource
Bundle-ManifestVersion: 2
Import-Package:
sun.reflect;version="0";resolution:=optional
Import-Template:
javax.servlet;version="[2.4.0, 4.0.0)",
javax.servlet.http;version="[2.4.0, 4.0.0)",
org.springframework.http.*;version="[3.0.0, 4.0.0)",
org.springframework.http.converter.*;version="[3.0.0, 4.0.0)",
org.springframework.http.server*;version="[3.0.0, 4.0.0)",
org.springframework.ui.*;version="[3.0.0, 4.0.0)",
org.springframework.web.*;version="[3.0.0, 4.0.0)",
org.springframework.web.bind.*;version="[3.0.0, 4.0.0)",
org.springframework.web.bind.annotation.*;version="[3.0.0, 4.0.0)",
org.springframework.web.bind.annotation.support.*;version="[3.0.0, 4.0.0)",
org.springframework.web.bind.support.*;version="[3.0.0, 4.0.0)",
org.springframework.web.context.request.*;version="[3.0.0, 4.0.0)",
org.springframework.web.util.*;version="[3.0.0, 4.0.0)",
org.springframework.validation.*;version="[3.0.0, 4.0.0)",
org.springframework.validation.support.*;version="[3.0.0, 4.0.0)",
org.springframework.beans.*;version="[3.0.0, 4.0.0)",
org.springframework.core.*;version="[3.0.0, 4.0.0)",
org.springframework.dao.*;version="[3.0.0, 4.0.0)",
org.springframework.util.*;version="[3.0.0, 4.0.0)",
org.springframework.expression.*;version="[3.0.0, 4.0.0)",
org.springframework.expression.common.*;version="[3.0.0, 4.0.0)",
org.springframework.expression.spel.standard.*;version="[3.0.0, 4.0.0)",
org.springframework.expression.spel.support.*;version="[3.0.0, 4.0.0)",
org.springframework.validation.*;version="[3.0.0, 4.0.0)",
org.springframework.data.core.*;version="[1.0.0, 2.0.0)",
org.aopalliance.*;version="[1.0.0, 2.0.0)";resolution:=optional,
org.apache.commons.logging.*;version="[1.1.1, 2.0.0)",
org.w3c.dom.*;version="0",
javax.persistence.*;version="[1.0, 2.0)"

View File

@@ -3,12 +3,11 @@
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-document-parent</artifactId>
<version>1.0.0.M2</version>
<relativePath>../spring-data-document-parent/pom.xml</relativePath>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.0.0.RC1</version>
<relativePath>../spring-data-mongodb-parent/pom.xml</relativePath>
</parent>
<artifactId>spring-data-mongodb-cross-store</artifactId>
<packaging>jar</packaging>
<name>Spring Data MongoDB Cross-store Persistence Support</name>
<dependencies>
@@ -24,7 +23,7 @@
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-aspects</artifactId>
<version>${org.springframework.version}</version>
<version>${org.springframework.version.range}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
@@ -158,14 +157,6 @@
<name>JBoss Public Repository</name>
<url>http://repository.jboss.org/nexus/content/groups/public-jboss</url>
</repository>
<repository>
<id>spring-maven-snapshot</id>
<snapshots>
<enabled>true</enabled>
</snapshots>
<name>Springframework Maven SNAPSHOT Repository</name>
<url>http://maven.springframework.org/snapshot</url>
</repository>
</repositories>
<pluginRepositories>
<pluginRepository>
@@ -173,6 +164,11 @@
<name>Springframework Maven Milestone Repository</name>
<url>http://maven.springframework.org/milestone</url>
</pluginRepository>
<pluginRepository>
<id>spring-maven-release</id>
<name>Springframework Maven Release Repository</name>
<url>http://maven.springframework.org/release</url>
</pluginRepository>
</pluginRepositories>
<build>
<plugins>

View File

@@ -1,30 +1,22 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb.core;
import org.junit.Test;
/**
* Unit tests for CouchTemplate with mocks
*/
public class CouchTemplateTests {
@Test
public void foo() {
}
}
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.crossstore;
import org.springframework.data.crossstore.ChangeSetBacked;
public interface DocumentBacked extends ChangeSetBacked {
}

View File

@@ -0,0 +1,185 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.crossstore;
import javax.persistence.EntityManagerFactory;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.DataAccessResourceFailureException;
import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.data.crossstore.ChangeSet;
import org.springframework.data.crossstore.ChangeSetBacked;
import org.springframework.data.crossstore.ChangeSetPersister;
import org.springframework.data.mongodb.core.CollectionCallback;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.util.ClassUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mongodb.MongoException;
public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
private static final String ENTITY_CLASS = "_entity_class";
private static final String ENTITY_ID = "_entity_id";
private static final String ENTITY_FIELD_NAME = "_entity_field_name";
private static final String ENTITY_FIELD_CLASS = "_entity_field_class";
protected final Log log = LogFactory.getLog(getClass());
private MongoTemplate mongoTemplate;
private EntityManagerFactory entityManagerFactory;
public void setMongoTemplate(MongoTemplate mongoTemplate) {
this.mongoTemplate = mongoTemplate;
}
public void setEntityManagerFactory(EntityManagerFactory entityManagerFactory) {
this.entityManagerFactory = entityManagerFactory;
}
public void getPersistentState(Class<? extends ChangeSetBacked> entityClass,
Object id, final ChangeSet changeSet)
throws DataAccessException, NotFoundException {
if (id == null) {
log.debug("Unable to load MongoDB data for null id");
return;
}
String collName = getCollectionNameForEntity(entityClass);
final DBObject dbk = new BasicDBObject();
dbk.put(ENTITY_ID, id);
dbk.put(ENTITY_CLASS, entityClass.getName());
if (log.isDebugEnabled()) {
log.debug("Loading MongoDB data for " + dbk);
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection)
throws MongoException, DataAccessException {
for (DBObject dbo : collection.find(dbk)) {
String key = (String) dbo.get(ENTITY_FIELD_NAME);
if (log.isDebugEnabled()) {
log.debug("Processing key: " + key);
}
if (!changeSet.getValues().containsKey(key)) {
String className = (String) dbo.get(ENTITY_FIELD_CLASS);
if (className == null) {
throw new DataIntegrityViolationException(
"Unble to convert property " + key
+ ": Invalid metadata, " + ENTITY_FIELD_CLASS + " not available");
}
Class<?> clazz = ClassUtils.resolveClassName(className, ClassUtils.getDefaultClassLoader());
Object value = mongoTemplate.getConverter().read(clazz, dbo);
if (log.isDebugEnabled()) {
log.debug("Adding to ChangeSet: " + key);
}
changeSet.set(key, value);
}
}
return null;
}
});
}
public Object getPersistentId(ChangeSetBacked entity, ChangeSet cs) throws DataAccessException {
log.debug("getPersistentId called on " + entity);
if (entityManagerFactory == null) {
throw new DataAccessResourceFailureException("EntityManagerFactory cannot be null");
}
Object o = entityManagerFactory.getPersistenceUnitUtil().getIdentifier(entity);
return o;
}
public Object persistState(ChangeSetBacked entity, ChangeSet cs) throws DataAccessException {
if (cs == null) {
log.debug("Flush: changeset was null, nothing to flush.");
return 0L;
}
if (log.isDebugEnabled()) {
log.debug("Flush: changeset: " + cs.getValues());
}
String collName = getCollectionNameForEntity(entity.getClass());
if (mongoTemplate.getCollection(collName) == null) {
mongoTemplate.createCollection(collName);
}
for (String key : cs.getValues().keySet()) {
if (key != null && !key.startsWith("_") && !key.equals(ChangeSetPersister.ID_KEY)) {
Object value = cs.getValues().get(key);
final DBObject dbQuery = new BasicDBObject();
dbQuery.put(ENTITY_ID, getPersistentId(entity, cs));
dbQuery.put(ENTITY_CLASS, entity.getClass().getName());
dbQuery.put(ENTITY_FIELD_NAME, key);
DBObject dbId = mongoTemplate.execute(collName,
new CollectionCallback<DBObject>() {
public DBObject doInCollection(DBCollection collection)
throws MongoException, DataAccessException {
return collection.findOne(dbQuery);
}
});
if (value == null) {
if (log.isDebugEnabled()) {
log.debug("Flush: removing: " + dbQuery);
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection)
throws MongoException, DataAccessException {
collection.remove(dbQuery);
return null;
}
});
}
else {
final DBObject dbDoc = new BasicDBObject();
dbDoc.putAll(dbQuery);
if (log.isDebugEnabled()) {
log.debug("Flush: saving: " + dbQuery);
}
mongoTemplate.getConverter().write(value, dbDoc);
dbDoc.put(ENTITY_FIELD_CLASS, value.getClass().getName());
if (dbId != null) {
dbDoc.put("_id", dbId.get("_id"));
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection)
throws MongoException, DataAccessException {
collection.save(dbDoc);
return null;
}
});
}
}
}
return 0L;
}
private String getCollectionNameForEntity(Class<? extends ChangeSetBacked> entityClass) {
return ClassUtils.getQualifiedName(entityClass);
}
}

View File

@@ -1,7 +1,21 @@
package org.springframework.data.persistence.document.mongo;
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.crossstore;
import java.lang.reflect.Field;
import java.util.Map;
import javax.persistence.EntityManager;
import javax.persistence.Transient;
@@ -13,15 +27,14 @@ import org.aspectj.lang.JoinPoint;
import org.aspectj.lang.reflect.FieldSignature;
import org.springframework.dao.DataAccessException;
import org.springframework.data.document.annotation.RelatedDocument;
import org.springframework.data.persistence.document.DocumentBacked;
import org.springframework.data.persistence.document.DocumentBackedTransactionSynchronization;
import org.springframework.data.persistence.ChangeSet;
import org.springframework.data.persistence.ChangeSetBacked;
import org.springframework.data.persistence.ChangeSetPersister;
import org.springframework.data.persistence.ChangeSetPersister.NotFoundException;
import org.springframework.data.persistence.HashMapChangeSet;
import org.springframework.data.mongodb.crossstore.RelatedDocument;
import org.springframework.data.mongodb.crossstore.DocumentBacked;
import org.springframework.data.crossstore.ChangeSetBackedTransactionSynchronization;
import org.springframework.data.crossstore.ChangeSet;
import org.springframework.data.crossstore.ChangeSetPersister;
import org.springframework.data.crossstore.ChangeSetPersister.NotFoundException;
import org.springframework.data.crossstore.HashMapChangeSet;
import org.springframework.transaction.support.TransactionSynchronizationManager;
/**
@@ -132,7 +145,7 @@ public aspect MongoDocumentBacking {
entity.setChangeSet(new HashMapChangeSet());
entity.itdChangeSetPersister = changeSetPersister;
entity.itdTransactionSynchronization =
new DocumentBackedTransactionSynchronization(changeSetPersister, entity);
new ChangeSetBackedTransactionSynchronization(changeSetPersister, entity);
//registerTransactionSynchronization(entity);
}
@@ -165,7 +178,7 @@ public aspect MongoDocumentBacking {
@Transient private ChangeSetPersister<?> DocumentBacked.itdChangeSetPersister;
@Transient private DocumentBackedTransactionSynchronization DocumentBacked.itdTransactionSynchronization;
@Transient private ChangeSetBackedTransactionSynchronization DocumentBacked.itdTransactionSynchronization;
public void DocumentBacked.setChangeSet(ChangeSet cs) {
this.changeSet = cs;

View File

@@ -1,11 +1,11 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -13,8 +13,7 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.annotation;
package org.springframework.data.mongodb.crossstore;
import java.lang.annotation.ElementType;
import java.lang.annotation.Retention;
@@ -29,7 +28,4 @@ import java.lang.annotation.Target;
ElementType.FIELD
})
public @interface RelatedDocument {
String collection() default "";
}

View File

@@ -1,7 +0,0 @@
package org.springframework.data.persistence.document;
import org.springframework.data.persistence.ChangeSetBacked;
public interface DocumentBacked extends ChangeSetBacked {
}

View File

@@ -1,63 +0,0 @@
package org.springframework.data.persistence.document;
//public class DocumentBackedTransactionSynchronization {
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.springframework.data.persistence.ChangeSetBacked;
import org.springframework.data.persistence.ChangeSetPersister;
import org.springframework.transaction.support.TransactionSynchronization;
public class DocumentBackedTransactionSynchronization implements TransactionSynchronization {
protected final Log log = LogFactory.getLog(getClass());
private ChangeSetPersister<Object> changeSetPersister;
private ChangeSetBacked entity;
private int changeSetTxStatus = -1;
public DocumentBackedTransactionSynchronization(ChangeSetPersister<Object> changeSetPersister, ChangeSetBacked entity) {
this.changeSetPersister = changeSetPersister;
this.entity = entity;
}
public void afterCommit() {
log.debug("After Commit called for " + entity);
changeSetPersister.persistState(entity, entity.getChangeSet());
changeSetTxStatus = 0;
}
public void afterCompletion(int status) {
log.debug("After Completion called with status = " + status);
if (changeSetTxStatus == 0) {
if (status == STATUS_COMMITTED) {
// this is good
log.debug("ChangedSetBackedTransactionSynchronization completed successfully for " + this.entity);
}
else {
// this could be bad - TODO: compensate
log.error("ChangedSetBackedTransactionSynchronization failed for " + this.entity);
}
}
}
public void beforeCommit(boolean readOnly) {
}
public void beforeCompletion() {
}
public void flush() {
}
public void resume() {
throw new IllegalStateException("ChangedSetBackedTransactionSynchronization does not support transaction suspension currently.");
}
public void suspend() {
throw new IllegalStateException("ChangedSetBackedTransactionSynchronization does not support transaction suspension currently.");
}
}

View File

@@ -1,169 +0,0 @@
package org.springframework.data.persistence.document.mongo;
import javax.persistence.EntityManagerFactory;
import com.mongodb.BasicDBObject;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mongodb.MongoException;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.DataAccessResourceFailureException;
import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.data.document.mongodb.CollectionCallback;
import org.springframework.data.document.mongodb.MongoTemplate;
import org.springframework.data.persistence.ChangeSet;
import org.springframework.data.persistence.ChangeSetBacked;
import org.springframework.data.persistence.ChangeSetPersister;
import org.springframework.util.ClassUtils;
public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
private static final String ENTITY_CLASS = "_entity_class";
private static final String ENTITY_ID = "_entity_id";
private static final String ENTITY_FIELD_NAME = "_entity_field_name";
private static final String ENTITY_FIELD_CLASS = "_entity_field_class";
protected final Log log = LogFactory.getLog(getClass());
private MongoTemplate mongoTemplate;
private EntityManagerFactory entityManagerFactory;
public void setMongoTemplate(MongoTemplate mongoTemplate) {
this.mongoTemplate = mongoTemplate;
}
public void setEntityManagerFactory(EntityManagerFactory entityManagerFactory) {
this.entityManagerFactory = entityManagerFactory;
}
public void getPersistentState(Class<? extends ChangeSetBacked> entityClass,
Object id, final ChangeSet changeSet)
throws DataAccessException, NotFoundException {
if (id == null) {
log.debug("Unable to load MongoDB data for null id");
return;
}
String collName = getCollectionNameForEntity(entityClass);
final DBObject dbk = new BasicDBObject();
dbk.put(ENTITY_ID, id);
dbk.put(ENTITY_CLASS, entityClass.getName());
if (log.isDebugEnabled()) {
log.debug("Loading MongoDB data for " + dbk);
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection)
throws MongoException, DataAccessException {
for (DBObject dbo : collection.find(dbk)) {
String key = (String) dbo.get(ENTITY_FIELD_NAME);
if (log.isDebugEnabled()) {
log.debug("Processing key: " + key);
}
if (!changeSet.getValues().containsKey(key)) {
String className = (String) dbo.get(ENTITY_FIELD_CLASS);
if (className == null) {
throw new DataIntegrityViolationException(
"Unble to convert property " + key
+ ": Invalid metadata, " + ENTITY_FIELD_CLASS + " not available");
}
Class<?> clazz = ClassUtils.resolveClassName(className, ClassUtils.getDefaultClassLoader());
Object value = mongoTemplate.getConverter().read(clazz, dbo);
if (log.isDebugEnabled()) {
log.debug("Adding to ChangeSet: " + key);
}
changeSet.set(key, value);
}
}
return null;
}
});
}
public Object getPersistentId(ChangeSetBacked entity, ChangeSet cs) throws DataAccessException {
log.debug("getPersistentId called on " + entity);
if (entityManagerFactory == null) {
throw new DataAccessResourceFailureException("EntityManagerFactory cannot be null");
}
Object o = entityManagerFactory.getPersistenceUnitUtil().getIdentifier(entity);
return o;
}
public Object persistState(ChangeSetBacked entity, ChangeSet cs) throws DataAccessException {
if (cs == null) {
log.debug("Flush: changeset was null, nothing to flush.");
return 0L;
}
if (log.isDebugEnabled()) {
log.debug("Flush: changeset: " + cs.getValues());
}
String collName = getCollectionNameForEntity(entity.getClass());
DBCollection dbc = mongoTemplate.getCollection(collName);
if (dbc == null) {
dbc = mongoTemplate.createCollection(collName);
}
for (String key : cs.getValues().keySet()) {
if (key != null && !key.startsWith("_") && !key.equals(ChangeSetPersister.ID_KEY)) {
Object value = cs.getValues().get(key);
final DBObject dbQuery = new BasicDBObject();
dbQuery.put(ENTITY_ID, getPersistentId(entity, cs));
dbQuery.put(ENTITY_CLASS, entity.getClass().getName());
dbQuery.put(ENTITY_FIELD_NAME, key);
DBObject dbId = mongoTemplate.execute(collName,
new CollectionCallback<DBObject>() {
public DBObject doInCollection(DBCollection collection)
throws MongoException, DataAccessException {
return collection.findOne(dbQuery);
}
});
if (value == null) {
if (log.isDebugEnabled()) {
log.debug("Flush: removing: " + dbQuery);
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection)
throws MongoException, DataAccessException {
collection.remove(dbQuery);
return null;
}
});
}
else {
final DBObject dbDoc = new BasicDBObject();
dbDoc.putAll(dbQuery);
if (log.isDebugEnabled()) {
log.debug("Flush: saving: " + dbQuery);
}
mongoTemplate.getConverter().write(value, dbDoc);
dbDoc.put(ENTITY_FIELD_CLASS, value.getClass().getName());
if (dbId != null) {
dbDoc.put("_id", dbId.get("_id"));
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection)
throws MongoException, DataAccessException {
collection.save(dbDoc);
return null;
}
});
}
}
}
return 0L;
}
private String getCollectionNameForEntity(Class<? extends ChangeSetBacked> entityClass) {
return ClassUtils.getQualifiedName(entityClass);
}
}

View File

@@ -1,154 +0,0 @@
package org.springframework.data.document.persistence;
import javax.persistence.EntityManager;
import javax.persistence.PersistenceContext;
import org.junit.Assert;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.data.document.mongodb.MongoTemplate;
import org.springframework.data.document.persistence.test.Address;
import org.springframework.data.document.persistence.test.Person;
import org.springframework.data.document.persistence.test.Resume;
import org.springframework.test.annotation.Rollback;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import org.springframework.transaction.PlatformTransactionManager;
import org.springframework.transaction.TransactionStatus;
import org.springframework.transaction.annotation.Transactional;
import org.springframework.transaction.support.TransactionCallback;
import org.springframework.transaction.support.TransactionTemplate;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration(locations = "classpath:/META-INF/spring/applicationContext.xml")
public class CrossStoreMongoTests {
@Autowired
private MongoTemplate mongoTemplate;
private EntityManager entityManager;
@Autowired
private PlatformTransactionManager transactionManager;
@PersistenceContext
public void setEntityManager(EntityManager entityManager) {
this.entityManager = entityManager;
}
private void clearData(String collectionName) {
DBCollection col = this.mongoTemplate.getCollection(collectionName);
if (col != null) {
this.mongoTemplate.dropCollection(collectionName);
}
}
@Test
@Transactional
@Rollback(false)
public void testCreateJpaToMongoEntityRelationship() {
clearData(Person.class.getName());
Person p = new Person("Thomas", 20);
Address a = new Address(12, "MAin St.", "Boston", "MA", "02101");
p.setAddress(a);
Resume r = new Resume();
r.addEducation("Skanstulls High School, 1975");
r.addEducation("Univ. of Stockholm, 1980");
r.addJob("DiMark, DBA, 1990-2000");
r.addJob("VMware, Developer, 2007-");
p.setResume(r);
p.setId(1L);
entityManager.persist(p);
}
@Test
@Transactional
@Rollback(false)
public void testReadJpaToMongoEntityRelationship() {
Person found = entityManager.find(Person.class, 1L);
Assert.assertNotNull(found);
Assert.assertEquals(Long.valueOf(1), found.getId());
Assert.assertNotNull(found);
Assert.assertEquals(Long.valueOf(1), found.getId());
Assert.assertNotNull(found.getResume());
Assert.assertEquals("DiMark, DBA, 1990-2000" + "; "
+ "VMware, Developer, 2007-", found.getResume().getJobs());
found.getResume().addJob("SpringDeveloper.com, Consultant, 2005-2006");
found.setAge(44);
}
@Test
@Transactional
@Rollback(false)
public void testUpdatedJpaToMongoEntityRelationship() {
Person found = entityManager.find(Person.class, 1L);
Assert.assertNotNull(found);
Assert.assertEquals(Long.valueOf(1), found.getId());
Assert.assertNotNull(found);
Assert.assertEquals(Long.valueOf(1), found.getId());
Assert.assertNotNull(found.getResume());
Assert.assertEquals("DiMark, DBA, 1990-2000" + "; "
+ "VMware, Developer, 2007-" + "; "
+ "SpringDeveloper.com, Consultant, 2005-2006", found.getResume().getJobs());
}
@Test
public void testMergeJpaEntityWithMongoDocument() {
TransactionTemplate txTemplate = new TransactionTemplate(transactionManager);
final Person detached = entityManager.find(Person.class, 1L);
detached.getResume().addJob("TargetRx, Developer, 2000-2005");
Person merged = txTemplate.execute(new TransactionCallback<Person>() {
public Person doInTransaction(TransactionStatus status) {
return entityManager.merge(detached);
}
});
Assert.assertTrue(detached.getResume().getJobs().contains("TargetRx, Developer, 2000-2005"));
Assert.assertTrue(merged.getResume().getJobs().contains("TargetRx, Developer, 2000-2005"));
final Person updated = entityManager.find(Person.class, 1L);
Assert.assertTrue(updated.getResume().getJobs().contains("TargetRx, Developer, 2000-2005"));
}
@Test
public void testRemoveJpaEntityWithMongoDocument() {
TransactionTemplate txTemplate = new TransactionTemplate(transactionManager);
txTemplate.execute(new TransactionCallback<Person>() {
public Person doInTransaction(TransactionStatus status) {
Person p2 = new Person("Thomas", 20);
Resume r2 = new Resume();
r2.addEducation("Skanstulls High School, 1975");
r2.addJob("DiMark, DBA, 1990-2000");
p2.setResume(r2);
p2.setId(2L);
entityManager.persist(p2);
Person p3 = new Person("Thomas", 20);
Resume r3 = new Resume();
r3.addEducation("Univ. of Stockholm, 1980");
r3.addJob("VMware, Developer, 2007-");
p3.setResume(r3);
p3.setId(3L);
entityManager.persist(p3);
return null;
}
});
txTemplate.execute(new TransactionCallback<Person>() {
public Person doInTransaction(TransactionStatus status) {
final Person found2 = entityManager.find(Person.class, 2L);
entityManager.remove(found2);
return null;
}
});
boolean weFound3 = false;
for (DBObject dbo : this.mongoTemplate.getCollection(Person.class.getName()).find()) {
Assert.assertTrue(!dbo.get("_entity_id").equals(2L));
if (dbo.get("_entity_id").equals(3L)) {
weFound3 = true;
}
}
Assert.assertTrue(weFound3);
}
}

View File

@@ -1,87 +0,0 @@
package org.springframework.data.document.persistence.test;
import javax.persistence.Entity;
import javax.persistence.Id;
import org.springframework.data.document.annotation.RelatedDocument;
@Entity
public class Person {
@Id
Long id;
private String name;
private int age;
private java.util.Date birthDate;
@RelatedDocument
private Address address;
@RelatedDocument
private Resume resume;
public Person() {
}
public Person(String name, int age) {
this.name = name;
this.age = age;
this.birthDate = new java.util.Date();
}
public void birthday() {
++age;
}
public Long getId() {
return id;
}
public void setId(Long id) {
this.id = id;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public int getAge() {
return age;
}
public void setAge(int age) {
this.age = age;
}
public java.util.Date getBirthDate() {
return birthDate;
}
public void setBirthDate(java.util.Date birthDate) {
this.birthDate = birthDate;
}
public Resume getResume() {
return resume;
}
public void setResume(Resume resume) {
this.resume = resume;
}
public Address getAddress() {
return address;
}
public void setAddress(Address address) {
this.address = address;
}
}

View File

@@ -1,48 +0,0 @@
package org.springframework.data.document.persistence.test;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.bson.types.ObjectId;
import org.springframework.data.annotation.Id;
import org.springframework.data.document.mongodb.mapping.Document;
@Document
public class Resume {
private static final Log LOGGER = LogFactory.getLog(Resume.class);
@Id
private ObjectId id;
private String education = "";
private String jobs = "";
public String getId() {
return id.toString();
}
public String getEducation() {
return education;
}
public void addEducation(String education) {
LOGGER.debug("Adding education " + education);
this.education = this.education + (this.education.length() > 0 ? "; " : "") + education;
}
public String getJobs() {
return jobs;
}
public void addJob(String job) {
LOGGER.debug("Adding job " + job);
this.jobs = this.jobs + (this.jobs.length() > 0 ? "; " : "") + job;
}
@Override
public String toString() {
return "Resume [education=" + education + ", jobs=" + jobs + "]";
}
}

View File

@@ -0,0 +1,169 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.crossstore;
import javax.persistence.EntityManager;
import javax.persistence.PersistenceContext;
import org.junit.Assert;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.crossstore.test.Address;
import org.springframework.data.mongodb.crossstore.test.Person;
import org.springframework.data.mongodb.crossstore.test.Resume;
import org.springframework.test.annotation.Rollback;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import org.springframework.transaction.PlatformTransactionManager;
import org.springframework.transaction.TransactionStatus;
import org.springframework.transaction.annotation.Transactional;
import org.springframework.transaction.support.TransactionCallback;
import org.springframework.transaction.support.TransactionTemplate;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration(locations = "classpath:/META-INF/spring/applicationContext.xml")
public class CrossStoreMongoTests {
@Autowired
private MongoTemplate mongoTemplate;
private EntityManager entityManager;
@Autowired
private PlatformTransactionManager transactionManager;
@PersistenceContext
public void setEntityManager(EntityManager entityManager) {
this.entityManager = entityManager;
}
private void clearData(String collectionName) {
DBCollection col = this.mongoTemplate.getCollection(collectionName);
if (col != null) {
this.mongoTemplate.dropCollection(collectionName);
}
}
@Test
@Transactional
@Rollback(false)
public void testCreateJpaToMongoEntityRelationship() {
clearData(Person.class.getName());
Person p = new Person("Thomas", 20);
Address a = new Address(12, "MAin St.", "Boston", "MA", "02101");
p.setAddress(a);
Resume r = new Resume();
r.addEducation("Skanstulls High School, 1975");
r.addEducation("Univ. of Stockholm, 1980");
r.addJob("DiMark, DBA, 1990-2000");
r.addJob("VMware, Developer, 2007-");
p.setResume(r);
p.setId(1L);
entityManager.persist(p);
}
@Test
@Transactional
@Rollback(false)
public void testReadJpaToMongoEntityRelationship() {
Person found = entityManager.find(Person.class, 1L);
Assert.assertNotNull(found);
Assert.assertEquals(Long.valueOf(1), found.getId());
Assert.assertNotNull(found);
Assert.assertEquals(Long.valueOf(1), found.getId());
Assert.assertNotNull(found.getResume());
Assert.assertEquals("DiMark, DBA, 1990-2000" + "; "
+ "VMware, Developer, 2007-", found.getResume().getJobs());
found.getResume().addJob("SpringDeveloper.com, Consultant, 2005-2006");
found.setAge(44);
}
@Test
@Transactional
@Rollback(false)
public void testUpdatedJpaToMongoEntityRelationship() {
Person found = entityManager.find(Person.class, 1L);
Assert.assertNotNull(found);
Assert.assertEquals(Long.valueOf(1), found.getId());
Assert.assertNotNull(found);
Assert.assertEquals(Long.valueOf(1), found.getId());
Assert.assertNotNull(found.getResume());
Assert.assertEquals("DiMark, DBA, 1990-2000" + "; "
+ "VMware, Developer, 2007-" + "; "
+ "SpringDeveloper.com, Consultant, 2005-2006", found.getResume().getJobs());
}
@Test
public void testMergeJpaEntityWithMongoDocument() {
TransactionTemplate txTemplate = new TransactionTemplate(transactionManager);
final Person detached = entityManager.find(Person.class, 1L);
detached.getResume().addJob("TargetRx, Developer, 2000-2005");
Person merged = txTemplate.execute(new TransactionCallback<Person>() {
public Person doInTransaction(TransactionStatus status) {
return entityManager.merge(detached);
}
});
Assert.assertTrue(detached.getResume().getJobs().contains("TargetRx, Developer, 2000-2005"));
Assert.assertTrue(merged.getResume().getJobs().contains("TargetRx, Developer, 2000-2005"));
final Person updated = entityManager.find(Person.class, 1L);
Assert.assertTrue(updated.getResume().getJobs().contains("TargetRx, Developer, 2000-2005"));
}
@Test
public void testRemoveJpaEntityWithMongoDocument() {
TransactionTemplate txTemplate = new TransactionTemplate(transactionManager);
txTemplate.execute(new TransactionCallback<Person>() {
public Person doInTransaction(TransactionStatus status) {
Person p2 = new Person("Thomas", 20);
Resume r2 = new Resume();
r2.addEducation("Skanstulls High School, 1975");
r2.addJob("DiMark, DBA, 1990-2000");
p2.setResume(r2);
p2.setId(2L);
entityManager.persist(p2);
Person p3 = new Person("Thomas", 20);
Resume r3 = new Resume();
r3.addEducation("Univ. of Stockholm, 1980");
r3.addJob("VMware, Developer, 2007-");
p3.setResume(r3);
p3.setId(3L);
entityManager.persist(p3);
return null;
}
});
txTemplate.execute(new TransactionCallback<Person>() {
public Person doInTransaction(TransactionStatus status) {
final Person found2 = entityManager.find(Person.class, 2L);
entityManager.remove(found2);
return null;
}
});
boolean weFound3 = false;
for (DBObject dbo : this.mongoTemplate.getCollection(Person.class.getName()).find()) {
Assert.assertTrue(!dbo.get("_entity_id").equals(2L));
if (dbo.get("_entity_id").equals(3L)) {
weFound3 = true;
}
}
Assert.assertTrue(weFound3);
}
}

View File

@@ -1,13 +1,28 @@
package org.springframework.data.document.persistence.test;
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.crossstore.test;
public class Address {
private Integer streetNumber;
private String streetName;
private String city;
private String state;
private String zip;
public Address(Integer streetNumber, String streetName, String city,
String state, String zip) {
super();
@@ -17,7 +32,7 @@ public class Address {
this.state = state;
this.zip = zip;
}
public Integer getStreetNumber() {
return streetNumber;
}
@@ -48,7 +63,7 @@ public class Address {
public void setZip(String zip) {
this.zip = zip;
}
}

View File

@@ -0,0 +1,102 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.crossstore.test;
import javax.persistence.Entity;
import javax.persistence.Id;
import org.springframework.data.mongodb.crossstore.RelatedDocument;
@Entity
public class Person {
@Id
Long id;
private String name;
private int age;
private java.util.Date birthDate;
@RelatedDocument
private Address address;
@RelatedDocument
private Resume resume;
public Person() {
}
public Person(String name, int age) {
this.name = name;
this.age = age;
this.birthDate = new java.util.Date();
}
public void birthday() {
++age;
}
public Long getId() {
return id;
}
public void setId(Long id) {
this.id = id;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public int getAge() {
return age;
}
public void setAge(int age) {
this.age = age;
}
public java.util.Date getBirthDate() {
return birthDate;
}
public void setBirthDate(java.util.Date birthDate) {
this.birthDate = birthDate;
}
public Resume getResume() {
return resume;
}
public void setResume(Resume resume) {
this.resume = resume;
}
public Address getAddress() {
return address;
}
public void setAddress(Address address) {
this.address = address;
}
}

View File

@@ -0,0 +1,63 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.crossstore.test;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.bson.types.ObjectId;
import org.springframework.data.annotation.Id;
import org.springframework.data.mongodb.core.mapping.Document;
@Document
public class Resume {
private static final Log LOGGER = LogFactory.getLog(Resume.class);
@Id
private ObjectId id;
private String education = "";
private String jobs = "";
public String getId() {
return id.toString();
}
public String getEducation() {
return education;
}
public void addEducation(String education) {
LOGGER.debug("Adding education " + education);
this.education = this.education + (this.education.length() > 0 ? "; " : "") + education;
}
public String getJobs() {
return jobs;
}
public void addJob(String job) {
LOGGER.debug("Adding job " + job);
this.jobs = this.jobs + (this.jobs.length() > 0 ? "; " : "") + job;
}
@Override
public String toString() {
return "Resume [education=" + education + ", jobs=" + jobs + "]";
}
}

View File

@@ -4,7 +4,7 @@
xsi:schemaLocation="http://java.sun.com/xml/ns/persistence http://java.sun.com/xml/ns/persistence/persistence_2_0.xsd">
<persistence-unit name="test" transaction-type="RESOURCE_LOCAL">
<provider>org.hibernate.ejb.HibernatePersistence</provider>
<class>org.springframework.data.document.persistence.test.Person</class>
<class>org.springframework.data.mongodb.crossstore.test.Person</class>
<properties>
<property name="hibernate.dialect" value="org.hibernate.dialect.HSQLDialect"/>
<!--value='create' to build a new database on each run; value='update' to modify an existing database; value='create-drop' means the same as 'create' but also drops tables when Hibernate closes; value='validate' makes no changes to the database-->

View File

@@ -13,34 +13,37 @@
<context:spring-configured/>
<context:component-scan base-package="org.springframework.persistence.test">
<context:component-scan base-package="org.springframework.persistence.mongodb.test">
<context:exclude-filter expression="org.springframework.stereotype.Controller" type="annotation"/>
</context:component-scan>
<mongo:mapping-converter/>
<!-- Mongo config -->
<bean id="mongo" class="org.springframework.data.document.mongodb.MongoFactoryBean">
<bean id="mongo" class="org.springframework.data.mongodb.core.MongoFactoryBean">
<property name="host" value="localhost"/>
<property name="port" value="27017"/>
</bean>
<bean id="mongoTemplate" class="org.springframework.data.document.mongodb.MongoTemplate">
<constructor-arg name="mongo" ref="mongo"/>
<constructor-arg name="databaseName" value="test"/>
<constructor-arg name="defaultCollectionName" value="cross-store"/>
<bean id="mongoDbFactory" class="org.springframework.data.mongodb.core.SimpleMongoDbFactory">
<constructor-arg name="mongo" ref="mongo"/>
<constructor-arg name="databaseName" value="database"/>
</bean>
<bean id="mongoTemplate" class="org.springframework.data.mongodb.core.MongoTemplate">
<constructor-arg name="mongoDbFactory" ref="mongoDbFactory"/>
<constructor-arg name="mongoConverter" ref="mappingConverter"/>
</bean>
<bean class="org.springframework.data.document.mongodb.MongoExceptionTranslator"/>
<bean class="org.springframework.data.mongodb.core.MongoExceptionTranslator"/>
<!-- Mongo aspect config -->
<bean class="org.springframework.data.persistence.document.mongo.MongoDocumentBacking"
<bean class="org.springframework.data.mongodb.crossstore.MongoDocumentBacking"
factory-method="aspectOf">
<property name="changeSetPersister" ref="mongoChangeSetPersister"/>
</bean>
<bean id="mongoChangeSetPersister"
class="org.springframework.data.persistence.document.mongo.MongoChangeSetPersister">
class="org.springframework.data.mongodb.crossstore.MongoChangeSetPersister">
<property name="mongoTemplate" ref="mongoTemplate"/>
<property name="entityManagerFactory" ref="entityManagerFactory"/>
</bean>

View File

@@ -2,7 +2,7 @@ log4j.rootCategory=INFO, stdout
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d %p [%c] - <%m>%n
log4j.appender.stdout.layout.ConversionPattern=%d{ABSOLUTE} %5p %40.40c:%4L - %m%n
log4j.category.org.springframework=INFO
log4j.category.org.springframework.data=TRACE

View File

@@ -5,7 +5,7 @@ and connects directly to the MongoDB server using the driver. It has no dependen
To use it, configure a host, port, (optionally) applicationId, and database property in your Log4J configuration:
log4j.appender.stdout=org.springframework.data.document.mongodb.log4j.MongoLog4jAppender
log4j.appender.stdout=org.springframework.data.mongodb.log4j.MongoLog4jAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d %p [%c] - <%m>%n
log4j.appender.stdout.host = localhost
@@ -32,7 +32,7 @@ An example log entry might look like:
{
"_id" : ObjectId("4d89341a8ef397e06940d5cd"),
"applicationId" : "my.application",
"name" : "org.springframework.data.document.mongodb.log4j.AppenderTest",
"name" : "org.springframework.data.mongodb.log4j.AppenderTest",
"level" : "DEBUG",
"timestamp" : ISODate("2011-03-23T16:53:46.778Z"),
"properties" : {

View File

@@ -4,12 +4,11 @@
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-document-parent</artifactId>
<version>1.0.0.M2</version>
<relativePath>../spring-data-document-parent/pom.xml</relativePath>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.0.0.RC1</version>
<relativePath>../spring-data-mongodb-parent/pom.xml</relativePath>
</parent>
<artifactId>spring-data-mongodb-log4j</artifactId>
<packaging>jar</packaging>
<name>Spring Data MongoDB Log4J Appender</name>
<properties>

View File

@@ -14,7 +14,7 @@
* limitations under the License.
*/
package org.springframework.data.document.mongodb.log4j;
package org.springframework.data.mongodb.log4j;
import java.net.UnknownHostException;
import java.util.Arrays;

View File

@@ -14,7 +14,7 @@
* limitations under the License.
*/
package org.springframework.data.document.mongodb.log4j;
package org.springframework.data.mongodb.log4j;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;

View File

@@ -1,6 +1,6 @@
log4j.rootCategory=INFO, stdout
log4j.appender.stdout=org.springframework.data.document.mongodb.log4j.MongoLog4jAppender
log4j.appender.stdout=org.springframework.data.mongodb.log4j.MongoLog4jAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d %p [%c] - <%m>%n
log4j.appender.stdout.host = localhost

View File

@@ -3,10 +3,10 @@
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-document-parent</artifactId>
<name>Spring Data Document Parent</name>
<url>http://www.springsource.org/spring-data/data-document</url>
<version>1.0.0.M2</version>
<artifactId>spring-data-mongodb-parent</artifactId>
<name>Spring Data MongoDB Parent</name>
<url>http://www.springsource.org/spring-data/mongodb</url>
<version>1.0.0.RC1</version>
<packaging>pom</packaging>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
@@ -16,8 +16,10 @@
<org.mockito.version>1.8.4</org.mockito.version>
<org.slf4j.version>1.5.10</org.slf4j.version>
<org.codehaus.jackson.version>1.6.1</org.codehaus.jackson.version>
<org.springframework.version>3.0.5.RELEASE</org.springframework.version>
<data.commons.version>1.0.0.RC1</data.commons.version>
<org.springframework.version.30>3.0.6.RELEASE</org.springframework.version.30>
<org.springframework.version.40>4.0.0.RELEASE</org.springframework.version.40>
<org.springframework.version.range>[${org.springframework.version.30}, ${org.springframework.version.40})</org.springframework.version.range>
<data.commons.version>1.2.0.BUILD-SNAPSHOT</data.commons.version>
<aspectj.version>1.6.11.RELEASE</aspectj.version>
</properties>
<profiles>
@@ -39,15 +41,15 @@
<distributionManagement>
<site>
<id>spring-site-staging</id>
<url>file:///${java.io.tmpdir}/spring-data/data-document/docs</url>
<url>file:///${java.io.tmpdir}/spring-data/mongodb/docs</url>
</site>
<repository>
<id>spring-milestone-staging</id>
<url>file:///${java.io.tmpdir}/spring-data/data-document/milestone</url>
<url>file:///${java.io.tmpdir}/spring-data/mongodb/milestone</url>
</repository>
<snapshotRepository>
<id>spring-snapshot-staging</id>
<url>file:///${java.io.tmpdir}/spring-data/data-document/snapshot</url>
<url>file:///${java.io.tmpdir}/spring-data/mongodb/snapshot</url>
</snapshotRepository>
</distributionManagement>
</profile>
@@ -63,7 +65,7 @@
<site>
<id>static.springframework.org</id>
<url>
scp://static.springframework.org/var/www/domains/springframework.org/static/htdocs/spring-data/data-document/docs/${project.version}
scp://static.springframework.org/var/www/domains/springframework.org/static/htdocs/spring-data/mongodb/docs/${project.version}
</url>
</site>
<repository>
@@ -92,42 +94,42 @@
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-aop</artifactId>
<version>${org.springframework.version}</version>
<version>${org.springframework.version.range}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-beans</artifactId>
<version>${org.springframework.version}</version>
<version>${org.springframework.version.range}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-core</artifactId>
<version>${org.springframework.version}</version>
<version>${org.springframework.version.range}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-tx</artifactId>
<version>${org.springframework.version}</version>
<version>${org.springframework.version.range}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-orm</artifactId>
<version>${org.springframework.version}</version>
<version>${org.springframework.version.range}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-web</artifactId>
<version>${org.springframework.version}</version>
<version>${org.springframework.version.range}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-expression</artifactId>
<version>${org.springframework.version}</version>
<version>${org.springframework.version.range}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-test</artifactId>
<version>${org.springframework.version}</version>
<version>${org.springframework.version.range}</version>
<scope>test</scope>
</dependency>
@@ -142,11 +144,6 @@
<artifactId>spring-data-commons-aspects</artifactId>
<version>${data.commons.version}</version>
</dependency>
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-document-core</artifactId>
<version>${project.version}</version>
</dependency>
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-couchdb</artifactId>
@@ -294,10 +291,11 @@
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>2.3.2</version>
<configuration>
<source>1.5</source>
<target>1.5</target>
<compilerArgument>-Xlint:all</compilerArgument>
<compilerArgument>-Xlint:-path</compilerArgument>
<showWarnings>true</showWarnings>
<showDeprecation>false</showDeprecation>
</configuration>
@@ -313,19 +311,18 @@
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.8</version>
<configuration>
<useFile>false</useFile>
<includes>
<include>**/*Tests.java</include>
</includes>
<excludes>
<exclude>**/Abstract*.java</exclude>
</excludes>
<junitArtifactName>junit:junit</junitArtifactName>
</configuration>
</plugin>
<plugin>
<artifactId>maven-source-plugin</artifactId>
<version>2.1.2</version>
<executions>
<execution>
<id>attach-sources</id>
@@ -377,22 +374,27 @@
<name>SpringSource Maven Repository</name>
<url>http://repository.springsource.com/maven/bundles/release</url>
</pluginRepository>
<pluginRepository>
<id>repository.springframework.maven.release</id>
<name>Spring Framework Maven Release Repository</name>
<url>http://repo.springsource.org/release</url>
</pluginRepository>
</pluginRepositories>
<repositories>
<repository>
<id>repository.springframework.maven.release</id>
<name>Spring Framework Maven Release Repository</name>
<url>http://maven.springframework.org/release</url>
<url>http://repo.springsource.org/release</url>
</repository>
<repository>
<id>repository.springframework.maven.milestone</id>
<name>Spring Framework Maven Milestone Repository</name>
<url>http://maven.springframework.org/milestone</url>
<url>http://repo.springsource.org/milestone</url>
</repository>
<repository>
<id>repository.springframework.maven.snapshot</id>
<name>Spring Framework Maven Snapshot Repository</name>
<url>http://maven.springframework.org/snapshot</url>
<url>http://repo.springsource.org/snapshot</url>
</repository>
</repositories>
<reporting>

View File

@@ -0,0 +1,140 @@
<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<context version="7.0.3.1152">
<scope name="spring-data-mongodb" type="Project">
<element name="Filter" type="TypeFilterReferenceOverridden">
<element name="org.springframework.data.mongodb.**" type="IncludeTypePattern"/>
</element>
<architecture>
<element name="Config" type="Layer">
<element name="Assignment" type="TypeFilter">
<element name="**.config.**" type="WeakTypePattern"/>
</element>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core"/>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Monitoring"/>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Repositories"/>
</element>
<element name="Repositories" type="Layer">
<element name="Assignment" type="TypeFilter">
<element name="**.repository.**" type="IncludeTypePattern"/>
</element>
<element name="API" type="Subsystem">
<element name="Assignment" type="TypeFilter">
<element name="**.repository.*" type="IncludeTypePattern"/>
</element>
</element>
<element name="Query" type="Subsystem">
<element name="Assignment" type="TypeFilter">
<element name="**.query.**" type="IncludeTypePattern"/>
</element>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|API"/>
</element>
<element name="Implementation" type="Subsystem">
<element name="Assignment" type="TypeFilter">
<element name="**.support.**" type="IncludeTypePattern"/>
</element>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|API"/>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|Query"/>
</element>
<element name="Config" type="Subsystem">
<element name="Assignment" type="TypeFilter">
<element name="**.config.**" type="IncludeTypePattern"/>
</element>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|Implementation"/>
</element>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core"/>
</element>
<element name="Monitoring" type="Layer">
<element name="Assignment" type="TypeFilter">
<element name="**.monitor.**" type="IncludeTypePattern"/>
</element>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core"/>
</element>
<element name="Core" type="Layer">
<element name="Assignment" type="TypeFilter">
<element name="**.core.**" type="IncludeTypePattern"/>
</element>
<element name="Mapping" type="Subsystem">
<element name="Assignment" type="TypeFilter">
<element name="**.mapping.**" type="IncludeTypePattern"/>
</element>
</element>
<element name="Geospatial" type="Subsystem">
<element name="Assignment" type="TypeFilter">
<element name="**.geo.**" type="IncludeTypePattern"/>
</element>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping"/>
</element>
<element name="Query" type="Subsystem">
<element name="Assignment" type="TypeFilter">
<element name="**.query.**" type="IncludeTypePattern"/>
</element>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Geospatial"/>
</element>
<element name="Index" type="Subsystem">
<element name="Assignment" type="TypeFilter">
<element name="**.index.**" type="IncludeTypePattern"/>
</element>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping"/>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Query"/>
</element>
<element name="Core" type="Subsystem">
<element name="Assignment" type="TypeFilter">
<element name="**.core.**" type="WeakTypePattern"/>
</element>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Geospatial"/>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Index"/>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping"/>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Query"/>
</element>
</element>
</architecture>
<workspace>
<element name="src/main/java" type="JavaRootDirectory">
<reference name="Project|spring-data-mongodb::BuildUnit|spring-data-mongodb"/>
</element>
<element name="target/classes" type="JavaRootDirectory">
<reference name="Project|spring-data-mongodb::BuildUnit|spring-data-mongodb"/>
</element>
</workspace>
<physical>
<element name="spring-data-mongodb" type="BuildUnit"/>
</physical>
</scope>
<scope name="External" type="External">
<element name="Filter" type="TypeFilter">
<element name="**" type="IncludeTypePattern"/>
<element name="java.**" type="ExcludeTypePattern"/>
<element name="javax.**" type="ExcludeTypePattern"/>
</element>
<architecture>
<element name="Spring" type="Subsystem">
<element name="Assignment" type="TypeFilter">
<element name="org.springframework.**" type="IncludeTypePattern"/>
<element name="org.springframework.data.**" type="ExcludeTypePattern"/>
</element>
</element>
<element name="Spring Data Core" type="Subsystem">
<element name="Assignment" type="TypeFilter">
<element name="org.springframework.data.**" type="IncludeTypePattern"/>
</element>
</element>
<element name="Mongo Java Driver" type="Subsystem">
<element name="Assignment" type="TypeFilter">
<element name="com.mongodb.**" type="IncludeTypePattern"/>
<element name="org.bson.**" type="IncludeTypePattern"/>
</element>
</element>
<element name="Querydsl" type="Subsystem">
<element name="Assignment" type="TypeFilter">
<element name="com.mysema.query.**" type="IncludeTypePattern"/>
</element>
</element>
</architecture>
</scope>
<scope name="Global" type="Global">
<element name="Configuration" type="Configuration"/>
<element name="Filter" type="TypeFilter">
<element name="**" type="IncludeTypePattern"/>
</element>
</scope>
</context>

View File

@@ -0,0 +1,291 @@
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<profiles version="12">
<profile kind="CodeFormatterProfile" name="Spring Data" version="12">
<setting id="org.eclipse.jdt.core.formatter.comment.insert_new_line_before_root_tags" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.disabling_tag" value="@formatter:off"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_annotation" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_type_parameters" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_type_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_type_arguments" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_anonymous_type_declaration" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_colon_in_case" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_brace_in_array_initializer" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.new_lines_at_block_boundaries" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_annotation_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_before_closing_brace_in_array_initializer" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_annotation" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.blank_lines_before_field" value="0"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_while" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.use_on_off_tags" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_between_empty_parens_in_annotation_type_member_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_before_else_in_if_statement" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_prefix_operator" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.keep_else_statement_on_same_line" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_ellipsis" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.insert_new_line_for_parameter" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_annotation_type_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.indent_breaks_compare_to_cases" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_at_in_annotation" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_multiple_fields" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_expressions_in_array_initializer" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_conditional_expression" value="80"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_for" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_binary_operator" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_question_in_wildcard" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_array_initializer" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_between_empty_parens_in_enum_constant" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_before_finally_in_try_statement" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_after_annotation_on_local_variable" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_before_catch_in_try_statement" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_while" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.blank_lines_after_package" value="1"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_type_parameters" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.continuation_indentation" value="2"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_postfix_operator" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_arguments_in_method_invocation" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_angle_bracket_in_type_arguments" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_superinterfaces" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.blank_lines_before_new_chunk" value="1"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_binary_operator" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.blank_lines_before_package" value="0"/>
<setting id="org.eclipse.jdt.core.compiler.source" value="1.7"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_enum_constant_arguments" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_constructor_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.format_line_comments" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_closing_angle_bracket_in_type_arguments" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_enum_declarations" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.join_wrapped_lines" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_block" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_arguments_in_explicit_constructor_call" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_method_invocation_arguments" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.blank_lines_before_member_type" value="1"/>
<setting id="org.eclipse.jdt.core.formatter.align_type_members_on_columns" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_enum_constant" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_for" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_method_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_selector_in_method_invocation" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_switch" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_unary_operator" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_colon_in_case" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.indent_parameter_description" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_method_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_switch" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_enum_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_angle_bracket_in_type_parameters" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.clear_blank_lines_in_block_comment" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_type_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.lineSplit" value="120"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_if" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_between_brackets_in_array_type_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_parenthesized_expression" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_explicitconstructorcall_arguments" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_constructor_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.blank_lines_before_first_class_body_declaration" value="0"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_after_annotation_on_method" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.indentation.size" value="2"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_between_empty_parens_in_method_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.enabling_tag" value="@formatter:on"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_enum_constant" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_superclass_in_type_declaration" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_assignment" value="0"/>
<setting id="org.eclipse.jdt.core.compiler.problem.assertIdentifier" value="error"/>
<setting id="org.eclipse.jdt.core.formatter.tabulation.char" value="tab"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_constructor_declaration_parameters" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_semicolon_in_try_resources" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_prefix_operator" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.indent_statements_compare_to_body" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.blank_lines_before_method" value="1"/>
<setting id="org.eclipse.jdt.core.formatter.wrap_outer_expressions_when_nested" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.format_guardian_clause_on_one_line" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_colon_in_for" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_cast" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_parameters_in_constructor_declaration" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_colon_in_labeled_statement" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_annotation_type_declaration" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_method_body" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_method_declaration" value="0"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_method_invocation" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_try" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_bracket_in_array_allocation_expression" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_enum_constant" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_annotation" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_at_in_annotation_type_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_method_declaration_throws" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_if" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_switch" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_method_declaration_throws" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_parenthesized_expression_in_return" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_annotation" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_question_in_conditional" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_question_in_wildcard" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_try" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_bracket_in_array_allocation_expression" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.preserve_white_space_between_code_and_line_comments" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_parenthesized_expression_in_throw" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_type_arguments" value="do not insert"/>
<setting id="org.eclipse.jdt.core.compiler.problem.enumIdentifier" value="error"/>
<setting id="org.eclipse.jdt.core.formatter.indent_switchstatements_compare_to_switch" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_ellipsis" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_block" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_for_inits" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_method_declaration" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.compact_else_if" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.wrap_before_or_operator_multicatch" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_array_initializer" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_for_increments" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.format_line_comment_starting_on_first_column" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_bracket_in_array_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_after_annotation_on_field" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_enum_constant" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.comment.indent_root_tags" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_enum_declarations" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_union_type_in_multicatch" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_explicitconstructorcall_arguments" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_switch" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_method_declaration_parameters" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_superinterfaces" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_allocation_expression" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.tabulation.size" value="2"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_bracket_in_array_type_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_after_opening_brace_in_array_initializer" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_closing_brace_in_block" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_bracket_in_array_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_enum_constant" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_angle_bracket_in_type_arguments" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_constructor_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_constructor_declaration_throws" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_if" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.clear_blank_lines_in_javadoc_comment" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_throws_clause_in_constructor_declaration" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_assignment_operator" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_assignment_operator" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.indent_empty_lines" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_synchronized" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_closing_paren_in_cast" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_method_declaration_parameters" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_block_in_case" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.number_of_empty_lines_to_preserve" value="1"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_method_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_catch" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_constructor_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_method_invocation" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_bracket_in_array_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_arguments_in_qualified_allocation_expression" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_and_in_type_parameter" value="insert"/>
<setting id="org.eclipse.jdt.core.compiler.compliance" value="1.7"/>
<setting id="org.eclipse.jdt.core.formatter.continuation_indentation_for_array_initializer" value="2"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_between_empty_brackets_in_array_allocation_expression" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_at_in_annotation_type_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_arguments_in_allocation_expression" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_cast" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_unary_operator" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_angle_bracket_in_parameterized_type_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_anonymous_type_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.keep_empty_array_initializer_on_one_line" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_enum_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.keep_imple_if_on_one_line" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_constructor_declaration_parameters" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_closing_angle_bracket_in_type_parameters" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_at_end_of_file_if_missing" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_colon_in_for" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_colon_in_labeled_statement" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_parameterized_type_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_superinterfaces_in_type_declaration" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_binary_expression" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_enum_declaration" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_after_annotation_on_type" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_while" value="do not insert"/>
<setting id="org.eclipse.jdt.core.compiler.codegen.inlineJsrBytecode" value="enabled"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_try" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.put_empty_statement_on_new_line" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_after_label" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_after_annotation_on_parameter" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_angle_bracket_in_type_parameters" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_between_empty_parens_in_method_invocation" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_before_while_in_do_statement" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_arguments_in_enum_constant" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.comment.format_javadoc_comments" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.comment.line_length" value="120"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_after_annotation_on_package" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.blank_lines_between_import_groups" value="1"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_enum_constant_arguments" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_semicolon" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_constructor_declaration" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.number_of_blank_lines_at_beginning_of_method_body" value="0"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_colon_in_conditional" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.indent_body_declarations_compare_to_type_header" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_annotation_type_member_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.wrap_before_binary_operator" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.indent_body_declarations_compare_to_enum_declaration_header" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.blank_lines_between_type_declarations" value="1"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_synchronized" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.indent_statements_compare_to_block" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_superinterfaces_in_enum_declaration" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.join_lines_in_comments" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_question_in_conditional" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_multiple_field_declarations" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_compact_if" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_for_inits" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.indent_switchstatements_compare_to_cases" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_array_initializer" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_colon_in_default" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_and_in_type_parameter" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_between_empty_parens_in_constructor_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.blank_lines_before_imports" value="1"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_colon_in_assert" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.format_html" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_throws_clause_in_method_declaration" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_angle_bracket_in_type_parameters" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_bracket_in_array_allocation_expression" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_anonymous_type_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_colon_in_conditional" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_angle_bracket_in_parameterized_type_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_for" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_postfix_operator" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.format_source_code" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_synchronized" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_allocation_expression" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_constructor_declaration_throws" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_parameters_in_method_declaration" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_brace_in_array_initializer" value="insert"/>
<setting id="org.eclipse.jdt.core.compiler.codegen.targetPlatform" value="1.7"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_resources_in_try" value="80"/>
<setting id="org.eclipse.jdt.core.formatter.use_tabs_only_for_leading_indentations" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_arguments_in_annotation" value="0"/>
<setting id="org.eclipse.jdt.core.formatter.comment.format_header" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.comment.format_block_comments" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_enum_constant" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_enum_constants" value="0"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_block" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.indent_body_declarations_compare_to_annotation_declaration_header" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_parenthesized_expression" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_parenthesized_expression" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_catch" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_multiple_local_declarations" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_switch" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_for_increments" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_method_invocation" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_colon_in_assert" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_type_declaration" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_array_initializer" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_between_empty_braces_in_array_initializer" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_method_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_semicolon_in_for" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_catch" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_angle_bracket_in_parameterized_type_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_multiple_field_declarations" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_annotation" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_parameterized_type_reference" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_method_invocation_arguments" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.new_lines_at_javadoc_boundaries" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.blank_lines_after_imports" value="1"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_multiple_local_declarations" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.indent_body_declarations_compare_to_enum_constant_header" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_semicolon_in_for" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.never_indent_line_comments_on_first_column" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_semicolon_in_try_resources" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_angle_bracket_in_type_arguments" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.never_indent_block_comments_on_first_column" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.keep_then_statement_on_same_line" value="false"/>
</profile>
</profiles>

View File

@@ -4,16 +4,16 @@
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-document-parent</artifactId>
<version>1.0.0.M2</version>
<relativePath>../spring-data-document-parent/pom.xml</relativePath>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.0.0.RC1</version>
<relativePath>../spring-data-mongodb-parent/pom.xml</relativePath>
</parent>
<artifactId>spring-data-mongodb</artifactId>
<packaging>jar</packaging>
<name>Spring Data MongoDB Support</name>
<name>Spring Data MongoDB</name>
<properties>
<mongo.version>2.4</mongo.version>
<mongo.version>2.7.1</mongo.version>
<querydsl.version>2.2.5</querydsl.version>
</properties>
<dependencies>
@@ -23,10 +23,6 @@
<groupId>org.springframework</groupId>
<artifactId>spring-beans</artifactId>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-tx</artifactId>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-expression</artifactId>
@@ -38,11 +34,6 @@
</dependency>
<!-- Spring Data -->
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-document-core</artifactId>
<version>${project.version}</version>
</dependency>
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-commons-core</artifactId>
@@ -65,7 +56,7 @@
<dependency>
<groupId>com.mysema.querydsl</groupId>
<artifactId>querydsl-mongodb</artifactId>
<version>2.1.1</version>
<version>${querydsl.version}</version>
<optional>true</optional>
<exclusions>
<exclusion>
@@ -78,7 +69,7 @@
<dependency>
<groupId>com.mysema.querydsl</groupId>
<artifactId>querydsl-apt</artifactId>
<version>2.1.1</version>
<version>${querydsl.version}</version>
<scope>provided</scope>
</dependency>
@@ -88,20 +79,6 @@
<optional>true</optional>
</dependency>
<dependency>
<groupId>javax.annotation</groupId>
<artifactId>jsr250-api</artifactId>
<optional>true</optional>
</dependency>
<!--
<dependency>
<groupId>javax.persistence</groupId>
<artifactId>persistence-api</artifactId>
<version>1.0</version>
</dependency>
-->
<!-- Test dependencies -->
<dependency>
<groupId>org.mockito</groupId>
@@ -161,7 +138,7 @@
<plugin>
<groupId>com.mysema.maven</groupId>
<artifactId>maven-apt-plugin</artifactId>
<version>1.0</version>
<version>1.0.2</version>
<executions>
<execution>
<phase>generate-test-sources</phase>
@@ -169,8 +146,8 @@
<goal>test-process</goal>
</goals>
<configuration>
<outputDirectory>target/generated-sources/test-annotations</outputDirectory>
<processor>org.springframework.data.document.mongodb.repository.MongoAnnotationProcessor</processor>
<outputDirectory>target/generated-test-sources</outputDirectory>
<processor>org.springframework.data.mongodb.repository.support.MongoAnnotationProcessor</processor>
</configuration>
</execution>
</executions>

View File

@@ -1,71 +0,0 @@
/*
* Copyright 2010-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.mongodb;
/**
* Provides a simple wrapper to encapsulate the variety of settings you can use when creating a collection.
*
* @author Thomas Risberg
*/
public class CollectionOptions {
private Integer maxDocuments;
private Integer size;
private Boolean capped;
/**
* Constructs a new <code>CollectionOptions</code> instance.
*
* @param size the collection size in bytes, this data space is preallocated
* @param maxDocuments the maximum number of documents in the collection.
* @param capped true to created a "capped" collection (fixed size with auto-FIFO behavior
* based on insertion order), false otherwise.
*/
public CollectionOptions(Integer size, Integer maxDocuments, Boolean capped) {
super();
this.maxDocuments = maxDocuments;
this.size = size;
this.capped = capped;
}
public Integer getMaxDocuments() {
return maxDocuments;
}
public void setMaxDocuments(Integer maxDocuments) {
this.maxDocuments = maxDocuments;
}
public Integer getSize() {
return size;
}
public void setSize(Integer size) {
this.size = size;
}
public Boolean getCapped() {
return capped;
}
public void setCapped(Boolean capped) {
this.capped = capped;
}
}

View File

@@ -1,70 +0,0 @@
package org.springframework.data.document.mongodb;
import java.util.Map;
import java.util.concurrent.ConcurrentHashMap;
import com.mongodb.DB;
import org.springframework.transaction.support.ResourceHolderSupport;
import org.springframework.util.Assert;
class DbHolder extends ResourceHolderSupport {
private static final Object DEFAULT_KEY = new Object();
private final Map<Object, DB> dbMap = new ConcurrentHashMap<Object, DB>();
public DbHolder(DB db) {
addDB(db);
}
public DbHolder(Object key, DB db) {
addDB(key, db);
}
public DB getDB() {
return getDB(DEFAULT_KEY);
}
public DB getDB(Object key) {
return this.dbMap.get(key);
}
public DB getAnyDB() {
if (!this.dbMap.isEmpty()) {
return this.dbMap.values().iterator().next();
}
return null;
}
public void addDB(DB session) {
addDB(DEFAULT_KEY, session);
}
public void addDB(Object key, DB session) {
Assert.notNull(key, "Key must not be null");
Assert.notNull(session, "DB must not be null");
this.dbMap.put(key, session);
}
public DB removeDB(Object key) {
return this.dbMap.remove(key);
}
public boolean containsDB(DB session) {
return this.dbMap.containsValue(session);
}
public boolean isEmpty() {
return this.dbMap.isEmpty();
}
public boolean doesNotHoldNonDefaultDB() {
synchronized (this.dbMap) {
return this.dbMap.isEmpty() ||
(this.dbMap.size() == 1 && this.dbMap.containsKey(DEFAULT_KEY));
}
}
}

View File

@@ -1,94 +0,0 @@
/*
* Copyright 2002-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.mongodb;
import com.mongodb.DB;
import com.mongodb.Mongo;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.springframework.jmx.export.annotation.ManagedOperation;
import org.springframework.jmx.export.annotation.ManagedResource;
/**
* Mongo server administration exposed via JMX annotations
*
* @author Mark Pollack
*/
@ManagedResource(description = "Mongo Admin Operations")
public class MongoAdmin implements MongoAdminOperations {
/**
* Logger available to subclasses
*/
protected final Log logger = LogFactory.getLog(getClass());
private Mongo mongo;
private String username;
private String password;
public MongoAdmin(Mongo mongo) {
this.mongo = mongo;
}
/* (non-Javadoc)
* @see org.springframework.data.document.mongodb.MongoAdminOperations#dropDatabase(java.lang.String)
*/
@ManagedOperation
public void dropDatabase(String databaseName) {
getDB(databaseName).dropDatabase();
}
/* (non-Javadoc)
* @see org.springframework.data.document.mongodb.MongoAdminOperations#createDatabase(java.lang.String)
*/
@ManagedOperation
public void createDatabase(String databaseName) {
getDB(databaseName);
}
/* (non-Javadoc)
* @see org.springframework.data.document.mongodb.MongoAdminOperations#getDatabaseStats(java.lang.String)
*/
@ManagedOperation
public String getDatabaseStats(String databaseName) {
return getDB(databaseName).getStats().toString();
}
/**
* Sets the username to use to connect to the Mongo database
*
* @param username The username to use
*/
public void setUsername(String username) {
this.username = username;
}
/**
* Sets the password to use to authenticate with the Mongo database.
*
* @param password The password to use
*/
public void setPassword(String password) {
this.password = password;
}
DB getDB(String databaseName) {
return MongoDbUtils.getDB(mongo, databaseName, username, password == null ? null : password.toCharArray());
}
}

View File

@@ -1,16 +0,0 @@
package org.springframework.data.document.mongodb;
import org.springframework.jmx.export.annotation.ManagedOperation;
public interface MongoAdminOperations {
@ManagedOperation
public abstract void dropDatabase(String databaseName);
@ManagedOperation
public abstract void createDatabase(String databaseName);
@ManagedOperation
public abstract String getDatabaseStats(String databaseName);
}

View File

@@ -1,168 +0,0 @@
/*
* Copyright 2010-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.mongodb;
import com.mongodb.CommandResult;
import com.mongodb.DB;
import com.mongodb.Mongo;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.springframework.transaction.support.TransactionSynchronizationManager;
import org.springframework.util.Assert;
/**
* Helper class featuring helper methods for internal MongoDb classes.
* <p/>
* <p>Mainly intended for internal use within the framework.
*
* @author Thomas Risberg
* @author Graeme Rocher
* @author Oliver Gierke
* @since 1.0
*/
public abstract class MongoDbUtils {
private static final Log LOGGER = LogFactory.getLog(MongoDbUtils.class);
/**
* Private constructor to prevent instantiation.
*/
private MongoDbUtils() {
}
/**
* Obtains a {@link DB} connection for the given {@link Mongo} instance and database name
*
* @param mongo The {@link Mongo} instance
* @param databaseName The database name
* @return The {@link DB} connection
*/
public static DB getDB(Mongo mongo, String databaseName) {
return doGetDB(mongo, databaseName, null, null, true);
}
/**
* Obtains a {@link DB} connection for the given {@link Mongo} instance and database name
*
* @param mongo The {@link Mongo} instance
* @param databaseName The database name
* @param username The username to authenticate with
* @param password The password to authenticate with
* @return The {@link DB} connection
*/
public static DB getDB(Mongo mongo, String databaseName, String username, char[] password) {
return doGetDB(mongo, databaseName, username, password, true);
}
public static DB doGetDB(Mongo mongo, String databaseName, String username, char[] password, boolean allowCreate) {
Assert.notNull(mongo, "No Mongo instance specified");
DbHolder dbHolder = (DbHolder) TransactionSynchronizationManager.getResource(mongo);
if (dbHolder != null && !dbHolder.isEmpty()) {
// pre-bound Mongo DB
DB db = null;
if (TransactionSynchronizationManager.isSynchronizationActive() &&
dbHolder.doesNotHoldNonDefaultDB()) {
// Spring transaction management is active ->
db = dbHolder.getDB();
if (db != null && !dbHolder.isSynchronizedWithTransaction()) {
LOGGER.debug("Registering Spring transaction synchronization for existing Mongo DB");
TransactionSynchronizationManager.registerSynchronization(new MongoSynchronization(dbHolder, mongo));
dbHolder.setSynchronizedWithTransaction(true);
}
}
if (db != null) {
return db;
}
}
LOGGER.trace("Getting Mongo Database name=["+databaseName+"]");
DB db = mongo.getDB(databaseName);
boolean credentialsGiven = username != null && password != null;
if (credentialsGiven && !db.isAuthenticated()) {
//Note, can only authenticate once against the same com.mongodb.DB object.
if (!db.authenticate(username, password)) {
throw new CannotGetMongoDbConnectionException("Failed to authenticate to database [" + databaseName +
"], username = [" + username + "], password = [" + new String(password) + "]", databaseName, username, password );
}
}
// Use same Session for further Mongo actions within the transaction.
// Thread object will get removed by synchronization at transaction completion.
if (TransactionSynchronizationManager.isSynchronizationActive()) {
// We're within a Spring-managed transaction, possibly from JtaTransactionManager.
LOGGER.debug("Registering Spring transaction synchronization for new Hibernate Session");
DbHolder holderToUse = dbHolder;
if (holderToUse == null) {
holderToUse = new DbHolder(db);
} else {
holderToUse.addDB(db);
}
TransactionSynchronizationManager.registerSynchronization(new MongoSynchronization(holderToUse, mongo));
holderToUse.setSynchronizedWithTransaction(true);
if (holderToUse != dbHolder) {
TransactionSynchronizationManager.bindResource(mongo, holderToUse);
}
}
// Check whether we are allowed to return the DB.
if (!allowCreate && !isDBTransactional(db, mongo)) {
throw new IllegalStateException("No Mongo DB bound to thread, " +
"and configuration does not allow creation of non-transactional one here");
}
return db;
}
/**
* Return whether the given DB instance is transactional, that is,
* bound to the current thread by Spring's transaction facilities.
*
* @param db the DB to check
* @param mongo the Mongo instance that the DB was created with
* (may be <code>null</code>)
* @return whether the DB is transactional
*/
public static boolean isDBTransactional(DB db, Mongo mongo) {
if (mongo == null) {
return false;
}
DbHolder dbHolder =
(DbHolder) TransactionSynchronizationManager.getResource(mongo);
return (dbHolder != null && dbHolder.containsDB(db));
}
/**
* Perform actual closing of the Mongo DB object,
* catching and logging any cleanup exceptions thrown.
*
* @param db the DB to close (may be <code>null</code>)
*/
public static void closeDB(DB db) {
if (db != null) {
LOGGER.debug("Closing Mongo DB object");
try {
db.requestDone();
} catch (Throwable ex) {
LOGGER.debug("Unexpected exception on closing Mongo DB object", ex);
}
}
}
}

View File

@@ -1,84 +0,0 @@
/*
* Copyright 2010-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.mongodb;
import com.mongodb.MongoException;
import com.mongodb.MongoException.CursorNotFound;
import com.mongodb.MongoException.DuplicateKey;
import com.mongodb.MongoException.Network;
import com.mongodb.MongoInternalException;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.DataAccessResourceFailureException;
import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.dao.DuplicateKeyException;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.dao.InvalidDataAccessResourceUsageException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.document.UncategorizedDocumentStoreException;
/**
* Simple {@link PersistenceExceptionTranslator} for Mongo. Convert the given runtime exception to an appropriate
* exception from the {@code org.springframework.dao} hierarchy. Return {@literal null} if no translation is
* appropriate: any other exception may have resulted from user code, and should not be translated.
*
* @param ex runtime exception that occurred
* @author Oliver Gierke
* @return the corresponding DataAccessException instance, or {@literal null} if the exception should not be translated
*/
public class MongoExceptionTranslator implements PersistenceExceptionTranslator {
/*
* (non-Javadoc)
*
* @see org.springframework.dao.support.PersistenceExceptionTranslator#
* translateExceptionIfPossible(java.lang.RuntimeException)
*/
public DataAccessException translateExceptionIfPossible(RuntimeException ex) {
// Check for well-known MongoException subclasses.
// All other MongoExceptions
if (ex instanceof DuplicateKey) {
return new DuplicateKeyException(ex.getMessage(), ex);
}
if (ex instanceof Network) {
return new DataAccessResourceFailureException(ex.getMessage(), ex);
}
if (ex instanceof CursorNotFound) {
return new DataAccessResourceFailureException(ex.getMessage(), ex);
}
if (ex instanceof MongoException) {
int code = ((MongoException)ex).getCode();
if (code == 11000 || code == 11001) {
throw new DuplicateKeyException(ex.getMessage(), ex);
} else if (code == 12000 || code == 13440) {
throw new DataAccessResourceFailureException(ex.getMessage(), ex);
} else if (code == 10003 || code == 12001 || code == 12010 || code == 12011 || code == 12012 ) {
throw new InvalidDataAccessApiUsageException(ex.getMessage(), ex);
}
return new UncategorizedDocumentStoreException(ex.getMessage(), ex);
}
if (ex instanceof MongoInternalException) {
return new InvalidDataAccessResourceUsageException(ex.getMessage(), ex);
}
// If we get here, we have an exception that resulted from user code,
// rather than the persistence provider, so we return null to indicate
// that translation should not occur.
return null;
}
}

View File

@@ -1,131 +0,0 @@
/*
* Copyright 2010-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.mongodb;
import java.util.List;
import com.mongodb.Mongo;
import com.mongodb.MongoOptions;
import com.mongodb.ServerAddress;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.springframework.beans.factory.FactoryBean;
import org.springframework.beans.factory.InitializingBean;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.util.Assert;
/**
* Convenient factory for configuring MongoDB.
*
* @author Thomas Risberg
* @author Graeme Rocher
* @since 1.0
*/
public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, PersistenceExceptionTranslator {
/**
* Logger, available to subclasses.
*/
protected final Log logger = LogFactory.getLog(getClass());
private Mongo mongo;
private MongoOptions mongoOptions;
private String host;
private Integer port;
private List<ServerAddress> replicaSetSeeds;
private List<ServerAddress> replicaPair;
private PersistenceExceptionTranslator exceptionTranslator = new MongoExceptionTranslator();
public void setMongoOptions(MongoOptions mongoOptions) {
this.mongoOptions = mongoOptions;
}
public void setReplicaSetSeeds(List<ServerAddress> replicaSetSeeds) {
this.replicaSetSeeds = replicaSetSeeds;
}
public void setReplicaPair(List<ServerAddress> replicaPair) {
this.replicaPair = replicaPair;
}
public void setHost(String host) {
this.host = host;
}
public void setPort(int port) {
this.port = port;
}
public PersistenceExceptionTranslator getExceptionTranslator() {
return exceptionTranslator;
}
public void setExceptionTranslator(
PersistenceExceptionTranslator exceptionTranslator) {
this.exceptionTranslator = exceptionTranslator;
}
public Mongo getObject() throws Exception {
Assert.notNull(mongo, "Mongo must not be null");
return mongo;
}
public Class<? extends Mongo> getObjectType() {
return Mongo.class;
}
public boolean isSingleton() {
return false;
}
public void afterPropertiesSet() throws Exception {
// apply defaults - convenient when used to configure for tests
// in an application context
if (mongo == null) {
if (host == null) {
logger.warn("Property host not specified. Using default configuration");
mongo = new Mongo();
} else {
ServerAddress defaultOptions = new ServerAddress();
if (mongoOptions == null) mongoOptions = new MongoOptions();
if (replicaPair != null) {
if (replicaPair.size() < 2) {
throw new CannotGetMongoDbConnectionException("A replica pair must have two server entries");
}
mongo = new Mongo(replicaPair.get(0), replicaPair.get(1), mongoOptions);
} else if (replicaSetSeeds != null) {
mongo = new Mongo(replicaSetSeeds, mongoOptions);
} else {
String mongoHost = host != null ? host : defaultOptions.getHost();
if (port != null) {
mongo = new Mongo(new ServerAddress(mongoHost, port), mongoOptions);
} else {
mongo = new Mongo(mongoHost, mongoOptions);
}
}
}
}
}
public DataAccessException translateExceptionIfPossible(RuntimeException ex) {
return exceptionTranslator.translateExceptionIfPossible(ex);
}
}

View File

@@ -1,735 +0,0 @@
/*
* Copyright 2010-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.mongodb;
import java.util.List;
import java.util.Set;
import com.mongodb.CommandResult;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mongodb.WriteResult;
import org.bson.types.ObjectId;
import org.springframework.data.document.mongodb.index.IndexDefinition;
import org.springframework.data.document.mongodb.query.Query;
import org.springframework.data.document.mongodb.query.Update;
/**
* Interface that specifies a basic set of MongoDB operations. Implemented by {@link MongoTemplate}.
* Not often used but a useful option for extensibility and testability (as it can be easily mocked, stubbed, or be
* the target of a JDK proxy).
*
* @author Thomas Risberg
* @author Mark Pollack
* @author Oliver Gierke
*/
public interface MongoOperations {
/**
* The default collection name used by this template.
*
* @return
*/
String getDefaultCollectionName();
/**
* The default collection used by this template.
*
* @return The default collection used by this template
*/
DBCollection getDefaultCollection();
/**
* Execute the a MongoDB command expressed as a JSON string. This will call the method
* JSON.parse that is part of the MongoDB driver to convert the JSON string to a DBObject.
* Any errors that result from executing this command will be converted into Spring's DAO
* exception hierarchy.
*
* @param jsonCommand a MongoDB command expressed as a JSON string.
*/
CommandResult executeCommand(String jsonCommand);
/**
* Execute a MongoDB command. Any errors that result from executing this command will be converted
* into Spring's DAO exception hierarchy.
*
* @param command a MongoDB command
*/
CommandResult executeCommand(DBObject command);
/**
* Executes a {@link DbCallback} translating any exceptions as necessary.
* <p/>
* Allows for returning a result object, that is a domain object or a collection of domain objects.
*
* @param <T> return type
* @param action callback object that specifies the MongoDB actions to perform on the passed in DB instance.
* @return a result object returned by the action or <tt>null</tt>
*/
<T> T execute(DbCallback<T> action);
/**
* Executes the given {@link CollectionCallback} on the default collection.
* <p/>
* Allows for returning a result object, that is a domain object or a collection of domain objects.
*
* @param <T> return type
* @param action callback object that specifies the MongoDB action
* @return a result object returned by the action or <tt>null</tt>
*/
<T> T execute(CollectionCallback<T> action);
/**
* Executes the given {@link CollectionCallback} on the collection of the given name.
* <p/>
* Allows for returning a result object, that is a domain object or a collection of domain objects.
*
* @param <T> return type
* @param collectionName the name of the collection that specifies which DBCollection instance will be passed into
* @param action callback object that specifies the MongoDB action
* the callback action.
* @return a result object returned by the action or <tt>null</tt>
*/
<T> T execute(String collectionName, CollectionCallback<T> action);
/**
* Executes the given {@link DbCallback} within the same connection to the database so as to ensure
* consistency in a write heavy environment where you may read the data that you wrote. See the
* comments on {@see <a href=http://www.mongodb.org/display/DOCS/Java+Driver+Concurrency>Java Driver Concurrency</a>}
* <p/>
* Allows for returning a result object, that is a domain object or a collection of domain objects.
*
* @param <T> return type
* @param action callback that specified the MongoDB actions to perform on the DB instance
* @return a result object returned by the action or <tt>null</tt>
*/
<T> T executeInSession(DbCallback<T> action);
/**
* Create an uncapped collection with the provided name.
*
* @param collectionName name of the collection
* @return the created collection
*/
DBCollection createCollection(String collectionName);
/**
* Create a collect with the provided name and options.
*
* @param collectionName name of the collection
* @param collectionOptions options to use when creating the collection.
* @return the created collection
*/
DBCollection createCollection(String collectionName, CollectionOptions collectionOptions);
/**
* A set of collection names.
*
* @return list of collection names
*/
Set<String> getCollectionNames();
/**
* Get a collection by name, creating it if it doesn't exist.
* <p/>
* Translate any exceptions as necessary.
*
* @param collectionName name of the collection
* @return an existing collection or a newly created one.
*/
DBCollection getCollection(String collectionName);
/**
* Check to see if a collection with a given name exists.
* <p/>
* Translate any exceptions as necessary.
*
* @param collectionName name of the collection
* @return true if a collection with the given name is found, false otherwise.
*/
boolean collectionExists(String collectionName);
/**
* Drop the collection with the given name.
* <p/>
* Translate any exceptions as necessary.
*
* @param collectionName name of the collection to drop/delete.
*/
void dropCollection(String collectionName);
/**
* Query for a list of objects of type T from the default collection.
* <p/>
* The object is converted from the MongoDB native representation using an instance of
* {@see MongoConverter}. Unless configured otherwise, an
* instance of SimpleMongoConverter will be used.
* <p/>
* If your collection does not contain a homogeneous collection of types, this operation will not be an efficient
* way to map objects since the test for class type is done in the client and not on the server.
*
* @param targetClass the parameterized type of the returned list
* @return the converted collection
*/
<T> List<T> getCollection(Class<T> targetClass);
/**
* Query for a list of objects of type T from the specified collection.
* <p/>
* The object is converted from the MongoDB native representation using an instance of
* {@see MongoConverter}. Unless configured otherwise, an
* instance of SimpleMongoConverter will be used.
* <p/>
* If your collection does not contain a homogeneous collection of types, this operation will not be an efficient
* way to map objects since the test for class type is done in the client and not on the server.
*
* @param collectionName name of the collection to retrieve the objects from
* @param targetClass the parameterized type of the returned list.
* @return the converted collection
*/
<T> List<T> getCollection(String collectionName, Class<T> targetClass);
/**
* Query for a list of objects of type T from the specified collection, mapping the DBObject using
* the provided MongoReader.
*
* @param collectionName name of the collection to retrieve the objects from
* @param targetClass the parameterized type of the returned list.
* @param reader the MongoReader to convert from DBObject to an object.
* @return the converted collection
*/
<T> List<T> getCollection(String collectionName, Class<T> targetClass,
MongoReader<T> reader);
/**
* Ensure that an index for the provided {@link IndexDefinition} exists for the default collection.
* If not it will be created.
*
* @param index
*/
void ensureIndex(IndexDefinition indexDefinition);
/**
* Ensure that an index for the provided {@link IndexDefinition} exists. If not it will be
* created.
*
* @param collectionName
* @param index
*/
void ensureIndex(String collectionName, IndexDefinition indexDefinition);
/**
* Map the results of an ad-hoc query on the default MongoDB collection to a single instance of an object
* of the specified type.
* <p/>
* The object is converted from the MongoDB native representation using an instance of
* {@see MongoConverter}. Unless configured otherwise, an
* instance of SimpleMongoConverter will be used.
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields specification
* @param targetClass the parameterized type of the returned list.
* @return the converted object
*/
<T> T findOne(Query query, Class<T> targetClass);
/**
* Map the results of an ad-hoc query on the default MongoDB collection to a single instance of an object
* of the specified type.
* <p/>
* The object is converted from the MongoDB native representation using an instance of
* {@see MongoConverter}. Unless configured otherwise, an
* instance of SimpleMongoConverter will be used.
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields specification
* @param targetClass the parameterized type of the returned list.
* @param reader the MongoReader to convert from DBObject to an object.
* @return the converted object
*/
<T> T findOne(Query query, Class<T> targetClass,
MongoReader<T> reader);
/**
* Map the results of an ad-hoc query on the specified collection to a single instance of an object
* of the specified type.
* <p/>
* The object is converted from the MongoDB native representation using an instance of
* {@see MongoConverter}. Unless configured otherwise, an
* instance of SimpleMongoConverter will be used.
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
* @param collectionName name of the collection to retrieve the objects from
* @param query the query class that specifies the criteria used to find a record and also an optional fields specification
* @param targetClass the parameterized type of the returned list.
* @return the converted object
*/
<T> T findOne(String collectionName, Query query,
Class<T> targetClass);
/**
* Map the results of an ad-hoc query on the specified collection to a single instance of an object
* of the specified type.
* <p/>
* The object is converted from the MongoDB native representation using an instance of
* {@see MongoConverter}. Unless configured otherwise, an
* instance of SimpleMongoConverter will be used.
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
* @param collectionName name of the collection to retrieve the objects from
* @param query the query class that specifies the criteria used to find a record and also an optional fields specification
* @param targetClass the parameterized type of the returned list.
* @param reader the MongoReader to convert from DBObject to an object.
* @return the converted object
*/
<T> T findOne(String collectionName, Query query,
Class<T> targetClass, MongoReader<T> reader);
/**
* Map the results of an ad-hoc query on the default MongoDB collection to a List of the specified type.
* <p/>
* The object is converted from the MongoDB native representation using an instance of
* {@see MongoConverter}. Unless configured otherwise, an
* instance of SimpleMongoConverter will be used.
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields specification
* @param targetClass the parameterized type of the returned list.
* @return the List of converted objects
*/
<T> List<T> find(Query query, Class<T> targetClass);
/**
* Map the results of an ad-hoc query on the default MongoDB collection to a List of the specified type.
* <p/>
* The object is converted from the MongoDB native representation using an instance of
* {@see MongoConverter}. Unless configured otherwise, an
* instance of SimpleMongoConverter will be used.
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields specification
* @param targetClass the parameterized type of the returned list.
* @param reader the MongoReader to convert from DBObject to an object.
* @return the List of converted objects
*/
<T> List<T> find(Query query, Class<T> targetClass,
MongoReader<T> reader);
/**
* Map the results of an ad-hoc query on the specified collection to a List of the specified type.
* <p/>
* The object is converted from the MongoDB native representation using an instance of
* {@see MongoConverter}. Unless configured otherwise, an
* instance of SimpleMongoConverter will be used.
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
* @param collectionName name of the collection to retrieve the objects from
* @param query the query class that specifies the criteria used to find a record and also an optional fields specification
* @param targetClass the parameterized type of the returned list.
* @return the List of converted objects
*/
<T> List<T> find(String collectionName, Query query,
Class<T> targetClass);
/**
* Map the results of an ad-hoc query on the specified collection to a List of the specified type.
* <p/>
* The object is converted from the MongoDB native representation using an instance of
* {@see MongoConverter}. Unless configured otherwise, an
* instance of SimpleMongoConverter will be used.
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
* @param collectionName name of the collection to retrieve the objects from
* @param query the query class that specifies the criteria used to find a record and also an optional fields specification
* @param targetClass the parameterized type of the returned list.
* @param reader the MongoReader to convert from DBObject to an object.
* @return the List of converted objects
*/
<T> List<T> find(String collectionName, Query query,
Class<T> targetClass, MongoReader<T> reader);
/**
* Map the results of an ad-hoc query on the specified collection to a List of the specified type.
* <p/>
* The object is converted from the MongoDB native representation using an instance of
* {@see MongoConverter}. Unless configured otherwise, an
* instance of SimpleMongoConverter will be used.
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
* @param collectionName name of the collection to retrieve the objects from
* @param query the query class that specifies the criteria used to find a record and also an optional fields specification
* @param targetClass the parameterized type of the returned list.
* @param preparer allows for customization of the DBCursor used when iterating over the result set,
* (apply limits, skips and so on).
* @return the List of converted objects.
*/
<T> List<T> find(String collectionName, Query query, Class<T> targetClass, CursorPreparer preparer);
/**
* Map the results of an ad-hoc query on the default MongoDB collection to a single instance of an object
* of the specified type. The first document that matches the query is returned and also removed from the
* collection in the database.
* <p/>
* The object is converted from the MongoDB native representation using an instance of
* {@see MongoConverter}. Unless configured otherwise, an
* instance of SimpleMongoConverter will be used.
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields specification
* @param targetClass the parameterized type of the returned list.
* @return the converted object
*/
<T> T findAndRemove(Query query, Class<T> targetClass);
/**
* Map the results of an ad-hoc query on the default MongoDB collection to a single instance of an object
* of the specified type. The first document that matches the query is returned and also removed from the
* collection in the database.
* <p/>
* The object is converted from the MongoDB native representation using an instance of
* {@see MongoConverter}. Unless configured otherwise, an
* instance of SimpleMongoConverter will be used.
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields specification
* @param targetClass the parameterized type of the returned list.
* @param reader the MongoReader to convert from DBObject to an object.
* @return the converted object
*/
<T> T findAndRemove(Query query, Class<T> targetClass,
MongoReader<T> reader);
/**
* Map the results of an ad-hoc query on the specified collection to a single instance of an object
* of the specified type. The first document that matches the query is returned and also removed from the
* collection in the database.
* <p/>
* The object is converted from the MongoDB native representation using an instance of
* {@see MongoConverter}. Unless configured otherwise, an
* instance of SimpleMongoConverter will be used.
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
* @param collectionName name of the collection to retrieve the objects from
* @param query the query class that specifies the criteria used to find a record and also an optional fields specification
* @param targetClass the parameterized type of the returned list.
* @return the converted object
*/
<T> T findAndRemove(String collectionName, Query query,
Class<T> targetClass);
/**
* Map the results of an ad-hoc query on the specified collection to a single instance of an object
* of the specified type. The first document that matches the query is returned and also removed from the
* collection in the database.
* <p/>
* The object is converted from the MongoDB native representation using an instance of
* {@see MongoConverter}. Unless configured otherwise, an
* instance of SimpleMongoConverter will be used.
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
* @param collectionName name of the collection to retrieve the objects from
* @param query the query class that specifies the criteria used to find a record and also an optional fields specification
* @param targetClass the parameterized type of the returned list.
* @param reader the MongoReader to convert from DBObject to an object.
* @return the converted object
*/
<T> T findAndRemove(String collectionName, Query query,
Class<T> targetClass, MongoReader<T> reader);
/**
* Insert the object into the default collection.
* <p/>
* The object is converted to the MongoDB native representation using an instance of
* {@see MongoConverter}. Unless configured otherwise, an
* instance of SimpleMongoConverter will be used.
* <p/>
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property
* is a String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from
* ObjectId to your property type will be handled by Spring's BeanWrapper class that leverages Spring 3.0's
* new Type Conversion API.
* See <a href="http://static.springsource.org/spring/docs/3.0.x/reference/validation.html#core-convert">Spring 3 Type Conversion"</a>
* for more details.
* <p/>
* <p/>
* Insert is used to initially store the object into the database.
* To update an existing object use the save method.
*
* @param objectToSave the object to store in the collection.
*/
void insert(Object objectToSave);
/**
* Insert the object into the specified collection.
* <p/>
* The object is converted to the MongoDB native representation using an instance of
* {@see MongoConverter}. Unless configured otherwise, an
* instance of SimpleMongoConverter will be used.
* <p/>
* Insert is used to initially store the object into the
* database. To update an existing object use the save method.
*
* @param collectionName name of the collection to store the object in
* @param objectToSave the object to store in the collection
*/
void insert(String collectionName, Object objectToSave);
/**
* Insert the object into the default collection.
* <p/>
* The object is converted to the MongoDB native representation using an instance of
* {@see MongoWriter}
* <p/>
* Insert is used to initially store the object into the
* database. To update an existing object use the save method.
*
* @param <T> the type of the object to insert
* @param objectToSave the object to store in the collection
* @param writer the writer to convert the object to save into a DBObject
*/
<T> void insert(T objectToSave, MongoWriter<T> writer);
/**
* Insert the object into the specified collection.
* <p/>
* The object is converted to the MongoDB native representation using an instance of
* {@see MongoWriter}
* <p/>
* Insert is used to initially store the object into the
* database. To update an existing object use the save method.
*
* @param <T> the type of the object to insert
* @param collectionName name of the collection to store the object in
* @param objectToSave the object to store in the collection
* @param writer the writer to convert the object to save into a DBObject
*/
<T> void insert(String collectionName, T objectToSave, MongoWriter<T> writer);
/**
* Insert a list of objects into the default collection in a single batch write to the database.
*
* @param listToSave the list of objects to save.
*/
void insertList(List<? extends Object> listToSave);
/**
* Insert a list of objects into the specified collection in a single batch write to the database.
*
* @param collectionName name of the collection to store the object in
* @param listToSave the list of objects to save.
*/
void insertList(String collectionName, List<? extends Object> listToSave);
/**
* Insert a list of objects into the default collection using the provided MongoWriter instance
*
* @param <T> the type of object being saved
* @param listToSave the list of objects to save.
* @param writer the writer to convert the object to save into a DBObject
*/
<T> void insertList(List<? extends T> listToSave, MongoWriter<T> writer);
/**
* Insert a list of objects into the specified collection using the provided MongoWriter instance
*
* @param <T> the type of object being saved
* @param collectionName name of the collection to store the object in
* @param listToSave the list of objects to save.
* @param writer the writer to convert the object to save into a DBObject
*/
<T> void insertList(String collectionName, List<? extends T> listToSave, MongoWriter<T> writer);
/**
* Save the object to the default collection. This will perform an insert if the object is not already
* present, that is an 'upsert'.
* <p/>
* The object is converted to the MongoDB native representation using an instance of
* {@see MongoConverter}. Unless configured otherwise, an
* instance of SimpleMongoConverter will be used.
* <p/>
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property
* is a String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from
* ObjectId to your property type will be handled by Spring's BeanWrapper class that leverages Spring 3.0's
* new Type Conversion API.
* See <a href="http://static.springsource.org/spring/docs/3.0.x/reference/validation.html#core-convert">Spring 3 Type Conversion"</a>
* for more details.
*
* @param objectToSave the object to store in the collection
*/
void save(Object objectToSave);
/**
* Save the object to the specified collection. This will perform an insert if the object is not already
* present, that is an 'upsert'.
* <p/>
* The object is converted to the MongoDB native representation using an instance of
* {@see MongoConverter}. Unless configured otherwise, an
* instance of SimpleMongoConverter will be used.
* <p/>
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property
* is a String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from
* ObjectId to your property type will be handled by Spring's BeanWrapper class that leverages Spring 3.0's
* new Type Cobnversion API.
* See <a href="http://static.springsource.org/spring/docs/3.0.x/reference/validation.html#core-convert">Spring 3 Type Conversion"</a>
* for more details.
*
* @param collectionName name of the collection to store the object in
* @param objectToSave the object to store in the collection
*/
void save(String collectionName, Object objectToSave);
/**
* Save the object into the default collection using the provided writer.
* This will perform an insert if the object is not already
* present, that is an 'upsert'.
* <p/>
* The object is converted to the MongoDB native representation using an instance of
* {@see MongoWriter}
*
* @param <T> the type of the object to insert
* @param objectToSave the object to store in the collection
* @param writer the writer to convert the object to save into a DBObject
*/
<T> void save(T objectToSave, MongoWriter<T> writer);
/**
* Save the object into the specified collection using the provided writer.
* This will perform an insert if the object is not already
* present, that is an 'upsert'.
* <p/>
* The object is converted to the MongoDB native representation using an instance of
* {@see MongoWriter}
*
* @param <T> the type of the object to insert
* @param collectionName name of the collection to store the object in
* @param objectToSave the object to store in the collection
* @param writer the writer to convert the object to save into a DBObject
*/
<T> void save(String collectionName, T objectToSave, MongoWriter<T> writer);
/**
* Updates the first object that is found in the default collection that matches the query document
* with the provided updated document.
*
* @param queryDoc the query document that specifies the criteria used to select a record to be updated
* @param updateDoc the update document that contains the updated object or $ operators to manipulate the
* existing object.
*/
WriteResult updateFirst(Query query, Update update);
/**
* Updates the first object that is found in the specified collection that matches the query document criteria
* with the provided updated document.
*
* @param collectionName name of the collection to update the object in
* @param queryDoc the query document that specifies the criteria used to select a record to be updated
* @param updateDoc the update document that contains the updated object or $ operators to manipulate the
* existing object.
*/
WriteResult updateFirst(String collectionName, Query query,
Update update);
/**
* Updates all objects that are found in the default collection that matches the query document criteria
* with the provided updated document.
*
* @param queryDoc the query document that specifies the criteria used to select a record to be updated
* @param updateDoc the update document that contains the updated object or $ operators to manipulate the
* existing object.
*/
WriteResult updateMulti(Query query, Update update);
/**
* Updates all objects that are found in the specified collection that matches the query document criteria
* with the provided updated document.
*
* @param collectionName name of the collection to update the object in
* @param queryDoc the query document that specifies the criteria used to select a record to be updated
* @param updateDoc the update document that contains the updated object or $ operators to manipulate the
* existing object.
*/
WriteResult updateMulti(String collectionName, Query query,
Update update);
/**
* Remove the given object from the collection by Id
* @param object
*/
void remove(Object object);
/**
* Remove all documents from the default collection that match the provided query document criteria.
*
* @param queryDoc the query document that specifies the criteria used to remove a record
*/
void remove(Query query);
/**
* Remove all documents from the default collection that match the provided query document criteria. The
* Class parameter is used to help convert the Id of the object if it is present in the query.
* @param <T>
* @param query
* @param targetClass
*/
<T> void remove(Query query, Class<T> targetClass);
/**
* Remove all documents from the specified collection that match the provided query document criteria.
*
* @param collectionName name of the collection where the objects will removed
* @param queryDoc the query document that specifies the criteria used to remove a record
*/
void remove(String collectionName, Query query);
/**
* Remove all documents from the specified collection that match the provided query document criteria.
* The Class parameter is used to help convert the Id of the object if it is present in the query.
* @param collectionName
* @param query
* @param targetClass
*/
<T> void remove(String collectionName, Query query, Class<T> targetClass);
}

View File

@@ -1,134 +0,0 @@
/*
* Copyright 2010-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.mongodb;
import com.mongodb.MongoOptions;
import org.springframework.beans.factory.FactoryBean;
import org.springframework.beans.factory.InitializingBean;
/**
* A factory bean for consruction a MongoOptions instance
*
* @author Graeme Rocher
*/
public class MongoOptionsFactoryBean implements FactoryBean<MongoOptions>, InitializingBean {
private static final MongoOptions MONGO_OPTIONS = new MongoOptions();
/**
* number of connections allowed per host
* will block if run out
*/
private int connectionsPerHost = MONGO_OPTIONS.connectionsPerHost;
/**
* multiplier for connectionsPerHost for # of threads that can block
* if connectionsPerHost is 10, and threadsAllowedToBlockForConnectionMultiplier is 5,
* then 50 threads can block
* more than that and an exception will be throw
*/
private int threadsAllowedToBlockForConnectionMultiplier = MONGO_OPTIONS.threadsAllowedToBlockForConnectionMultiplier;
/**
* max wait time of a blocking thread for a connection
*/
private int maxWaitTime = MONGO_OPTIONS.maxWaitTime;
/**
* connect timeout in milliseconds. 0 is default and infinite
*/
private int connectTimeout = MONGO_OPTIONS.connectTimeout;
/**
* socket timeout. 0 is default and infinite
*/
private int socketTimeout = MONGO_OPTIONS.socketTimeout;
/**
* this controls whether or not on a connect, the system retries automatically
*/
private boolean autoConnectRetry = MONGO_OPTIONS.autoConnectRetry;
/**
* number of connections allowed per host
* will block if run out
*/
public void setConnectionsPerHost(int connectionsPerHost) {
this.connectionsPerHost = connectionsPerHost;
}
/**
* multiplier for connectionsPerHost for # of threads that can block
* if connectionsPerHost is 10, and threadsAllowedToBlockForConnectionMultiplier is 5,
* then 50 threads can block
* more than that and an exception will be throw
*/
public void setThreadsAllowedToBlockForConnectionMultiplier(
int threadsAllowedToBlockForConnectionMultiplier) {
this.threadsAllowedToBlockForConnectionMultiplier = threadsAllowedToBlockForConnectionMultiplier;
}
/**
* max wait time of a blocking thread for a connection
*/
public void setMaxWaitTime(int maxWaitTime) {
this.maxWaitTime = maxWaitTime;
}
/**
* connect timeout in milliseconds. 0 is default and infinite
*/
public void setConnectTimeout(int connectTimeout) {
this.connectTimeout = connectTimeout;
}
/**
* socket timeout. 0 is default and infinite
*/
public void setSocketTimeout(int socketTimeout) {
this.socketTimeout = socketTimeout;
}
/**
* this controls whether or not on a connect, the system retries automatically
*/
public void setAutoConnectRetry(boolean autoConnectRetry) {
this.autoConnectRetry = autoConnectRetry;
}
public void afterPropertiesSet() {
MONGO_OPTIONS.connectionsPerHost = connectionsPerHost;
MONGO_OPTIONS.threadsAllowedToBlockForConnectionMultiplier = threadsAllowedToBlockForConnectionMultiplier;
MONGO_OPTIONS.maxWaitTime = maxWaitTime;
MONGO_OPTIONS.connectTimeout = connectTimeout;
MONGO_OPTIONS.socketTimeout = socketTimeout;
MONGO_OPTIONS.autoConnectRetry = autoConnectRetry;
}
public MongoOptions getObject() {
return MONGO_OPTIONS;
}
public Class<?> getObjectType() {
return MongoOptions.class;
}
public boolean isSingleton() {
return true;
}
}

View File

@@ -1,245 +0,0 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.mongodb;
import java.beans.PropertyDescriptor;
import java.lang.reflect.Method;
import java.lang.reflect.Type;
import java.math.BigInteger;
import java.util.*;
import org.bson.types.ObjectId;
import org.springframework.beans.BeanUtils;
import org.springframework.util.Assert;
import org.springframework.util.ReflectionUtils;
/**
* An iterable of {@link MongoPropertyDescriptor}s that allows dedicated access to the {@link MongoPropertyDescriptor}
* that captures the id-property.
*
* @author Oliver Gierke
*/
public class MongoPropertyDescriptors implements Iterable<MongoPropertyDescriptors.MongoPropertyDescriptor> {
private final Collection<MongoPropertyDescriptors.MongoPropertyDescriptor> descriptors;
private final MongoPropertyDescriptors.MongoPropertyDescriptor idDescriptor;
/**
* Creates the {@link MongoPropertyDescriptors} for the given type.
*
* @param type
*/
public MongoPropertyDescriptors(Class<?> type) {
Assert.notNull(type);
Set<MongoPropertyDescriptors.MongoPropertyDescriptor> descriptors = new HashSet<MongoPropertyDescriptors.MongoPropertyDescriptor>();
MongoPropertyDescriptors.MongoPropertyDescriptor idDesciptor = null;
for (PropertyDescriptor candidates : BeanUtils.getPropertyDescriptors(type)) {
MongoPropertyDescriptor descriptor = new MongoPropertyDescriptors.MongoPropertyDescriptor(candidates, type);
descriptors.add(descriptor);
if (descriptor.isIdProperty()) {
idDesciptor = descriptor;
}
}
this.descriptors = Collections.unmodifiableSet(descriptors);
this.idDescriptor = idDesciptor;
}
/**
* Returns the {@link MongoPropertyDescriptor} for the id property.
*
* @return the idDescriptor
*/
public MongoPropertyDescriptors.MongoPropertyDescriptor getIdDescriptor() {
return idDescriptor;
}
/*
* (non-Javadoc)
*
* @see java.lang.Iterable#iterator()
*/
public Iterator<MongoPropertyDescriptors.MongoPropertyDescriptor> iterator() {
return descriptors.iterator();
}
/**
* Simple value object to have a more suitable abstraction for MongoDB specific property handling.
*
* @author Oliver Gierke
*/
public static class MongoPropertyDescriptor {
public static Collection<Class<?>> SUPPORTED_ID_CLASSES;
static {
Set<Class<?>> classes = new HashSet<Class<?>>();
classes.add(ObjectId.class);
classes.add(String.class);
classes.add(BigInteger.class);
SUPPORTED_ID_CLASSES = Collections.unmodifiableCollection(classes);
}
private static final String ID_PROPERTY = "id";
static final String ID_KEY = "_id";
private final PropertyDescriptor delegate;
private final Class<?> owningType;
/**
* Creates a new {@link MongoPropertyDescriptor} for the given {@link PropertyDescriptor}.
*
* @param descriptor
* @param owningType
*/
public MongoPropertyDescriptor(PropertyDescriptor descriptor, Class<?> owningType) {
Assert.notNull(descriptor);
this.delegate = descriptor;
this.owningType = owningType;
}
/**
* Returns whether the property is the id-property. Will be identified by name for now ({@value #ID_PROPERTY}).
*
* @return
*/
public boolean isIdProperty() {
return ID_PROPERTY.equals(delegate.getName()) || ID_KEY.equals(delegate.getName());
}
/**
* Returns whether the property is of one of the supported id types. Currently we support {@link String},
* {@link ObjectId} and {@link BigInteger}.
*
* @return
*/
public boolean isOfIdType() {
return SUPPORTED_ID_CLASSES.contains(delegate.getPropertyType());
}
/**
* Returns the key that shall be used for mapping. Will return {@value #ID_KEY} for the id property and the
* plain name for all other ones.
*
* @return
*/
public String getKeyToMap() {
return isIdProperty() ? ID_KEY : delegate.getName();
}
/**
* Returns the name of the property.
*
* @return
*/
public String getName() {
return delegate.getName();
}
/**
* Returns whether the underlying property is actually mappable. By default this will exclude the
* {@literal class} property and only include properties with a getter.
*
* @return
*/
public boolean isMappable() {
boolean isNotClassAttribute = !delegate.getName().equals("class");
boolean hasGetter = delegate.getReadMethod() != null;
boolean hasField = ReflectionUtils.findField(owningType, delegate.getName()) != null;
return isNotClassAttribute && hasGetter && hasField;
}
/**
* Returns the plain property type.
*
* @return
*/
public Class<?> getPropertyType() {
return delegate.getPropertyType();
}
/**
* Returns the type type to be set. Will return the setter method's type and fall back to the getter method's
* return type in case no setter is available. Useful for further (generics) inspection.
*
* @return
*/
public Type getTypeToSet() {
Method method = delegate.getWriteMethod();
return method == null ? delegate.getReadMethod().getGenericReturnType()
: method.getGenericParameterTypes()[0];
}
/**
* Returns whther we describe a {@link Map}.
*
* @return
*/
public boolean isMap() {
return Map.class.isAssignableFrom(getPropertyType());
}
/**
* Returns whether the descriptor is for a collection.
*
* @return
*/
public boolean isCollection() {
return Collection.class.isAssignableFrom(getPropertyType());
}
/**
* Returns whether the descriptor is for an {@link Enum}.
*
* @return
*/
public boolean isEnum() {
return Enum.class.isAssignableFrom(getPropertyType());
}
/*
* (non-Javadoc)
*
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (obj == this) {
return true;
}
if (obj == null || !getClass().equals(obj.getClass())) {
return false;
}
MongoPropertyDescriptor that = (MongoPropertyDescriptor) obj;
return that.delegate.equals(this.delegate);
}
/*
* (non-Javadoc)
*
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
return delegate.hashCode();
}
}
}

View File

@@ -1,40 +0,0 @@
/*
* Copyright 2010-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.mongodb;
import com.mongodb.DBObject;
/**
* A MongoWriter is responsible for converting a native MongoDB DBObject to an object of type T.
*
* @param <T> the type of the object to convert from a DBObject
* @author Mark Pollack
* @author Thomas Risberg
* @author Oliver Gierke
*/
public interface MongoReader<T> {
/**
* Ready from the native MongoDB DBObject representation to an instance of the class T. The given type has to be the
* starting point for marshalling the {@link DBObject} into it. So in case there's no real valid data inside
* {@link DBObject} for the given type, just return an empty instance of the given type.
*
* @param clazz the type of the return value
* @param dbo theDBObject
* @return the converted object
*/
<S extends T> S read(Class<S> clazz, DBObject dbo);
}

View File

@@ -1,5 +0,0 @@
package org.springframework.data.document.mongodb;
public enum WriteResultChecking {
NONE, LOG, EXCEPTION
}

View File

@@ -1,100 +0,0 @@
/*
* Copyright 2010-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.mongodb.config;
import java.util.HashSet;
import java.util.Set;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.ClassPathScanningCandidateComponentProvider;
import org.springframework.context.annotation.Configuration;
import org.springframework.core.type.filter.AnnotationTypeFilter;
import org.springframework.data.annotation.Persistent;
import org.springframework.data.document.mongodb.MongoTemplate;
import org.springframework.data.document.mongodb.convert.MappingMongoConverter;
import org.springframework.data.document.mongodb.mapping.Document;
import org.springframework.data.document.mongodb.mapping.MongoMappingContext;
import org.springframework.data.document.mongodb.mapping.MongoPersistentEntityIndexCreator;
import org.springframework.data.document.mongodb.mapping.event.LoggingEventListener;
import org.springframework.data.document.mongodb.mapping.event.MongoMappingEvent;
import org.springframework.data.mapping.context.MappingContextAwareBeanPostProcessor;
import org.springframework.util.ClassUtils;
import org.springframework.util.StringUtils;
import com.mongodb.Mongo;
@Configuration
public abstract class AbstractMongoConfiguration {
@Bean
public abstract Mongo mongo() throws Exception;
@Bean
public abstract MongoTemplate mongoTemplate() throws Exception;
public String getMappingBasePackage() {
return "";
}
@Bean
public MongoMappingContext mongoMappingContext() throws ClassNotFoundException, LinkageError {
MongoMappingContext mappingContext = new MongoMappingContext();
String basePackage = getMappingBasePackage();
if (StringUtils.hasText(basePackage)) {
ClassPathScanningCandidateComponentProvider componentProvider = new ClassPathScanningCandidateComponentProvider(false);
componentProvider.addIncludeFilter(new AnnotationTypeFilter(Document.class));
componentProvider.addIncludeFilter(new AnnotationTypeFilter(Persistent.class));
Set<Class<?>> initialEntitySet = new HashSet<Class<?>>();
for (BeanDefinition candidate : componentProvider.findCandidateComponents(basePackage)) {
initialEntitySet.add(ClassUtils.forName(candidate.getBeanClassName(), mappingContext.getClass().getClassLoader()));
}
mappingContext.setInitialEntitySet(initialEntitySet);
}
return mappingContext;
}
@Bean
public MappingMongoConverter mappingMongoConverter() throws Exception {
MappingMongoConverter converter = new MappingMongoConverter(mongoMappingContext());
converter.setMongo(mongo());
afterMappingMongoConverterCreation(converter);
return converter;
}
/**
* Hook that allows post-processing after the MappingMongoConverter has been
* successfully created.
* @param converter
*/
protected void afterMappingMongoConverterCreation(MappingMongoConverter converter) {
}
@Bean
public MappingContextAwareBeanPostProcessor mappingContextAwareBeanPostProcessor() {
MappingContextAwareBeanPostProcessor bpp = new MappingContextAwareBeanPostProcessor();
bpp.setMappingContextBeanName("mongoMappingContext");
return bpp;
}
@Bean MongoPersistentEntityIndexCreator mongoPersistentEntityIndexCreator() throws Exception {
MongoPersistentEntityIndexCreator indexCreator = new MongoPersistentEntityIndexCreator(mongoMappingContext(), mongoTemplate() );
return indexCreator;
}
}

View File

@@ -1,127 +0,0 @@
/*
* Copyright (c) 2011 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.mongodb.config;
import java.util.Set;
import org.springframework.beans.factory.BeanDefinitionStoreException;
import org.springframework.beans.factory.NoSuchBeanDefinitionException;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.config.RuntimeBeanReference;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
import org.springframework.beans.factory.support.ManagedSet;
import org.springframework.beans.factory.xml.AbstractBeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.context.annotation.ClassPathScanningCandidateComponentProvider;
import org.springframework.core.type.filter.AnnotationTypeFilter;
import org.springframework.data.annotation.Persistent;
import org.springframework.data.document.mongodb.convert.MappingMongoConverter;
import org.springframework.data.document.mongodb.mapping.Document;
import org.springframework.data.document.mongodb.mapping.MongoMappingContext;
import org.springframework.data.document.mongodb.mapping.MongoPersistentEntityIndexCreator;
import org.springframework.data.mapping.context.MappingContextAwareBeanPostProcessor;
import org.springframework.util.StringUtils;
import org.w3c.dom.Element;
/**
* @author Jon Brisbin <jbrisbin@vmware.com>
* @author Oliver Gierke
*/
public class MappingMongoConverterParser extends AbstractBeanDefinitionParser {
static final String MAPPING_CONTEXT = "mappingContext";
private static final String INDEX_HELPER = "indexCreationHelper";
private static final String TEMPLATE = "mongoTemplate";
private static final String POST_PROCESSOR = "mappingContextAwareBeanPostProcessor";
private static final String BASE_PACKAGE = "base-package";
@Override
protected String resolveId(Element element, AbstractBeanDefinition definition, ParserContext parserContext) throws BeanDefinitionStoreException {
String id = super.resolveId(element, definition, parserContext);
return StringUtils.hasText(id) ? id : "mappingConverter";
}
@Override
protected AbstractBeanDefinition parseInternal(Element element, ParserContext parserContext) {
BeanDefinitionRegistry registry = parserContext.getRegistry();
String ctxRef = element.getAttribute("mapping-context-ref");
if (!StringUtils.hasText(ctxRef)) {
BeanDefinitionBuilder mappingContextBuilder = BeanDefinitionBuilder.genericBeanDefinition(MongoMappingContext.class);
Set<String> classesToAdd = getInititalEntityClasses(element, mappingContextBuilder);
if (classesToAdd != null) {
mappingContextBuilder.addPropertyValue("initialEntitySet", classesToAdd);
}
registry.registerBeanDefinition(MAPPING_CONTEXT, mappingContextBuilder.getBeanDefinition());
ctxRef = MAPPING_CONTEXT;
}
try {
registry.getBeanDefinition(POST_PROCESSOR);
} catch (NoSuchBeanDefinitionException ignored) {
BeanDefinitionBuilder postProcBuilder = BeanDefinitionBuilder.genericBeanDefinition(MappingContextAwareBeanPostProcessor.class);
postProcBuilder.addPropertyValue("mappingContextBeanName", ctxRef);
registry.registerBeanDefinition(POST_PROCESSOR, postProcBuilder.getBeanDefinition());
}
BeanDefinitionBuilder converterBuilder = BeanDefinitionBuilder.genericBeanDefinition(MappingMongoConverter.class);
converterBuilder.addConstructorArgReference(ctxRef);
// Need a reference to a Mongo instance
String mongoRef = element.getAttribute("mongo-ref");
converterBuilder.addPropertyReference("mongo", StringUtils.hasText(mongoRef) ? mongoRef : "mongo");
try {
registry.getBeanDefinition(INDEX_HELPER);
} catch (NoSuchBeanDefinitionException ignored) {
String templateRef = element.getAttribute("mongo-template-ref");
BeanDefinitionBuilder indexHelperBuilder = BeanDefinitionBuilder.genericBeanDefinition(MongoPersistentEntityIndexCreator.class);
indexHelperBuilder.addConstructorArgValue(new RuntimeBeanReference(ctxRef));
indexHelperBuilder.addConstructorArgValue(new RuntimeBeanReference(StringUtils.hasText(templateRef) ? templateRef : TEMPLATE));
registry.registerBeanDefinition(INDEX_HELPER, indexHelperBuilder.getBeanDefinition());
}
return converterBuilder.getBeanDefinition();
}
public Set<String> getInititalEntityClasses(Element element, BeanDefinitionBuilder builder) {
String basePackage = element.getAttribute(BASE_PACKAGE);
if (!StringUtils.hasText(basePackage)) {
return null;
}
ClassPathScanningCandidateComponentProvider componentProvider = new ClassPathScanningCandidateComponentProvider(false);
componentProvider.addIncludeFilter(new AnnotationTypeFilter(Document.class));
componentProvider.addIncludeFilter(new AnnotationTypeFilter(Persistent.class));
Set<String> classes = new ManagedSet<String>();
for (BeanDefinition candidate : componentProvider.findCandidateComponents(basePackage)) {
classes.add(candidate.getBeanClassName());
}
return classes;
}
}

View File

@@ -1,69 +0,0 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.mongodb.config;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.parsing.BeanComponentDefinition;
import org.springframework.beans.factory.parsing.CompositeComponentDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.document.mongodb.MongoAdmin;
import org.springframework.data.document.mongodb.monitor.*;
import org.springframework.util.StringUtils;
import org.w3c.dom.Element;
public class MongoJmxParser implements BeanDefinitionParser {
public BeanDefinition parse(Element element, ParserContext parserContext) {
String name = element.getAttribute("mongo-ref");
if (!StringUtils.hasText(name)) {
name = "mongo";
}
registerJmxComponents(name, element, parserContext);
return null;
}
protected void registerJmxComponents(String mongoRefName, Element element, ParserContext parserContext) {
Object eleSource = parserContext.extractSource(element);
CompositeComponentDefinition compositeDef = new CompositeComponentDefinition(element.getTagName(), eleSource);
createBeanDefEntry(AssertMetrics.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(BackgroundFlushingMetrics.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(BtreeIndexCounters.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(ConnectionMetrics.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(GlobalLockMetrics.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(MemoryMetrics.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(OperationCounters.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(ServerInfo.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(MongoAdmin.class, compositeDef, mongoRefName, eleSource, parserContext);
parserContext.registerComponent(compositeDef);
}
protected void createBeanDefEntry(Class<?> clazz, CompositeComponentDefinition compositeDef, String mongoRefName, Object eleSource, ParserContext parserContext) {
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(clazz);
builder.getRawBeanDefinition().setSource(eleSource);
builder.addConstructorArgReference(mongoRefName);
BeanDefinition assertDef = builder.getBeanDefinition();
String assertName = parserContext.getReaderContext().registerWithGeneratedName(assertDef);
compositeDef.addNestedComponent(new BeanComponentDefinition(assertDef, assertName));
}
}

View File

@@ -1,94 +0,0 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.mongodb.config;
import org.springframework.beans.factory.BeanDefinitionStoreException;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.xml.AbstractSingleBeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.document.mongodb.MongoFactoryBean;
import org.springframework.data.document.mongodb.MongoOptionsFactoryBean;
import org.springframework.util.StringUtils;
import org.springframework.util.xml.DomUtils;
import org.w3c.dom.Element;
/**
* Parser for &lt;mongo;gt; definitions. If no name
*
* @author Mark Pollack
*/
public class MongoParser extends AbstractSingleBeanDefinitionParser {
protected Class<?> getBeanClass(Element element) {
return MongoFactoryBean.class;
}
@Override
protected void doParse(Element element, ParserContext parserContext, BeanDefinitionBuilder builder) {
super.doParse(element, builder);
setPropertyValue(element, builder, "port", "port");
setPropertyValue(element, builder, "host", "host");
parseOptions(parserContext, element, builder);
}
/**
* Parses the options sub-element. Populates the given attribute factory with the proper attributes.
*
* @param element
* @param attrBuilder
* @return true if parsing actually occured, false otherwise
*/
private boolean parseOptions(ParserContext parserContext, Element element,
BeanDefinitionBuilder mongoBuilder) {
Element optionsElement = DomUtils.getChildElementByTagName(element, "options");
if (optionsElement == null)
return false;
BeanDefinitionBuilder optionsDefBuilder = BeanDefinitionBuilder.genericBeanDefinition(MongoOptionsFactoryBean.class);
setPropertyValue(optionsElement, optionsDefBuilder, "connectionsPerHost", "connectionsPerHost");
setPropertyValue(optionsElement, optionsDefBuilder, "threadsAllowedToBlockForConnectionMultiplier", "threadsAllowedToBlockForConnectionMultiplier");
setPropertyValue(optionsElement, optionsDefBuilder, "maxWaitTime", "maxWaitTime");
setPropertyValue(optionsElement, optionsDefBuilder, "connectTimeout", "connectTimeout");
setPropertyValue(optionsElement, optionsDefBuilder, "socketTimeout", "socketTimeout");
setPropertyValue(optionsElement, optionsDefBuilder, "autoConnectRetry", "autoConnectRetry");
mongoBuilder.addPropertyValue("mongoOptions", optionsDefBuilder.getBeanDefinition());
return true;
}
@Override
protected String resolveId(Element element, AbstractBeanDefinition definition, ParserContext parserContext)
throws BeanDefinitionStoreException {
String name = super.resolveId(element, definition, parserContext);
if (!StringUtils.hasText(name)) {
name = "mongo";
}
return name;
}
private void setPropertyValue(Element element, BeanDefinitionBuilder builder, String attrName, String propertyName) {
String attr = element.getAttribute(attrName);
if (StringUtils.hasText(attr)) {
builder.addPropertyValue(propertyName, attr);
}
}
}

View File

@@ -1,95 +0,0 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.mongodb.config;
import org.springframework.beans.factory.NoSuchBeanDefinitionException;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
import org.springframework.data.document.mongodb.config.SimpleMongoRepositoryConfiguration.MongoRepositoryConfiguration;
import org.springframework.data.mapping.model.MappingContext;
import org.springframework.data.repository.config.AbstractRepositoryConfigDefinitionParser;
import org.w3c.dom.Element;
/**
* {@link org.springframework.beans.factory.xml.BeanDefinitionParser} to create
* Mongo DB repositories from classpath scanning or manual definition.
*
* @author Oliver Gierke
*/
public class MongoRepositoryConfigParser
extends
AbstractRepositoryConfigDefinitionParser<SimpleMongoRepositoryConfiguration, MongoRepositoryConfiguration> {
private static final String MAPPING_CONTEXT_DEFAULT = MappingMongoConverterParser.MAPPING_CONTEXT;
/*
* (non-Javadoc)
*
* @see org.springframework.data.repository.config.
* AbstractRepositoryConfigDefinitionParser
* #getGlobalRepositoryConfigInformation(org.w3c.dom.Element)
*/
@Override
protected SimpleMongoRepositoryConfiguration getGlobalRepositoryConfigInformation(
Element element) {
return new SimpleMongoRepositoryConfiguration(element);
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.config.AbstractRepositoryConfigDefinitionParser#postProcessBeanDefinition(org.springframework.data.repository.config.SingleRepositoryConfigInformation, org.springframework.beans.factory.support.BeanDefinitionBuilder, org.springframework.beans.factory.support.BeanDefinitionRegistry, java.lang.Object)
*/
@Override
protected void postProcessBeanDefinition(
MongoRepositoryConfiguration context,
BeanDefinitionBuilder builder, BeanDefinitionRegistry registry, Object beanSource) {
builder.addPropertyReference("template", context.getMongoTemplateRef());
String mappingContextRef = getMappingContextReference(context, registry);
if (mappingContextRef != null) {
builder.addPropertyReference("mappingContext", mappingContextRef);
}
}
/**
* Returns the bean name of a {@link MappingContext} to be wired. Will inspect the namespace attribute first and if no
* config is found in that place it will try to lookup the default one. Will return {@literal null} if neither one is
* available.
*
* @param config
* @param registry
* @return
*/
private String getMappingContextReference(MongoRepositoryConfiguration config, BeanDefinitionRegistry registry) {
String contextRef = config.getMappingContextRef();
if (contextRef != null) {
return contextRef;
}
try {
registry.getBeanDefinition(MAPPING_CONTEXT_DEFAULT);
return MAPPING_CONTEXT_DEFAULT;
} catch(NoSuchBeanDefinitionException e) {
return null;
}
}
}

View File

@@ -1,41 +0,0 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.mongodb.config;
import org.springframework.beans.factory.xml.NamespaceHandlerSupport;
/**
* {@link org.springframework.beans.factory.xml.NamespaceHandler} for Mongo DB
* based repositories.
*
* @author Oliver Gierke
*/
public class MongoRepositoryNamespaceHandler extends NamespaceHandlerSupport {
/*
* (non-Javadoc)
*
* @see org.springframework.beans.factory.xml.NamespaceHandler#init()
*/
public void init() {
registerBeanDefinitionParser("repositories", new MongoRepositoryConfigParser());
registerBeanDefinitionParser("mapping-converter", new MappingMongoConverterParser());
registerBeanDefinitionParser("mongo", new MongoParser());
registerBeanDefinitionParser("jmx", new MongoJmxParser());
}
}

View File

@@ -1,216 +0,0 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.mongodb.config;
import org.springframework.data.document.mongodb.repository.MongoRepository;
import org.springframework.data.document.mongodb.repository.MongoRepositoryFactoryBean;
import org.springframework.data.repository.config.AutomaticRepositoryConfigInformation;
import org.springframework.data.repository.config.ManualRepositoryConfigInformation;
import org.springframework.data.repository.config.RepositoryConfig;
import org.springframework.data.repository.config.SingleRepositoryConfigInformation;
import org.springframework.util.StringUtils;
import org.w3c.dom.Element;
/**
* {@link RepositoryConfig} implementation to create
* {@link MongoRepositoryConfiguration} instances for both automatic and manual
* configuration.
*
* @author Oliver Gierke
*/
public class SimpleMongoRepositoryConfiguration extends RepositoryConfig<SimpleMongoRepositoryConfiguration.MongoRepositoryConfiguration, SimpleMongoRepositoryConfiguration> {
private static final String MONGO_TEMPLATE_REF = "mongo-template-ref";
private static final String DEFAULT_MONGO_TEMPLATE_REF = "mongoTemplate";
private static final String MAPPING_CONTEXT_REF = "mongo-mapping-context-ref";
/**
* Creates a new {@link SimpleMongoRepositoryConfiguration} for the given
* {@link Element}.
*
* @param repositoriesElement
*/
protected SimpleMongoRepositoryConfiguration(Element repositoriesElement) {
super(repositoriesElement, MongoRepositoryFactoryBean.class.getName());
}
/**
* Returns the bean name of the {@link org.springframework.data.document.mongodb.MongoTemplate} to be referenced.
*
* @return
*/
public String getMongoTemplateRef() {
String templateRef = getSource().getAttribute(MONGO_TEMPLATE_REF);
return StringUtils.hasText(templateRef) ? templateRef
: DEFAULT_MONGO_TEMPLATE_REF;
}
public String getMappingContextRef() {
String attribute = getSource().getAttribute(MAPPING_CONTEXT_REF);
return StringUtils.hasText(attribute) ? attribute : null;
}
/*
* (non-Javadoc)
*
* @see
* org.springframework.data.repository.config.GlobalRepositoryConfigInformation
* #getAutoconfigRepositoryInformation(java.lang.String)
*/
public MongoRepositoryConfiguration getAutoconfigRepositoryInformation(
String interfaceName) {
return new AutomaticMongoRepositoryConfiguration(interfaceName, this);
}
/*
* (non-Javadoc)
*
* @see
* org.springframework.data.repository.config.GlobalRepositoryConfigInformation
* #getRepositoryBaseInterface()
*/
public Class<?> getRepositoryBaseInterface() {
return MongoRepository.class;
}
/*
* (non-Javadoc)
*
* @see org.springframework.data.repository.config.RepositoryConfig#
* createSingleRepositoryConfigInformationFor(org.w3c.dom.Element)
*/
@Override
protected MongoRepositoryConfiguration createSingleRepositoryConfigInformationFor(
Element element) {
return new ManualMongoRepositoryConfiguration(element, this);
}
/**
* Simple interface for configuration values specific to Mongo repositories.
*
* @author Oliver Gierke
*/
public interface MongoRepositoryConfiguration
extends
SingleRepositoryConfigInformation<SimpleMongoRepositoryConfiguration> {
String getMongoTemplateRef();
String getMappingContextRef();
}
/**
* Implements manual lookup of the additional attributes.
*
* @author Oliver Gierke
*/
private static class ManualMongoRepositoryConfiguration
extends
ManualRepositoryConfigInformation<SimpleMongoRepositoryConfiguration>
implements MongoRepositoryConfiguration {
/**
* Creates a new {@link ManualMongoRepositoryConfiguration} for the
* given {@link Element} and parent.
*
* @param element
* @param parent
*/
public ManualMongoRepositoryConfiguration(Element element,
SimpleMongoRepositoryConfiguration parent) {
super(element, parent);
}
/*
* (non-Javadoc)
*
* @see org.springframework.data.document.mongodb.repository.config.
* SimpleMongoRepositoryConfiguration
* .MongoRepositoryConfiguration#getMongoTemplateRef()
*/
public String getMongoTemplateRef() {
return getAttribute(MONGO_TEMPLATE_REF);
}
/* (non-Javadoc)
* @see org.springframework.data.document.mongodb.config.SimpleMongoRepositoryConfiguration.MongoRepositoryConfiguration#getMappingContextRef()
*/
public String getMappingContextRef() {
return getAttribute(MAPPING_CONTEXT_REF);
}
}
/**
* Implements the lookup of the additional attributes during automatic
* configuration.
*
* @author Oliver Gierke
*/
private static class AutomaticMongoRepositoryConfiguration
extends
AutomaticRepositoryConfigInformation<SimpleMongoRepositoryConfiguration>
implements MongoRepositoryConfiguration {
/**
* Creates a new {@link AutomaticMongoRepositoryConfiguration} for the
* given interface and parent.
*
* @param interfaceName
* @param parent
*/
public AutomaticMongoRepositoryConfiguration(String interfaceName,
SimpleMongoRepositoryConfiguration parent) {
super(interfaceName, parent);
}
/*
* (non-Javadoc)
*
* @see org.springframework.data.document.mongodb.repository.config.
* SimpleMongoRepositoryConfiguration
* .MongoRepositoryConfiguration#getMongoTemplateRef()
*/
public String getMongoTemplateRef() {
return getParent().getMongoTemplateRef();
}
/* (non-Javadoc)
* @see org.springframework.data.document.mongodb.config.SimpleMongoRepositoryConfiguration.MongoRepositoryConfiguration#getMappingContextRef()
*/
public String getMappingContextRef() {
return getParent().getMappingContextRef();
}
}
}

View File

@@ -1,636 +0,0 @@
/*
* Copyright (c) 2011 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.mongodb.convert;
import static org.springframework.data.document.mongodb.convert.ObjectIdConverters.*;
import java.lang.reflect.Array;
import java.lang.reflect.InvocationTargetException;
import java.math.BigInteger;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
import java.util.Date;
import java.util.HashMap;
import java.util.LinkedHashMap;
import java.util.LinkedList;
import java.util.List;
import java.util.Map;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
import com.mongodb.DB;
import com.mongodb.DBObject;
import com.mongodb.DBRef;
import com.mongodb.Mongo;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.bson.types.ObjectId;
import org.springframework.beans.BeansException;
import org.springframework.beans.factory.InitializingBean;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.ApplicationContext;
import org.springframework.context.ApplicationContextAware;
import org.springframework.context.expression.BeanFactoryResolver;
import org.springframework.core.GenericTypeResolver;
import org.springframework.core.convert.ConversionService;
import org.springframework.core.convert.converter.Converter;
import org.springframework.core.convert.converter.ConverterFactory;
import org.springframework.core.convert.support.ConversionServiceFactory;
import org.springframework.core.convert.support.GenericConversionService;
import org.springframework.data.mapping.AssociationHandler;
import org.springframework.data.mapping.MappingBeanHelper;
import org.springframework.data.mapping.PropertyHandler;
import org.springframework.data.mapping.model.Association;
import org.springframework.data.mapping.model.MappingContext;
import org.springframework.data.mapping.model.MappingException;
import org.springframework.data.mapping.model.PersistentEntity;
import org.springframework.data.mapping.model.PersistentProperty;
import org.springframework.data.mapping.model.PreferredConstructor;
import org.springframework.expression.Expression;
import org.springframework.expression.spel.standard.SpelExpressionParser;
import org.springframework.expression.spel.support.StandardEvaluationContext;
/**
* {@link MongoConverter} that uses a {@link MappingContext} to do sophisticated mapping of domain objects to
* {@link DBObject}.
*
* @author Jon Brisbin <jbrisbin@vmware.com>
* @author Oliver Gierke
*/
public class MappingMongoConverter implements MongoConverter, ApplicationContextAware, InitializingBean {
private static final String CUSTOM_TYPE_KEY = "_class";
@SuppressWarnings({"unchecked"})
private static final List<Class<?>> MONGO_TYPES = Arrays.asList(Number.class, Date.class, String.class, DBObject.class);
private static final List<Class<?>> VALID_ID_TYPES = Arrays.asList(new Class<?>[]{ObjectId.class, String.class, BigInteger.class, byte[].class});
protected static final Log log = LogFactory.getLog(MappingMongoConverter.class);
protected final GenericConversionService conversionService = ConversionServiceFactory.createDefaultConversionService();
protected final Map<Class<?>, Class<?>> customTypeMapping = new HashMap<Class<?>, Class<?>>();
protected final MappingContext mappingContext;
protected SpelExpressionParser spelExpressionParser = new SpelExpressionParser();
protected ApplicationContext applicationContext;
protected boolean useFieldAccessOnly = true;
protected Mongo mongo;
protected String defaultDatabase;
/**
* Creates a new {@link MappingMongoConverter} with the given {@link MappingContext}.
*
* @param mappingContext
*/
public MappingMongoConverter(MappingContext mappingContext) {
this.mappingContext = mappingContext;
this.conversionService.removeConvertible(Object.class, String.class);
}
/**
* Add custom {@link Converter} or {@link ConverterFactory} instances to be used that will take presidence over
* metadata driven conversion between of objects to/from DBObject
*
* @param converters
*/
public void setConverters(List<Converter<?, ?>> converters) {
if (null != converters) {
for (Converter<?, ?> c : converters) {
registerConverter(c);
conversionService.addConverter(c);
}
}
}
/**
* Inspects the given {@link Converter} for the types it can convert and registers the pair for custom type conversion
* in case the target type is a Mongo basic type.
*
* @param converter
*/
private void registerConverter(Converter<?, ?> converter) {
Class<?>[] arguments = GenericTypeResolver.resolveTypeArguments(converter.getClass(), Converter.class);
if (MONGO_TYPES.contains(arguments[1])) {
customTypeMapping.put(arguments[0], arguments[1]);
}
}
public MappingContext getMappingContext() {
return mappingContext;
}
public Mongo getMongo() {
return mongo;
}
public void setMongo(Mongo mongo) {
this.mongo = mongo;
}
public String getDefaultDatabase() {
return defaultDatabase;
}
public void setDefaultDatabase(String defaultDatabase) {
this.defaultDatabase = defaultDatabase;
}
public boolean isUseFieldAccessOnly() {
return useFieldAccessOnly;
}
public void setUseFieldAccessOnly(boolean useFieldAccessOnly) {
this.useFieldAccessOnly = useFieldAccessOnly;
}
public <T> T convertObjectId(ObjectId id, Class<T> targetType) {
return conversionService.convert(id, targetType);
}
public ObjectId convertObjectId(Object id) {
return conversionService.convert(id, ObjectId.class);
}
@SuppressWarnings({"unchecked", "rawtypes"})
public <S extends Object> S read(Class<S> clazz, final DBObject dbo) {
if (null == dbo) {
return null;
}
if ((clazz.isArray()
|| (clazz.isAssignableFrom(Collection.class)
|| clazz.isAssignableFrom(List.class)))
&& dbo instanceof BasicDBList) {
List l = new ArrayList<S>();
BasicDBList dbList = (BasicDBList) dbo;
for (Object o : dbList) {
if (o instanceof DBObject) {
Object newObj = read(clazz.getComponentType(), (DBObject) o);
if (newObj.getClass().isAssignableFrom(clazz.getComponentType())) {
l.add(newObj);
} else {
l.add(conversionService.convert(newObj, clazz.getComponentType()));
}
} else {
l.add(o);
}
}
return conversionService.convert(l, clazz);
}
// Retrieve persistent entity info
PersistentEntity<S> persistentEntity = mappingContext.getPersistentEntity(clazz);
if (persistentEntity == null) {
persistentEntity = mappingContext.addPersistentEntity(clazz);
}
return read(persistentEntity, dbo);
}
private <S extends Object> S read(PersistentEntity<S> entity, final DBObject dbo) {
final StandardEvaluationContext spelCtx = new StandardEvaluationContext();
if (null != applicationContext) {
spelCtx.setBeanResolver(new BeanFactoryResolver(applicationContext));
}
String[] keySet = dbo.keySet().toArray(new String[]{});
for (String key : keySet) {
spelCtx.setVariable(key, dbo.get(key));
}
final List<String> ctorParamNames = new ArrayList<String>();
final S instance = MappingBeanHelper.constructInstance(entity, new PreferredConstructor.ParameterValueProvider() {
public Object getParameterValue(PreferredConstructor.Parameter parameter) {
String name = parameter.getName();
Class<?> type = parameter.getType();
Object obj = dbo.get(name);
if (obj instanceof DBRef) {
ctorParamNames.add(name);
return read(type, ((DBRef) obj).fetch());
} else if (obj instanceof DBObject) {
ctorParamNames.add(name);
return read(type, ((DBObject) obj));
} else if (null != obj && obj.getClass().isAssignableFrom(type)) {
ctorParamNames.add(name);
return obj;
} else if (null != obj) {
ctorParamNames.add(name);
return conversionService.convert(obj, type);
}
return null;
}
}, spelCtx);
// Set the ID
PersistentProperty idProperty = entity.getIdProperty();
if (dbo.containsField("_id") && null != idProperty) {
Object idObj = dbo.get("_id");
try {
MappingBeanHelper.setProperty(instance, idProperty, idObj, useFieldAccessOnly);
} catch (IllegalAccessException e) {
throw new MappingException(e.getMessage(), e);
} catch (InvocationTargetException e) {
throw new MappingException(e.getMessage(), e);
}
}
// Set properties not already set in the constructor
entity.doWithProperties(new PropertyHandler() {
public void doWithPersistentProperty(PersistentProperty prop) {
if (ctorParamNames.contains(prop.getName())) {
return;
}
Object obj = getValueInternal(prop, dbo, spelCtx, prop.getValueAnnotation());
try {
MappingBeanHelper.setProperty(instance, prop, obj, useFieldAccessOnly);
} catch (IllegalAccessException e) {
throw new MappingException(e.getMessage(), e);
} catch (InvocationTargetException e) {
throw new MappingException(e.getMessage(), e);
}
}
});
// Handle associations
entity.doWithAssociations(new AssociationHandler() {
public void doWithAssociation(Association association) {
PersistentProperty inverseProp = association.getInverse();
Object obj = getValueInternal(inverseProp, dbo, spelCtx, inverseProp.getValueAnnotation());
try {
MappingBeanHelper.setProperty(instance, inverseProp, obj);
} catch (IllegalAccessException e) {
throw new MappingException(e.getMessage(), e);
} catch (InvocationTargetException e) {
throw new MappingException(e.getMessage(), e);
}
}
});
return instance;
}
public void write(final Object obj, final DBObject dbo) {
if (null == obj) {
return;
}
PersistentEntity<?> entity = mappingContext.getPersistentEntity(obj.getClass());
write(obj, dbo, entity);
}
protected void write(final Object obj, final DBObject dbo, PersistentEntity<?> entity) {
if (obj == null) {
return;
}
if (null == entity) {
// Must not have explictly added this entity yet
entity = mappingContext.addPersistentEntity(obj.getClass());
if (null == entity) {
// We can't map this entity for some reason
throw new MappingException("Unable to map entity " + obj);
}
}
// Write the ID
final PersistentProperty idProperty = entity.getIdProperty();
if (!dbo.containsField("_id") && null != idProperty) {
Object idObj = null;
try {
idObj = MappingBeanHelper.getProperty(obj, idProperty, Object.class, useFieldAccessOnly);
} catch (IllegalAccessException e) {
throw new MappingException(e.getMessage(), e);
} catch (InvocationTargetException e) {
throw new MappingException(e.getMessage(), e);
}
if (null != idObj) {
dbo.put("_id", idObj);
} else {
if (!VALID_ID_TYPES.contains(idProperty.getType())) {
throw new MappingException("Invalid data type " + idProperty.getType().getName() + " for Id property. Should be one of " + VALID_ID_TYPES);
}
}
}
// Write the properties
entity.doWithProperties(new PropertyHandler() {
public void doWithPersistentProperty(PersistentProperty prop) {
String name = prop.getName();
Class<?> type = prop.getType();
Object propertyObj = null;
try {
propertyObj = MappingBeanHelper.getProperty(obj, prop, type, useFieldAccessOnly);
} catch (IllegalAccessException e) {
throw new MappingException(e.getMessage(), e);
} catch (InvocationTargetException e) {
throw new MappingException(e.getMessage(), e);
}
if (null != propertyObj) {
if (!MappingBeanHelper.isSimpleType(propertyObj.getClass())) {
writePropertyInternal(prop, propertyObj, dbo);
} else {
dbo.put(name, propertyObj);
}
}
}
});
entity.doWithAssociations(new AssociationHandler() {
public void doWithAssociation(Association association) {
PersistentProperty inverseProp = association.getInverse();
Class<?> type = inverseProp.getType();
Object propertyObj = null;
try {
propertyObj = MappingBeanHelper.getProperty(obj, inverseProp, type, useFieldAccessOnly);
} catch (IllegalAccessException e) {
throw new MappingException(e.getMessage(), e);
} catch (InvocationTargetException e) {
throw new MappingException(e.getMessage(), e);
}
if (null != propertyObj) {
writePropertyInternal(inverseProp, propertyObj, dbo);
}
}
});
}
public void setApplicationContext(ApplicationContext applicationContext) throws BeansException {
this.applicationContext = applicationContext;
}
/**
* Registers converters for {@link ObjectId} handling, removes plain {@link #toString()} converter and promotes the
* configured {@link ConversionService} to {@link MappingBeanHelper}.
*/
private void initializeConverters() {
if (!conversionService.canConvert(ObjectId.class, String.class)) {
conversionService.addConverter(ObjectIdToStringConverter.INSTANCE);
}
if (!conversionService.canConvert(String.class, ObjectId.class)) {
conversionService.addConverter(StringToObjectIdConverter.INSTANCE);
}
if (!conversionService.canConvert(ObjectId.class, BigInteger.class)) {
conversionService.addConverter(ObjectIdToBigIntegerConverter.INSTANCE);
}
if (!conversionService.canConvert(BigInteger.class, ObjectId.class)) {
conversionService.addConverter(BigIntegerToObjectIdConverter.INSTANCE);
}
MappingBeanHelper.setConversionService(conversionService);
}
@SuppressWarnings({"unchecked"})
protected void writePropertyInternal(PersistentProperty prop, Object obj, DBObject dbo) {
org.springframework.data.document.mongodb.mapping.DBRef dbref = prop.getField()
.getAnnotation(org.springframework.data.document.mongodb.mapping.DBRef.class);
String name = prop.getName();
Class<?> type = prop.getType();
if (prop.isCollection()) {
BasicDBList dbList = new BasicDBList();
Collection<?> coll;
if (type.isArray()) {
coll = new ArrayList<Object>();
for (Object o : (Object[]) obj) {
((List) coll).add(o);
}
} else {
coll = (Collection<?>) obj;
}
for (Object propObjItem : coll) {
if (null != dbref) {
DBRef dbRef = createDBRef(propObjItem, dbref);
dbList.add(dbRef);
} else if (type.isArray() && MappingBeanHelper.isSimpleType(type.getComponentType())) {
dbList.add(propObjItem);
} else if (MappingBeanHelper.isSimpleType(propObjItem.getClass())) {
dbList.add(propObjItem);
} else {
BasicDBObject propDbObj = new BasicDBObject();
write(propObjItem, propDbObj, mappingContext.getPersistentEntity(prop.getTypeInformation()));
dbList.add(propDbObj);
}
}
dbo.put(name, dbList);
return;
}
if (null != obj && obj instanceof Map) {
BasicDBObject mapDbObj = new BasicDBObject();
writeMapInternal((Map<Object, Object>) obj, mapDbObj);
dbo.put(name, mapDbObj);
return;
}
if (null != dbref) {
DBRef dbRefObj = createDBRef(obj, dbref);
if (null != dbRefObj) {
dbo.put(name, dbRefObj);
return;
}
}
// Lookup potential custom target type
Class<?> basicTargetType = customTypeMapping.get(obj.getClass());
if (basicTargetType != null) {
dbo.put(name, conversionService.convert(obj, basicTargetType));
return;
}
BasicDBObject propDbObj = new BasicDBObject();
write(obj, propDbObj, mappingContext.getPersistentEntity(prop.getTypeInformation()));
dbo.put(name, propDbObj);
}
protected void writeMapInternal(Map<Object, Object> obj, DBObject dbo) {
for (Map.Entry<Object, Object> entry : obj.entrySet()) {
Object key = entry.getKey();
Object val = entry.getValue();
if (MappingBeanHelper.isSimpleType(key.getClass())) {
String simpleKey = conversionService.convert(key, String.class);
if (MappingBeanHelper.isSimpleType(val.getClass())) {
dbo.put(simpleKey, val);
} else {
DBObject newDbo = new BasicDBObject();
Class<?> componentType = val.getClass();
if (componentType.isArray()
|| componentType.isAssignableFrom(Collection.class)
|| componentType.isAssignableFrom(List.class)) {
Class<?> ctype = val.getClass().getComponentType();
dbo.put("_class", (null != ctype ? ctype.getName() : componentType.getName()));
} else {
dbo.put("_class", componentType.getName());
}
write(val, newDbo);
dbo.put(simpleKey, newDbo);
}
} else {
throw new MappingException("Cannot use a complex object as a key value.");
}
}
}
protected DBRef createDBRef(Object target, org.springframework.data.document.mongodb.mapping.DBRef dbref) {
PersistentEntity<?> targetEntity = mappingContext.getPersistentEntity(target.getClass());
if (null == targetEntity || null == targetEntity.getIdProperty()) {
return null;
}
PersistentProperty idProperty = targetEntity.getIdProperty();
ObjectId id = null;
try {
id = MappingBeanHelper.getProperty(target, idProperty, ObjectId.class, useFieldAccessOnly);
if (null == id) {
throw new MappingException("Cannot create a reference to an object with a NULL id.");
}
} catch (IllegalAccessException e) {
throw new MappingException(e.getMessage(), e);
} catch (InvocationTargetException e) {
throw new MappingException(e.getMessage(), e);
}
String collection = dbref.collection();
if ("".equals(collection)) {
collection = targetEntity.getType().getSimpleName().toLowerCase();
}
String dbname = dbref.db();
if ("".equals(dbname)) {
dbname = defaultDatabase;
}
DB db = mongo.getDB(dbname);
return new DBRef(db, collection, id);
}
@SuppressWarnings({"unchecked"})
protected Object getValueInternal(PersistentProperty prop, DBObject dbo, StandardEvaluationContext ctx, Value spelExpr) {
String name = prop.getName();
Object o;
if (null != spelExpr) {
Expression x = spelExpressionParser.parseExpression(spelExpr.value());
o = x.getValue(ctx);
} else {
DBObject from = dbo;
if (dbo instanceof DBRef) {
from = ((DBRef) dbo).fetch();
}
Object dbObj = from.get(name);
if (dbObj instanceof DBObject) {
if (prop.isMap() && dbObj instanceof DBObject) {
// We have to find a potentially stored class to be used first.
Class<?> toType = findTypeToBeUsed((DBObject) dbObj);
Map<String, Object> m = new LinkedHashMap<String, Object>();
for (Map.Entry<String, Object> entry : ((Map<String, Object>) ((DBObject) dbObj).toMap()).entrySet()) {
if (entry.getKey().equals(CUSTOM_TYPE_KEY)) {
continue;
}
if (null != entry.getValue() && entry.getValue() instanceof DBObject) {
m.put(entry.getKey(), read((null != toType ? toType : prop.getMapValueType()), (DBObject) entry.getValue()));
} else {
m.put(entry.getKey(), entry.getValue());
}
}
return m;
} else if (prop.isArray() && dbObj instanceof BasicDBObject && ((DBObject) dbObj).keySet().size() == 0) {
// It's empty
return Array.newInstance(prop.getComponentType(), 0);
} else if (prop.isCollection() && dbObj instanceof BasicDBList) {
BasicDBList dbObjList = (BasicDBList) dbObj;
Object[] items = (Object[]) Array.newInstance(prop.getComponentType(), dbObjList.size());
for (int i = 0; i < dbObjList.size(); i++) {
Object dbObjItem = dbObjList.get(i);
if (dbObjItem instanceof DBRef) {
items[i] = read(prop.getComponentType(), ((DBRef) dbObjItem).fetch());
} else if (dbObjItem instanceof DBObject) {
items[i] = read(prop.getComponentType(), (DBObject) dbObjItem);
} else {
items[i] = dbObjItem;
}
}
List<Object> itemsToReturn = new LinkedList<Object>();
for (Object obj : items) {
itemsToReturn.add(obj);
}
return itemsToReturn;
}
Class<?> toType = findTypeToBeUsed((DBObject) dbObj);
// It's a complex object, have to read it in
if (toType != null) {
dbo.removeField(CUSTOM_TYPE_KEY);
o = read(toType, (DBObject) dbObj);
} else {
o = read(mappingContext.getPersistentEntity(prop.getTypeInformation()), (DBObject) dbObj);
}
} else {
o = dbObj;
}
}
return o;
}
/**
* Returns the type to be used to convert the DBObject given to.
*
* @param dbObject
* @return
*/
protected Class<?> findTypeToBeUsed(DBObject dbObject) {
Object classToBeUsed = dbObject.get(CUSTOM_TYPE_KEY);
if (classToBeUsed == null) {
return null;
}
try {
return Class.forName(classToBeUsed.toString());
} catch (ClassNotFoundException e) {
throw new MappingException(e.getMessage(), e);
}
}
public void afterPropertiesSet() {
initializeConverters();
}
protected class PersistentPropertyWrapper {
private final PersistentProperty property;
private final DBObject target;
public PersistentPropertyWrapper(PersistentProperty property, DBObject target) {
this.property = property;
this.target = target;
}
public PersistentProperty getProperty() {
return property;
}
public DBObject getTarget() {
return target;
}
}
}

View File

@@ -1,94 +0,0 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.mongodb.convert;
import static org.springframework.beans.PropertyAccessorFactory.forBeanPropertyAccess;
import static org.springframework.beans.PropertyAccessorFactory.forDirectFieldAccess;
import org.springframework.beans.BeanWrapper;
import org.springframework.beans.ConfigurablePropertyAccessor;
import org.springframework.beans.NotWritablePropertyException;
import org.springframework.core.convert.ConversionService;
import org.springframework.data.document.mongodb.MongoPropertyDescriptors;
import org.springframework.data.document.mongodb.MongoPropertyDescriptors.MongoPropertyDescriptor;
import org.springframework.util.Assert;
/**
* Custom Mongo specific {@link BeanWrapper} to allow access to bean properties via {@link MongoPropertyDescriptor}s.
*
* @author Oliver Gierke
*/
class MongoBeanWrapper {
private final ConfigurablePropertyAccessor accessor;
private final MongoPropertyDescriptors descriptors;
private final boolean fieldAccess;
/**
* Creates a new {@link MongoBeanWrapper} for the given target object and {@link ConversionService}.
*
* @param target
* @param conversionService
* @param fieldAccess
*/
public MongoBeanWrapper(Object target, ConversionService conversionService, boolean fieldAccess) {
Assert.notNull(target);
Assert.notNull(conversionService);
this.fieldAccess = fieldAccess;
this.accessor = fieldAccess ? forDirectFieldAccess(target) : forBeanPropertyAccess(target);
this.accessor.setConversionService(conversionService);
this.descriptors = new MongoPropertyDescriptors(target.getClass());
}
/**
* Returns all {@link MongoPropertyDescriptors.MongoPropertyDescriptor}s for the underlying target object.
*
* @return
*/
public MongoPropertyDescriptors getDescriptors() {
return this.descriptors;
}
/**
* Returns the value of the underlying object for the given property.
*
* @param descriptor
* @return
*/
public Object getValue(MongoPropertyDescriptors.MongoPropertyDescriptor descriptor) {
Assert.notNull(descriptor);
return accessor.getPropertyValue(descriptor.getName());
}
/**
* Sets the property of the underlying object to the given value.
*
* @param descriptor
* @param value
*/
public void setValue(MongoPropertyDescriptors.MongoPropertyDescriptor descriptor, Object value) {
Assert.notNull(descriptor);
try {
accessor.setPropertyValue(descriptor.getName(), value);
} catch (NotWritablePropertyException e) {
if (!fieldAccess) {
throw e;
}
}
}
}

View File

@@ -1,43 +0,0 @@
/*
* Copyright 2010-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.mongodb.convert;
import org.bson.types.ObjectId;
import org.springframework.data.document.mongodb.MongoReader;
import org.springframework.data.document.mongodb.MongoWriter;
public interface MongoConverter extends MongoWriter<Object>, MongoReader<Object> {
/**
* Converts the given {@link ObjectId} to the given target type.
*
* @param <T> the actual type to create
* @param id the source {@link ObjectId}
* @param targetType the target type to convert the {@link ObjectId} to
* @return
*/
public <T> T convertObjectId(ObjectId id, Class<T> targetType);
/**
* Returns the {@link ObjectId} instance for the given id.
*
* @param id
* @return
*/
public ObjectId convertObjectId(Object id);
}

View File

@@ -1,88 +0,0 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.mongodb.convert;
import java.math.BigInteger;
import org.bson.types.ObjectId;
import org.springframework.core.convert.converter.Converter;
/**
* Wrapper class to contain useful {@link ObjectId}-to-something-and-back converters.
*
* @author Oliver Gierke
*/
abstract class ObjectIdConverters {
/**
* Private constructor to prevent instantiation.
*/
private ObjectIdConverters() {
}
/**
* Simple singleton to convert {@link ObjectId}s to their {@link String} representation.
*
* @author Oliver Gierke
*/
public static enum ObjectIdToStringConverter implements Converter<ObjectId, String> {
INSTANCE;
public String convert(ObjectId id) {
return id.toString();
}
}
/**
* Simple singleton to convert {@link String}s to their {@link ObjectId} representation.
*
* @author Oliver Gierke
*/
public static enum StringToObjectIdConverter implements Converter<String, ObjectId> {
INSTANCE;
public ObjectId convert(String source) {
return new ObjectId(source);
}
}
/**
* Simple singleton to convert {@link ObjectId}s to their {@link java.math.BigInteger} representation.
*
* @author Oliver Gierke
*/
public static enum ObjectIdToBigIntegerConverter implements Converter<ObjectId, BigInteger> {
INSTANCE;
public BigInteger convert(ObjectId source) {
return new BigInteger(source.toString(), 16);
}
}
/**
* Simple singleton to convert {@link BigInteger}s to their {@link ObjectId} representation.
*
* @author Oliver Gierke
*/
public static enum BigIntegerToObjectIdConverter implements Converter<BigInteger, ObjectId> {
INSTANCE;
public ObjectId convert(BigInteger source) {
return new ObjectId(source.toString(16));
}
}
}

View File

@@ -1,541 +0,0 @@
/*
* Copyright 2010-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.mongodb.convert;
import static org.springframework.data.document.mongodb.convert.ObjectIdConverters.*;
import java.lang.reflect.*;
import java.math.BigInteger;
import java.util.*;
import java.util.Map.Entry;
import java.util.regex.Pattern;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
import com.mongodb.DBRef;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.bson.types.CodeWScope;
import org.bson.types.ObjectId;
import org.springframework.beans.BeanUtils;
import org.springframework.beans.factory.InitializingBean;
import org.springframework.core.CollectionFactory;
import org.springframework.core.convert.ConversionFailedException;
import org.springframework.core.convert.ConversionService;
import org.springframework.core.convert.converter.Converter;
import org.springframework.core.convert.converter.ConverterFactory;
import org.springframework.core.convert.support.ConversionServiceFactory;
import org.springframework.core.convert.support.GenericConversionService;
import org.springframework.data.document.mongodb.MongoPropertyDescriptors.MongoPropertyDescriptor;
import org.springframework.util.Assert;
import org.springframework.util.comparator.CompoundComparator;
/**
* Basic {@link MongoConverter} implementation to convert between domain classes and {@link DBObject}s.
*
* @author Mark Pollack
* @author Thomas Risberg
* @author Oliver Gierke
*/
public class SimpleMongoConverter implements MongoConverter, InitializingBean {
private static final Log LOG = LogFactory.getLog(SimpleMongoConverter.class);
@SuppressWarnings("unchecked")
private static final List<Class<?>> MONGO_TYPES = Arrays.asList(Number.class, Date.class, String.class, DBObject.class);
private static final Set<String> SIMPLE_TYPES;
static {
Set<String> basics = new HashSet<String>();
basics.add(boolean.class.getName());
basics.add(long.class.getName());
basics.add(short.class.getName());
basics.add(int.class.getName());
basics.add(byte.class.getName());
basics.add(float.class.getName());
basics.add(double.class.getName());
basics.add(char.class.getName());
basics.add(Boolean.class.getName());
basics.add(Long.class.getName());
basics.add(Short.class.getName());
basics.add(Integer.class.getName());
basics.add(Byte.class.getName());
basics.add(Float.class.getName());
basics.add(Double.class.getName());
basics.add(Character.class.getName());
basics.add(String.class.getName());
basics.add(java.util.Date.class.getName());
// basics.add(Time.class.getName());
// basics.add(Timestamp.class.getName());
// basics.add(java.sql.Date.class.getName());
// basics.add(BigDecimal.class.getName());
// basics.add(BigInteger.class.getName());
basics.add(Locale.class.getName());
// basics.add(Calendar.class.getName());
// basics.add(GregorianCalendar.class.getName());
// basics.add(java.util.Currency.class.getName());
// basics.add(TimeZone.class.getName());
// basics.add(Object.class.getName());
basics.add(Class.class.getName());
// basics.add(byte[].class.getName());
// basics.add(Byte[].class.getName());
// basics.add(char[].class.getName());
// basics.add(Character[].class.getName());
// basics.add(Blob.class.getName());
// basics.add(Clob.class.getName());
// basics.add(Serializable.class.getName());
// basics.add(URI.class.getName());
// basics.add(URL.class.getName());
basics.add(DBRef.class.getName());
basics.add(Pattern.class.getName());
basics.add(CodeWScope.class.getName());
basics.add(ObjectId.class.getName());
basics.add(Enum.class.getName());
SIMPLE_TYPES = Collections.unmodifiableSet(basics);
}
private final GenericConversionService conversionService;
/**
* Creates a {@link SimpleMongoConverter}.
*/
public SimpleMongoConverter() {
this.conversionService = ConversionServiceFactory.createDefaultConversionService();
this.conversionService.removeConvertible(Object.class, String.class);
}
/**
* Initializes additional converters that handle {@link ObjectId} conversion. Will register converters for supported
* id types if none are registered for those conversion already. {@link GenericConversionService} is configured.
*/
private void initializeConverters() {
if (!conversionService.canConvert(ObjectId.class, String.class)) {
conversionService.addConverter(ObjectIdToStringConverter.INSTANCE);
}
if (!conversionService.canConvert(String.class, ObjectId.class)) {
conversionService.addConverter(StringToObjectIdConverter.INSTANCE);
}
if (!conversionService.canConvert(ObjectId.class, BigInteger.class)) {
conversionService.addConverter(ObjectIdToBigIntegerConverter.INSTANCE);
}
if (!conversionService.canConvert(BigInteger.class, ObjectId.class)) {
conversionService.addConverter(BigIntegerToObjectIdConverter.INSTANCE);
}
}
/**
* Add custom {@link Converter} or {@link ConverterFactory} instances to be used that will take presidence over
* using object traversal to convert and object to/from DBObject
*
* @param converters
*/
public void setConverters(Set<?> converters) {
for (Object converter : converters) {
boolean added = false;
if (converter instanceof Converter) {
this.conversionService.addConverter((Converter<?, ?>) converter);
added = true;
}
if (converter instanceof ConverterFactory) {
this.conversionService.addConverterFactory((ConverterFactory<?, ?>) converter);
added = true;
}
if (!added) {
throw new IllegalArgumentException("Given set contains element that is neither Converter nor ConverterFactory!");
}
}
}
/*
* (non-Javadoc)
*
* @see org.springframework.data.document.mongodb.MongoWriter#write(java.lang.Object, com.mongodb.DBObject)
*/
@SuppressWarnings("rawtypes")
public void write(Object obj, DBObject dbo) {
MongoBeanWrapper beanWrapper = createWrapper(obj, false);
for (MongoPropertyDescriptor descriptor : beanWrapper.getDescriptors()) {
if (descriptor.isMappable()) {
Object value = beanWrapper.getValue(descriptor);
if (value == null) {
continue;
}
String keyToUse = descriptor.getKeyToMap();
if (descriptor.isEnum()) {
writeValue(dbo, keyToUse, ((Enum) value).name());
} else if (descriptor.isIdProperty() && descriptor.isOfIdType()) {
if (value instanceof String && ObjectId.isValid((String) value)) {
try {
writeValue(dbo, keyToUse, conversionService.convert(value, ObjectId.class));
} catch (ConversionFailedException iae) {
LOG.warn("Unable to convert the String " + value + " to an ObjectId");
writeValue(dbo, keyToUse, value);
}
} else {
// we can't convert this id - use as is
writeValue(dbo, keyToUse, value);
}
} else {
writeValue(dbo, keyToUse, value);
}
} else {
if (!"class".equals(descriptor.getName())) {
LOG.debug("Skipping property " + descriptor.getName() + " as it's not a mappable one.");
}
}
}
}
/**
* Writes the given value to the given {@link DBObject}. Will skip {@literal null} values.
*
* @param dbo
* @param keyToUse
* @param value
*/
private void writeValue(DBObject dbo, String keyToUse, Object value) {
if (!isSimpleType(value.getClass())) {
writeCompoundValue(dbo, keyToUse, value);
} else {
dbo.put(keyToUse, value);
}
}
/**
* Writes the given {@link CompoundComparator} value to the given {@link DBObject}.
*
* @param dbo
* @param keyToUse
* @param value
*/
@SuppressWarnings("unchecked")
private void writeCompoundValue(DBObject dbo, String keyToUse, Object value) {
if (value instanceof Map) {
writeMap(dbo, keyToUse, (Map<String, Object>) value);
return;
}
if (value instanceof Collection) {
// Should write a collection!
writeArray(dbo, keyToUse, ((Collection<Object>) value).toArray());
return;
}
if (value instanceof Object[]) {
// Should write an array!
writeArray(dbo, keyToUse, (Object[]) value);
return;
}
Class<?> customTargetType = getCustomTargetType(value);
if (customTargetType != null) {
dbo.put(keyToUse, conversionService.convert(value, customTargetType));
return;
}
DBObject nestedDbo = new BasicDBObject();
write(value, nestedDbo);
dbo.put(keyToUse, nestedDbo);
}
/**
* Returns whether the {@link ConversionService} has a custom {@link Converter} registered that can convert the given
* object into one of the types supported by MongoDB.
*
* @param obj
* @return
*/
private Class<?> getCustomTargetType(Object obj) {
for (Class<?> mongoType : MONGO_TYPES) {
if (conversionService.canConvert(obj.getClass(), mongoType)) {
return mongoType;
}
}
return null;
}
/**
* Writes the given {@link Map} to the given {@link DBObject}.
*
* @param dbo
* @param mapKey
* @param map
*/
protected void writeMap(DBObject dbo, String mapKey, Map<String, Object> map) {
// TODO support non-string based keys as long as there is a Spring Converter obj->string and (optionally)
// string->obj
DBObject dboToPopulate = null;
// TODO - Does that make sense? If we create a new object here it's content will never make it out of this
// method
if (mapKey != null) {
dboToPopulate = new BasicDBObject();
} else {
dboToPopulate = dbo;
}
if (map != null) {
for (Entry<String, Object> entry : map.entrySet()) {
Object entryValue = entry.getValue();
String entryKey = entry.getKey();
if (!isSimpleType(entryValue.getClass())) {
writeCompoundValue(dboToPopulate, entryKey, entryValue);
} else {
dboToPopulate.put(entryKey, entryValue);
}
}
dbo.put(mapKey, dboToPopulate);
}
}
/**
* Writes the given array to the given {@link DBObject}.
*
* @param dbo
* @param keyToUse
* @param array
*/
protected void writeArray(DBObject dbo, String keyToUse, Object[] array) {
Object[] dboValues;
if (array != null) {
dboValues = new Object[array.length];
int i = 0;
for (Object o : array) {
if (!isSimpleType(o.getClass())) {
DBObject dboValue = new BasicDBObject();
write(o, dboValue);
dboValues[i] = dboValue;
} else {
dboValues[i] = o;
}
i++;
}
dbo.put(keyToUse, dboValues);
}
}
/*
* (non-Javadoc)
*
* @see org.springframework.data.document.mongodb.MongoReader#read(java.lang.Class, com.mongodb.DBObject)
*/
public <S> S read(Class<S> clazz, DBObject source) {
if (source == null) {
return null;
}
Assert.notNull(clazz, "Mapped class was not specified");
S target = BeanUtils.instantiateClass(clazz);
MongoBeanWrapper bw = new MongoBeanWrapper(target, conversionService, true);
for (MongoPropertyDescriptor descriptor : bw.getDescriptors()) {
String keyToUse = descriptor.getKeyToMap();
if (source.containsField(keyToUse)) {
if (descriptor.isMappable()) {
Object value = source.get(keyToUse);
if (!isSimpleType(value.getClass())) {
if (value instanceof Object[]) {
bw.setValue(descriptor, readCollection(descriptor, Arrays.asList((Object[]) value))
.toArray());
} else if (value instanceof BasicDBList) {
bw.setValue(descriptor, readCollection(descriptor, (BasicDBList) value));
} else if (value instanceof DBObject) {
bw.setValue(descriptor, readCompoundValue(descriptor, (DBObject) value));
} else {
LOG.warn("Unable to map compound DBObject field " + keyToUse + " to property "
+ descriptor.getName()
+ ". The field value should have been a 'DBObject.class' but was "
+ value.getClass().getName());
}
} else {
bw.setValue(descriptor, value);
}
} else {
LOG.warn("Unable to map DBObject field " + keyToUse + " to property " + descriptor.getName()
+ ". Skipping.");
}
}
}
return target;
}
/**
* Reads the given collection values (that are {@link DBObject}s potentially) into a {@link Collection} of domain
* objects.
*
* @param descriptor
* @param values
* @return
*/
private Collection<Object> readCollection(MongoPropertyDescriptor descriptor, Collection<?> values) {
Class<?> targetCollectionType = descriptor.getPropertyType();
boolean targetIsArray = targetCollectionType.isArray();
@SuppressWarnings("unchecked")
Collection<Object> result = targetIsArray ? new ArrayList<Object>(values.size()) : CollectionFactory
.createCollection(targetCollectionType, values.size());
for (Object o : values) {
if (o instanceof DBObject) {
Class<?> type;
if (targetIsArray) {
type = targetCollectionType.getComponentType();
} else {
type = getGenericParameters(descriptor.getTypeToSet()).get(0);
}
result.add(read(type, (DBObject) o));
} else {
result.add(o);
}
}
return result;
}
/**
* Reads a compound value from the given {@link DBObject} for the given property.
*
* @param pd
* @param dbo
* @return
*/
private Object readCompoundValue(MongoPropertyDescriptor pd, DBObject dbo) {
Assert.isTrue(!pd.isCollection(), "Collections not supported!");
if (pd.isMap()) {
return readMap(pd, dbo, getGenericParameters(pd.getTypeToSet()).get(1));
} else {
return read(pd.getPropertyType(), dbo);
}
}
/**
* Create a {@link Map} instance. Will return a {@link HashMap} by default. Subclasses might want to override this
* method to use a custom {@link Map} implementation.
*
* @return
*/
protected Map<String, Object> createMap() {
return new HashMap<String, Object>();
}
/**
* Reads every key/value pair from the {@link DBObject} into a {@link Map} instance.
*
* @param pd
* @param dbo
* @param targetType
* @return
*/
protected Map<?, ?> readMap(MongoPropertyDescriptor pd, DBObject dbo, Class<?> targetType) {
Map<String, Object> map = createMap();
for (String key : dbo.keySet()) {
Object value = dbo.get(key);
if (!isSimpleType(value.getClass())) {
map.put(key, read(targetType, (DBObject) value));
// Can do some reflection tricks here -
// throw new RuntimeException("User types not supported yet as values for Maps");
} else {
map.put(key, conversionService.convert(value, targetType));
}
}
return map;
}
protected static boolean isSimpleType(Class<?> propertyType) {
if (propertyType == null) {
return false;
}
if (propertyType.isArray()) {
return isSimpleType(propertyType.getComponentType());
}
return SIMPLE_TYPES.contains(propertyType.getName());
}
/**
* Callback to allow customizing creation of a {@link MongoBeanWrapper}.
*
* @param target the target object to wrap
* @param fieldAccess whether to use field access or property access
* @return
*/
protected MongoBeanWrapper createWrapper(Object target, boolean fieldAccess) {
return new MongoBeanWrapper(target, conversionService, fieldAccess);
}
public List<Class<?>> getGenericParameters(Type genericParameterType) {
List<Class<?>> actualGenericParameterTypes = new ArrayList<Class<?>>();
if (genericParameterType instanceof ParameterizedType) {
ParameterizedType aType = (ParameterizedType) genericParameterType;
Type[] parameterArgTypes = aType.getActualTypeArguments();
for (Type parameterArgType : parameterArgTypes) {
if (parameterArgType instanceof GenericArrayType) {
Class<?> arrayType = (Class<?>) ((GenericArrayType) parameterArgType).getGenericComponentType();
actualGenericParameterTypes.add(Array.newInstance(arrayType, 0).getClass());
} else {
if (parameterArgType instanceof ParameterizedType) {
ParameterizedType paramTypeArgs = (ParameterizedType) parameterArgType;
actualGenericParameterTypes.add((Class<?>) paramTypeArgs.getRawType());
} else {
if (parameterArgType instanceof TypeVariable) {
throw new RuntimeException("Can not map " + ((TypeVariable<?>) parameterArgType).getName());
} else {
if (parameterArgType instanceof Class) {
actualGenericParameterTypes.add((Class<?>) parameterArgType);
} else {
throw new RuntimeException("Can not map " + parameterArgType);
}
}
}
}
}
}
return actualGenericParameterTypes;
}
/* (non-Javadoc)
* @see org.springframework.data.document.mongodb.convert.MongoConverter#convertObjectId(org.bson.types.ObjectId, java.lang.Class)
*/
public <T> T convertObjectId(ObjectId id, Class<T> targetType) {
return conversionService.convert(id, targetType);
}
/* (non-Javadoc)
* @see org.springframework.data.document.mongodb.convert.MongoConverter#convertObjectId(java.lang.Object)
*/
public ObjectId convertObjectId(Object id) {
return conversionService.convert(id, ObjectId.class);
}
/* (non-Javadoc)
* @see org.springframework.beans.factory.InitializingBean#afterPropertiesSet()
*/
public void afterPropertiesSet() {
initializeConverters();
}
}

View File

@@ -1,97 +0,0 @@
/*
* Copyright 2010-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.mongodb.geo;
/**
* Represents a geospatial box value
* @author Mark Pollack
*
*/
public class Box {
private double xmin;
private double ymin;
private double xmax;
private double ymax;
public Box(Point lowerLeft, Point upperRight) {
xmin = lowerLeft.getX();
ymin = lowerLeft.getY();
xmax = upperRight.getX();
ymax = upperRight.getY();
}
public Box(double[] lowerLeft, double[] upperRight) {
xmin = lowerLeft[0];
ymin = lowerLeft[1];
xmax = upperRight[0];
ymax = upperRight[1];
}
public Point getLowerLeft() {
return new Point(xmin, ymin);
}
public Point getUpperRight() {
return new Point(xmax, ymax);
}
@Override
public String toString() {
return "Box [xmin=" + xmin + ", ymin=" + ymin + ", xmax=" + xmax
+ ", ymax=" + ymax + "]";
}
@Override
public int hashCode() {
final int prime = 31;
int result = 1;
long temp;
temp = Double.doubleToLongBits(xmax);
result = prime * result + (int) (temp ^ (temp >>> 32));
temp = Double.doubleToLongBits(xmin);
result = prime * result + (int) (temp ^ (temp >>> 32));
temp = Double.doubleToLongBits(ymax);
result = prime * result + (int) (temp ^ (temp >>> 32));
temp = Double.doubleToLongBits(ymin);
result = prime * result + (int) (temp ^ (temp >>> 32));
return result;
}
@Override
public boolean equals(Object obj) {
if (this == obj)
return true;
if (obj == null)
return false;
if (getClass() != obj.getClass())
return false;
Box other = (Box) obj;
if (Double.doubleToLongBits(xmax) != Double.doubleToLongBits(other.xmax))
return false;
if (Double.doubleToLongBits(xmin) != Double.doubleToLongBits(other.xmin))
return false;
if (Double.doubleToLongBits(ymax) != Double.doubleToLongBits(other.ymax))
return false;
if (Double.doubleToLongBits(ymin) != Double.doubleToLongBits(other.ymin))
return false;
return true;
}
}

View File

@@ -1,50 +0,0 @@
/*
* Copyright 2010-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.mongodb.geo;
import java.util.Arrays;
/**
* Represents a geospatial circle value
* @author Mark Pollack
*
*/
public class Circle {
private double[] center;
private double radius;
public Circle(double centerX, double centerY, double radius) {
this.center = new double[] { centerX, centerY };
this.radius = radius;
}
public double[] getCenter() {
return center;
}
public double getRadius() {
return radius;
}
@Override
public String toString() {
return "Circle [center=" + Arrays.toString(center) + ", radius=" + radius
+ "]";
}
}

View File

@@ -1,85 +0,0 @@
/*
* Copyright 2010-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.mongodb.geo;
/**
* Represents a geospatial point value
* @author Mark Pollack
*
*/
public class Point {
private double x;
private double y;
public Point(double x, double y) {
this.x = x;
this.y = y;
}
public Point(Point point) {
this.x = point.x;
this.y = point.y;
}
public double getX() {
return x;
}
public double getY() {
return y;
}
@Override
public int hashCode() {
final int prime = 31;
int result = 1;
long temp;
temp = Double.doubleToLongBits(x);
result = prime * result + (int) (temp ^ (temp >>> 32));
temp = Double.doubleToLongBits(y);
result = prime * result + (int) (temp ^ (temp >>> 32));
return result;
}
@Override
public boolean equals(Object obj) {
if (this == obj)
return true;
if (obj == null)
return false;
if (getClass() != obj.getClass())
return false;
Point other = (Point) obj;
if (Double.doubleToLongBits(x) != Double
.doubleToLongBits(other.x))
return false;
if (Double.doubleToLongBits(y) != Double
.doubleToLongBits(other.y))
return false;
return true;
}
@Override
public String toString() {
return "Point [latitude=" + x + ", longitude=" + y + "]";
}
}

Some files were not shown because too many files have changed in this diff Show More