Compare commits

..

161 Commits

Author SHA1 Message Date
Thomas Risberg
613b183b70 Preparing for 1.0.0.M5 MongoDB release
* Fixing spring-data-commons dependency
2011-11-04 09:20:18 -04:00
Thomas Risberg
5d1320a82a preparing for snapshot builds 2011-10-24 17:46:12 -04:00
Thomas Risberg
8ccaf61d12 preparing for 1.0.0.M5 MongoDB release 2011-10-24 17:39:28 -04:00
Thomas Risberg
2c46cfd8fe Updating documentation with project name changes 2011-10-24 17:32:41 -04:00
Thomas Risberg
45f2900d15 Updating documentation with package name changes and Criteria changes 2011-10-24 17:04:20 -04:00
Oliver Gierke
79934538b6 DATAMONGO-303 - Updated to Querydsl 2.2.4. 2011-10-24 14:52:01 -05:00
Oliver Gierke
0e4e0094a5 DATAMONGO-302, DATACMNS-91 - Added null-checks for CRUD methods where necessary.
CRUD methods in SimpleMongoRepository now consistently throw IllegalArgumentExceptions for null parameters handed to them.
2011-10-24 14:21:12 -05:00
Thomas Risberg
5df61563f4 DATADOC-300 Changing to use InvalidMongoDbApiUsageException 2011-10-20 15:44:23 -04:00
Thomas Risberg
04ebec993d DATADOC-300 Removing old 'or' metod, adding JavaDoc 2011-10-20 15:34:51 -04:00
Thomas Risberg
caa245dd08 DATADOC-283 DATADOC-300 Refatoring the QueryCriteria implementations to better support $and, $or and $nor queries 2011-10-20 15:22:37 -04:00
Oliver Gierke
717ff38319 DATADOC-230 - Added method to remove objects from specific collection.
Added MongoOperations.remove(Object, String) and according MongoTemplate implementation to be able to explicitly define the collection an object should be removed from. This aligns to the method signatures we provide for all other methods as well.

Consolidated MongoTemplate.getIdPropertName(…) and ….getIdValue(…) into ….getIdQueryFor(…) as they we're only used inside a method building a by-id-query for an object.
2011-10-20 10:02:36 +02:00
Oliver Gierke
33155ba0f4 DATADOC-295 - Added ability to setup a SimpleMongoDbFactory using a MongoURI.
Extended SimpleMongoDbFactory with a constructor to take a MongoUri instance. Expose uri attribute at the db-factory namespace element.
2011-10-14 13:32:46 +02:00
Oliver Gierke
9f62efa47c Fixed typo in reference documentation. 2011-10-14 13:32:03 +02:00
Oliver Gierke
620fc876f4 Removed unused imports. 2011-10-14 13:32:03 +02:00
Oliver Gierke
cfcf839232 DATADOC-297 - Pruned irrelevant sub modules.
Removed CouchDB module as well as the generic document one. Renamed document-parent into mongodb-parent. Adapted poms accordingly.
2011-10-13 20:27:03 +02:00
Oliver Gierke
7ce1e5fbd3 DATADOC-294 - Overhaul of collection and type information handling.
Streamlined handling of when and how to write type information into DBObjects being created. Added handling for converting Collections on the top level.
2011-10-13 18:01:17 +02:00
Christoph Leiter
2d12ba38f8 DATADOC-289 - Filter AfterLoadEvent for specific domain type.
AfterLoadEvent can now be typed to a domain type again and will only be invoked if documents are loaded that shall be mapped onto the declared type.
2011-10-12 15:25:09 +02:00
Oliver Gierke
1554d489ca DATACMNS-73 - Fixed package imports in DocumentBacking aspect. 2011-10-12 15:05:02 +02:00
Oliver Gierke
f030d304f4 DATADOC-271, DATACMNS-73 - Adapted changes of SD Commons. 2011-10-12 14:40:42 +02:00
Oliver Gierke
105e1da82b DATADOC-271 - Added license headers to cross-store files. 2011-10-12 14:30:21 +02:00
Oliver Gierke
dbc7601d7b DATADOC-293 - Activated Polygon related tests.
CI system has been updated to Mongo 2.0 so we can activate the test invoking its features.
2011-10-12 14:27:22 +02:00
Thomas Risberg
ccf981b8fb DATADOC-271 Re-packaging Mongo cross-store support, removing 'document' in package name 2011-10-12 08:16:43 -04:00
Oliver Gierke
8a43b4bbc0 DATADOC-65 - Allow usage of SpEL in @Document.
@Document can now use SpEL expressions to let the collection an entity shall be stored to be calculated on the fly.
2011-10-12 13:38:14 +02:00
Thomas Risberg
1c2c592b96 DATADOC-271 Re-packaging Mongo cross-store support, removing 'document' in package name 2011-10-12 07:08:34 -04:00
Thomas Risberg
cd4b409bf2 DATADOC-283 - Adding query support for $and operator. 2011-10-12 11:42:36 +02:00
Oliver Gierke
f71477f17d DATADOC-289 - Fixed invocation of AfterLoadEvent in AbstractMongoEventListener.
Invoke events that are not bound to the domain type before the domain type check. Removed unnecessary gentrification of AfterLoadEvent.
2011-10-11 17:37:32 +02:00
Oliver Gierke
454df1e7f1 DATADOC-273 - Persisting type information for raw types as well.
We now trigger writing type informations for subtypes of raw collections and maps as well.
2011-10-11 13:21:24 +02:00
Oliver Gierke
80641a0943 DATADOC-293 - Added Polygon abstraction to Criteria.
Introduced Polygon value object to capture a list of Points. Polished implementation of Circle (equals(…) and hashCode()) and API of Criteria. Added some additional unit tests. Introduced Shape interface to allow streamling the implementation of building within-Criterias.

Ignoring the tests for polygons right now until we have updated the Mongo instance on the CI server to 2.0.
2011-10-11 12:31:26 +02:00
Oliver Gierke
e405bf574c DATADOC-291 - MongoQueryCreator now considers mapping information for query building.
Instead of using the pure PropertyPath of the PartTree we ask the MappingContext for a PersistentPropertyPath and create a field name based path expression from it.
2011-10-07 12:00:31 +02:00
Oliver Gierke
9f8e406aff DATADOC-275 - Fixed id handling for DBRefs.
Trying to convert target ids into ObjectId or String before using the actual type.
2011-09-30 11:50:44 +02:00
Oliver Gierke
7d26366352 Added test case to show usage of custom converters and querying. 2011-09-30 09:35:10 +02:00
Oliver Gierke
864ae831dd DATADOC-285 - Added test case for issue, doesn't fail right now. 2011-09-27 14:36:50 +02:00
Oliver Gierke
bfb13d99e3 DATADOC-280 - Added maxAutoConnectRetryTime config option.
Extended MongoOperationsFactoryBean to carry maxAutoConnectRetryTime property and exposed that through the XML namespace.
2011-09-27 13:59:50 +02:00
Oliver Gierke
f39de4c28e DATADOC-286 - Added support for GreaterThanEqual and LessThanEqual.
Using those keywords provided by Spring Data Commons will cause $gte and $lte criterias to be created.
2011-09-27 11:48:20 +02:00
Oliver Gierke
3bdeb68617 DATADOC-278 - QueryMapper now converts ids for $ne correctly. 2011-09-27 10:43:02 +02:00
Oliver Gierke
6b40a27c92 DATACMNS-76 - Adapt changes of Spring Data Commons. 2011-09-26 20:21:58 +02:00
Oliver Gierke
237cbec945 DATADOC-284 - Added custom SpringDataMongodbSerializer.
The custom MongodbSerializer considers mapping information when building keys for the Querydsl queries to be executed.
2011-09-26 20:18:47 +02:00
Oliver Gierke
39bc6771b7 DATADOC-183 - Added count(…) methods to MongoOperations.
MongoTemplate is now able to count documents using either an entity class or collection name.
2011-09-15 12:00:03 +02:00
Oliver Gierke
73b2d5a99c DATADOC-270 - Removed critical Sonar warnings from codebase. 2011-09-14 15:34:29 +02:00
Oliver Gierke
5dab0c721e DATADOC-277 - Upgraded to Querydsl 2.2.2. 2011-09-14 11:49:36 +02:00
Oliver Gierke
bf7c9663cf DATADOC-274, DATADOC-276 - Split up repository package to be consistent with Spring Data JPA.
Introduced dedicated config, query and support packages. Updated Sonargraph architecture description introducing subsystems for repository layer.
2011-09-14 11:47:55 +02:00
Oliver Gierke
135742b7e4 Fixed potential NullPointException in debug level in MongoQueryCreator. 2011-09-14 11:38:58 +02:00
Mark Pollack
dee0307055 DATADOC-269 - XML configuration for replica sets is not working 2011-09-13 12:17:40 -04:00
Oliver Gierke
06fb4144e0 DATADOC-261 - Added id to geoNear section of reference Documentation. 2011-09-07 13:46:11 +02:00
Oliver Gierke
5fdc600570 DATADOC-272 - Moved XSD file into correct package. 2011-09-07 13:29:06 +02:00
Oliver Gierke
f54c69b6ef DATADOC-258 - Updated dependencies to the latest versions. 2011-09-07 11:53:54 +02:00
Oliver Gierke
decdcff79f Added fallback references to general repository documentation. 2011-09-07 11:51:28 +02:00
Oliver Gierke
be8daa5268 Removed obsolete MongoBeanWrapper and MongoPropertyDescriptors. 2011-09-07 11:50:04 +02:00
Oliver Gierke
c4cd074d4d DATADOC-259 - Fixed potential NullPointerException in MapingMongoConverter.writeMapInternal(…).
MappingMongoConverter.writeInternal(…) invoked ….writeMapInternal(…) handing in null for the TypeInformation which violated the implicit contract for the method. Made contract explicit in Javadoc and hand in plain Map TypeInformation.
2011-09-07 11:24:50 +02:00
Oliver Gierke
8b7521a93b DATADOC-268 - CustomConversion considers types only simple for registered write converters.
In cases where only a reading converter is registered (e.g. to manually instantiate the object instance) the type the reading converter is registered for must not be regarded as simple as it will be written to the DBObject as is.
2011-09-07 10:25:44 +02:00
Oliver Gierke
ba508f497c DATACMNS-72 - Adapt changes introduced in Spring Data Commons.
Removed references to MappingContextAware(BeanPostProcessor).
2011-09-07 08:14:48 +02:00
Oliver Gierke
091246a9aa Polished map-reduce tests and formatted documentation with XMLEditor. 2011-09-04 14:00:09 +02:00
Oliver Gierke
745e1f313d DATADOC-259 - MappingMongoConverter now converts Maps nested in Collections as well.
Fixed nested type handling in MappingMongoConverter.writeMapInternally(…). Force usage of ConversionService for simple values if read value doesn't match target type.
2011-09-04 13:57:43 +02:00
Mark Pollack
234e04f4a3 minor doc fices 2011-09-02 19:27:36 -04:00
Thomas Risberg
15ab529596 preparing for snapshot builds 2011-09-01 20:37:40 -04:00
Thomas Risberg
00bea23eed preparing for 1.0.0.M4 MongoDB release 2011-09-01 20:32:10 -04:00
Thomas Risberg
c646c1f3b2 Updated changelog for 1.0.0.M4 MongoDB 2011-09-01 20:23:48 -04:00
Thomas Risberg
069f491603 DATADOC-232 added unit tests for multi set and inc updates 2011-09-01 18:56:16 -04:00
Thomas Risberg
e426ba4d43 DATADOC-210 removed Java 6 only method to provide Java 5 compatibility 2011-09-01 18:56:16 -04:00
Mark Pollack
756f886cef DATADOC-7 - Support for map-reduce operations in MongoTemplate 2011-09-01 18:03:32 -04:00
Thomas Risberg
9bb42245bb changed the test for index names since they don't seem to be consistent 2011-09-01 17:52:21 -04:00
Thomas Risberg
c15263a259 updated .gitignore 2011-09-01 17:51:46 -04:00
Mark Pollack
758ee97a8d DATADOC-7 - Support for map-reduce operations in MongoTemplate 2011-09-01 16:34:52 -04:00
Mark Pollack
4f92690f58 DATADOC-255 - Add to MongoOperations and executeCommand with an additional integer options argument
DATADOC-256 - Update to use MongoDB driver version 2.6.5
DATADOC-7 - Support for map-reduce operations in MongoTemplate
2011-09-01 15:33:55 -04:00
Mark Pollack
595ed69820 DATADOC-255 - Add to MongoOperations and executeCommand with an additional integer options argument
DATADOC-256 - Update to use MongoDB driver version 2.6.5
DATADOC-7 - Support for map-reduce operations in MongoTemplate
2011-09-01 15:33:13 -04:00
Oliver Gierke
48bf08afa3 DATADOC-68 - Updated documentation regading geoNear usage with MongoOperations. 2011-09-01 17:32:27 +02:00
Mark Pollack
98bdae4a00 DATADOC-7 (Add initial options class for Map reduce) 2011-09-01 11:19:54 -04:00
Mark Pollack
f08b87d8d6 DATADOC-7 (Add initial options class for Map reduce) 2011-09-01 10:35:12 -04:00
Oliver Gierke
6c3ffeac5f Merge branch 'geo-repo'
* geo-repo:
  DATADOC-68 - Support for geo-near queries at repositories.
2011-09-01 00:33:38 +02:00
Oliver Gierke
490ddc4a0e DATADOC-254 - Reject invalid MongoDB database names.
Only letters, numbers underscores and dashes are now allowed.
2011-09-01 00:29:55 +02:00
Oliver Gierke
ae26f4fea1 DATADOC-68 - Support for geo-near queries at repositories. 2011-09-01 00:28:48 +02:00
Oliver Gierke
e9ea756a3a DATADOC-253 - Upgraded to Spring 3.0.6. 2011-08-31 20:32:30 +02:00
Oliver Gierke
c4c95813b7 DATADOC-199 - Added caching for MongoPersistentProperty.isIdProperty and ….getFieldName.
Continuous field and annotation lookups in those methods have turned out to be some hotspots in performance tests. Added a CachingMongoPersistentProperty that delegates to the actual implementation once and caching it.
2011-08-31 19:41:04 +02:00
Mark Pollack
bfeb1b34c1 DATADOC-170 - Streamlined AbstractMongoEventListener (fix test) 2011-08-30 10:30:23 -04:00
Oliver Gierke
44def7dddb DATADOC-170 - Streamlined AbstractMongoEventListener.
Renamed Abstract{Mapping => Mongo}EventListener. Removed generic typing for the MongoMappingEvent and stick to domain type generification only.
2011-08-29 23:04:43 +02:00
Mark Pollack
df10bb2168 DATADOC-202 - Add a 'DocumentCallbackHandler' so that a callback can process each DBObject returned from a query (cherry-picking from commit 2ddc77c25e09a81ee61b3931337b35ae9a67b6e5) 2011-08-29 14:48:35 -04:00
Mark Pollack
f98607f5dc Add integration tests for mapping events 2011-08-29 11:46:05 -04:00
Oliver Gierke
da23133327 Yet another round of formatting.
Added Eclipse formatter settings.
2011-08-26 20:26:06 +02:00
Oliver Gierke
ce5046c35f DATADOC-63 - Added TypeMapper abstraction to customize how type information is written to a DBObject and retrieved from it.
Added TypeMapper abstraction to allow plugging in a custom implementation. Current type handling logic is now in DefaultTypeMapper which additionally allows customizing the key the type information its stored under or disabling writing type info by setting the key to null.

Added ConfigurableTypeMapper implementation that gets a Map configured to define Strings to be used to store type information.
2011-08-26 20:20:38 +02:00
Oliver Gierke
95245015bc DATADOC-245 - Default to custom target type if raw type is null.
Really only use the custom target type in case it is a real subtype of the basic one. Before that we used the plain custom one which was is lacking generics information potentially available in the basic one.
2011-08-26 13:44:47 +02:00
Oliver Gierke
fc40e6b08c DATADOC-243 - Allow db-factory-ref attribute for converter-element. 2011-08-24 16:51:56 +02:00
Oliver Gierke
eac5cb8c46 DATADOC-247 - BigInteger ids handled properly in QueryMapper.
QueryMapper now tries to convert given ids to ObjectId and String.
2011-08-24 16:31:18 +02:00
Oliver Gierke
54377031bb DATADOC-216 - Added ability to configure a WriteConcern on DB level.
Added WriteConcern property to MongoDbFactory and expose it through the db-factory namespace element.
2011-08-24 09:16:06 +02:00
Oliver Gierke
213963f2ff DATADOC-215 - Allow configuring a WriteConcern per MongoFactoryBean.
Added ability to configure a WriteConcern for an entire MongoFactoryBean and exposed the attribute via the namespace.
2011-08-23 20:47:19 +02:00
Oliver Gierke
0fb21aa2a1 DATADOC-248 - Customized MappingMongoEntityInformation to allow taking a custom collection.
In case a repository query method returns a domain type not assignable to the repository domain type we have to use the repositories domain type to determine the collection but still use the returned domain type to hand to the unmarshalling. Thus, we need to set up a custom MongoEntityInformation to reflect this scenario.

Extended MappingMongoEntityInformation to allow manually defining a custom collection name. EntityInformationCreator was extended accordingly and MongoQueryMethod now sets up the EntityInformation accordingly.
2011-08-23 13:42:32 +02:00
Oliver Gierke
e0da98ec51 DATADOC-217 - Consolidated collection handling when reading.
Extracted readCollectionOrArray(…) method and make sure it's used everywhere a Collection has to be read. Test cases now checks that empty collections are correctly read when using @PersistenceConstructor as well.
2011-08-22 18:15:22 +02:00
Oliver Gierke
7610246a1d Fixed GeoResultsUnitTests. 2011-08-22 17:37:37 +02:00
Oliver Gierke
ce59d893ba DATACMNS-61 - Adapted changes in Spring Data Commons.
Use QueryMethod accessor methods instead of dropped Type enum to determine query execution. Added custom MongoParameters and MongoParameter to allow discovering a Distance parameter for repository queries which will transparently add a 'maxDistance' clause for *Near criterias in query methods. If the given Distance is equipped with a Metric we will rather use $nearSphere over $near.

Added asList() to Point class to circumvent bug in BasicBSONObject.equals(…) which breaks equals(…) comparisons of DBObjects in case they use arrays as values. See [0] for details. Adapted usage of Point objects to use asList() over asArray().

[0] https://jira.mongodb.org/browse/JAVA-416
2011-08-21 16:46:11 +02:00
Oliver Gierke
e130fb5a2d DATACMNS-61 - Adapted changes in Spring Data Commons.
Use QueryMethod accessor methods instead of dropped Type enum to determine query execution. Added custom MongoParameters and MongoParameter to allow discovering a Distance parameter for repository queries which will transparently add a 'maxDistance' clause for *Near criterias in query methods. If the given Distance is equipped with a Metric we will rather use $nearSphere over $near.

Added asList() to Point class to circumvent bug in BasicBSONObject.equals(…) which breaks equals(…) comparisons of DBObjects in case they use arrays as values. See [0] for details. Adapted usage of Point objects to use asList() over asArray().

[0] https://jira.mongodb.org/browse/JAVA-416
2011-08-21 15:18:36 +02:00
Oliver Gierke
fedcbdae4f DATADOC-68 - Added support for geoNear command.
Introduced GeoResult value object as well as NearQuery. NearQuery allows definition of an origin and distances. Introduced a Metric interface and Metrics enum to carry commonly used metrics like kilometers and miles to ease the handling in NearQueries. Introduced Distance value object to capture distances in Metrics.
2011-08-20 17:56:26 +02:00
Oliver Gierke
7cd020ffa7 Removed some compiler warnings. 2011-08-20 13:27:08 +02:00
Oliver Gierke
b01e1a994b DATADOC-245 - MappingMongoConverter now reads nested, untyped Maps correctly.
Added some defaulting code in MappingMongoConverter that converts DBObject instances into Maps in case the raw property type is Object (see getMoreConcreteTargetType()).
2011-08-19 22:43:26 +02:00
Oliver Gierke
dd02338b5e DATADOC-246 - Added DBRef to Mongo simple types.
General overhaul of default setup of custom Mongo simple types. MongoTemplate.doUpdate(…) transparently handles null queries now as well.
2011-08-19 21:28:25 +02:00
Oliver Gierke
7084839df1 DATADOC-194 - Use OSGi valid version loading a class. 2011-08-19 09:33:39 +02:00
Oliver Gierke
09aad4343f DATADOC-241 - Made readMap(…) protected to allow overriding.
Aligned method signatures for reading and writing for consistent parameter order.
2011-08-16 14:25:48 +02:00
Oliver Gierke
c6a97ef407 DATADOC-240, DATADOC-212 - Overhaul of MongoTemplate.doUpdate(…).
Replaced manual ID conversion with delegating to QueryMapper. Added ObjectId as Mongo native type to CustomConversions and added unit tests around its handling.
2011-08-16 13:03:52 +02:00
Oliver Gierke
ceac760d19 DATADOC-236 - MongoQueryCreator now regards the Sort object from the PartTree on query completion. 2011-08-16 10:40:44 +02:00
Oliver Gierke
244e9bc6d8 DATADOC-238 - Reverted change in test case that introduced side effects. 2011-08-16 10:30:50 +02:00
Oliver Gierke
2016aab969 DATADOC-235 - Arbitrary Map values are now converted correctly.
Map values are now handled correctly regardless of the actual Map value type declaration (can be Object in the most open case). We now handle collections as Map value types correctly. BasicDBList instances are now hinted to become Lists by default (if not typed to another collection type by the property). 

Reduced visibility of MappingMongoConverter.addCustomTypeKeyIfNecessary(…) and made createCollectionDBObject(…) safe against null values for the TypeInformation.
2011-08-15 20:54:16 +02:00
Oliver Gierke
35f180f999 DATADOC-238 - Fixed applying pagination for manually defined queries.
BasicQuery accidentally shadowed limit and skip fields of Query and introduced setters instead of builder style mutators. This caused the getters not returning the values set throughout the mutators which essentially turned off pagination for manually defined queries.

Removed the shadowing and created test case. Refactored constructors.
2011-08-15 19:59:08 +02:00
Oliver Gierke
ad8b6fccb4 DATADOC-237 - Index name considers field name if not configured explicitly. 2011-08-14 17:22:50 +02:00
Oliver Gierke
d237ee80c8 DATADOC-239 - Clarify documentation of passed in values for MongoReader. 2011-08-12 20:06:04 +02:00
Oliver Gierke
eb276841dd DATADOC-214 - Cleaned up MongoConverter interface and implementations.
Removed quite some obsolete methods from MongoConverter interface. Renamed maybeConvertObject(…) to convertToMongoType(…). Moved implementation of that method into MappingMongoConverter. Let the implementation transparently use custom Converters as well. Removed SimpleMongoConverter. Switched QueryMapper implementation from using a MongoConverter to use a ConversionService. Removed custom "maybe convert" logic from ConvertingParameterAccessor in favor of MongoWriter.convertToMongo(…).
2011-08-12 15:39:57 +02:00
Mark Pollack
1d51052ab1 Merge pull request #4 from georgecalm/master
DATADOC-231: setting optional deps in the manifest to make the bundle work in an OSGi server
2011-08-02 07:09:46 -07:00
Yuriy Nemtsov
58898e97c2 DATADOC-231 - Making mysema.query & collections15 deps optional in the manifest 2011-07-31 20:46:12 -04:00
Oliver Gierke
a5328da460 Added handling of BigInteger.
Added converter to handle BigInteger values. Adapted id handling to try converting and object to String before taking the id as is. Made custom converter implementations safe against invocations with null.
2011-07-28 13:29:32 +02:00
Oliver Gierke
fd7e41b753 DATADOC-228 - MappingMongoConverter now writes null for map values.
Fixes a NPE that one got when trying to persist a Map containing null as value for entries.
2011-07-28 11:17:47 +02:00
Oliver Gierke
f349f5ea10 DATADOC-226 - Added QuerydslRepositorySupport class to ease implementing repositories using Querydsl predicates. 2011-07-28 11:16:33 +02:00
Oliver Gierke
aa9d69d584 DATADOC-225 - BasicMongoPersistentEntity doesn't reject root entities without id anymore. 2011-07-27 09:55:56 +02:00
Oliver Gierke
764317635c Fixed annotation package pattern for APT processor. 2011-07-27 09:52:13 +02:00
Oliver Gierke
9ed5e6886c DATADOC-224 - Inspect value entity metadata in case it's a subtype of the declared property. 2011-07-26 23:35:05 +02:00
Oliver Gierke
94f36d27fc Upgraded Querydsl APT plugin to 1.0.2. 2011-07-26 21:22:11 +02:00
Oliver Gierke
a5fb4872b7 DATADOC-221 - Potentially convert values of maps on reading. 2011-07-26 21:13:51 +02:00
Oliver Gierke
2f577c9678 Updated Spring Data Commons dependency to 1.2.0.BUILD-SNAPSHOT. 2011-07-25 23:44:38 +02:00
Oliver Gierke
9bcd19866f DATADOC-211 - Guard potential NullPointerException in AbstractMongoConverter.maybeConvertObject(…).
Quickfix, will probably undergo a deeper cleanup as part of DATADOC-214.
2011-07-25 14:37:30 +02:00
Oliver Gierke
2af45518bd Formatting in MappingMongoConverter. 2011-07-22 17:53:04 +02:00
Oliver Gierke
e1daf36ed8 DATADOC-209 - MappingMongoConverter handles collections of enums correctly now. 2011-07-22 17:52:48 +02:00
Oliver Gierke
101064769c DATADOC-207 - Empty custom maps get read correctly.
Eagerly detect the map target type by using the TypeInformation instead of falling back to the raw Map interface.
2011-07-22 10:16:13 +02:00
Oliver Gierke
ac9c804aae DATADOC-206 - Upgraded to Querydsl 2.2.0. 2011-07-21 09:11:22 +02:00
Oliver Gierke
992f09d731 Removed unused imports, polished license headers and suppres some compiler warnings. 2011-07-21 09:07:49 +02:00
Oliver Gierke
4324ed8231 Javadoc cleanups and a few code cleanups. 2011-07-20 19:03:08 +02:00
Oliver Gierke
eebe973209 Added *.sonar4clipseExternals, .springBeans and *.orig to .gitignore.
Removed accidentally checked in .classpath.orig file.
2011-07-20 18:59:14 +02:00
Oliver Gierke
bb01cccac5 Updated architecture description to allow core package to have access to geo. 2011-07-20 18:51:35 +02:00
Oliver Gierke
2bdd49e3b7 Beautify upper version ranges in manifest. 2011-07-14 21:34:32 +02:00
Oliver Gierke
f424b7c760 DATADOC-166 - Check for null on removing objects. 2011-07-14 21:34:11 +02:00
Oliver Gierke
58b9db28a8 Polished Sonargraph architecture description. 2011-07-13 12:42:56 +02:00
Oliver Gierke
3e56210c78 Polished test cases a little. 2011-07-13 12:01:32 +02:00
Oliver Gierke
523612a3a9 DATADOC-175 - Broke up cyclic dependencies and added architecture management file.
Added initial architecture description for Sonargraph. Moved some types around and introduced core package to break up cyclic dependencies.
2011-07-13 11:57:38 +02:00
Jon Brisbin
1604c80d32 DATADOC-176 - Added test case to make sure non-ObjectIds can be used in DBRefs 2011-07-12 10:18:10 -05:00
Jon Brisbin
d27a1ad310 DATADOC-176 - Changed from using ObjectId to using Object for IDs in DBRefs. 2011-07-12 09:33:31 -05:00
Oliver Gierke
d2cca2c52a DATADOC-192 - Tweak Array handling in processing collection-like structures in MappingMongoConverter.
Fixes broken tests.
2011-07-08 13:22:48 +02:00
Oliver Gierke
ce0539a3dc DATADOC-192 - MappingMongoConverter handles collections correctly now.
Replaced hard coded List creation with delegate to Spring's CollectionFactory. Although the List should get converted before setting the value we can prevent that additional step by looking up the correct collection type upfront.
2011-07-08 13:04:25 +02:00
Oliver Gierke
4b4c35b904 DATADOC-191 - Removed 'document' from package names.
Polished template.mf as well by using Maven version placeholders.
2011-07-08 11:25:43 +02:00
Oliver Gierke
f0df16a340 Added more unit tests for Box. 2011-07-08 11:00:06 +02:00
Oliver Gierke
a9670de959 DATADOC-134 - Added test case to show indexes with unique flag work. 2011-07-08 10:23:45 +02:00
Oliver Gierke
2d5f41f65c DATADOC-171 - Convert BigDecimals to String by default.
Unfortunately MongoDB can't handle BigDecimal instances by default thus we have to provide a default serialization. We do by simply serializing it into a String and reading it back. Can be customized to register a custom Converter with the MongoConverter.
2011-07-07 21:35:54 +02:00
Oliver Gierke
2e3f2c602c Made integration test less aggressive in cleaning up to avoid inter-test collisions. 2011-07-07 19:52:16 +02:00
Oliver Gierke
a2687b6688 DATADOC-188 - Allow controlling repository index creation.
Disabled index creation for repository query methods by default. Added property on MongoRepositoryFactoryBean to enable index creation and expose that via 'create-query-indexes' attribute on the namespace.
2011-07-07 13:22:25 +02:00
Oliver Gierke
e89ea320bc Added @NoRepositoryBean to MongoRepository to prevent it being accidentally picked up during class path scanning. 2011-07-07 13:14:48 +02:00
Oliver Gierke
df24218a4f DATADOC-189 - Improved extensibility of MongoRepositoryFactoryBean.
Creation of MongoRepositoryFactory is now delegated into a template method that gets a MongoTemplate handed over.
2011-07-07 10:46:54 +02:00
Oliver Gierke
2d09b8b7d1 DATADOC-186 - Ensure index property order.
Use LinkedHashMap inside Index to make sure properties added are kept in the order of the addition.
2011-07-07 07:45:15 +02:00
Oliver Gierke
dd06973f50 DATADOC-190 - SimpleMongoRepository.exists(…) now works for entities with non-ObjectId id type.
Changed the implementation of the exists(…) method to make sure the query is handed to the QueryMapper to convert the id appropriately before executing the query.
2011-07-06 19:49:11 +02:00
Oliver Gierke
2256ebac1b DATADOC-172 - Allow defining property order for document mapping.
Entities can now use @Field annotation to define the order using the order attribute. We will simply start with the lower values and proceed to the higher ones. Unannotated properties are ordered behind all annotated (and order declared) ones. One might not assume any particular order for fields where the order is not manually defined.

Removed @FieldName as it is superseeded by @Field.
2011-07-06 19:09:24 +02:00
Oliver Gierke
f4373957b3 DATADOC-177 - Sort now preserves order of individual sort properties.
Using a LinkedHashMap now to preserve the order of properties to be sorted upon.
2011-07-06 15:39:22 +02:00
Thomas Risberg
af0cd9049a DATADOC-178 removed System.out.println statement 2011-06-24 12:45:17 -04:00
Thomas Risberg
99edcb82cf reformatted code to use tabs 2011-06-24 11:50:28 -04:00
Thomas Risberg
7a98877e75 DATADOC-181 added destroy() method and DisposableBean interface to call close on Mongo 2011-06-24 11:45:42 -04:00
Oliver Gierke
d856a7cad9 DATACMNS-49, DATADOC-100 - Allow externalizing Mongo repository queries.
Adapted changes in Spring Data Commons for DATACMNS-49. We now automatically pick up classpath*:META-INF/mongo-named-queries.properties to find named queries when using the namespace. See AbstractMongoRepositoryIntegrationTests.findsPeopleByNamedQuery() for sample.
2011-06-23 19:12:22 +02:00
Oliver Gierke
c1b396cca5 Replaced custom implementation over usage of TypeInformation.getActualType(). 2011-06-17 20:28:13 +02:00
Oliver Gierke
2e86012b3f Lowered log level to speed up test execution. 2011-06-13 20:25:36 -07:00
Oliver Gierke
538a62efce Adapted package refactorings in Spring Data Commons. 2011-06-13 20:25:15 -07:00
Oliver Gierke
3573851bfc Polishing.
Removed obsolete methods from MappingMongoConverter exposing internals. Added JavaDoc here and there. Polished GeospatialIndex value object and added some assertions to it. Fixed generics warnings in GeoIndexedAppConfig.
2011-06-13 14:53:07 -07:00
Oliver Gierke
e47ddf4bf2 Polished MonogoPersistentEntityIndexCreator.
Fixed generics warnings and the index key being set to the field name in any case now. setting the index name is now only affecting the name.
2011-06-13 13:25:06 -07:00
Oliver Gierke
c59711a409 Use findOne(…) for single entity queries inside repository abstraction. 2011-06-13 13:24:43 -07:00
Oliver Gierke
b1ab3f6ade DATADOC-162 - Added more test cases around Point class. 2011-06-13 09:53:11 -07:00
Thomas Risberg
fa094eef25 DATADOC-162 fixed formatting in toString method 2011-06-10 09:01:56 -04:00
Oliver Gierke
8ac2ff6b81 DATACMNS-169 - Revised custom converter handling.
Registering custom converts now automatically declares the custom handled types simple by using the SimpleTypeHolder built up by the CustomConversions instance. All the custom converter lookup logic is now in CustomConversions and AbstractMongoConverter uses that over handling it itself.
2011-06-09 16:35:04 -07:00
Oliver Gierke
46d6dffdd4 DATADOC-167 - @Documentation annotation is now inherited into subclasses 2011-06-07 11:42:53 -07:00
Oliver Gierke
06632cc3ae DATACMNS-42 - Fixed some bugs in nested collection handling. 2011-06-06 14:46:06 -07:00
Oliver Gierke
df0427442a DATACMNS-42 - Removed BigInteger as Mongo specific custom type as MappingBeanHelper covers Number implementations already. 2011-06-06 11:10:53 -07:00
Thomas Risberg
eb20aea7fe preparing for snapshot builds 2011-06-02 13:45:25 -04:00
388 changed files with 14728 additions and 9492 deletions

30
.gitignore vendored
View File

@@ -1,13 +1,17 @@
.DS_Store
*.iml
*.ipr
*.iws
target
.springBeans
.ant-targets-build.xml
.settings/
.project
.classpath
src/ant/.ant-targets-upload-dist.xml
atlassian-ide-plugin.xml
/.gradle/
.DS_Store
*.iml
*.ipr
*.iws
*.orig
target
.springBeans
.sonar4clipse
*.sonar4clipseExternals
.ant-targets-build.xml
.settings/
.project
.classpath
src/ant/.ant-targets-upload-dist.xml
atlassian-ide-plugin.xml
/.gradle/
/.idea/

17
pom.xml
View File

@@ -3,16 +3,15 @@
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-document-dist</artifactId>
<name>Spring Data Document Distribution</name>
<version>1.0.0.M3</version>
<artifactId>spring-data-mongo-dist</artifactId>
<name>Spring Data MongoDB Distribution</name>
<version>1.0.0.M5</version>
<packaging>pom</packaging>
<modules>
<module>spring-data-document-parent</module>
<module>spring-data-mongodb</module>
<module>spring-data-mongodb-cross-store</module>
<module>spring-data-mongodb-log4j</module>
<!-- <module>spring-data-couchdb</module> -->
<module>spring-data-mongodb-parent</module>
</modules>
<developers>
@@ -90,7 +89,7 @@
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<!-- dist.* properties are used by the antrun tasks below -->
<dist.id>spring-data-document</dist.id>
<dist.id>spring-data-mongo</dist.id>
<dist.name>Spring Data Mongo</dist.name>
<dist.key>SDMONGO</dist.key>
<dist.version>${project.version}</dist.version>
@@ -175,7 +174,7 @@
</fileset>
</copy>
<move file="${project.basedir}/target/site/reference/pdf/index.pdf"
tofile="${project.basedir}/target/site/reference/pdf/spring-data-document-reference.pdf"
tofile="${project.basedir}/target/site/reference/pdf/spring-data-mongo-reference.pdf"
failonerror="false"/>
</postProcess>
</configuration>
@@ -258,8 +257,8 @@
</dependencies>
</plugin>
</plugins>
<!-- the name of this project is 'spring-data-document-dist';
make sure the zip file is just 'spring-data-document'. -->
<!-- the name of this project is 'spring-data-mongo-dist';
make sure the zip file is just 'spring-data-mongo'. -->
<finalName>${dist.finalName}</finalName>
</build>

View File

@@ -1,156 +0,0 @@
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-document-parent</artifactId>
<version>1.0.0.BUILD-SNAPSHOT</version>
<relativePath>../spring-data-document-parent/pom.xml</relativePath>
</parent>
<artifactId>spring-data-couchdb</artifactId>
<packaging>jar</packaging>
<name>Spring Data CouchDB Support</name>
<licenses>
<license>
<name>The Apache Software License, Version 2.0</name>
<url>http://www.apache.org/licenses/LICENSE-2.0.txt</url>
<distribution>repo</distribution>
</license>
</licenses>
<developers>
<developer>
<id>tareq.abedrabbo</id>
<name>Tareq Abedrabbo</name>
<email>tareq.abedrabbo@opencredo.com</email>
<organization>OpenCredo</organization>
<organizationUrl>http://www.opencredo.org</organizationUrl>
<roles>
<role>Project Admin</role>
<role>Developer</role>
</roles>
<timezone>+0</timezone>
</developer>
<developer>
<id>tomas.lukosius</id>
<name>Tomas Lukosius</name>
<email>tomas.lukosius@opencredo.com</email>
<organization>OpenCredo</organization>
<organizationUrl>http://www.opencredo.org</organizationUrl>
<roles>
<role>Project Admin</role>
<role>Developer</role>
</roles>
<timezone>+0</timezone>
</developer>
</developers>
<dependencies>
<!-- Spring -->
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-beans</artifactId>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-tx</artifactId>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-test</artifactId>
<scope>test</scope>
</dependency>
<!-- Spring Data -->
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-commons-core</artifactId>
</dependency>
<!-- Dependencies for web analytics functionality - to me moved into spring framework -->
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>servlet-api</artifactId>
<version>2.5</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-core</artifactId>
<version>${org.springframework.version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-web</artifactId>
<version>${org.springframework.version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-webmvc</artifactId>
<version>${org.springframework.version}</version>
</dependency>
<!-- Jackson JSON -->
<dependency>
<groupId>org.codehaus.jackson</groupId>
<artifactId>jackson-core-asl</artifactId>
<version>1.6.1</version>
</dependency>
<dependency>
<groupId>org.codehaus.jackson</groupId>
<artifactId>jackson-mapper-asl</artifactId>
<version>1.6.1</version>
</dependency>
<dependency>
<groupId>javax.annotation</groupId>
<artifactId>jsr250-api</artifactId>
<optional>true</optional>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.hamcrest</groupId>
<artifactId>hamcrest-all</artifactId>
<version>1.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.mockito</groupId>
<artifactId>mockito-all</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
</dependency>
<!-- Couch DB -->
<dependency>
<groupId>com.google.code.jcouchdb</groupId>
<artifactId>jcouchdb</artifactId>
<version>0.11.0-1</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>com.springsource.bundlor</groupId>
<artifactId>com.springsource.bundlor.maven</artifactId>
</plugin>
</plugins>
</build>
</project>

View File

@@ -1,34 +0,0 @@
/*
* Copyright 2011 the original author or authors
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb;
import org.springframework.dao.InvalidDataAccessResourceUsageException;
import org.springframework.web.client.HttpServerErrorException;
public class CouchServerResourceUsageException extends InvalidDataAccessResourceUsageException {
/**
* Create a new CouchServerResourceUsageException,
* wrapping an arbitrary HttpServerErrorException.
*
* @param cause the HttpServerErrorException thrown
*/
public CouchServerResourceUsageException(HttpServerErrorException cause) {
super(cause != null ? cause.getMessage() : null, cause);
}
}

View File

@@ -1,34 +0,0 @@
/*
* Copyright 2011 the original author or authors
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.web.client.HttpClientErrorException;
public class CouchUsageException extends InvalidDataAccessApiUsageException {
/**
* Create a new CouchUsageException,
* wrapping an arbitrary HttpServerErrorException.
*
* @param cause the HttpServerErrorException thrown
*/
public CouchUsageException(HttpClientErrorException cause) {
super(cause != null ? cause.getMessage() : null, cause);
}
}

View File

@@ -1,34 +0,0 @@
/*
* Copyright 2011 the original author or authors
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb;
import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.web.client.HttpStatusCodeException;
public class DocumentExistsException extends DataIntegrityViolationException {
/**
* Create a new DocumentExistsException,
* wrapping an arbitrary HttpServerErrorException.
*
* @param cause the HttpServerErrorException thrown
*/
public DocumentExistsException(String documentId, HttpStatusCodeException cause) {
super(cause != null ? cause.getMessage() : null, cause);
}
}

View File

@@ -1,34 +0,0 @@
/*
* Copyright 2011 the original author or authors
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb;
import org.springframework.dao.UncategorizedDataAccessException;
import org.springframework.web.client.RestClientException;
public class UncategorizedCouchDataAccessException extends UncategorizedDataAccessException {
/**
* Create a new HibernateSystemException,
* wrapping an arbitrary HibernateException.
*
* @param cause the HibernateException thrown
*/
public UncategorizedCouchDataAccessException(RestClientException cause) {
super(cause != null ? cause.getMessage() : null, cause);
}
}

View File

@@ -1,64 +0,0 @@
/*
* Copyright 2011 the original author or authors
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb.admin;
import java.util.Arrays;
import java.util.List;
import java.util.Map;
import org.springframework.data.document.couchdb.support.CouchUtils;
import org.springframework.util.StringUtils;
import org.springframework.web.client.RestOperations;
import org.springframework.web.client.RestTemplate;
public class CouchAdmin implements CouchAdminOperations {
private String databaseUrl;
private RestOperations restOperations = new RestTemplate();
public CouchAdmin(String databaseUrl) {
if (!databaseUrl.trim().endsWith("/")) {
this.databaseUrl = databaseUrl.trim() + "/";
} else {
this.databaseUrl = databaseUrl.trim();
}
}
public List<String> listDatabases() {
String dbs = restOperations.getForObject(databaseUrl + "_all_dbs", String.class);
return Arrays.asList(StringUtils.commaDelimitedListToStringArray(dbs));
}
public void createDatabase(String dbName) {
org.springframework.util.Assert.hasText(dbName);
restOperations.put(databaseUrl + dbName, null);
}
public void deleteDatabase(String dbName) {
org.springframework.util.Assert.hasText(dbName);
restOperations.delete(CouchUtils.ensureTrailingSlash(databaseUrl + dbName));
}
public DbInfo getDatabaseInfo(String dbName) {
String url = CouchUtils.ensureTrailingSlash(databaseUrl + dbName);
Map dbInfoMap = (Map) restOperations.getForObject(url, Map.class);
return new DbInfo(dbInfoMap);
}
}

View File

@@ -1,34 +0,0 @@
/*
* Copyright 2011 the original author or authors
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb.admin;
import java.util.List;
public interface CouchAdminOperations {
// functionality for /_special - replication, logs, UUIDs
List<String> listDatabases();
void createDatabase(String name);
void deleteDatabase(String name);
DbInfo getDatabaseInfo(String name);
}

View File

@@ -1,72 +0,0 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb.admin;
import java.util.Collections;
import java.util.Map;
public class DbInfo {
private Map dbInfoMap;
public DbInfo(Map dbInfoMap) {
super();
this.dbInfoMap = dbInfoMap;
}
public boolean isCompactRunning() {
return (Boolean) this.dbInfoMap.get("compact_running");
}
public String getDbName() {
return (String) this.dbInfoMap.get("db_name");
}
public long getDiskFormatVersion() {
return (Long) this.dbInfoMap.get("disk_format_version");
}
public long getDiskSize() {
return (Long) this.dbInfoMap.get("disk_size");
}
public long getDocCount() {
return (Long) this.dbInfoMap.get("doc_count");
}
public long getDocDeleteCount() {
return (Long) this.dbInfoMap.get("doc_del_count");
}
public long getInstanceStartTime() {
return (Long) this.dbInfoMap.get("instance_start_time");
}
public long getPurgeSequence() {
return (Long) this.dbInfoMap.get("purge_seq");
}
public long getUpdateSequence() {
return (Long) this.dbInfoMap.get("update_seq");
}
public Map getDbInfoMap() {
return Collections.unmodifiableMap(dbInfoMap);
}
}

View File

@@ -1,70 +0,0 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb.config;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.parsing.BeanComponentDefinition;
import org.springframework.beans.factory.parsing.CompositeComponentDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.document.couchdb.monitor.ServerInfo;
import org.springframework.util.StringUtils;
import org.w3c.dom.Element;
public class CouchJmxParser implements BeanDefinitionParser {
public BeanDefinition parse(Element element, ParserContext parserContext) {
String databaseUrl = element.getAttribute("database-url");
if (!StringUtils.hasText(databaseUrl)) {
databaseUrl = "http://localhost:5984";
}
registerJmxComponents(databaseUrl, element, parserContext);
return null;
}
protected void registerJmxComponents(String databaseUrl, Element element, ParserContext parserContext) {
Object eleSource = parserContext.extractSource(element);
CompositeComponentDefinition compositeDef = new CompositeComponentDefinition(element.getTagName(), eleSource);
/*
createBeanDefEntry(AssertMetrics.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(BackgroundFlushingMetrics.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(BtreeIndexCounters.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(ConnectionMetrics.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(GlobalLockMetrics.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(MemoryMetrics.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(OperationCounters.class, compositeDef, mongoRefName, eleSource, parserContext);
*/
createBeanDefEntry(ServerInfo.class, compositeDef, databaseUrl, eleSource, parserContext);
//createBeanDefEntry(MongoAdmin.class, compositeDef, mongoRefName, eleSource, parserContext);
parserContext.registerComponent(compositeDef);
}
protected void createBeanDefEntry(Class clazz, CompositeComponentDefinition compositeDef, String databaseUrl, Object eleSource, ParserContext parserContext) {
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(clazz);
builder.getRawBeanDefinition().setSource(eleSource);
builder.addConstructorArg(databaseUrl);
BeanDefinition assertDef = builder.getBeanDefinition();
String assertName = parserContext.getReaderContext().registerWithGeneratedName(assertDef);
compositeDef.addNestedComponent(new BeanComponentDefinition(assertDef, assertName));
}
}

View File

@@ -1,63 +0,0 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb.core;
import java.net.URI;
public interface CouchOperations {
/**
* Reads a document from the database and maps it a Java object.
* </p>
* This method is intended to work when a default database
* is set on the CouchDbDocumentOperations instance.
*
* @param id the id of the CouchDB document to read
* @param targetClass the target type to map to
* @return the mapped object
*/
<T> T findOne(String id, Class<T> targetClass);
/**
* Reads a document from the database and maps it a Java object.
*
* @param uri the full URI of the document to read
* @param targetClass the target type to map to
* @return the mapped object
*/
<T> T findOne(URI uri, Class<T> targetClass);
/**
* Maps a Java object to JSON and writes it to the database
* </p>
* This method is intended to work when a default database
* is set on the CouchDbDocumentOperations instance.
*
* @param id the id of the document to write
* @param document the object to write
*/
void save(String id, Object document);
/**
* Maps a Java object to JSON and writes it to the database
*
* @param uri the full URI of the document to write
* @param document the object to write
*/
void save(URI uri, Object document);
}

View File

@@ -1,143 +0,0 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb.core;
import java.net.URI;
import java.util.Map;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.springframework.data.document.couchdb.CouchServerResourceUsageException;
import org.springframework.data.document.couchdb.CouchUsageException;
import org.springframework.data.document.couchdb.DocumentRetrievalFailureException;
import org.springframework.data.document.couchdb.UncategorizedCouchDataAccessException;
import org.springframework.data.document.couchdb.support.CouchUtils;
import org.springframework.http.*;
import org.springframework.util.Assert;
import org.springframework.web.client.*;
public class CouchTemplate implements CouchOperations {
protected final Log logger = LogFactory.getLog(this.getClass());
private String defaultDocumentUrl;
private RestOperations restOperations = new RestTemplate();
/**
* Constructs an instance of CouchDbDocumentTemplate with a default database
*
* @param defaultDatabaseUrl the default database to connect to
*/
public CouchTemplate(String defaultDatabaseUrl) {
Assert.hasText(defaultDatabaseUrl, "defaultDatabaseUrl must not be empty");
defaultDocumentUrl = CouchUtils.addId(defaultDatabaseUrl);
}
/**
* Constructs an instance of CouchDbDocumentTemplate with a default database
*
* @param defaultDatabaseUrl the default database to connect to
*/
public CouchTemplate(String defaultDatabaseUrl, RestOperations restOperations) {
this(defaultDatabaseUrl);
Assert.notNull(restOperations, "restOperations must not be null");
this.restOperations = restOperations;
}
public <T> T findOne(String id, Class<T> targetClass) {
Assert.state(defaultDocumentUrl != null, "defaultDatabaseUrl must be set to use this method");
try {
return restOperations.getForObject(defaultDocumentUrl, targetClass, id);
//TODO check this exception translation and centralize.
} catch (HttpClientErrorException clientError) {
if (clientError.getStatusCode() == HttpStatus.NOT_FOUND) {
throw new DocumentRetrievalFailureException(defaultDocumentUrl + "/" + id);
}
throw new CouchUsageException(clientError);
} catch (HttpServerErrorException serverError) {
throw new CouchServerResourceUsageException(serverError);
} catch (RestClientException otherError) {
throw new UncategorizedCouchDataAccessException(otherError);
}
}
public <T> T findOne(URI uri, Class<T> targetClass) {
Assert.state(uri != null, "uri must be set to use this method");
try {
return restOperations.getForObject(uri, targetClass);
//TODO check this exception translation and centralize.
} catch (HttpClientErrorException clientError) {
if (clientError.getStatusCode() == HttpStatus.NOT_FOUND) {
throw new DocumentRetrievalFailureException(uri.getPath());
}
throw new CouchUsageException(clientError);
} catch (HttpServerErrorException serverError) {
throw new CouchServerResourceUsageException(serverError);
} catch (RestClientException otherError) {
throw new UncategorizedCouchDataAccessException(otherError);
}
}
public void save(String id, Object document) {
Assert.notNull(document, "document must not be null for save");
HttpEntity<?> httpEntity = createHttpEntity(document);
try {
ResponseEntity<Map> response = restOperations.exchange(defaultDocumentUrl, HttpMethod.PUT, httpEntity, Map.class, id);
//TODO update the document revision id on the object from the returned value
//TODO better exception translation
} catch (RestClientException e) {
throw new UncategorizedCouchDataAccessException(e);
}
}
public void save(URI uri, Object document) {
Assert.notNull(document, "document must not be null for save");
Assert.notNull(uri, "URI must not be null for save");
HttpEntity<?> httpEntity = createHttpEntity(document);
try {
ResponseEntity<Map> response = restOperations.exchange(uri, HttpMethod.PUT, httpEntity, Map.class);
//TODO update the document revision id on the object from the returned value
//TODO better exception translation
} catch (RestClientException e) {
throw new UncategorizedCouchDataAccessException(e);
}
}
private HttpEntity<?> createHttpEntity(Object document) {
if (document instanceof HttpEntity) {
HttpEntity httpEntity = (HttpEntity) document;
Assert.isTrue(httpEntity.getHeaders().getContentType().equals(MediaType.APPLICATION_JSON),
"HttpEntity payload with non application/json content type found.");
return httpEntity;
}
HttpHeaders httpHeaders = new HttpHeaders();
httpHeaders.setContentType(MediaType.APPLICATION_JSON);
HttpEntity<Object> httpEntity = new HttpEntity<Object>(document, httpHeaders);
return httpEntity;
}
}

View File

@@ -1,315 +0,0 @@
/*
* Copyright 2011 the original author or authors
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb.core.support;
import java.io.IOException;
import java.nio.charset.Charset;
import java.util.*;
import org.codehaus.jackson.*;
import org.codehaus.jackson.map.JsonMappingException;
import org.codehaus.jackson.map.ObjectMapper;
import org.codehaus.jackson.map.type.TypeFactory;
import org.codehaus.jackson.type.JavaType;
import org.springframework.http.HttpInputMessage;
import org.springframework.http.HttpOutputMessage;
import org.springframework.http.MediaType;
import org.springframework.http.converter.AbstractHttpMessageConverter;
import org.springframework.http.converter.HttpMessageNotReadableException;
import org.springframework.http.converter.HttpMessageNotWritableException;
import org.springframework.util.Assert;
public class CouchDbMappingJacksonHttpMessageConverter extends
AbstractHttpMessageConverter<Object> {
public static final Charset DEFAULT_CHARSET = Charset.forName("UTF-8");
private static final String ROWS_FIELD_NAME = "rows";
private static final String VALUE_FIELD_NAME = "value";
private static final String INCLUDED_DOC_FIELD_NAME = "doc";
private static final String TOTAL_ROWS_FIELD_NAME = "total_rows";
private ObjectMapper objectMapper = new ObjectMapper();
private boolean prefixJson = false;
/**
* Construct a new {@code BindingJacksonHttpMessageConverter}.
*/
public CouchDbMappingJacksonHttpMessageConverter() {
super(new MediaType("application", "json", DEFAULT_CHARSET));
}
/**
* Sets the {@code ObjectMapper} for this view. If not set, a default
* {@link ObjectMapper#ObjectMapper() ObjectMapper} is used.
* <p/>
* Setting a custom-configured {@code ObjectMapper} is one way to take
* further control of the JSON serialization process. For example, an
* extended {@link org.codehaus.jackson.map.SerializerFactory} can be
* configured that provides custom serializers for specific types. The other
* option for refining the serialization process is to use Jackson's
* provided annotations on the types to be serialized, in which case a
* custom-configured ObjectMapper is unnecessary.
*/
public void setObjectMapper(ObjectMapper objectMapper) {
Assert.notNull(objectMapper, "'objectMapper' must not be null");
this.objectMapper = objectMapper;
}
/**
* Indicates whether the JSON output by this view should be prefixed with
* "{} &&". Default is false.
* <p/>
* Prefixing the JSON string in this manner is used to help prevent JSON
* Hijacking. The prefix renders the string syntactically invalid as a
* script so that it cannot be hijacked. This prefix does not affect the
* evaluation of JSON, but if JSON validation is performed on the string,
* the prefix would need to be ignored.
*/
public void setPrefixJson(boolean prefixJson) {
this.prefixJson = prefixJson;
}
@Override
public boolean canRead(Class<?> clazz, MediaType mediaType) {
JavaType javaType = getJavaType(clazz);
return this.objectMapper.canDeserialize(javaType) && canRead(mediaType);
}
/**
* Returns the Jackson {@link JavaType} for the specific class.
* <p/>
* <p/>
* Default implementation returns
* {@link TypeFactory#type(java.lang.reflect.Type)}, but this can be
* overridden in subclasses, to allow for custom generic collection
* handling. For instance:
* <p/>
* <pre class="code">
* protected JavaType getJavaType(Class&lt;?&gt; clazz) {
* if (List.class.isAssignableFrom(clazz)) {
* return TypeFactory.collectionType(ArrayList.class, MyBean.class);
* } else {
* return super.getJavaType(clazz);
* }
* }
* </pre>
*
* @param clazz the class to return the java type for
* @return the java type
*/
protected JavaType getJavaType(Class<?> clazz) {
return TypeFactory.type(clazz);
}
@Override
public boolean canWrite(Class<?> clazz, MediaType mediaType) {
return this.objectMapper.canSerialize(clazz) && canWrite(mediaType);
}
@Override
protected boolean supports(Class<?> clazz) {
// should not be called, since we override canRead/Write instead
throw new UnsupportedOperationException();
}
@Override
protected Object readInternal(Class<?> clazz, HttpInputMessage inputMessage)
throws IOException, HttpMessageNotReadableException {
JavaType javaType = getJavaType(clazz);
try {
return success(clazz, inputMessage);
// return this.objectMapper.readValue(inputMessage.getBody(),
// javaType);
} catch (Exception ex) {
throw new HttpMessageNotReadableException("Could not read JSON: "
+ ex.getMessage(), ex);
}
}
private Object success(Class<?> clazz, HttpInputMessage inputMessage)
throws JsonParseException, IOException {
//Note, parsing code used from ektorp project
JsonParser jp = objectMapper.getJsonFactory().createJsonParser(
inputMessage.getBody());
if (jp.nextToken() != JsonToken.START_OBJECT) {
throw new RuntimeException("Expected data to start with an Object");
}
Map<String, Integer> fields = readHeaderFields(jp);
List result;
if (fields.containsKey(TOTAL_ROWS_FIELD_NAME)) {
int totalRows = fields.get(TOTAL_ROWS_FIELD_NAME);
if (totalRows == 0) {
return Collections.emptyList();
}
result = new ArrayList(totalRows);
} else {
result = new ArrayList();
}
ParseState state = new ParseState();
Object first = parseFirstRow(jp, state, clazz);
if (first == null) {
return Collections.emptyList();
} else {
result.add(first);
}
while (jp.getCurrentToken() != null) {
skipToField(jp, state.docFieldName, state);
if (atEndOfRows(jp)) {
return result;
}
result.add(jp.readValueAs(clazz));
endRow(jp, state);
}
return result;
}
private Object parseFirstRow(JsonParser jp, ParseState state, Class clazz)
throws JsonParseException, IOException, JsonProcessingException,
JsonMappingException {
skipToField(jp, VALUE_FIELD_NAME, state);
JsonNode value = null;
if (atObjectStart(jp)) {
value = jp.readValueAsTree();
jp.nextToken();
if (isEndOfRow(jp)) {
state.docFieldName = VALUE_FIELD_NAME;
Object doc = objectMapper.readValue(value, clazz);
endRow(jp, state);
return doc;
}
}
skipToField(jp, INCLUDED_DOC_FIELD_NAME, state);
if (atObjectStart(jp)) {
state.docFieldName = INCLUDED_DOC_FIELD_NAME;
Object doc = jp.readValueAs(clazz);
endRow(jp, state);
return doc;
}
return null;
}
private boolean isEndOfRow(JsonParser jp) {
return jp.getCurrentToken() == JsonToken.END_OBJECT;
}
private void endRow(JsonParser jp, ParseState state) throws IOException, JsonParseException {
state.inRow = false;
jp.nextToken();
}
private boolean atObjectStart(JsonParser jp) {
return jp.getCurrentToken() == JsonToken.START_OBJECT;
}
private boolean atEndOfRows(JsonParser jp) {
return jp.getCurrentToken() != JsonToken.START_OBJECT;
}
private void skipToField(JsonParser jp, String fieldName, ParseState state) throws JsonParseException, IOException {
String lastFieldName = null;
while (jp.getCurrentToken() != null) {
switch (jp.getCurrentToken()) {
case FIELD_NAME:
lastFieldName = jp.getCurrentName();
jp.nextToken();
break;
case START_OBJECT:
if (!state.inRow) {
state.inRow = true;
jp.nextToken();
} else {
if (isInField(fieldName, lastFieldName)) {
return;
} else {
jp.skipChildren();
}
}
break;
default:
if (isInField(fieldName, lastFieldName)) {
jp.nextToken();
return;
}
jp.nextToken();
break;
}
}
}
private boolean isInField(String fieldName, String lastFieldName) {
return lastFieldName != null && lastFieldName.equals(fieldName);
}
private Map<String, Integer> readHeaderFields(JsonParser jp)
throws JsonParseException, IOException {
Map<String, Integer> map = new HashMap<String, Integer>();
jp.nextToken();
String nextFieldName = jp.getCurrentName();
while (!nextFieldName.equals(ROWS_FIELD_NAME)) {
jp.nextToken();
map.put(nextFieldName, Integer.valueOf(jp.getIntValue()));
jp.nextToken();
nextFieldName = jp.getCurrentName();
}
return map;
}
@Override
protected void writeInternal(Object o, HttpOutputMessage outputMessage)
throws IOException, HttpMessageNotWritableException {
JsonEncoding encoding = getEncoding(outputMessage.getHeaders()
.getContentType());
JsonGenerator jsonGenerator = this.objectMapper.getJsonFactory()
.createJsonGenerator(outputMessage.getBody(), encoding);
try {
if (this.prefixJson) {
jsonGenerator.writeRaw("{} && ");
}
this.objectMapper.writeValue(jsonGenerator, o);
} catch (JsonGenerationException ex) {
throw new HttpMessageNotWritableException("Could not write JSON: "
+ ex.getMessage(), ex);
}
}
private JsonEncoding getEncoding(MediaType contentType) {
if (contentType != null && contentType.getCharSet() != null) {
Charset charset = contentType.getCharSet();
for (JsonEncoding encoding : JsonEncoding.values()) {
if (charset.name().equals(encoding.getJavaName())) {
return encoding;
}
}
}
return JsonEncoding.UTF8;
}
private static class ParseState {
boolean inRow;
String docFieldName = "";
}
}

View File

@@ -1,41 +0,0 @@
/*
* Copyright 2002-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb.monitor;
import org.springframework.web.client.RestTemplate;
/**
* Base class to encapsulate common configuration settings when connecting to a CouchDB database
*
* @author Mark Pollack
*/
public abstract class AbstractMonitor {
protected RestTemplate restTemplate;
protected String databaseUrl;
/**
* Gets the databaseUrl used to connect to CouchDB
*
* @return
*/
public String getDatabaseUrl() {
return this.databaseUrl;
}
}

View File

@@ -1,62 +0,0 @@
/*
* Copyright 2002-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb.monitor;
import java.net.InetAddress;
import java.net.UnknownHostException;
import java.util.Map;
import org.springframework.jmx.export.annotation.ManagedOperation;
import org.springframework.jmx.export.annotation.ManagedResource;
import org.springframework.web.client.RestTemplate;
/**
* Expose basic server information via JMX
*
* @author Mark Pollack
*/
@ManagedResource(description = "Server Information")
public class ServerInfo extends AbstractMonitor {
public ServerInfo(String databaseUrl) {
this.databaseUrl = databaseUrl;
this.restTemplate = new RestTemplate();
}
@ManagedOperation(description = "Server host name")
public String getHostName() throws UnknownHostException {
return InetAddress.getLocalHost().getHostName();
}
@ManagedOperation(description = "CouchDB Server Version")
public String getVersion() {
return (String) getRoot().get("version");
}
@ManagedOperation(description = "Message of the day")
public String getMotd() {
return (String) getRoot().get("greeting");
}
public Map getRoot() {
Map map = restTemplate.getForObject(getDatabaseUrl(), Map.class);
return map;
}
}

View File

@@ -1,4 +0,0 @@
/**
* CouchDB specific JMX monitoring support.
*/
package org.springframework.data.document.couchdb.monitor;

View File

@@ -1,81 +0,0 @@
/*
* Copyright 2010 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb.support;
import org.springframework.dao.DataAccessException;
/**
* Helper class featuring helper methods for internal CouchDB classes.
* <p/>
* <p>Mainly intended for internal use within the framework.
*
* @author Thomas Risberg
* @author Tareq Abedrabbo
* @since 1.0
*/
public abstract class CouchUtils {
/**
* Convert the given runtime exception to an appropriate exception from the
* <code>org.springframework.dao</code> hierarchy.
* Return null if no translation is appropriate: any other exception may
* have resulted from user code, and should not be translated.
*
* @param ex runtime exception that occurred
* @return the corresponding DataAccessException instance,
* or <code>null</code> if the exception should not be translated
*/
public static DataAccessException translateCouchExceptionIfPossible(RuntimeException ex) {
return null;
}
/**
* Adds an id variable to a URL
*
* @param url the URL to modify
* @return the modified URL
*/
public static String addId(String url) {
return ensureTrailingSlash(url) + "{id}";
}
/**
* Adds a 'changes since' variable to a URL
*
* @param url
* @return
*/
public static String addChangesSince(String url) {
return ensureTrailingSlash(url) + "_changes?since={seq}";
}
/**
* Ensures that a URL ends with a slash.
*
* @param url the URL to modify
* @return the modified URL
*/
public static String ensureTrailingSlash(String url) {
if (!url.endsWith("/")) {
url += "/";
}
return url;
}
}

View File

@@ -1 +0,0 @@
http\://www.springframework.org/schema/data/couch=org.springframework.data.document.couchdb.config.CouchNamespaceHandler

View File

@@ -1,2 +0,0 @@
http\://www.springframework.org/schema/data/couch/spring-couch-1.0.xsd=org/springframework/data/document/couchdb/config/spring-couch-1.0.xsd
http\://www.springframework.org/schema/data/couch/spring-couch.xsd=org/springframework/data/document/couchdb/config/spring-couch-1.0.xsd

View File

@@ -1,4 +0,0 @@
# Tooling related information for the Couch DB namespace
http\://www.springframework.org/schema/data/couch@name=Couch Namespace
http\://www.springframework.org/schema/data/couch@prefix=couch
http\://www.springframework.org/schema/data/couch@icon=org/springframework/jdbc/config/spring-jdbc.gif

View File

@@ -1,33 +0,0 @@
<?xml version="1.0" encoding="UTF-8" ?>
<xsd:schema xmlns="http://www.springframework.org/schema/data/couch"
xmlns:xsd="http://www.w3.org/2001/XMLSchema"
xmlns:tool="http://www.springframework.org/schema/tool"
xmlns:context="http://www.springframework.org/schema/context"
xmlns:repository="http://www.springframework.org/schema/data/repository"
targetNamespace="http://www.springframework.org/schema/data/couch"
elementFormDefault="qualified" attributeFormDefault="unqualified">
<xsd:import namespace="http://www.springframework.org/schema/tool"/>
<xsd:import namespace="http://www.springframework.org/schema/context"
schemaLocation="http://www.springframework.org/schema/context/spring-context.xsd"/>
<xsd:import namespace="http://www.springframework.org/schema/data/repository"
schemaLocation="http://www.springframework.org/schema/data/repository/spring-repository.xsd"/>
<xsd:element name="jmx">
<xsd:annotation>
<xsd:documentation><![CDATA[
Defines a JMX Model MBeans for monitoring a CouchDB server'.
]]></xsd:documentation>
</xsd:annotation>
<xsd:complexType>
<xsd:attribute name="database-url" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The database URL of the CouchDB]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
</xsd:complexType>
</xsd:element>
</xsd:schema>

View File

@@ -1,77 +0,0 @@
/*
* Copyright 2011 the original author or authors
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb;
import java.util.Date;
import org.codehaus.jackson.annotate.JsonIgnoreProperties;
/**
* @author Tareq Abedrabbo (tareq.abedrabbo@opencredo.com)
* @since 13/01/2011
*/
@JsonIgnoreProperties(ignoreUnknown = true)
public class DummyDocument {
private String message;
private String timestamp = new Date().toString();
public DummyDocument() {
}
public DummyDocument(String message) {
this.message = message;
}
public String getMessage() {
return message;
}
public void setMessage(String message) {
this.message = message;
}
public String getTimestamp() {
return timestamp;
}
@Override
public boolean equals(Object o) {
if (this == o) return true;
if (o == null || getClass() != o.getClass()) return false;
DummyDocument document = (DummyDocument) o;
if (message != null ? !message.equals(document.message) : document.message != null) return false;
return true;
}
@Override
public int hashCode() {
return message != null ? message.hashCode() : 0;
}
@Override
public String toString() {
return "DummyDocument{" +
"message='" + message + '\'' +
", timestamp=" + timestamp +
'}';
}
}

View File

@@ -1,52 +0,0 @@
/*
* Copyright 2011 the original author or authors
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb;
import org.hamcrest.Description;
import org.hamcrest.Factory;
import org.hamcrest.Matcher;
import org.hamcrest.TypeSafeMatcher;
import org.springframework.http.HttpEntity;
/**
* Matches the content of the body of an HttpEntity.
*
* @author Tareq Abedrabbo
* @since 31/01/2011
*/
public class IsBodyEqual extends TypeSafeMatcher<HttpEntity> {
private Object object;
public IsBodyEqual(Object object) {
this.object = object;
}
@Override
public boolean matchesSafely(HttpEntity httpEntity) {
return httpEntity.getBody().equals(object);
}
public void describeTo(Description description) {
description.appendText("body equals ").appendValue(object);
}
@Factory
public static Matcher<HttpEntity> bodyEqual(Object object) {
return new IsBodyEqual(object);
}
}

View File

@@ -1,38 +0,0 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb.admin;
import java.util.List;
import junit.framework.Assert;
import org.junit.Ignore;
import org.junit.Test;
import org.springframework.data.document.couchdb.core.CouchConstants;
public class CouchAdminIntegrationTests {
@Test
@Ignore("until CI has couch server running")
public void dbLifecycle() {
CouchAdmin admin = new CouchAdmin(CouchConstants.COUCHDB_URL);
admin.deleteDatabase("foo");
List<String> dbs = admin.listDatabases();
admin.createDatabase("foo");
List<String> newDbs = admin.listDatabases();
Assert.assertEquals(dbs.size() + 1, newDbs.size());
}
}

View File

@@ -1,118 +0,0 @@
/*
* Copyright 2011 the original author or authors
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb.core;
import static org.junit.Assume.assumeNoException;
import static org.junit.Assume.assumeTrue;
import static org.springframework.http.HttpStatus.OK;
import java.io.IOException;
import java.util.UUID;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.junit.Before;
import org.junit.BeforeClass;
import org.springframework.http.*;
import org.springframework.http.client.ClientHttpResponse;
import org.springframework.web.client.DefaultResponseErrorHandler;
import org.springframework.web.client.RestClientException;
import org.springframework.web.client.RestTemplate;
/**
* Base class for CouchDB integration tests. Checks whether CouchDB is available before running each test,
* in which case the test is executed. If CouchDB is not available, tests are ignored.
*
* @author Tareq Abedrabbo (tareq.abedrabbo@opencredo.com)
* @since 13/01/2011
*/
public abstract class AbstractCouchTemplateIntegrationTests {
protected static final Log log = LogFactory.getLog(AbstractCouchTemplateIntegrationTests.class);
protected static final RestTemplate restTemplate = new RestTemplate();
/**
* This methods ensures that the database is running. Otherwise, the test is ignored.
*/
@BeforeClass
public static void assumeDatabaseIsUpAndRunning() {
try {
ResponseEntity<String> responseEntity = restTemplate.getForEntity(CouchConstants.COUCHDB_URL, String.class);
assumeTrue(responseEntity.getStatusCode().equals(OK));
log.debug("CouchDB is running on " + CouchConstants.COUCHDB_URL +
" with status " + responseEntity.getStatusCode());
} catch (RestClientException e) {
log.debug("CouchDB is not running on " + CouchConstants.COUCHDB_URL);
assumeNoException(e);
}
}
@Before
public void setUpTestDatabase() throws Exception {
RestTemplate template = new RestTemplate();
template.setErrorHandler(new DefaultResponseErrorHandler() {
@Override
public void handleError(ClientHttpResponse response) throws IOException {
// do nothing, error status will be handled in the switch statement
}
});
ResponseEntity<String> response = template.getForEntity(CouchConstants.TEST_DATABASE_URL, String.class);
HttpStatus statusCode = response.getStatusCode();
switch (statusCode) {
case NOT_FOUND:
createNewTestDatabase();
break;
case OK:
deleteExisitingTestDatabase();
createNewTestDatabase();
break;
default:
throw new IllegalStateException("Unsupported http status [" + statusCode + "]");
}
}
private void deleteExisitingTestDatabase() {
restTemplate.delete(CouchConstants.TEST_DATABASE_URL);
}
private void createNewTestDatabase() {
restTemplate.put(CouchConstants.TEST_DATABASE_URL, null);
}
/**
* Reads a CouchDB document and converts it to the expected type.
*/
protected <T> T getDocument(String id, Class<T> expectedType) {
String url = CouchConstants.TEST_DATABASE_URL + "{id}";
return restTemplate.getForObject(url, expectedType, id);
}
/**
* Writes a CouchDB document
*/
protected String putDocument(Object document) {
HttpHeaders headers = new HttpHeaders();
headers.setContentType(MediaType.APPLICATION_JSON);
HttpEntity request = new HttpEntity(document, headers);
String id = UUID.randomUUID().toString();
restTemplate.put(CouchConstants.TEST_DATABASE_URL + "{id}", request, id);
return id;
}
}

View File

@@ -1,27 +0,0 @@
/*
* Copyright 2011 the original author or authors
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb.core;
public abstract class CouchConstants {
public static final String COUCHDB_URL = "http://127.0.0.1:5984/";
public static final String TEST_DATABASE_URL = COUCHDB_URL + "si_couchdb_test/";
public CouchConstants() {
// TODO Auto-generated constructor stub
}
}

View File

@@ -1,39 +0,0 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb.core;
import java.util.UUID;
import junit.framework.Assert;
import org.junit.Ignore;
import org.junit.Test;
import org.springframework.data.document.couchdb.DummyDocument;
public class CouchTemplateIntegrationTests extends AbstractCouchTemplateIntegrationTests {
@Test
@Ignore("until CI has couch server running")
public void saveAndFindTest() {
CouchTemplate template = new CouchTemplate(CouchConstants.TEST_DATABASE_URL);
DummyDocument document = new DummyDocument("hello");
String id = UUID.randomUUID().toString();
template.save(id, document);
DummyDocument foundDocument = template.findOne(id, DummyDocument.class);
Assert.assertEquals(document.getMessage(), foundDocument.getMessage());
}
}

View File

@@ -1,36 +0,0 @@
/*
* Copyright 2002-2010 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb.monitor;
import org.springframework.context.support.ClassPathXmlApplicationContext;
/**
* Server application to test JMX functionality.
*
* @author Mark Pollack
*/
public class JmxServer {
public static void main(String[] args) {
new JmxServer().run();
}
public void run() {
new ClassPathXmlApplicationContext(new String[]{"server-jmx.xml"});
}
}

View File

@@ -1,13 +0,0 @@
log4j.rootCategory=INFO, stdout
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d %p [%c] - <%m>%n
log4j.category.org.apache.activemq=ERROR
log4j.category.org.springframework.batch=DEBUG
log4j.category.org.springframework.transaction=INFO
log4j.category.org.hibernate.SQL=DEBUG
# for debugging datasource initialization
# log4j.category.test.jdbc=DEBUG

View File

@@ -1,24 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:context="http://www.springframework.org/schema/context"
xmlns:p="http://www.springframework.org/schema/p"
xmlns:couch="http://www.springframework.org/schema/data/couch"
xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.0.xsd
http://www.springframework.org/schema/data/couch http://www.springframework.org/schema/data/couch/spring-couch-1.0.xsd
http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context-3.0.xsd">
<couch:jmx/>
<context:mbean-export/>
<bean id="registry" class="org.springframework.remoting.rmi.RmiRegistryFactoryBean"
p:port="1099"/>
<!-- Expose JMX over RMI -->
<bean id="serverConnector" class="org.springframework.jmx.support.ConnectorServerFactoryBean" depends-on="registry"
p:objectName="connector:name=rmi"
p:serviceUrl="service:jmx:rmi://localhost/jndi/rmi://localhost:1099/myconnector"/>
</beans>

View File

@@ -1,24 +0,0 @@
Bundle-SymbolicName: org.springframework.data.couchdb
Bundle-Name: Spring Data Couch DB Support
Bundle-Vendor: SpringSource
Bundle-ManifestVersion: 2
Import-Package:
sun.reflect;version="0";resolution:=optional
Import-Template:
org.springframework.beans.*;version="[3.0.0, 4.0.0)",
org.springframework.core.*;version="[3.0.0, 4.0.0)",
org.springframework.dao.*;version="[3.0.0, 4.0.0)",
org.springframework.http.*;version="[3.0.0, 4.0.0)",
org.springframework.web.*;version="[3.0.0, 4.0.0)",
org.springframework.util.*;version="[3.0.0, 4.0.0)",
org.springframework.context.*;version="[3.0.0, 4.0.0)",
org.springframework.jmx.*;version="[3.0.0, 4.0.0)",
org.springframework.remoting.*;version="[3.0.0, 4.0.0)",
org.springframework.data.core.*;version="[1.0.0, 2.0.0)",
org.springframework.data.document.*;version="[1.0.0, 2.0.0)",
org.codehaus.jackson.*;version="[1.0.0, 2.0.0)";resolution:=optional,
org.aopalliance.*;version="[1.0.0, 2.0.0)";resolution:=optional,
org.apache.commons.logging.*;version="[1.1.1, 2.0.0)",
org.w3c.dom.*;version="0"

View File

@@ -3,12 +3,11 @@
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-document-parent</artifactId>
<version>1.0.0.M3</version>
<relativePath>../spring-data-document-parent/pom.xml</relativePath>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.0.0.M5</version>
<relativePath>../spring-data-mongodb-parent/pom.xml</relativePath>
</parent>
<artifactId>spring-data-mongodb-cross-store</artifactId>
<packaging>jar</packaging>
<name>Spring Data MongoDB Cross-store Persistence Support</name>
<dependencies>

View File

@@ -1,30 +1,22 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb.core;
import org.junit.Test;
/**
* Unit tests for CouchTemplate with mocks
*/
public class CouchTemplateTests {
@Test
public void foo() {
}
}
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.crossstore;
import org.springframework.data.crossstore.ChangeSetBacked;
public interface DocumentBacked extends ChangeSetBacked {
}

View File

@@ -0,0 +1,185 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.crossstore;
import javax.persistence.EntityManagerFactory;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.DataAccessResourceFailureException;
import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.data.crossstore.ChangeSet;
import org.springframework.data.crossstore.ChangeSetBacked;
import org.springframework.data.crossstore.ChangeSetPersister;
import org.springframework.data.mongodb.core.CollectionCallback;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.util.ClassUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mongodb.MongoException;
public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
private static final String ENTITY_CLASS = "_entity_class";
private static final String ENTITY_ID = "_entity_id";
private static final String ENTITY_FIELD_NAME = "_entity_field_name";
private static final String ENTITY_FIELD_CLASS = "_entity_field_class";
protected final Log log = LogFactory.getLog(getClass());
private MongoTemplate mongoTemplate;
private EntityManagerFactory entityManagerFactory;
public void setMongoTemplate(MongoTemplate mongoTemplate) {
this.mongoTemplate = mongoTemplate;
}
public void setEntityManagerFactory(EntityManagerFactory entityManagerFactory) {
this.entityManagerFactory = entityManagerFactory;
}
public void getPersistentState(Class<? extends ChangeSetBacked> entityClass,
Object id, final ChangeSet changeSet)
throws DataAccessException, NotFoundException {
if (id == null) {
log.debug("Unable to load MongoDB data for null id");
return;
}
String collName = getCollectionNameForEntity(entityClass);
final DBObject dbk = new BasicDBObject();
dbk.put(ENTITY_ID, id);
dbk.put(ENTITY_CLASS, entityClass.getName());
if (log.isDebugEnabled()) {
log.debug("Loading MongoDB data for " + dbk);
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection)
throws MongoException, DataAccessException {
for (DBObject dbo : collection.find(dbk)) {
String key = (String) dbo.get(ENTITY_FIELD_NAME);
if (log.isDebugEnabled()) {
log.debug("Processing key: " + key);
}
if (!changeSet.getValues().containsKey(key)) {
String className = (String) dbo.get(ENTITY_FIELD_CLASS);
if (className == null) {
throw new DataIntegrityViolationException(
"Unble to convert property " + key
+ ": Invalid metadata, " + ENTITY_FIELD_CLASS + " not available");
}
Class<?> clazz = ClassUtils.resolveClassName(className, ClassUtils.getDefaultClassLoader());
Object value = mongoTemplate.getConverter().read(clazz, dbo);
if (log.isDebugEnabled()) {
log.debug("Adding to ChangeSet: " + key);
}
changeSet.set(key, value);
}
}
return null;
}
});
}
public Object getPersistentId(ChangeSetBacked entity, ChangeSet cs) throws DataAccessException {
log.debug("getPersistentId called on " + entity);
if (entityManagerFactory == null) {
throw new DataAccessResourceFailureException("EntityManagerFactory cannot be null");
}
Object o = entityManagerFactory.getPersistenceUnitUtil().getIdentifier(entity);
return o;
}
public Object persistState(ChangeSetBacked entity, ChangeSet cs) throws DataAccessException {
if (cs == null) {
log.debug("Flush: changeset was null, nothing to flush.");
return 0L;
}
if (log.isDebugEnabled()) {
log.debug("Flush: changeset: " + cs.getValues());
}
String collName = getCollectionNameForEntity(entity.getClass());
if (mongoTemplate.getCollection(collName) == null) {
mongoTemplate.createCollection(collName);
}
for (String key : cs.getValues().keySet()) {
if (key != null && !key.startsWith("_") && !key.equals(ChangeSetPersister.ID_KEY)) {
Object value = cs.getValues().get(key);
final DBObject dbQuery = new BasicDBObject();
dbQuery.put(ENTITY_ID, getPersistentId(entity, cs));
dbQuery.put(ENTITY_CLASS, entity.getClass().getName());
dbQuery.put(ENTITY_FIELD_NAME, key);
DBObject dbId = mongoTemplate.execute(collName,
new CollectionCallback<DBObject>() {
public DBObject doInCollection(DBCollection collection)
throws MongoException, DataAccessException {
return collection.findOne(dbQuery);
}
});
if (value == null) {
if (log.isDebugEnabled()) {
log.debug("Flush: removing: " + dbQuery);
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection)
throws MongoException, DataAccessException {
collection.remove(dbQuery);
return null;
}
});
}
else {
final DBObject dbDoc = new BasicDBObject();
dbDoc.putAll(dbQuery);
if (log.isDebugEnabled()) {
log.debug("Flush: saving: " + dbQuery);
}
mongoTemplate.getConverter().write(value, dbDoc);
dbDoc.put(ENTITY_FIELD_CLASS, value.getClass().getName());
if (dbId != null) {
dbDoc.put("_id", dbId.get("_id"));
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection)
throws MongoException, DataAccessException {
collection.save(dbDoc);
return null;
}
});
}
}
}
return 0L;
}
private String getCollectionNameForEntity(Class<? extends ChangeSetBacked> entityClass) {
return ClassUtils.getQualifiedName(entityClass);
}
}

View File

@@ -1,4 +1,19 @@
package org.springframework.data.persistence.document.mongodb;
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.crossstore;
import java.lang.reflect.Field;
@@ -12,14 +27,14 @@ import org.aspectj.lang.JoinPoint;
import org.aspectj.lang.reflect.FieldSignature;
import org.springframework.dao.DataAccessException;
import org.springframework.data.persistence.document.RelatedDocument;
import org.springframework.data.persistence.document.DocumentBacked;
import org.springframework.data.persistence.document.DocumentBackedTransactionSynchronization;
import org.springframework.data.persistence.ChangeSet;
import org.springframework.data.persistence.ChangeSetPersister;
import org.springframework.data.persistence.ChangeSetPersister.NotFoundException;
import org.springframework.data.persistence.HashMapChangeSet;
import org.springframework.data.mongodb.crossstore.RelatedDocument;
import org.springframework.data.mongodb.crossstore.DocumentBacked;
import org.springframework.data.crossstore.ChangeSetBackedTransactionSynchronization;
import org.springframework.data.crossstore.ChangeSet;
import org.springframework.data.crossstore.ChangeSetPersister;
import org.springframework.data.crossstore.ChangeSetPersister.NotFoundException;
import org.springframework.data.crossstore.HashMapChangeSet;
import org.springframework.transaction.support.TransactionSynchronizationManager;
/**
@@ -130,7 +145,7 @@ public aspect MongoDocumentBacking {
entity.setChangeSet(new HashMapChangeSet());
entity.itdChangeSetPersister = changeSetPersister;
entity.itdTransactionSynchronization =
new DocumentBackedTransactionSynchronization(changeSetPersister, entity);
new ChangeSetBackedTransactionSynchronization(changeSetPersister, entity);
//registerTransactionSynchronization(entity);
}
@@ -163,7 +178,7 @@ public aspect MongoDocumentBacking {
@Transient private ChangeSetPersister<?> DocumentBacked.itdChangeSetPersister;
@Transient private DocumentBackedTransactionSynchronization DocumentBacked.itdTransactionSynchronization;
@Transient private ChangeSetBackedTransactionSynchronization DocumentBacked.itdTransactionSynchronization;
public void DocumentBacked.setChangeSet(ChangeSet cs) {
this.changeSet = cs;

View File

@@ -1,11 +1,11 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -13,8 +13,7 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.persistence.document;
package org.springframework.data.mongodb.crossstore;
import java.lang.annotation.ElementType;
import java.lang.annotation.Retention;

View File

@@ -1,7 +0,0 @@
package org.springframework.data.persistence.document;
import org.springframework.data.persistence.ChangeSetBacked;
public interface DocumentBacked extends ChangeSetBacked {
}

View File

@@ -1,63 +0,0 @@
package org.springframework.data.persistence.document;
//public class DocumentBackedTransactionSynchronization {
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.springframework.data.persistence.ChangeSetBacked;
import org.springframework.data.persistence.ChangeSetPersister;
import org.springframework.transaction.support.TransactionSynchronization;
public class DocumentBackedTransactionSynchronization implements TransactionSynchronization {
protected final Log log = LogFactory.getLog(getClass());
private ChangeSetPersister<Object> changeSetPersister;
private ChangeSetBacked entity;
private int changeSetTxStatus = -1;
public DocumentBackedTransactionSynchronization(ChangeSetPersister<Object> changeSetPersister, ChangeSetBacked entity) {
this.changeSetPersister = changeSetPersister;
this.entity = entity;
}
public void afterCommit() {
log.debug("After Commit called for " + entity);
changeSetPersister.persistState(entity, entity.getChangeSet());
changeSetTxStatus = 0;
}
public void afterCompletion(int status) {
log.debug("After Completion called with status = " + status);
if (changeSetTxStatus == 0) {
if (status == STATUS_COMMITTED) {
// this is good
log.debug("ChangedSetBackedTransactionSynchronization completed successfully for " + this.entity);
}
else {
// this could be bad - TODO: compensate
log.error("ChangedSetBackedTransactionSynchronization failed for " + this.entity);
}
}
}
public void beforeCommit(boolean readOnly) {
}
public void beforeCompletion() {
}
public void flush() {
}
public void resume() {
throw new IllegalStateException("ChangedSetBackedTransactionSynchronization does not support transaction suspension currently.");
}
public void suspend() {
throw new IllegalStateException("ChangedSetBackedTransactionSynchronization does not support transaction suspension currently.");
}
}

View File

@@ -1,169 +0,0 @@
package org.springframework.data.persistence.document.mongodb;
import javax.persistence.EntityManagerFactory;
import com.mongodb.BasicDBObject;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mongodb.MongoException;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.DataAccessResourceFailureException;
import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.data.document.mongodb.CollectionCallback;
import org.springframework.data.document.mongodb.MongoTemplate;
import org.springframework.data.persistence.ChangeSet;
import org.springframework.data.persistence.ChangeSetBacked;
import org.springframework.data.persistence.ChangeSetPersister;
import org.springframework.util.ClassUtils;
public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
private static final String ENTITY_CLASS = "_entity_class";
private static final String ENTITY_ID = "_entity_id";
private static final String ENTITY_FIELD_NAME = "_entity_field_name";
private static final String ENTITY_FIELD_CLASS = "_entity_field_class";
protected final Log log = LogFactory.getLog(getClass());
private MongoTemplate mongoTemplate;
private EntityManagerFactory entityManagerFactory;
public void setMongoTemplate(MongoTemplate mongoTemplate) {
this.mongoTemplate = mongoTemplate;
}
public void setEntityManagerFactory(EntityManagerFactory entityManagerFactory) {
this.entityManagerFactory = entityManagerFactory;
}
public void getPersistentState(Class<? extends ChangeSetBacked> entityClass,
Object id, final ChangeSet changeSet)
throws DataAccessException, NotFoundException {
if (id == null) {
log.debug("Unable to load MongoDB data for null id");
return;
}
String collName = getCollectionNameForEntity(entityClass);
final DBObject dbk = new BasicDBObject();
dbk.put(ENTITY_ID, id);
dbk.put(ENTITY_CLASS, entityClass.getName());
if (log.isDebugEnabled()) {
log.debug("Loading MongoDB data for " + dbk);
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection)
throws MongoException, DataAccessException {
for (DBObject dbo : collection.find(dbk)) {
String key = (String) dbo.get(ENTITY_FIELD_NAME);
if (log.isDebugEnabled()) {
log.debug("Processing key: " + key);
}
if (!changeSet.getValues().containsKey(key)) {
String className = (String) dbo.get(ENTITY_FIELD_CLASS);
if (className == null) {
throw new DataIntegrityViolationException(
"Unble to convert property " + key
+ ": Invalid metadata, " + ENTITY_FIELD_CLASS + " not available");
}
Class<?> clazz = ClassUtils.resolveClassName(className, ClassUtils.getDefaultClassLoader());
Object value = mongoTemplate.getConverter().read(clazz, dbo);
if (log.isDebugEnabled()) {
log.debug("Adding to ChangeSet: " + key);
}
changeSet.set(key, value);
}
}
return null;
}
});
}
public Object getPersistentId(ChangeSetBacked entity, ChangeSet cs) throws DataAccessException {
log.debug("getPersistentId called on " + entity);
if (entityManagerFactory == null) {
throw new DataAccessResourceFailureException("EntityManagerFactory cannot be null");
}
Object o = entityManagerFactory.getPersistenceUnitUtil().getIdentifier(entity);
return o;
}
public Object persistState(ChangeSetBacked entity, ChangeSet cs) throws DataAccessException {
if (cs == null) {
log.debug("Flush: changeset was null, nothing to flush.");
return 0L;
}
if (log.isDebugEnabled()) {
log.debug("Flush: changeset: " + cs.getValues());
}
String collName = getCollectionNameForEntity(entity.getClass());
DBCollection dbc = mongoTemplate.getCollection(collName);
if (dbc == null) {
dbc = mongoTemplate.createCollection(collName);
}
for (String key : cs.getValues().keySet()) {
if (key != null && !key.startsWith("_") && !key.equals(ChangeSetPersister.ID_KEY)) {
Object value = cs.getValues().get(key);
final DBObject dbQuery = new BasicDBObject();
dbQuery.put(ENTITY_ID, getPersistentId(entity, cs));
dbQuery.put(ENTITY_CLASS, entity.getClass().getName());
dbQuery.put(ENTITY_FIELD_NAME, key);
DBObject dbId = mongoTemplate.execute(collName,
new CollectionCallback<DBObject>() {
public DBObject doInCollection(DBCollection collection)
throws MongoException, DataAccessException {
return collection.findOne(dbQuery);
}
});
if (value == null) {
if (log.isDebugEnabled()) {
log.debug("Flush: removing: " + dbQuery);
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection)
throws MongoException, DataAccessException {
collection.remove(dbQuery);
return null;
}
});
}
else {
final DBObject dbDoc = new BasicDBObject();
dbDoc.putAll(dbQuery);
if (log.isDebugEnabled()) {
log.debug("Flush: saving: " + dbQuery);
}
mongoTemplate.getConverter().write(value, dbDoc);
dbDoc.put(ENTITY_FIELD_CLASS, value.getClass().getName());
if (dbId != null) {
dbDoc.put("_id", dbId.get("_id"));
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection)
throws MongoException, DataAccessException {
collection.save(dbDoc);
return null;
}
});
}
}
}
return 0L;
}
private String getCollectionNameForEntity(Class<? extends ChangeSetBacked> entityClass) {
return ClassUtils.getQualifiedName(entityClass);
}
}

View File

@@ -1,154 +0,0 @@
package org.springframework.data.document.persistence;
import javax.persistence.EntityManager;
import javax.persistence.PersistenceContext;
import org.junit.Assert;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.data.document.mongodb.MongoTemplate;
import org.springframework.data.document.persistence.test.Address;
import org.springframework.data.document.persistence.test.Person;
import org.springframework.data.document.persistence.test.Resume;
import org.springframework.test.annotation.Rollback;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import org.springframework.transaction.PlatformTransactionManager;
import org.springframework.transaction.TransactionStatus;
import org.springframework.transaction.annotation.Transactional;
import org.springframework.transaction.support.TransactionCallback;
import org.springframework.transaction.support.TransactionTemplate;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration(locations = "classpath:/META-INF/spring/applicationContext.xml")
public class CrossStoreMongoTests {
@Autowired
private MongoTemplate mongoTemplate;
private EntityManager entityManager;
@Autowired
private PlatformTransactionManager transactionManager;
@PersistenceContext
public void setEntityManager(EntityManager entityManager) {
this.entityManager = entityManager;
}
private void clearData(String collectionName) {
DBCollection col = this.mongoTemplate.getCollection(collectionName);
if (col != null) {
this.mongoTemplate.dropCollection(collectionName);
}
}
@Test
@Transactional
@Rollback(false)
public void testCreateJpaToMongoEntityRelationship() {
clearData(Person.class.getName());
Person p = new Person("Thomas", 20);
Address a = new Address(12, "MAin St.", "Boston", "MA", "02101");
p.setAddress(a);
Resume r = new Resume();
r.addEducation("Skanstulls High School, 1975");
r.addEducation("Univ. of Stockholm, 1980");
r.addJob("DiMark, DBA, 1990-2000");
r.addJob("VMware, Developer, 2007-");
p.setResume(r);
p.setId(1L);
entityManager.persist(p);
}
@Test
@Transactional
@Rollback(false)
public void testReadJpaToMongoEntityRelationship() {
Person found = entityManager.find(Person.class, 1L);
Assert.assertNotNull(found);
Assert.assertEquals(Long.valueOf(1), found.getId());
Assert.assertNotNull(found);
Assert.assertEquals(Long.valueOf(1), found.getId());
Assert.assertNotNull(found.getResume());
Assert.assertEquals("DiMark, DBA, 1990-2000" + "; "
+ "VMware, Developer, 2007-", found.getResume().getJobs());
found.getResume().addJob("SpringDeveloper.com, Consultant, 2005-2006");
found.setAge(44);
}
@Test
@Transactional
@Rollback(false)
public void testUpdatedJpaToMongoEntityRelationship() {
Person found = entityManager.find(Person.class, 1L);
Assert.assertNotNull(found);
Assert.assertEquals(Long.valueOf(1), found.getId());
Assert.assertNotNull(found);
Assert.assertEquals(Long.valueOf(1), found.getId());
Assert.assertNotNull(found.getResume());
Assert.assertEquals("DiMark, DBA, 1990-2000" + "; "
+ "VMware, Developer, 2007-" + "; "
+ "SpringDeveloper.com, Consultant, 2005-2006", found.getResume().getJobs());
}
@Test
public void testMergeJpaEntityWithMongoDocument() {
TransactionTemplate txTemplate = new TransactionTemplate(transactionManager);
final Person detached = entityManager.find(Person.class, 1L);
detached.getResume().addJob("TargetRx, Developer, 2000-2005");
Person merged = txTemplate.execute(new TransactionCallback<Person>() {
public Person doInTransaction(TransactionStatus status) {
return entityManager.merge(detached);
}
});
Assert.assertTrue(detached.getResume().getJobs().contains("TargetRx, Developer, 2000-2005"));
Assert.assertTrue(merged.getResume().getJobs().contains("TargetRx, Developer, 2000-2005"));
final Person updated = entityManager.find(Person.class, 1L);
Assert.assertTrue(updated.getResume().getJobs().contains("TargetRx, Developer, 2000-2005"));
}
@Test
public void testRemoveJpaEntityWithMongoDocument() {
TransactionTemplate txTemplate = new TransactionTemplate(transactionManager);
txTemplate.execute(new TransactionCallback<Person>() {
public Person doInTransaction(TransactionStatus status) {
Person p2 = new Person("Thomas", 20);
Resume r2 = new Resume();
r2.addEducation("Skanstulls High School, 1975");
r2.addJob("DiMark, DBA, 1990-2000");
p2.setResume(r2);
p2.setId(2L);
entityManager.persist(p2);
Person p3 = new Person("Thomas", 20);
Resume r3 = new Resume();
r3.addEducation("Univ. of Stockholm, 1980");
r3.addJob("VMware, Developer, 2007-");
p3.setResume(r3);
p3.setId(3L);
entityManager.persist(p3);
return null;
}
});
txTemplate.execute(new TransactionCallback<Person>() {
public Person doInTransaction(TransactionStatus status) {
final Person found2 = entityManager.find(Person.class, 2L);
entityManager.remove(found2);
return null;
}
});
boolean weFound3 = false;
for (DBObject dbo : this.mongoTemplate.getCollection(Person.class.getName()).find()) {
Assert.assertTrue(!dbo.get("_entity_id").equals(2L));
if (dbo.get("_entity_id").equals(3L)) {
weFound3 = true;
}
}
Assert.assertTrue(weFound3);
}
}

View File

@@ -1,87 +0,0 @@
package org.springframework.data.document.persistence.test;
import javax.persistence.Entity;
import javax.persistence.Id;
import org.springframework.data.persistence.document.RelatedDocument;
@Entity
public class Person {
@Id
Long id;
private String name;
private int age;
private java.util.Date birthDate;
@RelatedDocument
private Address address;
@RelatedDocument
private Resume resume;
public Person() {
}
public Person(String name, int age) {
this.name = name;
this.age = age;
this.birthDate = new java.util.Date();
}
public void birthday() {
++age;
}
public Long getId() {
return id;
}
public void setId(Long id) {
this.id = id;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public int getAge() {
return age;
}
public void setAge(int age) {
this.age = age;
}
public java.util.Date getBirthDate() {
return birthDate;
}
public void setBirthDate(java.util.Date birthDate) {
this.birthDate = birthDate;
}
public Resume getResume() {
return resume;
}
public void setResume(Resume resume) {
this.resume = resume;
}
public Address getAddress() {
return address;
}
public void setAddress(Address address) {
this.address = address;
}
}

View File

@@ -1,48 +0,0 @@
package org.springframework.data.document.persistence.test;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.bson.types.ObjectId;
import org.springframework.data.annotation.Id;
import org.springframework.data.document.mongodb.mapping.Document;
@Document
public class Resume {
private static final Log LOGGER = LogFactory.getLog(Resume.class);
@Id
private ObjectId id;
private String education = "";
private String jobs = "";
public String getId() {
return id.toString();
}
public String getEducation() {
return education;
}
public void addEducation(String education) {
LOGGER.debug("Adding education " + education);
this.education = this.education + (this.education.length() > 0 ? "; " : "") + education;
}
public String getJobs() {
return jobs;
}
public void addJob(String job) {
LOGGER.debug("Adding job " + job);
this.jobs = this.jobs + (this.jobs.length() > 0 ? "; " : "") + job;
}
@Override
public String toString() {
return "Resume [education=" + education + ", jobs=" + jobs + "]";
}
}

View File

@@ -0,0 +1,169 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.crossstore;
import javax.persistence.EntityManager;
import javax.persistence.PersistenceContext;
import org.junit.Assert;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.crossstore.test.Address;
import org.springframework.data.mongodb.crossstore.test.Person;
import org.springframework.data.mongodb.crossstore.test.Resume;
import org.springframework.test.annotation.Rollback;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import org.springframework.transaction.PlatformTransactionManager;
import org.springframework.transaction.TransactionStatus;
import org.springframework.transaction.annotation.Transactional;
import org.springframework.transaction.support.TransactionCallback;
import org.springframework.transaction.support.TransactionTemplate;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration(locations = "classpath:/META-INF/spring/applicationContext.xml")
public class CrossStoreMongoTests {
@Autowired
private MongoTemplate mongoTemplate;
private EntityManager entityManager;
@Autowired
private PlatformTransactionManager transactionManager;
@PersistenceContext
public void setEntityManager(EntityManager entityManager) {
this.entityManager = entityManager;
}
private void clearData(String collectionName) {
DBCollection col = this.mongoTemplate.getCollection(collectionName);
if (col != null) {
this.mongoTemplate.dropCollection(collectionName);
}
}
@Test
@Transactional
@Rollback(false)
public void testCreateJpaToMongoEntityRelationship() {
clearData(Person.class.getName());
Person p = new Person("Thomas", 20);
Address a = new Address(12, "MAin St.", "Boston", "MA", "02101");
p.setAddress(a);
Resume r = new Resume();
r.addEducation("Skanstulls High School, 1975");
r.addEducation("Univ. of Stockholm, 1980");
r.addJob("DiMark, DBA, 1990-2000");
r.addJob("VMware, Developer, 2007-");
p.setResume(r);
p.setId(1L);
entityManager.persist(p);
}
@Test
@Transactional
@Rollback(false)
public void testReadJpaToMongoEntityRelationship() {
Person found = entityManager.find(Person.class, 1L);
Assert.assertNotNull(found);
Assert.assertEquals(Long.valueOf(1), found.getId());
Assert.assertNotNull(found);
Assert.assertEquals(Long.valueOf(1), found.getId());
Assert.assertNotNull(found.getResume());
Assert.assertEquals("DiMark, DBA, 1990-2000" + "; "
+ "VMware, Developer, 2007-", found.getResume().getJobs());
found.getResume().addJob("SpringDeveloper.com, Consultant, 2005-2006");
found.setAge(44);
}
@Test
@Transactional
@Rollback(false)
public void testUpdatedJpaToMongoEntityRelationship() {
Person found = entityManager.find(Person.class, 1L);
Assert.assertNotNull(found);
Assert.assertEquals(Long.valueOf(1), found.getId());
Assert.assertNotNull(found);
Assert.assertEquals(Long.valueOf(1), found.getId());
Assert.assertNotNull(found.getResume());
Assert.assertEquals("DiMark, DBA, 1990-2000" + "; "
+ "VMware, Developer, 2007-" + "; "
+ "SpringDeveloper.com, Consultant, 2005-2006", found.getResume().getJobs());
}
@Test
public void testMergeJpaEntityWithMongoDocument() {
TransactionTemplate txTemplate = new TransactionTemplate(transactionManager);
final Person detached = entityManager.find(Person.class, 1L);
detached.getResume().addJob("TargetRx, Developer, 2000-2005");
Person merged = txTemplate.execute(new TransactionCallback<Person>() {
public Person doInTransaction(TransactionStatus status) {
return entityManager.merge(detached);
}
});
Assert.assertTrue(detached.getResume().getJobs().contains("TargetRx, Developer, 2000-2005"));
Assert.assertTrue(merged.getResume().getJobs().contains("TargetRx, Developer, 2000-2005"));
final Person updated = entityManager.find(Person.class, 1L);
Assert.assertTrue(updated.getResume().getJobs().contains("TargetRx, Developer, 2000-2005"));
}
@Test
public void testRemoveJpaEntityWithMongoDocument() {
TransactionTemplate txTemplate = new TransactionTemplate(transactionManager);
txTemplate.execute(new TransactionCallback<Person>() {
public Person doInTransaction(TransactionStatus status) {
Person p2 = new Person("Thomas", 20);
Resume r2 = new Resume();
r2.addEducation("Skanstulls High School, 1975");
r2.addJob("DiMark, DBA, 1990-2000");
p2.setResume(r2);
p2.setId(2L);
entityManager.persist(p2);
Person p3 = new Person("Thomas", 20);
Resume r3 = new Resume();
r3.addEducation("Univ. of Stockholm, 1980");
r3.addJob("VMware, Developer, 2007-");
p3.setResume(r3);
p3.setId(3L);
entityManager.persist(p3);
return null;
}
});
txTemplate.execute(new TransactionCallback<Person>() {
public Person doInTransaction(TransactionStatus status) {
final Person found2 = entityManager.find(Person.class, 2L);
entityManager.remove(found2);
return null;
}
});
boolean weFound3 = false;
for (DBObject dbo : this.mongoTemplate.getCollection(Person.class.getName()).find()) {
Assert.assertTrue(!dbo.get("_entity_id").equals(2L));
if (dbo.get("_entity_id").equals(3L)) {
weFound3 = true;
}
}
Assert.assertTrue(weFound3);
}
}

View File

@@ -1,13 +1,28 @@
package org.springframework.data.document.persistence.test;
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.crossstore.test;
public class Address {
private Integer streetNumber;
private String streetName;
private String city;
private String state;
private String zip;
public Address(Integer streetNumber, String streetName, String city,
String state, String zip) {
super();
@@ -17,7 +32,7 @@ public class Address {
this.state = state;
this.zip = zip;
}
public Integer getStreetNumber() {
return streetNumber;
}
@@ -48,7 +63,7 @@ public class Address {
public void setZip(String zip) {
this.zip = zip;
}
}

View File

@@ -0,0 +1,102 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.crossstore.test;
import javax.persistence.Entity;
import javax.persistence.Id;
import org.springframework.data.mongodb.crossstore.RelatedDocument;
@Entity
public class Person {
@Id
Long id;
private String name;
private int age;
private java.util.Date birthDate;
@RelatedDocument
private Address address;
@RelatedDocument
private Resume resume;
public Person() {
}
public Person(String name, int age) {
this.name = name;
this.age = age;
this.birthDate = new java.util.Date();
}
public void birthday() {
++age;
}
public Long getId() {
return id;
}
public void setId(Long id) {
this.id = id;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public int getAge() {
return age;
}
public void setAge(int age) {
this.age = age;
}
public java.util.Date getBirthDate() {
return birthDate;
}
public void setBirthDate(java.util.Date birthDate) {
this.birthDate = birthDate;
}
public Resume getResume() {
return resume;
}
public void setResume(Resume resume) {
this.resume = resume;
}
public Address getAddress() {
return address;
}
public void setAddress(Address address) {
this.address = address;
}
}

View File

@@ -0,0 +1,63 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.crossstore.test;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.bson.types.ObjectId;
import org.springframework.data.annotation.Id;
import org.springframework.data.mongodb.core.mapping.Document;
@Document
public class Resume {
private static final Log LOGGER = LogFactory.getLog(Resume.class);
@Id
private ObjectId id;
private String education = "";
private String jobs = "";
public String getId() {
return id.toString();
}
public String getEducation() {
return education;
}
public void addEducation(String education) {
LOGGER.debug("Adding education " + education);
this.education = this.education + (this.education.length() > 0 ? "; " : "") + education;
}
public String getJobs() {
return jobs;
}
public void addJob(String job) {
LOGGER.debug("Adding job " + job);
this.jobs = this.jobs + (this.jobs.length() > 0 ? "; " : "") + job;
}
@Override
public String toString() {
return "Resume [education=" + education + ", jobs=" + jobs + "]";
}
}

View File

@@ -4,7 +4,7 @@
xsi:schemaLocation="http://java.sun.com/xml/ns/persistence http://java.sun.com/xml/ns/persistence/persistence_2_0.xsd">
<persistence-unit name="test" transaction-type="RESOURCE_LOCAL">
<provider>org.hibernate.ejb.HibernatePersistence</provider>
<class>org.springframework.data.document.persistence.test.Person</class>
<class>org.springframework.data.mongodb.crossstore.test.Person</class>
<properties>
<property name="hibernate.dialect" value="org.hibernate.dialect.HSQLDialect"/>
<!--value='create' to build a new database on each run; value='update' to modify an existing database; value='create-drop' means the same as 'create' but also drops tables when Hibernate closes; value='validate' makes no changes to the database-->

View File

@@ -13,37 +13,37 @@
<context:spring-configured/>
<context:component-scan base-package="org.springframework.persistence.test">
<context:component-scan base-package="org.springframework.persistence.mongodb.test">
<context:exclude-filter expression="org.springframework.stereotype.Controller" type="annotation"/>
</context:component-scan>
<mongo:mapping-converter/>
<!-- Mongo config -->
<bean id="mongo" class="org.springframework.data.document.mongodb.MongoFactoryBean">
<bean id="mongo" class="org.springframework.data.mongodb.core.MongoFactoryBean">
<property name="host" value="localhost"/>
<property name="port" value="27017"/>
</bean>
<bean id="mongoDbFactory" class="org.springframework.data.document.mongodb.SimpleMongoDbFactory">
<bean id="mongoDbFactory" class="org.springframework.data.mongodb.core.SimpleMongoDbFactory">
<constructor-arg name="mongo" ref="mongo"/>
<constructor-arg name="databaseName" value="database"/>
</bean>
<bean id="mongoTemplate" class="org.springframework.data.document.mongodb.MongoTemplate">
<bean id="mongoTemplate" class="org.springframework.data.mongodb.core.MongoTemplate">
<constructor-arg name="mongoDbFactory" ref="mongoDbFactory"/>
<constructor-arg name="mongoConverter" ref="mappingConverter"/>
</bean>
<bean class="org.springframework.data.document.mongodb.MongoExceptionTranslator"/>
<bean class="org.springframework.data.mongodb.core.MongoExceptionTranslator"/>
<!-- Mongo aspect config -->
<bean class="org.springframework.data.persistence.document.mongodb.MongoDocumentBacking"
<bean class="org.springframework.data.mongodb.crossstore.MongoDocumentBacking"
factory-method="aspectOf">
<property name="changeSetPersister" ref="mongoChangeSetPersister"/>
</bean>
<bean id="mongoChangeSetPersister"
class="org.springframework.data.persistence.document.mongodb.MongoChangeSetPersister">
class="org.springframework.data.mongodb.crossstore.MongoChangeSetPersister">
<property name="mongoTemplate" ref="mongoTemplate"/>
<property name="entityManagerFactory" ref="entityManagerFactory"/>
</bean>

View File

@@ -2,7 +2,7 @@ log4j.rootCategory=INFO, stdout
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d %p [%c] - <%m>%n
log4j.appender.stdout.layout.ConversionPattern=%d{ABSOLUTE} %5p %40.40c:%4L - %m%n
log4j.category.org.springframework=INFO
log4j.category.org.springframework.data=TRACE

View File

@@ -4,12 +4,11 @@
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-document-parent</artifactId>
<version>1.0.0.M3</version>
<relativePath>../spring-data-document-parent/pom.xml</relativePath>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.0.0.M5</version>
<relativePath>../spring-data-mongodb-parent/pom.xml</relativePath>
</parent>
<artifactId>spring-data-mongodb-log4j</artifactId>
<packaging>jar</packaging>
<name>Spring Data MongoDB Log4J Appender</name>
<properties>

View File

@@ -3,10 +3,10 @@
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-document-parent</artifactId>
<name>Spring Data Document Parent</name>
<url>http://www.springsource.org/spring-data/data-document</url>
<version>1.0.0.M3</version>
<artifactId>spring-data-mongodb-parent</artifactId>
<name>Spring Data MongoDB Parent</name>
<url>http://www.springsource.org/spring-data/mongodb</url>
<version>1.0.0.M5</version>
<packaging>pom</packaging>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
@@ -16,8 +16,8 @@
<org.mockito.version>1.8.4</org.mockito.version>
<org.slf4j.version>1.5.10</org.slf4j.version>
<org.codehaus.jackson.version>1.6.1</org.codehaus.jackson.version>
<org.springframework.version>3.0.5.RELEASE</org.springframework.version>
<data.commons.version>1.1.0.M1</data.commons.version>
<org.springframework.version>3.0.6.RELEASE</org.springframework.version>
<data.commons.version>1.2.0.M2</data.commons.version>
<aspectj.version>1.6.11.RELEASE</aspectj.version>
</properties>
<profiles>
@@ -39,15 +39,15 @@
<distributionManagement>
<site>
<id>spring-site-staging</id>
<url>file:///${java.io.tmpdir}/spring-data/data-document/docs</url>
<url>file:///${java.io.tmpdir}/spring-data/mongodb/docs</url>
</site>
<repository>
<id>spring-milestone-staging</id>
<url>file:///${java.io.tmpdir}/spring-data/data-document/milestone</url>
<url>file:///${java.io.tmpdir}/spring-data/mongodb/milestone</url>
</repository>
<snapshotRepository>
<id>spring-snapshot-staging</id>
<url>file:///${java.io.tmpdir}/spring-data/data-document/snapshot</url>
<url>file:///${java.io.tmpdir}/spring-data/mongodb/snapshot</url>
</snapshotRepository>
</distributionManagement>
</profile>
@@ -63,7 +63,7 @@
<site>
<id>static.springframework.org</id>
<url>
scp://static.springframework.org/var/www/domains/springframework.org/static/htdocs/spring-data/data-document/docs/${project.version}
scp://static.springframework.org/var/www/domains/springframework.org/static/htdocs/spring-data/mongodb/docs/${project.version}
</url>
</site>
<repository>

View File

@@ -1,16 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<classpath>
<classpathentry including="**/*.java" kind="src" output="target/classes" path="src/main/java"/>
<classpathentry excluding="**" kind="src" output="target/classes" path="src/main/resources"/>
<classpathentry including="**/*.java" kind="src" output="target/test-classes" path="src/test/java"/>
<classpathentry kind="src" output="target/test-classes" path="target/generated-sources/test-annotations"/>
<classpathentry excluding="**" kind="src" output="target/test-classes" path="src/test/resources"/>
<classpathentry kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/J2SE-1.5"/>
<classpathentry kind="con" path="org.maven.ide.eclipse.MAVEN2_CLASSPATH_CONTAINER">
<attributes>
<attribute name="org.eclipse.jst.component.nondependency" value=""/>
</attributes>
</classpathentry>
<classpathentry kind="con" path="org.eclipse.jst.j2ee.internal.module.container"/>
<classpathentry kind="output" path="target/classes"/>
</classpath>

View File

@@ -0,0 +1,140 @@
<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<context version="7.0.3.1152">
<scope name="spring-data-mongodb" type="Project">
<element name="Filter" type="TypeFilterReferenceOverridden">
<element name="org.springframework.data.mongodb.**" type="IncludeTypePattern"/>
</element>
<architecture>
<element name="Config" type="Layer">
<element name="Assignment" type="TypeFilter">
<element name="**.config.**" type="WeakTypePattern"/>
</element>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core"/>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Monitoring"/>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Repositories"/>
</element>
<element name="Repositories" type="Layer">
<element name="Assignment" type="TypeFilter">
<element name="**.repository.**" type="IncludeTypePattern"/>
</element>
<element name="API" type="Subsystem">
<element name="Assignment" type="TypeFilter">
<element name="**.repository.*" type="IncludeTypePattern"/>
</element>
</element>
<element name="Query" type="Subsystem">
<element name="Assignment" type="TypeFilter">
<element name="**.query.**" type="IncludeTypePattern"/>
</element>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|API"/>
</element>
<element name="Implementation" type="Subsystem">
<element name="Assignment" type="TypeFilter">
<element name="**.support.**" type="IncludeTypePattern"/>
</element>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|API"/>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|Query"/>
</element>
<element name="Config" type="Subsystem">
<element name="Assignment" type="TypeFilter">
<element name="**.config.**" type="IncludeTypePattern"/>
</element>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|Implementation"/>
</element>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core"/>
</element>
<element name="Monitoring" type="Layer">
<element name="Assignment" type="TypeFilter">
<element name="**.monitor.**" type="IncludeTypePattern"/>
</element>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core"/>
</element>
<element name="Core" type="Layer">
<element name="Assignment" type="TypeFilter">
<element name="**.core.**" type="IncludeTypePattern"/>
</element>
<element name="Mapping" type="Subsystem">
<element name="Assignment" type="TypeFilter">
<element name="**.mapping.**" type="IncludeTypePattern"/>
</element>
</element>
<element name="Geospatial" type="Subsystem">
<element name="Assignment" type="TypeFilter">
<element name="**.geo.**" type="IncludeTypePattern"/>
</element>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping"/>
</element>
<element name="Query" type="Subsystem">
<element name="Assignment" type="TypeFilter">
<element name="**.query.**" type="IncludeTypePattern"/>
</element>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Geospatial"/>
</element>
<element name="Index" type="Subsystem">
<element name="Assignment" type="TypeFilter">
<element name="**.index.**" type="IncludeTypePattern"/>
</element>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping"/>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Query"/>
</element>
<element name="Core" type="Subsystem">
<element name="Assignment" type="TypeFilter">
<element name="**.core.**" type="WeakTypePattern"/>
</element>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Geospatial"/>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Index"/>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping"/>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Query"/>
</element>
</element>
</architecture>
<workspace>
<element name="src/main/java" type="JavaRootDirectory">
<reference name="Project|spring-data-mongodb::BuildUnit|spring-data-mongodb"/>
</element>
<element name="target/classes" type="JavaRootDirectory">
<reference name="Project|spring-data-mongodb::BuildUnit|spring-data-mongodb"/>
</element>
</workspace>
<physical>
<element name="spring-data-mongodb" type="BuildUnit"/>
</physical>
</scope>
<scope name="External" type="External">
<element name="Filter" type="TypeFilter">
<element name="**" type="IncludeTypePattern"/>
<element name="java.**" type="ExcludeTypePattern"/>
<element name="javax.**" type="ExcludeTypePattern"/>
</element>
<architecture>
<element name="Spring" type="Subsystem">
<element name="Assignment" type="TypeFilter">
<element name="org.springframework.**" type="IncludeTypePattern"/>
<element name="org.springframework.data.**" type="ExcludeTypePattern"/>
</element>
</element>
<element name="Spring Data Core" type="Subsystem">
<element name="Assignment" type="TypeFilter">
<element name="org.springframework.data.**" type="IncludeTypePattern"/>
</element>
</element>
<element name="Mongo Java Driver" type="Subsystem">
<element name="Assignment" type="TypeFilter">
<element name="com.mongodb.**" type="IncludeTypePattern"/>
<element name="org.bson.**" type="IncludeTypePattern"/>
</element>
</element>
<element name="Querydsl" type="Subsystem">
<element name="Assignment" type="TypeFilter">
<element name="com.mysema.query.**" type="IncludeTypePattern"/>
</element>
</element>
</architecture>
</scope>
<scope name="Global" type="Global">
<element name="Configuration" type="Configuration"/>
<element name="Filter" type="TypeFilter">
<element name="**" type="IncludeTypePattern"/>
</element>
</scope>
</context>

View File

@@ -0,0 +1,291 @@
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<profiles version="12">
<profile kind="CodeFormatterProfile" name="Spring Data" version="12">
<setting id="org.eclipse.jdt.core.formatter.comment.insert_new_line_before_root_tags" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.disabling_tag" value="@formatter:off"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_annotation" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_type_parameters" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_type_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_type_arguments" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_anonymous_type_declaration" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_colon_in_case" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_brace_in_array_initializer" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.new_lines_at_block_boundaries" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_annotation_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_before_closing_brace_in_array_initializer" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_annotation" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.blank_lines_before_field" value="0"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_while" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.use_on_off_tags" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_between_empty_parens_in_annotation_type_member_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_before_else_in_if_statement" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_prefix_operator" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.keep_else_statement_on_same_line" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_ellipsis" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.insert_new_line_for_parameter" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_annotation_type_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.indent_breaks_compare_to_cases" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_at_in_annotation" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_multiple_fields" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_expressions_in_array_initializer" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_conditional_expression" value="80"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_for" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_binary_operator" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_question_in_wildcard" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_array_initializer" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_between_empty_parens_in_enum_constant" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_before_finally_in_try_statement" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_after_annotation_on_local_variable" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_before_catch_in_try_statement" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_while" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.blank_lines_after_package" value="1"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_type_parameters" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.continuation_indentation" value="2"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_postfix_operator" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_arguments_in_method_invocation" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_angle_bracket_in_type_arguments" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_superinterfaces" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.blank_lines_before_new_chunk" value="1"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_binary_operator" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.blank_lines_before_package" value="0"/>
<setting id="org.eclipse.jdt.core.compiler.source" value="1.7"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_enum_constant_arguments" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_constructor_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.format_line_comments" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_closing_angle_bracket_in_type_arguments" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_enum_declarations" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.join_wrapped_lines" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_block" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_arguments_in_explicit_constructor_call" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_method_invocation_arguments" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.blank_lines_before_member_type" value="1"/>
<setting id="org.eclipse.jdt.core.formatter.align_type_members_on_columns" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_enum_constant" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_for" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_method_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_selector_in_method_invocation" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_switch" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_unary_operator" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_colon_in_case" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.indent_parameter_description" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_method_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_switch" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_enum_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_angle_bracket_in_type_parameters" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.clear_blank_lines_in_block_comment" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_type_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.lineSplit" value="120"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_if" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_between_brackets_in_array_type_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_parenthesized_expression" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_explicitconstructorcall_arguments" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_constructor_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.blank_lines_before_first_class_body_declaration" value="0"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_after_annotation_on_method" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.indentation.size" value="2"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_between_empty_parens_in_method_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.enabling_tag" value="@formatter:on"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_enum_constant" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_superclass_in_type_declaration" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_assignment" value="0"/>
<setting id="org.eclipse.jdt.core.compiler.problem.assertIdentifier" value="error"/>
<setting id="org.eclipse.jdt.core.formatter.tabulation.char" value="tab"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_constructor_declaration_parameters" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_semicolon_in_try_resources" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_prefix_operator" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.indent_statements_compare_to_body" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.blank_lines_before_method" value="1"/>
<setting id="org.eclipse.jdt.core.formatter.wrap_outer_expressions_when_nested" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.format_guardian_clause_on_one_line" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_colon_in_for" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_cast" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_parameters_in_constructor_declaration" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_colon_in_labeled_statement" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_annotation_type_declaration" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_method_body" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_method_declaration" value="0"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_method_invocation" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_try" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_bracket_in_array_allocation_expression" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_enum_constant" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_annotation" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_at_in_annotation_type_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_method_declaration_throws" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_if" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_switch" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_method_declaration_throws" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_parenthesized_expression_in_return" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_annotation" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_question_in_conditional" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_question_in_wildcard" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_try" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_bracket_in_array_allocation_expression" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.preserve_white_space_between_code_and_line_comments" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_parenthesized_expression_in_throw" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_type_arguments" value="do not insert"/>
<setting id="org.eclipse.jdt.core.compiler.problem.enumIdentifier" value="error"/>
<setting id="org.eclipse.jdt.core.formatter.indent_switchstatements_compare_to_switch" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_ellipsis" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_block" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_for_inits" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_method_declaration" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.compact_else_if" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.wrap_before_or_operator_multicatch" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_array_initializer" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_for_increments" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.format_line_comment_starting_on_first_column" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_bracket_in_array_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_after_annotation_on_field" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_enum_constant" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.comment.indent_root_tags" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_enum_declarations" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_union_type_in_multicatch" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_explicitconstructorcall_arguments" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_switch" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_method_declaration_parameters" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_superinterfaces" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_allocation_expression" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.tabulation.size" value="2"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_bracket_in_array_type_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_after_opening_brace_in_array_initializer" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_closing_brace_in_block" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_bracket_in_array_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_enum_constant" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_angle_bracket_in_type_arguments" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_constructor_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_constructor_declaration_throws" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_if" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.clear_blank_lines_in_javadoc_comment" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_throws_clause_in_constructor_declaration" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_assignment_operator" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_assignment_operator" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.indent_empty_lines" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_synchronized" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_closing_paren_in_cast" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_method_declaration_parameters" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_block_in_case" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.number_of_empty_lines_to_preserve" value="1"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_method_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_catch" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_constructor_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_method_invocation" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_bracket_in_array_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_arguments_in_qualified_allocation_expression" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_and_in_type_parameter" value="insert"/>
<setting id="org.eclipse.jdt.core.compiler.compliance" value="1.7"/>
<setting id="org.eclipse.jdt.core.formatter.continuation_indentation_for_array_initializer" value="2"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_between_empty_brackets_in_array_allocation_expression" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_at_in_annotation_type_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_arguments_in_allocation_expression" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_cast" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_unary_operator" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_angle_bracket_in_parameterized_type_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_anonymous_type_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.keep_empty_array_initializer_on_one_line" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_enum_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.keep_imple_if_on_one_line" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_constructor_declaration_parameters" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_closing_angle_bracket_in_type_parameters" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_at_end_of_file_if_missing" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_colon_in_for" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_colon_in_labeled_statement" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_parameterized_type_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_superinterfaces_in_type_declaration" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_binary_expression" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_enum_declaration" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_after_annotation_on_type" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_while" value="do not insert"/>
<setting id="org.eclipse.jdt.core.compiler.codegen.inlineJsrBytecode" value="enabled"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_try" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.put_empty_statement_on_new_line" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_after_label" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_after_annotation_on_parameter" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_angle_bracket_in_type_parameters" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_between_empty_parens_in_method_invocation" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_before_while_in_do_statement" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_arguments_in_enum_constant" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.comment.format_javadoc_comments" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.comment.line_length" value="120"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_after_annotation_on_package" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.blank_lines_between_import_groups" value="1"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_enum_constant_arguments" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_semicolon" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_constructor_declaration" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.number_of_blank_lines_at_beginning_of_method_body" value="0"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_colon_in_conditional" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.indent_body_declarations_compare_to_type_header" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_annotation_type_member_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.wrap_before_binary_operator" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.indent_body_declarations_compare_to_enum_declaration_header" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.blank_lines_between_type_declarations" value="1"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_synchronized" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.indent_statements_compare_to_block" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_superinterfaces_in_enum_declaration" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.join_lines_in_comments" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_question_in_conditional" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_multiple_field_declarations" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_compact_if" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_for_inits" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.indent_switchstatements_compare_to_cases" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_array_initializer" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_colon_in_default" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_and_in_type_parameter" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_between_empty_parens_in_constructor_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.blank_lines_before_imports" value="1"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_colon_in_assert" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.format_html" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_throws_clause_in_method_declaration" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_angle_bracket_in_type_parameters" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_bracket_in_array_allocation_expression" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_anonymous_type_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_colon_in_conditional" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_angle_bracket_in_parameterized_type_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_for" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_postfix_operator" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.format_source_code" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_synchronized" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_allocation_expression" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_constructor_declaration_throws" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_parameters_in_method_declaration" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_brace_in_array_initializer" value="insert"/>
<setting id="org.eclipse.jdt.core.compiler.codegen.targetPlatform" value="1.7"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_resources_in_try" value="80"/>
<setting id="org.eclipse.jdt.core.formatter.use_tabs_only_for_leading_indentations" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_arguments_in_annotation" value="0"/>
<setting id="org.eclipse.jdt.core.formatter.comment.format_header" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.comment.format_block_comments" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_enum_constant" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_enum_constants" value="0"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_block" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.indent_body_declarations_compare_to_annotation_declaration_header" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_parenthesized_expression" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_parenthesized_expression" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_catch" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_multiple_local_declarations" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_switch" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_for_increments" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_method_invocation" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_colon_in_assert" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_type_declaration" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_array_initializer" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_between_empty_braces_in_array_initializer" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_method_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_semicolon_in_for" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_catch" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_angle_bracket_in_parameterized_type_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_multiple_field_declarations" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_annotation" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_parameterized_type_reference" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_method_invocation_arguments" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.new_lines_at_javadoc_boundaries" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.blank_lines_after_imports" value="1"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_multiple_local_declarations" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.indent_body_declarations_compare_to_enum_constant_header" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_semicolon_in_for" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.never_indent_line_comments_on_first_column" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_semicolon_in_try_resources" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_angle_bracket_in_type_arguments" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.never_indent_block_comments_on_first_column" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.keep_then_statement_on_same_line" value="false"/>
</profile>
</profiles>

View File

@@ -4,17 +4,16 @@
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-document-parent</artifactId>
<version>1.0.0.M3</version>
<relativePath>../spring-data-document-parent/pom.xml</relativePath>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.0.0.M5</version>
<relativePath>../spring-data-mongodb-parent/pom.xml</relativePath>
</parent>
<artifactId>spring-data-mongodb</artifactId>
<packaging>jar</packaging>
<name>Spring Data MongoDB Support</name>
<name>Spring Data MongoDB</name>
<properties>
<mongo.version>2.5.3</mongo.version>
<querydsl.version>2.2.0-beta4</querydsl.version>
<mongo.version>2.6.5</mongo.version>
<querydsl.version>2.2.4</querydsl.version>
</properties>
<dependencies>
@@ -139,7 +138,7 @@
<plugin>
<groupId>com.mysema.maven</groupId>
<artifactId>maven-apt-plugin</artifactId>
<version>1.0.1</version>
<version>1.0.2</version>
<executions>
<execution>
<phase>generate-test-sources</phase>
@@ -148,7 +147,7 @@
</goals>
<configuration>
<outputDirectory>target/generated-test-sources</outputDirectory>
<processor>org.springframework.data.document.mongodb.repository.MongoAnnotationProcessor</processor>
<processor>org.springframework.data.mongodb.repository.support.MongoAnnotationProcessor</processor>
</configuration>
</execution>
</executions>

View File

@@ -1,13 +0,0 @@
package org.springframework.data.document.mongodb;
import org.springframework.dao.DataAccessException;
import com.mongodb.DB;
public interface MongoDbFactory {
DB getDb() throws DataAccessException;
DB getDb(String dbName) throws DataAccessException;
}

View File

@@ -1,244 +0,0 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.mongodb;
import java.beans.PropertyDescriptor;
import java.lang.reflect.Method;
import java.lang.reflect.Type;
import java.math.BigInteger;
import java.util.*;
import org.bson.types.ObjectId;
import org.springframework.beans.BeanUtils;
import org.springframework.util.Assert;
import org.springframework.util.ReflectionUtils;
/**
* An iterable of {@link MongoPropertyDescriptor}s that allows dedicated access to the {@link MongoPropertyDescriptor}
* that captures the id-property.
*
* @author Oliver Gierke
*/
public class MongoPropertyDescriptors implements Iterable<MongoPropertyDescriptors.MongoPropertyDescriptor> {
private final Collection<MongoPropertyDescriptors.MongoPropertyDescriptor> descriptors;
private final MongoPropertyDescriptors.MongoPropertyDescriptor idDescriptor;
/**
* Creates the {@link MongoPropertyDescriptors} for the given type.
*
* @param type
*/
public MongoPropertyDescriptors(Class<?> type) {
Assert.notNull(type);
Set<MongoPropertyDescriptors.MongoPropertyDescriptor> descriptors = new HashSet<MongoPropertyDescriptors.MongoPropertyDescriptor>();
MongoPropertyDescriptors.MongoPropertyDescriptor idDesciptor = null;
for (PropertyDescriptor candidates : BeanUtils.getPropertyDescriptors(type)) {
MongoPropertyDescriptor descriptor = new MongoPropertyDescriptors.MongoPropertyDescriptor(candidates, type);
descriptors.add(descriptor);
if (descriptor.isIdProperty()) {
idDesciptor = descriptor;
}
}
this.descriptors = Collections.unmodifiableSet(descriptors);
this.idDescriptor = idDesciptor;
}
/**
* Returns the {@link MongoPropertyDescriptor} for the id property.
*
* @return the idDescriptor
*/
public MongoPropertyDescriptors.MongoPropertyDescriptor getIdDescriptor() {
return idDescriptor;
}
/*
* (non-Javadoc)
*
* @see java.lang.Iterable#iterator()
*/
public Iterator<MongoPropertyDescriptors.MongoPropertyDescriptor> iterator() {
return descriptors.iterator();
}
/**
* Simple value object to have a more suitable abstraction for MongoDB specific property handling.
*
* @author Oliver Gierke
*/
public static class MongoPropertyDescriptor {
public static Collection<Class<?>> SUPPORTED_ID_CLASSES;
static {
Set<Class<?>> classes = new HashSet<Class<?>>();
classes.add(ObjectId.class);
classes.add(String.class);
classes.add(BigInteger.class);
SUPPORTED_ID_CLASSES = Collections.unmodifiableCollection(classes);
}
private static final String ID_PROPERTY = "id";
static final String ID_KEY = "_id";
private final PropertyDescriptor delegate;
private final Class<?> owningType;
/**
* Creates a new {@link MongoPropertyDescriptor} for the given {@link PropertyDescriptor}.
*
* @param descriptor
* @param owningType
*/
public MongoPropertyDescriptor(PropertyDescriptor descriptor, Class<?> owningType) {
Assert.notNull(descriptor);
this.delegate = descriptor;
this.owningType = owningType;
}
/**
* Returns whether the property is the id-property. Will be identified by name for now ({@value #ID_PROPERTY}).
*
* @return
*/
public boolean isIdProperty() {
return ID_PROPERTY.equals(delegate.getName()) || ID_KEY.equals(delegate.getName());
}
/**
* Returns whether the property is of one of the supported id types. Currently we support {@link String},
* {@link ObjectId} and {@link BigInteger}.
*
* @return
*/
public boolean isOfIdType() {
return SUPPORTED_ID_CLASSES.contains(delegate.getPropertyType());
}
/**
* Returns the key that shall be used for mapping. Will return {@value #ID_KEY} for the id property and the plain
* name for all other ones.
*
* @return
*/
public String getKeyToMap() {
return isIdProperty() ? ID_KEY : delegate.getName();
}
/**
* Returns the name of the property.
*
* @return
*/
public String getName() {
return delegate.getName();
}
/**
* Returns whether the underlying property is actually mappable. By default this will exclude the {@literal class}
* property and only include properties with a getter.
*
* @return
*/
public boolean isMappable() {
boolean isNotClassAttribute = !delegate.getName().equals("class");
boolean hasGetter = delegate.getReadMethod() != null;
boolean hasField = ReflectionUtils.findField(owningType, delegate.getName()) != null;
return isNotClassAttribute && hasGetter && hasField;
}
/**
* Returns the plain property type.
*
* @return
*/
public Class<?> getPropertyType() {
return delegate.getPropertyType();
}
/**
* Returns the type type to be set. Will return the setter method's type and fall back to the getter method's return
* type in case no setter is available. Useful for further (generics) inspection.
*
* @return
*/
public Type getTypeToSet() {
Method method = delegate.getWriteMethod();
return method == null ? delegate.getReadMethod().getGenericReturnType() : method.getGenericParameterTypes()[0];
}
/**
* Returns whther we describe a {@link Map}.
*
* @return
*/
public boolean isMap() {
return Map.class.isAssignableFrom(getPropertyType());
}
/**
* Returns whether the descriptor is for a collection.
*
* @return
*/
public boolean isCollection() {
return Collection.class.isAssignableFrom(getPropertyType());
}
/**
* Returns whether the descriptor is for an {@link Enum}.
*
* @return
*/
public boolean isEnum() {
return Enum.class.isAssignableFrom(getPropertyType());
}
/*
* (non-Javadoc)
*
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (obj == this) {
return true;
}
if (obj == null || !getClass().equals(obj.getClass())) {
return false;
}
MongoPropertyDescriptor that = (MongoPropertyDescriptor) obj;
return that.delegate.equals(this.delegate);
}
/*
* (non-Javadoc)
*
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
return delegate.hashCode();
}
}
}

View File

@@ -1,43 +0,0 @@
/*
* Copyright 2010-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.mongodb;
import com.mongodb.DBObject;
/**
* A MongoWriter is responsible for converting a native MongoDB DBObject to an object of type T.
*
* @param <T>
* the type of the object to convert from a DBObject
* @author Mark Pollack
* @author Thomas Risberg
* @author Oliver Gierke
*/
public interface MongoReader<T> {
/**
* Ready from the native MongoDB DBObject representation to an instance of the class T. The given type has to be the
* starting point for marshalling the {@link DBObject} into it. So in case there's no real valid data inside
* {@link DBObject} for the given type, just return an empty instance of the given type.
*
* @param clazz
* the type of the return value
* @param dbo
* theDBObject
* @return the converted object
*/
<S extends T> S read(Class<S> clazz, DBObject dbo);
}

View File

@@ -1,60 +0,0 @@
package org.springframework.data.document.mongodb;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.springframework.dao.DataAccessException;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.util.Assert;
import com.mongodb.DB;
import com.mongodb.Mongo;
public class SimpleMongoDbFactory implements MongoDbFactory {
/**
* Logger, available to subclasses.
*/
protected final Log logger = LogFactory.getLog(getClass());
private Mongo mongo;
private String databaseName;
private String username;
private String password;
/**
* Create an instance of SimpleMongoDbFactory given the Mongo instance and database name
* @param mongo Mongo instance, not null
* @param databaseName Database name, not null
*/
public SimpleMongoDbFactory(Mongo mongo, String databaseName) {
Assert.notNull(mongo, "Mongo must not be null");
Assert.hasText(databaseName, "Database name must not be empty");
this.mongo = mongo;
this.databaseName = databaseName;
}
/**
* Create an instance of SimpleMongoDbFactory given the Mongo instance, database name, and username/password
* @param mongo Mongo instance, not null
* @param databaseName Database name, not null
* @param userCredentials username and password
*/
public SimpleMongoDbFactory(Mongo mongo, String databaseName, UserCredentials userCredentials) {
this(mongo, databaseName);
this.username = userCredentials.getUsername();
this.password = userCredentials.getPassword();
}
public DB getDb() throws DataAccessException {
Assert.notNull(mongo, "Mongo must not be null");
Assert.hasText(databaseName, "Database name must not be empty");
return MongoDbUtils.getDB(mongo, databaseName, username, password == null ? null : password.toCharArray());
}
public DB getDb(String dbName) throws DataAccessException {
Assert.notNull(mongo, "Mongo must not be null");
Assert.hasText(dbName, "Database name must not be empty");
return MongoDbUtils.getDB(mongo, dbName, username, password == null ? null : password.toCharArray());
}
}

View File

@@ -1,101 +0,0 @@
/*
* Copyright (c) 2011 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.mongodb.config;
import static org.springframework.data.document.mongodb.config.BeanNames.*;
import org.springframework.beans.factory.BeanDefinitionStoreException;
import org.springframework.beans.factory.config.RuntimeBeanReference;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionReaderUtils;
import org.springframework.beans.factory.xml.AbstractBeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.document.mongodb.MongoFactoryBean;
import org.springframework.data.document.mongodb.SimpleMongoDbFactory;
import org.springframework.util.StringUtils;
import org.w3c.dom.Element;
/**
* @author Jon Brisbin <jbrisbin@vmware.com>
*/
public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
@Override
protected String resolveId(Element element, AbstractBeanDefinition definition, ParserContext parserContext) throws BeanDefinitionStoreException {
String id = element.getAttribute("id");
if (!StringUtils.hasText(id)) {
id = DB_FACTORY;
}
return id;
}
@Override
protected AbstractBeanDefinition parseInternal(Element element, ParserContext parserContext) {
BeanDefinitionBuilder dbFactoryBuilder = BeanDefinitionBuilder.genericBeanDefinition(SimpleMongoDbFactory.class);
// UserCredentials
BeanDefinitionBuilder userCredentialsBuilder = BeanDefinitionBuilder.genericBeanDefinition(UserCredentials.class);
String username = element.getAttribute("username");
if (StringUtils.hasText(username)) {
userCredentialsBuilder.addConstructorArgValue(username);
} else {
userCredentialsBuilder.addConstructorArgValue(null);
}
String password = element.getAttribute("password");
if (StringUtils.hasText(password)) {
userCredentialsBuilder.addConstructorArgValue(password);
} else {
userCredentialsBuilder.addConstructorArgValue(null);
}
// host and port
String host = element.getAttribute("host");
if (!StringUtils.hasText(host)) {
host = "localhost";
}
String port = element.getAttribute("port");
if (!StringUtils.hasText(port)) {
port = "27017";
}
// Database name
String dbname = element.getAttribute("dbname");
if (!StringUtils.hasText(dbname)) {
dbname = "db";
}
// com.mongodb.Mongo object
String mongoRef = element.getAttribute("mongo-ref");
if (!StringUtils.hasText(mongoRef)) {
//Create implicit com.mongodb.Mongo object and register under generated name
BeanDefinitionBuilder mongoBuilder = BeanDefinitionBuilder.genericBeanDefinition(MongoFactoryBean.class);
mongoBuilder.addPropertyValue("host", host);
mongoBuilder.addPropertyValue("port", port);
mongoRef = BeanDefinitionReaderUtils.registerWithGeneratedName(mongoBuilder.getBeanDefinition(), parserContext.getRegistry());
}
dbFactoryBuilder.addConstructorArgValue(new RuntimeBeanReference(mongoRef));
dbFactoryBuilder.addConstructorArgValue(dbname);
dbFactoryBuilder.addConstructorArgValue(userCredentialsBuilder.getBeanDefinition());
return dbFactoryBuilder.getRawBeanDefinition();
}
}

View File

@@ -1,97 +0,0 @@
/*
* Copyright (c) 2011 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.mongodb.config;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.ManagedList;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.document.mongodb.MongoOptionsFactoryBean;
import org.springframework.util.StringUtils;
import org.springframework.util.xml.DomUtils;
import org.w3c.dom.Element;
import com.mongodb.ServerAddress;
abstract class ParsingUtils {
/**
* Parses the mongo replica-set element.
* @param parserContext the parser context
* @param element the mongo element
* @param mongoBuilder the bean definition builder to populate
* @return true if parsing actually occured, false otherwise
*/
static boolean parseReplicaSet(ParserContext parserContext, Element element, BeanDefinitionBuilder mongoBuilder) {
String replicaSetString = element.getAttribute("replica-set");
if (StringUtils.hasText(replicaSetString)) {
ManagedList<Object> serverAddresses = new ManagedList<Object>();
String[] replicaSetStringArray = StringUtils.commaDelimitedListToStringArray(replicaSetString);
for (int i = 0; i < replicaSetStringArray.length; i++) {
String[] hostAndPort = StringUtils.delimitedListToStringArray(replicaSetStringArray[i], ":");
BeanDefinitionBuilder defBuilder = BeanDefinitionBuilder.genericBeanDefinition(ServerAddress.class);
defBuilder.addConstructorArgValue(hostAndPort[0]);
defBuilder.addConstructorArgValue(hostAndPort[1]);
serverAddresses.add(defBuilder.getBeanDefinition());
}
if (!serverAddresses.isEmpty()) {
mongoBuilder.addPropertyValue("replicaSetSeeds", serverAddresses);
}
}
return true;
}
/**
* Parses the mongo:options sub-element. Populates the given attribute factory with the proper attributes.
*
* @return true if parsing actually occured, false otherwise
*/
static boolean parseMongoOptions(ParserContext parserContext, Element element, BeanDefinitionBuilder mongoBuilder) {
Element optionsElement = DomUtils.getChildElementByTagName(element, "options");
if (optionsElement == null)
return false;
BeanDefinitionBuilder optionsDefBuilder = BeanDefinitionBuilder
.genericBeanDefinition(MongoOptionsFactoryBean.class);
setPropertyValue(optionsElement, optionsDefBuilder, "connections-per-host", "connectionsPerHost");
setPropertyValue(optionsElement, optionsDefBuilder, "threads-allowed-to-block-for-connection-multiplier",
"threadsAllowedToBlockForConnectionMultiplier");
setPropertyValue(optionsElement, optionsDefBuilder, "max-wait-time", "maxWaitTime");
setPropertyValue(optionsElement, optionsDefBuilder, "connect-timeout", "connectTimeout");
setPropertyValue(optionsElement, optionsDefBuilder, "socket-timeout", "socketTimeout");
setPropertyValue(optionsElement, optionsDefBuilder, "socket-keep-alive", "socketKeepAlive");
setPropertyValue(optionsElement, optionsDefBuilder, "auto-connect-retry", "autoConnectRetry");
setPropertyValue(optionsElement, optionsDefBuilder, "write-number", "writeNumber");
setPropertyValue(optionsElement, optionsDefBuilder, "write-timeout", "writeTimeout");
setPropertyValue(optionsElement, optionsDefBuilder, "write-fsync", "writeFsync");
setPropertyValue(optionsElement, optionsDefBuilder, "slave-ok", "slaveOk");
mongoBuilder.addPropertyValue("mongoOptions", optionsDefBuilder.getBeanDefinition());
return true;
}
static void setPropertyValue(Element element, BeanDefinitionBuilder builder, String attrName, String propertyName) {
String attr = element.getAttribute(attrName);
if (StringUtils.hasText(attr)) {
builder.addPropertyValue(propertyName, attr);
}
}
}

View File

@@ -1,258 +0,0 @@
/*
* Copyright (c) 2011 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.mongodb.convert;
import static org.springframework.data.mapping.MappingBeanHelper.isSimpleType;
import java.math.BigInteger;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Date;
import java.util.HashMap;
import java.util.HashSet;
import java.util.Iterator;
import java.util.List;
import java.util.Locale;
import java.util.Map;
import java.util.Set;
import org.bson.types.ObjectId;
import org.springframework.beans.factory.InitializingBean;
import org.springframework.core.GenericTypeResolver;
import org.springframework.core.convert.ConversionService;
import org.springframework.core.convert.TypeDescriptor;
import org.springframework.core.convert.converter.Converter;
import org.springframework.core.convert.converter.ConverterFactory;
import org.springframework.core.convert.converter.GenericConverter;
import org.springframework.core.convert.converter.GenericConverter.ConvertiblePair;
import org.springframework.core.convert.support.ConversionServiceFactory;
import org.springframework.core.convert.support.GenericConversionService;
import org.springframework.data.document.mongodb.convert.ObjectIdConverters.BigIntegerToObjectIdConverter;
import org.springframework.data.document.mongodb.convert.ObjectIdConverters.ObjectIdToBigIntegerConverter;
import org.springframework.data.document.mongodb.convert.ObjectIdConverters.ObjectIdToStringConverter;
import org.springframework.data.document.mongodb.convert.ObjectIdConverters.StringToObjectIdConverter;
import org.springframework.data.mapping.MappingBeanHelper;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* @author Jon Brisbin <jbrisbin@vmware.com>
*/
public abstract class AbstractMongoConverter implements MongoConverter, InitializingBean {
@SuppressWarnings({ "unchecked" })
private static final List<Class<?>> MONGO_TYPES = Arrays.asList(Number.class, Date.class, String.class,
DBObject.class);
protected final GenericConversionService conversionService;
private final Set<ConvertiblePair> customTypeMapping = new HashSet<ConvertiblePair>();
public AbstractMongoConverter(GenericConversionService conversionService) {
this.conversionService = conversionService == null ? ConversionServiceFactory.createDefaultConversionService()
: conversionService;
this.conversionService.removeConvertible(Object.class, String.class);
registerConverter(CustomToStringConverter.INSTANCE);
}
/**
* Add custom {@link Converter} or {@link ConverterFactory} instances to be used that will take presidence over
* metadata driven conversion between of objects to/from DBObject
*
* @param converters
*/
public void setCustomConverters(Set<?> converters) {
if (null != converters) {
for (Object c : converters) {
registerConverter(c);
}
}
}
/**
* Registers converters for {@link ObjectId} handling, removes plain {@link #toString()} converter and promotes the
* configured {@link ConversionService} to {@link MappingBeanHelper}.
*/
private void initializeConverters() {
if (!conversionService.canConvert(ObjectId.class, String.class)) {
conversionService.addConverter(ObjectIdToStringConverter.INSTANCE);
}
if (!conversionService.canConvert(String.class, ObjectId.class)) {
conversionService.addConverter(StringToObjectIdConverter.INSTANCE);
}
if (!conversionService.canConvert(ObjectId.class, BigInteger.class)) {
conversionService.addConverter(ObjectIdToBigIntegerConverter.INSTANCE);
}
if (!conversionService.canConvert(BigInteger.class, ObjectId.class)) {
conversionService.addConverter(BigIntegerToObjectIdConverter.INSTANCE);
}
}
/**
* Inspects the given {@link Converter} for the types it can convert and registers the pair for custom type conversion
* in case the target type is a Mongo basic type.
*
* @param converter
*/
private void registerConverter(Object converter) {
if (converter instanceof GenericConverter) {
customTypeMapping.addAll(((GenericConverter) converter).getConvertibleTypes());
} else {
Class<?>[] arguments = GenericTypeResolver.resolveTypeArguments(converter.getClass(), Converter.class);
if (MONGO_TYPES.contains(arguments[1]) || MONGO_TYPES.contains(arguments[0])) {
customTypeMapping.add(new ConvertiblePair(arguments[0], arguments[1]));
}
}
boolean added = false;
if (converter instanceof Converter) {
this.conversionService.addConverter((Converter<?, ?>) converter);
added = true;
}
if (converter instanceof ConverterFactory) {
this.conversionService.addConverterFactory((ConverterFactory<?, ?>) converter);
added = true;
}
if (converter instanceof GenericConverter) {
this.conversionService.addConverter((GenericConverter) converter);
added = true;
}
if (!added) {
throw new IllegalArgumentException("Given set contains element that is neither Converter nor ConverterFactory!");
}
}
protected Class<?> getCustomTarget(Class<?> source, Class<?> expectedTargetType) {
for (ConvertiblePair typePair : customTypeMapping) {
if (typePair.getSourceType().isAssignableFrom(source)) {
Class<?> targetType = typePair.getTargetType();
if (targetType.equals(expectedTargetType) || expectedTargetType == null) {
return targetType;
}
}
}
return null;
}
/*
* (non-Javadoc)
* @see org.springframework.data.document.mongodb.convert.MongoConverter#getConversionService()
*/
public ConversionService getConversionService() {
return conversionService;
}
/* (non-Javadoc)
* @see org.springframework.beans.factory.InitializingBean#afterPropertiesSet()
*/
public void afterPropertiesSet() {
initializeConverters();
}
@SuppressWarnings("unchecked")
public Object maybeConvertObject(Object obj) {
if (obj instanceof Enum<?>) {
return ((Enum<?>) obj).name();
}
if (null != obj && isSimpleType(obj.getClass())) {
// Doesn't need conversion
return obj;
}
if (obj instanceof BasicDBList) {
return maybeConvertList((BasicDBList) obj);
}
if (obj instanceof DBObject) {
DBObject newValueDbo = new BasicDBObject();
for (String vk : ((DBObject) obj).keySet()) {
Object o = ((DBObject) obj).get(vk);
newValueDbo.put(vk, maybeConvertObject(o));
}
return newValueDbo;
}
if (obj instanceof Map) {
Map<Object, Object> m = new HashMap<Object, Object>();
for (Map.Entry<Object, Object> entry : ((Map<Object, Object>) obj).entrySet()) {
m.put(entry.getKey(), maybeConvertObject(entry.getValue()));
}
return m;
}
if (obj instanceof List) {
List<?> l = (List<?>) obj;
List<Object> newList = new ArrayList<Object>();
for (Object o : l) {
newList.add(maybeConvertObject(o));
}
return newList;
}
if (obj.getClass().isArray()) {
return maybeConvertArray((Object[]) obj);
}
DBObject newDbo = new BasicDBObject();
this.write(obj, newDbo);
return newDbo;
}
public Object[] maybeConvertArray(Object[] src) {
Object[] newArr = new Object[src.length];
for (int i = 0; i < src.length; i++) {
newArr[i] = maybeConvertObject(src[i]);
}
return newArr;
}
public BasicDBList maybeConvertList(BasicDBList dbl) {
BasicDBList newDbl = new BasicDBList();
Iterator<?> iter = dbl.iterator();
while (iter.hasNext()) {
Object o = iter.next();
newDbl.add(maybeConvertObject(o));
}
return newDbl;
}
private enum CustomToStringConverter implements GenericConverter {
INSTANCE;
public Set<ConvertiblePair> getConvertibleTypes() {
ConvertiblePair localeToString = new ConvertiblePair(Locale.class, String.class);
ConvertiblePair booleanToString = new ConvertiblePair(Character.class, String.class);
return new HashSet<ConvertiblePair>(Arrays.asList(localeToString, booleanToString));
}
public Object convert(Object source, TypeDescriptor sourceType, TypeDescriptor targetType) {
return source.toString();
}
}
}

View File

@@ -1,94 +0,0 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.mongodb.convert;
import static org.springframework.beans.PropertyAccessorFactory.forBeanPropertyAccess;
import static org.springframework.beans.PropertyAccessorFactory.forDirectFieldAccess;
import org.springframework.beans.BeanWrapper;
import org.springframework.beans.ConfigurablePropertyAccessor;
import org.springframework.beans.NotWritablePropertyException;
import org.springframework.core.convert.ConversionService;
import org.springframework.data.document.mongodb.MongoPropertyDescriptors;
import org.springframework.data.document.mongodb.MongoPropertyDescriptors.MongoPropertyDescriptor;
import org.springframework.util.Assert;
/**
* Custom Mongo specific {@link BeanWrapper} to allow access to bean properties via {@link MongoPropertyDescriptor}s.
*
* @author Oliver Gierke
*/
class MongoBeanWrapper {
private final ConfigurablePropertyAccessor accessor;
private final MongoPropertyDescriptors descriptors;
private final boolean fieldAccess;
/**
* Creates a new {@link MongoBeanWrapper} for the given target object and {@link ConversionService}.
*
* @param target
* @param conversionService
* @param fieldAccess
*/
public MongoBeanWrapper(Object target, ConversionService conversionService, boolean fieldAccess) {
Assert.notNull(target);
Assert.notNull(conversionService);
this.fieldAccess = fieldAccess;
this.accessor = fieldAccess ? forDirectFieldAccess(target) : forBeanPropertyAccess(target);
this.accessor.setConversionService(conversionService);
this.descriptors = new MongoPropertyDescriptors(target.getClass());
}
/**
* Returns all {@link MongoPropertyDescriptors.MongoPropertyDescriptor}s for the underlying target object.
*
* @return
*/
public MongoPropertyDescriptors getDescriptors() {
return this.descriptors;
}
/**
* Returns the value of the underlying object for the given property.
*
* @param descriptor
* @return
*/
public Object getValue(MongoPropertyDescriptors.MongoPropertyDescriptor descriptor) {
Assert.notNull(descriptor);
return accessor.getPropertyValue(descriptor.getName());
}
/**
* Sets the property of the underlying object to the given value.
*
* @param descriptor
* @param value
*/
public void setValue(MongoPropertyDescriptors.MongoPropertyDescriptor descriptor, Object value) {
Assert.notNull(descriptor);
try {
accessor.setPropertyValue(descriptor.getName(), value);
} catch (NotWritablePropertyException e) {
if (!fieldAccess) {
throw e;
}
}
}
}

View File

@@ -1,59 +0,0 @@
/*
* Copyright 2010-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.mongodb.convert;
import com.mongodb.BasicDBList;
import org.bson.types.ObjectId;
import org.springframework.core.convert.ConversionService;
import org.springframework.data.document.mongodb.MongoReader;
import org.springframework.data.document.mongodb.MongoWriter;
import org.springframework.data.document.mongodb.mapping.MongoPersistentEntity;
import org.springframework.data.document.mongodb.mapping.MongoPersistentProperty;
import org.springframework.data.mapping.model.MappingContext;
public interface MongoConverter extends MongoWriter<Object>, MongoReader<Object> {
/**
* Converts the given {@link ObjectId} to the given target type.
*
* @param <T>
* the actual type to create
* @param id
* the source {@link ObjectId}
* @param targetType
* the target type to convert the {@link ObjectId} to
* @return
*/
public <T> T convertObjectId(ObjectId id, Class<T> targetType);
/**
* Returns the {@link ObjectId} instance for the given id.
*
* @param id
* @return
*/
public ObjectId convertObjectId(Object id);
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> getMappingContext();
Object maybeConvertObject(Object obj);
Object[] maybeConvertArray(Object[] src);
BasicDBList maybeConvertList(BasicDBList dbl);
ConversionService getConversionService();
}

View File

@@ -1,516 +0,0 @@
/*
* Copyright 2010-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.mongodb.convert;
import java.lang.reflect.Array;
import java.lang.reflect.GenericArrayType;
import java.lang.reflect.ParameterizedType;
import java.lang.reflect.Type;
import java.lang.reflect.TypeVariable;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
import java.util.Collections;
import java.util.Date;
import java.util.HashMap;
import java.util.HashSet;
import java.util.List;
import java.util.Locale;
import java.util.Map;
import java.util.Map.Entry;
import java.util.Set;
import java.util.regex.Pattern;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
import com.mongodb.DBRef;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.bson.types.CodeWScope;
import org.bson.types.ObjectId;
import org.springframework.beans.BeanUtils;
import org.springframework.beans.factory.InitializingBean;
import org.springframework.core.CollectionFactory;
import org.springframework.core.convert.ConversionFailedException;
import org.springframework.core.convert.ConversionService;
import org.springframework.core.convert.converter.Converter;
import org.springframework.core.convert.support.ConversionServiceFactory;
import org.springframework.data.document.mongodb.MongoPropertyDescriptors.MongoPropertyDescriptor;
import org.springframework.data.document.mongodb.mapping.MongoPersistentEntity;
import org.springframework.data.document.mongodb.mapping.MongoPersistentProperty;
import org.springframework.data.document.mongodb.mapping.SimpleMongoMappingContext;
import org.springframework.data.mapping.model.MappingContext;
import org.springframework.util.Assert;
import org.springframework.util.comparator.CompoundComparator;
/**
* Basic {@link MongoConverter} implementation to convert between domain classes and {@link DBObject}s.
*
* @author Mark Pollack
* @author Thomas Risberg
* @author Oliver Gierke
*
* @deprecated since Spring 1.0 M3 in favor of {@link org.springframework.data.document.mongodb.convert.MappingMongoConverter}
* The MappingMongoConverter provides all the functionality of the SimpleMongoConverter and will replace it as the default
* converter used. The SimpleMongoCOnverter will be removed at some point before the GA release.
*/
@Deprecated
public class SimpleMongoConverter extends AbstractMongoConverter implements InitializingBean {
private static final Log LOG = LogFactory.getLog(SimpleMongoConverter.class);
@SuppressWarnings("unchecked")
private static final List<Class<?>> MONGO_TYPES = Arrays.asList(Number.class, Date.class, String.class,
DBObject.class);
private static final Set<String> SIMPLE_TYPES;
static {
Set<String> basics = new HashSet<String>();
basics.add(boolean.class.getName());
basics.add(long.class.getName());
basics.add(short.class.getName());
basics.add(int.class.getName());
basics.add(byte.class.getName());
basics.add(float.class.getName());
basics.add(double.class.getName());
basics.add(char.class.getName());
basics.add(Boolean.class.getName());
basics.add(Long.class.getName());
basics.add(Short.class.getName());
basics.add(Integer.class.getName());
basics.add(Byte.class.getName());
basics.add(Float.class.getName());
basics.add(Double.class.getName());
basics.add(Character.class.getName());
basics.add(String.class.getName());
basics.add(java.util.Date.class.getName());
// basics.add(Time.class.getName());
// basics.add(Timestamp.class.getName());
// basics.add(java.sql.Date.class.getName());
// basics.add(BigDecimal.class.getName());
// basics.add(BigInteger.class.getName());
basics.add(Locale.class.getName());
// basics.add(Calendar.class.getName());
// basics.add(GregorianCalendar.class.getName());
// basics.add(java.util.Currency.class.getName());
// basics.add(TimeZone.class.getName());
// basics.add(Object.class.getName());
basics.add(Class.class.getName());
// basics.add(byte[].class.getName());
// basics.add(Byte[].class.getName());
// basics.add(char[].class.getName());
// basics.add(Character[].class.getName());
// basics.add(Blob.class.getName());
// basics.add(Clob.class.getName());
// basics.add(Serializable.class.getName());
// basics.add(URI.class.getName());
// basics.add(URL.class.getName());
basics.add(DBRef.class.getName());
basics.add(Pattern.class.getName());
basics.add(CodeWScope.class.getName());
basics.add(ObjectId.class.getName());
basics.add(Enum.class.getName());
SIMPLE_TYPES = Collections.unmodifiableSet(basics);
}
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
/**
* Creates a {@link SimpleMongoConverter}.
*/
public SimpleMongoConverter() {
super(ConversionServiceFactory.createDefaultConversionService());
this.mappingContext = new SimpleMongoMappingContext();
}
/* (non-Javadoc)
* @see org.springframework.data.document.mongodb.convert.MongoConverter#getMappingContext()
*/
public MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> getMappingContext() {
return mappingContext;
}
/*
* (non-Javadoc)
*
* @see org.springframework.data.document.mongodb.MongoWriter#write(java.lang.Object, com.mongodb.DBObject)
*/
@SuppressWarnings("rawtypes")
public void write(Object obj, DBObject dbo) {
MongoBeanWrapper beanWrapper = createWrapper(obj, false);
for (MongoPropertyDescriptor descriptor : beanWrapper.getDescriptors()) {
if (descriptor.isMappable()) {
Object value = beanWrapper.getValue(descriptor);
if (value == null) {
continue;
}
String keyToUse = descriptor.getKeyToMap();
if (descriptor.isEnum()) {
writeValue(dbo, keyToUse, ((Enum) value).name());
} else if (descriptor.isIdProperty() && descriptor.isOfIdType()) {
if (value instanceof String && ObjectId.isValid((String) value)) {
try {
writeValue(dbo, keyToUse, conversionService.convert(value, ObjectId.class));
} catch (ConversionFailedException iae) {
LOG.warn("Unable to convert the String " + value + " to an ObjectId");
writeValue(dbo, keyToUse, value);
}
} else {
// we can't convert this id - use as is
writeValue(dbo, keyToUse, value);
}
} else {
writeValue(dbo, keyToUse, value);
}
} else {
if (!"class".equals(descriptor.getName())) {
LOG.debug("Skipping property " + descriptor.getName() + " as it's not a mappable one.");
}
}
}
}
/**
* Writes the given value to the given {@link DBObject}. Will skip {@literal null} values.
*
* @param dbo
* @param keyToUse
* @param value
*/
private void writeValue(DBObject dbo, String keyToUse, Object value) {
if (!isSimpleType(value.getClass())) {
writeCompoundValue(dbo, keyToUse, value);
} else {
dbo.put(keyToUse, value);
}
}
/**
* Writes the given {@link CompoundComparator} value to the given {@link DBObject}.
*
* @param dbo
* @param keyToUse
* @param value
*/
@SuppressWarnings("unchecked")
private void writeCompoundValue(DBObject dbo, String keyToUse, Object value) {
if (value instanceof Map) {
writeMap(dbo, keyToUse, (Map<String, Object>) value);
return;
}
if (value instanceof Collection) {
// Should write a collection!
writeArray(dbo, keyToUse, ((Collection<Object>) value).toArray());
return;
}
if (value instanceof Object[]) {
// Should write an array!
writeArray(dbo, keyToUse, (Object[]) value);
return;
}
Class<?> customTargetType = getCustomTargetType(value);
if (customTargetType != null) {
dbo.put(keyToUse, conversionService.convert(value, customTargetType));
return;
}
DBObject nestedDbo = new BasicDBObject();
write(value, nestedDbo);
dbo.put(keyToUse, nestedDbo);
}
/**
* Returns whether the {@link ConversionService} has a custom {@link Converter} registered that can convert the given
* object into one of the types supported by MongoDB.
*
* @param obj
* @return
*/
private Class<?> getCustomTargetType(Object obj) {
for (Class<?> mongoType : MONGO_TYPES) {
if (conversionService.canConvert(obj.getClass(), mongoType)) {
return mongoType;
}
}
return null;
}
/**
* Writes the given {@link Map} to the given {@link DBObject}.
*
* @param dbo
* @param mapKey
* @param map
*/
protected void writeMap(DBObject dbo, String mapKey, Map<String, Object> map) {
// TODO support non-string based keys as long as there is a Spring Converter obj->string and (optionally)
// string->obj
DBObject dboToPopulate = null;
// TODO - Does that make sense? If we create a new object here it's content will never make it out of this
// method
if (mapKey != null) {
dboToPopulate = new BasicDBObject();
} else {
dboToPopulate = dbo;
}
if (map != null) {
for (Entry<String, Object> entry : map.entrySet()) {
Object entryValue = entry.getValue();
String entryKey = entry.getKey();
if (!isSimpleType(entryValue.getClass())) {
writeCompoundValue(dboToPopulate, entryKey, entryValue);
} else {
dboToPopulate.put(entryKey, entryValue);
}
}
dbo.put(mapKey, dboToPopulate);
}
}
/**
* Writes the given array to the given {@link DBObject}.
*
* @param dbo
* @param keyToUse
* @param array
*/
protected void writeArray(DBObject dbo, String keyToUse, Object[] array) {
Object[] dboValues;
if (array != null) {
dboValues = new Object[array.length];
int i = 0;
for (Object o : array) {
if (!isSimpleType(o.getClass())) {
DBObject dboValue = new BasicDBObject();
write(o, dboValue);
dboValues[i] = dboValue;
} else {
dboValues[i] = o;
}
i++;
}
dbo.put(keyToUse, dboValues);
}
}
/*
* (non-Javadoc)
*
* @see org.springframework.data.document.mongodb.MongoReader#read(java.lang.Class, com.mongodb.DBObject)
*/
public <S> S read(Class<S> clazz, DBObject source) {
if (source == null) {
return null;
}
Assert.notNull(clazz, "Mapped class was not specified");
S target = BeanUtils.instantiateClass(clazz);
MongoBeanWrapper bw = new MongoBeanWrapper(target, conversionService, true);
for (MongoPropertyDescriptor descriptor : bw.getDescriptors()) {
String keyToUse = descriptor.getKeyToMap();
if (source.containsField(keyToUse)) {
if (descriptor.isMappable()) {
Object value = source.get(keyToUse);
if (!isSimpleType(value.getClass())) {
if (value instanceof Object[]) {
bw.setValue(descriptor, readCollection(descriptor, Arrays.asList((Object[]) value)).toArray());
} else if (value instanceof BasicDBList) {
bw.setValue(descriptor, readCollection(descriptor, (BasicDBList) value));
} else if (value instanceof DBObject) {
bw.setValue(descriptor, readCompoundValue(descriptor, (DBObject) value));
} else {
LOG.warn("Unable to map compound DBObject field " + keyToUse + " to property " + descriptor.getName()
+ ". The field value should have been a 'DBObject.class' but was " + value.getClass().getName());
}
} else {
bw.setValue(descriptor, value);
}
} else {
LOG.warn("Unable to map DBObject field " + keyToUse + " to property " + descriptor.getName() + ". Skipping.");
}
}
}
return target;
}
/**
* Reads the given collection values (that are {@link DBObject}s potentially) into a {@link Collection} of domain
* objects.
*
* @param descriptor
* @param values
* @return
*/
private Collection<Object> readCollection(MongoPropertyDescriptor descriptor, Collection<?> values) {
Class<?> targetCollectionType = descriptor.getPropertyType();
boolean targetIsArray = targetCollectionType.isArray();
@SuppressWarnings("unchecked")
Collection<Object> result = targetIsArray ? new ArrayList<Object>(values.size()) : CollectionFactory
.createCollection(targetCollectionType, values.size());
for (Object o : values) {
if (o instanceof DBObject) {
Class<?> type;
if (targetIsArray) {
type = targetCollectionType.getComponentType();
} else {
type = getGenericParameters(descriptor.getTypeToSet()).get(0);
}
result.add(read(type, (DBObject) o));
} else {
result.add(o);
}
}
return result;
}
/**
* Reads a compound value from the given {@link DBObject} for the given property.
*
* @param pd
* @param dbo
* @return
*/
private Object readCompoundValue(MongoPropertyDescriptor pd, DBObject dbo) {
Assert.isTrue(!pd.isCollection(), "Collections not supported!");
if (pd.isMap()) {
return readMap(pd, dbo, getGenericParameters(pd.getTypeToSet()).get(1));
} else {
return read(pd.getPropertyType(), dbo);
}
}
/**
* Create a {@link Map} instance. Will return a {@link HashMap} by default. Subclasses might want to override this
* method to use a custom {@link Map} implementation.
*
* @return
*/
protected Map<String, Object> createMap() {
return new HashMap<String, Object>();
}
/**
* Reads every key/value pair from the {@link DBObject} into a {@link Map} instance.
*
* @param pd
* @param dbo
* @param targetType
* @return
*/
protected Map<?, ?> readMap(MongoPropertyDescriptor pd, DBObject dbo, Class<?> targetType) {
Map<String, Object> map = createMap();
for (String key : dbo.keySet()) {
Object value = dbo.get(key);
if (!isSimpleType(value.getClass())) {
map.put(key, read(targetType, (DBObject) value));
// Can do some reflection tricks here -
// throw new RuntimeException("User types not supported yet as values for Maps");
} else {
map.put(key, conversionService.convert(value, targetType));
}
}
return map;
}
protected static boolean isSimpleType(Class<?> propertyType) {
if (propertyType == null) {
return false;
}
if (propertyType.isArray()) {
return isSimpleType(propertyType.getComponentType());
}
return SIMPLE_TYPES.contains(propertyType.getName());
}
/**
* Callback to allow customizing creation of a {@link MongoBeanWrapper}.
*
* @param target
* the target object to wrap
* @param fieldAccess
* whether to use field access or property access
* @return
*/
protected MongoBeanWrapper createWrapper(Object target, boolean fieldAccess) {
return new MongoBeanWrapper(target, conversionService, fieldAccess);
}
public List<Class<?>> getGenericParameters(Type genericParameterType) {
List<Class<?>> actualGenericParameterTypes = new ArrayList<Class<?>>();
if (genericParameterType instanceof ParameterizedType) {
ParameterizedType aType = (ParameterizedType) genericParameterType;
Type[] parameterArgTypes = aType.getActualTypeArguments();
for (Type parameterArgType : parameterArgTypes) {
if (parameterArgType instanceof GenericArrayType) {
Class<?> arrayType = (Class<?>) ((GenericArrayType) parameterArgType).getGenericComponentType();
actualGenericParameterTypes.add(Array.newInstance(arrayType, 0).getClass());
} else {
if (parameterArgType instanceof ParameterizedType) {
ParameterizedType paramTypeArgs = (ParameterizedType) parameterArgType;
actualGenericParameterTypes.add((Class<?>) paramTypeArgs.getRawType());
} else {
if (parameterArgType instanceof TypeVariable) {
throw new RuntimeException("Can not map " + ((TypeVariable<?>) parameterArgType).getName());
} else {
if (parameterArgType instanceof Class) {
actualGenericParameterTypes.add((Class<?>) parameterArgType);
} else {
throw new RuntimeException("Can not map " + parameterArgType);
}
}
}
}
}
}
return actualGenericParameterTypes;
}
/* (non-Javadoc)
* @see org.springframework.data.document.mongodb.convert.MongoConverter#convertObjectId(org.bson.types.ObjectId, java.lang.Class)
*/
public <T> T convertObjectId(ObjectId id, Class<T> targetType) {
return conversionService.convert(id, targetType);
}
/* (non-Javadoc)
* @see org.springframework.data.document.mongodb.convert.MongoConverter#convertObjectId(java.lang.Object)
*/
public ObjectId convertObjectId(Object id) {
return conversionService.convert(id, ObjectId.class);
}
}

View File

@@ -1,56 +0,0 @@
/*
* Copyright 2010-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.mongodb.geo;
import org.springframework.data.annotation.PersistenceConstructor;
import org.springframework.util.Assert;
/**
* Represents a geospatial circle value
*
* @author Mark Pollack
* @author Oliver Gierke
*/
public class Circle {
private Point center;
private double radius;
@PersistenceConstructor
public Circle(Point center, double radius) {
Assert.notNull(center);
Assert.isTrue(radius >= 0, "Radius must not be negative!");
this.center = center;
this.radius = radius;
}
public Circle(double centerX, double centerY, double radius) {
this(new Point(centerX, centerY), radius);
}
public Point getCenter() {
return center;
}
public double getRadius() {
return radius;
}
@Override
public String toString() {
return String.format("Circle [center=%s, radius=%d]", center, radius);
}
}

View File

@@ -1,80 +0,0 @@
/*
* Copyright (c) 2011 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.mongodb.mapping;
import org.springframework.data.document.mongodb.MongoCollectionUtils;
import org.springframework.data.mapping.BasicPersistentEntity;
import org.springframework.data.mapping.model.MappingException;
import org.springframework.data.mapping.model.PersistentEntity;
import org.springframework.data.util.TypeInformation;
import org.springframework.util.StringUtils;
/**
* Mongo specific {@link PersistentEntity} implementation that adds Mongo specific meta-data such as the collection name
* and the like.
*
* @author Jon Brisbin <jbrisbin@vmware.com>
* @author Oliver Gierke
*/
public class BasicMongoPersistentEntity<T> extends BasicPersistentEntity<T, MongoPersistentProperty> implements
MongoPersistentEntity<T> {
private final String collection;
private final boolean isRootEntity;
/**
* Creates a new {@link BasicMongoPersistentEntity} with the given {@link TypeInformation}. Will default the
* collection name to the entities simple type name.
*
* @param typeInformation
*/
public BasicMongoPersistentEntity(TypeInformation<T> typeInformation) {
super(typeInformation);
Class<?> rawType = typeInformation.getType();
String fallback = MongoCollectionUtils.getPreferredCollectionName(rawType);
if (rawType.isAnnotationPresent(Document.class)) {
Document d = rawType.getAnnotation(Document.class);
this.collection = StringUtils.hasText(d.collection()) ? d.collection() : fallback;
this.isRootEntity = true;
} else {
this.collection = fallback;
this.isRootEntity = false;
}
}
/**
* Returns the collection the entity should be stored in.
*
* @return
*/
public String getCollection() {
return collection;
}
/* (non-Javadoc)
* @see org.springframework.data.mapping.BasicPersistentEntity#verify()
*/
@Override
public void verify() {
if (isRootEntity && idProperty == null) {
throw new MappingException(String.format("Root entity %s has to have an id property!", getType().getName()));
}
}
}

View File

@@ -1,36 +0,0 @@
/*
* Copyright (c) 2011 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.mongodb.mapping;
import java.lang.annotation.Documented;
import java.lang.annotation.ElementType;
import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;
import java.lang.annotation.Target;
/**
* Annotation to allow defining the name of the field a property should use in a Mongo document. This will cause the
* property annotated being persisted to a field with the configured name as wells as being read from it.
*
* @author Oliver Gierke
*/
@Retention(RetentionPolicy.RUNTIME)
@Target({ ElementType.FIELD })
@Documented
public @interface FieldName {
String value();
}

View File

@@ -1,61 +0,0 @@
/*
* Copyright (c) 2011 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.mongodb.mapping;
import java.beans.PropertyDescriptor;
import java.lang.reflect.Field;
import java.math.BigInteger;
import java.util.Set;
import org.bson.types.CodeWScope;
import org.bson.types.ObjectId;
import org.springframework.data.mapping.AbstractMappingContext;
import org.springframework.data.mapping.MappingBeanHelper;
import org.springframework.data.util.TypeInformation;
/**
* @author Jon Brisbin <jbrisbin@vmware.com>
*/
public class MongoMappingContext extends AbstractMappingContext<BasicMongoPersistentEntity<?>, MongoPersistentProperty> {
public MongoMappingContext() {
augmentSimpleTypes();
}
protected void augmentSimpleTypes() {
// Augment simpleTypes with MongoDB-specific classes
Set<Class<?>> simpleTypes = MappingBeanHelper.getSimpleTypes();
simpleTypes.add(com.mongodb.DBRef.class);
simpleTypes.add(ObjectId.class);
simpleTypes.add(CodeWScope.class);
simpleTypes.add(Character.class);
simpleTypes.add(BigInteger.class);
}
@Override
public MongoPersistentProperty createPersistentProperty(Field field, PropertyDescriptor descriptor,
BasicMongoPersistentEntity<?> owner) {
return new BasicMongoPersistentProperty(field, descriptor, owner);
}
/* (non-Javadoc)
* @see org.springframework.data.mapping.BasicMappingContext#createPersistentEntity(org.springframework.data.util.TypeInformation, org.springframework.data.mapping.model.MappingContext)
*/
@Override
protected <T> BasicMongoPersistentEntity<T> createPersistentEntity(TypeInformation<T> typeInformation) {
return new BasicMongoPersistentEntity<T>(typeInformation);
}
}

View File

@@ -1,85 +0,0 @@
/*
* Copyright (c) 2011 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.mongodb.mapping.event;
import com.mongodb.DBObject;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.springframework.context.ApplicationEvent;
import org.springframework.context.ApplicationListener;
/**
* @author Jon Brisbin <jbrisbin@vmware.com>
*/
public abstract class AbstractMappingEventListener<T extends ApplicationEvent, E> implements ApplicationListener<T> {
protected final Log log = LogFactory.getLog(getClass());
@SuppressWarnings({ "unchecked" })
public void onApplicationEvent(T appEvent) {
if (appEvent instanceof MongoMappingEvent) {
try {
MongoMappingEvent<E> event = (MongoMappingEvent<E>) appEvent;
if (event instanceof BeforeConvertEvent) {
onBeforeConvert(event.getSource());
} else if (event instanceof BeforeSaveEvent) {
onBeforeSave(event.getSource(), event.getDBObject());
} else if (event instanceof AfterSaveEvent) {
onAfterSave(event.getSource(), event.getDBObject());
} else if (event instanceof AfterLoadEvent) {
onAfterLoad((DBObject) event.getSource());
} else if (event instanceof AfterConvertEvent) {
onAfterConvert(event.getDBObject(), event.getSource());
}
} catch (ClassCastException e) {
// Not a mapping event for this entity, apparently.
// Just ignore it for now.
}
}
}
public void onBeforeConvert(E source) {
if (log.isDebugEnabled()) {
log.debug("onBeforeConvert(" + source + ")");
}
}
public void onBeforeSave(E source, DBObject dbo) {
if (log.isDebugEnabled()) {
log.debug("onBeforeSave(" + source + ", " + dbo + ")");
}
}
public void onAfterSave(E source, DBObject dbo) {
if (log.isDebugEnabled()) {
log.debug("onAfterSave(" + source + ", " + dbo + ")");
}
}
public void onAfterLoad(DBObject dbo) {
if (log.isDebugEnabled()) {
log.debug("onAfterLoad(" + dbo + ")");
}
}
public void onAfterConvert(DBObject dbo, E source) {
if (log.isDebugEnabled()) {
log.debug("onAfterConvert(" + dbo + "," + source + ")");
}
}
}

View File

@@ -1,5 +0,0 @@
/**
* MongoDB specific JMX monitoring support.
*/
package org.springframework.data.document.mongodb.monitor;

View File

@@ -1,5 +0,0 @@
/**
* MongoDB core support.
*/
package org.springframework.data.document.mongodb;

View File

@@ -1,9 +0,0 @@
package org.springframework.data.document.mongodb.query;
public class NorQuery extends Query {
public NorQuery(Query... q) {
super.nor(q);
}
}

View File

@@ -1,51 +0,0 @@
/*
* Copyright 2010-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.mongodb.query;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
import org.bson.types.BasicBSONList;
public class OrCriteria implements CriteriaDefinition {
Query[] queries = null;
public OrCriteria(Query[] queries) {
super();
this.queries = queries;
}
/*
* (non-Javadoc)
*
* @see org.springframework.datastore.document.mongodb.query.Criteria#
* getCriteriaObject(java.lang.String)
*/
public DBObject getCriteriaObject() {
DBObject dbo = new BasicDBObject();
BasicBSONList l = new BasicBSONList();
for (Query q : queries) {
l.add(q.getQueryObject());
}
dbo.put(getOperator(), l);
return dbo;
}
protected String getOperator() {
return "$or";
}
}

View File

@@ -1,9 +0,0 @@
package org.springframework.data.document.mongodb.query;
public class OrQuery extends Query {
public OrQuery(Query... q) {
super.or(q);
}
}

View File

@@ -1,5 +0,0 @@
/**
* MongoDB specific query and update support.
*/
package org.springframework.data.document.mongodb.query;

View File

@@ -1,314 +0,0 @@
/*
* Copyright 2002-2010 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.mongodb.repository;
import static org.springframework.data.querydsl.QueryDslUtils.*;
import java.io.Serializable;
import java.lang.reflect.Method;
import java.util.Arrays;
import java.util.HashSet;
import java.util.Set;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.springframework.data.document.mongodb.MongoOperations;
import org.springframework.data.document.mongodb.MongoPropertyDescriptors.MongoPropertyDescriptor;
import org.springframework.data.document.mongodb.MongoTemplate;
import org.springframework.data.document.mongodb.mapping.MongoPersistentEntity;
import org.springframework.data.document.mongodb.mapping.MongoPersistentProperty;
import org.springframework.data.document.mongodb.query.Index;
import org.springframework.data.document.mongodb.query.Order;
import org.springframework.data.domain.Sort;
import org.springframework.data.mapping.model.MappingContext;
import org.springframework.data.querydsl.QueryDslPredicateExecutor;
import org.springframework.data.repository.Repository;
import org.springframework.data.repository.query.QueryLookupStrategy;
import org.springframework.data.repository.query.QueryLookupStrategy.Key;
import org.springframework.data.repository.query.RepositoryQuery;
import org.springframework.data.repository.query.parser.Part;
import org.springframework.data.repository.query.parser.Part.Type;
import org.springframework.data.repository.query.parser.PartTree;
import org.springframework.data.repository.support.QueryCreationListener;
import org.springframework.data.repository.support.RepositoryFactoryBeanSupport;
import org.springframework.data.repository.support.RepositoryFactorySupport;
import org.springframework.data.repository.support.RepositoryMetadata;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
/**
* {@link org.springframework.beans.factory.FactoryBean} to create {@link MongoRepository} instances.
*
* @author Oliver Gierke
*/
public class MongoRepositoryFactoryBean<T extends Repository<S, ID>, S, ID extends Serializable> extends
RepositoryFactoryBeanSupport<T, S, ID> {
private MongoTemplate template;
/**
* Configures the {@link MongoTemplate} to be used.
*
* @param template
* the template to set
*/
public void setTemplate(MongoTemplate template) {
this.template = template;
}
/*
* (non-Javadoc)
*
* @see
* org.springframework.data.repository.support.RepositoryFactoryBeanSupport
* #createRepositoryFactory()
*/
@Override
protected RepositoryFactorySupport createRepositoryFactory() {
MongoRepositoryFactory factory = new MongoRepositoryFactory(template);
factory.addQueryCreationListener(new IndexEnsuringQueryCreationListener(template));
return factory;
}
/*
* (non-Javadoc)
*
* @see
* org.springframework.data.repository.support.RepositoryFactoryBeanSupport
* #afterPropertiesSet()
*/
@Override
public void afterPropertiesSet() {
super.afterPropertiesSet();
Assert.notNull(template, "MongoTemplate must not be null!");
}
/**
* Repository to create {@link MongoRepository} instances.
*
* @author Oliver Gierke
*/
public static class MongoRepositoryFactory extends RepositoryFactorySupport {
private final MongoTemplate template;
private final EntityInformationCreator entityInformationCreator;
/**
* Creates a new {@link MongoRepositoryFactory} with the given {@link MongoTemplate} and {@link MappingContext}.
*
* @param template
* must not be {@literal null}
* @param mappingContext
*/
public MongoRepositoryFactory(MongoTemplate template) {
Assert.notNull(template);
this.template = template;
this.entityInformationCreator = new EntityInformationCreator(template.getConverter()
.getMappingContext());
}
/*
* (non-Javadoc)
*
* @see
* org.springframework.data.repository.support.RepositoryFactorySupport
* #getRepositoryBaseClass()
*/
@Override
protected Class<?> getRepositoryBaseClass(RepositoryMetadata metadata) {
return isQueryDslRepository(metadata.getRepositoryInterface()) ? QueryDslMongoRepository.class
: SimpleMongoRepository.class;
}
/*
* (non-Javadoc)
*
* @see
* org.springframework.data.repository.support.RepositoryFactorySupport
* #getTargetRepository
* (org.springframework.data.repository.support.RepositoryMetadata)
*/
@Override
@SuppressWarnings({ "rawtypes", "unchecked" })
protected Object getTargetRepository(RepositoryMetadata metadata) {
Class<?> repositoryInterface = metadata.getRepositoryInterface();
MongoEntityInformation<?, Serializable> entityInformation = getEntityInformation(metadata.getDomainClass());
if (isQueryDslRepository(repositoryInterface)) {
return new QueryDslMongoRepository(entityInformation, template);
} else {
return new SimpleMongoRepository(entityInformation, template);
}
}
private static boolean isQueryDslRepository(Class<?> repositoryInterface) {
return QUERY_DSL_PRESENT && QueryDslPredicateExecutor.class.isAssignableFrom(repositoryInterface);
}
/*
* (non-Javadoc)
*
* @see
* org.springframework.data.repository.support.RepositoryFactorySupport
* #getQueryLookupStrategy
* (org.springframework.data.repository.query.QueryLookupStrategy.Key)
*/
@Override
protected QueryLookupStrategy getQueryLookupStrategy(Key key) {
return new MongoQueryLookupStrategy();
}
/**
* {@link QueryLookupStrategy} to create {@link PartTreeMongoQuery} instances.
*
* @author Oliver Gierke
*/
private class MongoQueryLookupStrategy implements QueryLookupStrategy {
/*
* (non-Javadoc)
*
* @see
* org.springframework.data.repository.query.QueryLookupStrategy
* #resolveQuery(java.lang.reflect.Method, java.lang.Class)
*/
public RepositoryQuery resolveQuery(Method method, RepositoryMetadata metadata) {
MongoQueryMethod queryMethod = new MongoQueryMethod(method, metadata, entityInformationCreator);
if (queryMethod.hasAnnotatedQuery()) {
return new StringBasedMongoQuery(queryMethod, template);
} else {
return new PartTreeMongoQuery(queryMethod, template);
}
}
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.support.RepositoryFactorySupport#validate(org.springframework.data.repository.support.RepositoryMetadata)
*/
@Override
protected void validate(RepositoryMetadata metadata) {
Class<?> idClass = metadata.getIdClass();
if (!MongoPropertyDescriptor.SUPPORTED_ID_CLASSES.contains(idClass)) {
throw new IllegalArgumentException(String.format("Unsupported id class! Only %s are supported!",
StringUtils.collectionToCommaDelimitedString(MongoPropertyDescriptor.SUPPORTED_ID_CLASSES)));
}
}
/*
* (non-Javadoc)
*
* @see
* org.springframework.data.repository.support.RepositoryFactorySupport
* #getEntityInformation(java.lang.Class)
*/
@Override
public <T, ID extends Serializable> MongoEntityInformation<T, ID> getEntityInformation(Class<T> domainClass) {
return entityInformationCreator.getEntityInformation(domainClass);
}
}
/**
* Simple wrapper to to create {@link MongoEntityInformation} instances based on a {@link MappingContext}.
*
* @author Oliver Gierke
*/
static class EntityInformationCreator {
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
public EntityInformationCreator(MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext) {
Assert.notNull(mappingContext);
this.mappingContext = mappingContext;
}
@SuppressWarnings("unchecked")
public <T, ID extends Serializable> MongoEntityInformation<T, ID> getEntityInformation(Class<T> domainClass) {
MongoPersistentEntity<T> persistentEntity = (MongoPersistentEntity<T>) mappingContext
.getPersistentEntity(domainClass);
return new MappingMongoEntityInformation<T, ID>(persistentEntity);
}
}
/**
* {@link QueryCreationListener} inspecting {@link PartTreeMongoQuery}s and creating an index for the properties it
* refers to.
*
* @author Oliver Gierke
*/
private static class IndexEnsuringQueryCreationListener implements QueryCreationListener<PartTreeMongoQuery> {
private static final Set<Type> GEOSPATIAL_TYPES = new HashSet<Part.Type>(Arrays.asList(Type.NEAR, Type.WITHIN));
private static final Log LOG = LogFactory.getLog(IndexEnsuringQueryCreationListener.class);
private final MongoOperations operations;
public IndexEnsuringQueryCreationListener(MongoOperations operations) {
this.operations = operations;
}
/*
* (non-Javadoc)
*
* @see
* org.springframework.data.repository.support.QueryCreationListener
* #onCreation(org.springframework.data.repository
* .query.RepositoryQuery)
*/
public void onCreation(PartTreeMongoQuery query) {
PartTree tree = query.getTree();
Index index = new Index();
index.named(query.getQueryMethod().getName());
Sort sort = tree.getSort();
for (Part part : tree.getParts()) {
if (GEOSPATIAL_TYPES.contains(part.getType())) {
return;
}
String property = part.getProperty().toDotPath();
Order order = toOrder(sort, property);
index.on(property, order);
}
MongoEntityInformation<?, ?> metadata = query.getQueryMethod().getEntityInformation();
operations.ensureIndex(index, metadata.getCollectionName());
LOG.debug(String.format("Created %s!", index));
}
private static Order toOrder(Sort sort, String property) {
if (sort == null) {
return Order.DESCENDING;
}
org.springframework.data.domain.Sort.Order order = sort.getOrderFor(property);
return order == null ? Order.DESCENDING : order.isAscending() ? Order.ASCENDING : Order.DESCENDING;
}
}
}

View File

@@ -1,5 +0,0 @@
/**
* MongoDB specific repository implementation.
*/
package org.springframework.data.document.mongodb.repository;

View File

@@ -13,7 +13,7 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.mongodb;
package org.springframework.data.mongodb;
import org.springframework.dao.DataAccessResourceFailureException;
@@ -35,10 +35,10 @@ public class CannotGetMongoDbConnectionException extends DataAccessResourceFailu
super(msg);
}
public CannotGetMongoDbConnectionException(String msg, String database, String username, char[] password2) {
public CannotGetMongoDbConnectionException(String msg, String database, String username, char[] password) {
super(msg);
this.username = username;
this.password = password2;
this.password = password == null ? null : password.clone();
this.database = database;
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010 the original author or authors.
* Copyright 2010-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -13,19 +13,19 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.mongodb;
package org.springframework.data.mongodb;
import org.springframework.dao.InvalidDataAccessApiUsageException;
public class InvalidMongoDbApiUsageException extends InvalidDataAccessApiUsageException {
public InvalidMongoDbApiUsageException(String msg) {
super(msg);
}
private static final long serialVersionUID = 2034770973290508041L;
public InvalidMongoDbApiUsageException(String msg, Throwable cause) {
super(msg, cause);
}
public InvalidMongoDbApiUsageException(String msg) {
super(msg);
}
public InvalidMongoDbApiUsageException(String msg, Throwable cause) {
super(msg, cause);
}
}

View File

@@ -14,18 +14,16 @@
* limitations under the License.
*/
package org.springframework.data.document.mongodb;
package org.springframework.data.mongodb;
import org.springframework.util.StringUtils;
/**
* Helper class featuring helper methods for working with MongoDb collections.
* <p/>
* <p/>
* Mainly intended for internal use within the framework.
*
*
* @author Thomas Risberg
* @since 1.0
*/
@@ -40,7 +38,7 @@ public abstract class MongoCollectionUtils {
/**
* Obtains the collection name to use for the provided class
*
*
* @param entityClass The class to determine the preferred collection name for
* @return The preferred collection name
*/

View File

@@ -0,0 +1,30 @@
package org.springframework.data.mongodb;
import org.springframework.dao.DataAccessException;
import com.mongodb.DB;
/**
* Interface for factories creating {@link DB} instances.
*
* @author Mark Pollack
*/
public interface MongoDbFactory {
/**
* Creates a default {@link DB} instance.
*
* @return
* @throws DataAccessException
*/
DB getDb() throws DataAccessException;
/**
* Creates a {@link DB} instance to access the database with the given name.
*
* @param dbName must not be {@literal null} or empty.
* @return
* @throws DataAccessException
*/
DB getDb(String dbName) throws DataAccessException;
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010 the original author or authors.
* Copyright 2010-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -13,15 +13,15 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.mongodb;
package org.springframework.data.mongodb;
import org.springframework.dao.UncategorizedDataAccessException;
public class UncategorizedMongoDbException extends UncategorizedDataAccessException {
public UncategorizedMongoDbException(String msg, Throwable cause) {
super(msg, cause);
}
private static final long serialVersionUID = -2336595514062364929L;
public UncategorizedMongoDbException(String msg, Throwable cause) {
super(msg, cause);
}
}

View File

@@ -13,7 +13,7 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.mongodb.config;
package org.springframework.data.mongodb.config;
import java.util.HashSet;
import java.util.Set;
@@ -26,13 +26,12 @@ import org.springframework.context.annotation.Configuration;
import org.springframework.core.type.filter.AnnotationTypeFilter;
import org.springframework.data.annotation.Persistent;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.document.mongodb.MongoDbFactory;
import org.springframework.data.document.mongodb.MongoTemplate;
import org.springframework.data.document.mongodb.SimpleMongoDbFactory;
import org.springframework.data.document.mongodb.convert.MappingMongoConverter;
import org.springframework.data.document.mongodb.mapping.Document;
import org.springframework.data.document.mongodb.mapping.MongoMappingContext;
import org.springframework.data.mapping.context.MappingContextAwareBeanPostProcessor;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.SimpleMongoDbFactory;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.util.ClassUtils;
import org.springframework.util.StringUtils;
@@ -46,24 +45,24 @@ public abstract class AbstractMongoConfiguration {
@Bean
public MongoTemplate mongoTemplate() throws Exception {
return new MongoTemplate(mongoDbFactory(), mappingMongoConverter());
return new MongoTemplate(mongoDbFactory(), mappingMongoConverter());
}
@Bean
public MongoDbFactory mongoDbFactory() throws Exception {
if (getUserCredentials() == null) {
return new SimpleMongoDbFactory(mongo(), getDatabaseName());
} else {
return new SimpleMongoDbFactory(mongo(), getDatabaseName(), getUserCredentials());
}
public MongoDbFactory mongoDbFactory() throws Exception {
if (getUserCredentials() == null) {
return new SimpleMongoDbFactory(mongo(), getDatabaseName());
} else {
return new SimpleMongoDbFactory(mongo(), getDatabaseName(), getUserCredentials());
}
}
public String getMappingBasePackage() {
return "";
}
public UserCredentials getUserCredentials() {
return null;
return null;
}
@Bean
@@ -71,13 +70,15 @@ public abstract class AbstractMongoConfiguration {
MongoMappingContext mappingContext = new MongoMappingContext();
String basePackage = getMappingBasePackage();
if (StringUtils.hasText(basePackage)) {
ClassPathScanningCandidateComponentProvider componentProvider = new ClassPathScanningCandidateComponentProvider(false);
ClassPathScanningCandidateComponentProvider componentProvider = new ClassPathScanningCandidateComponentProvider(
false);
componentProvider.addIncludeFilter(new AnnotationTypeFilter(Document.class));
componentProvider.addIncludeFilter(new AnnotationTypeFilter(Persistent.class));
Set<Class<?>> initialEntitySet = new HashSet<Class<?>>();
for (BeanDefinition candidate : componentProvider.findCandidateComponents(basePackage)) {
initialEntitySet.add(ClassUtils.forName(candidate.getBeanClassName(), mappingContext.getClass().getClassLoader()));
initialEntitySet.add(ClassUtils.forName(candidate.getBeanClassName(), mappingContext.getClass()
.getClassLoader()));
}
mappingContext.setInitialEntitySet(initialEntitySet);
}
@@ -93,17 +94,9 @@ public abstract class AbstractMongoConfiguration {
/**
* Hook that allows post-processing after the MappingMongoConverter has been successfully created.
*
*
* @param converter
*/
protected void afterMappingMongoConverterCreation(MappingMongoConverter converter) {
}
@Bean
public MappingContextAwareBeanPostProcessor mappingContextAwareBeanPostProcessor() {
MappingContextAwareBeanPostProcessor bpp = new MappingContextAwareBeanPostProcessor();
bpp.setMappingContextBeanName("mongoMappingContext");
return bpp;
}
}

View File

@@ -14,7 +14,7 @@
* limitations under the License.
*/
package org.springframework.data.document.mongodb.config;
package org.springframework.data.mongodb.config;
/**
* @author Jon Brisbin <jbrisbin@vmware.com>
@@ -25,6 +25,4 @@ public abstract class BeanNames {
static final String INDEX_HELPER = "indexCreationHelper";
static final String MONGO = "mongo";
static final String DB_FACTORY = "mongoDbFactory";
static final String POST_PROCESSOR = "mappingContextAwareBeanPostProcessor";
}

View File

@@ -14,9 +14,9 @@
* limitations under the License.
*/
package org.springframework.data.document.mongodb.config;
package org.springframework.data.mongodb.config;
import static org.springframework.data.document.mongodb.config.BeanNames.*;
import static org.springframework.data.mongodb.config.BeanNames.*;
import java.util.List;
import java.util.Set;
@@ -30,6 +30,7 @@ import org.springframework.beans.factory.config.RuntimeBeanReference;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
import org.springframework.beans.factory.support.GenericBeanDefinition;
import org.springframework.beans.factory.support.ManagedList;
import org.springframework.beans.factory.support.ManagedSet;
import org.springframework.beans.factory.xml.AbstractBeanDefinitionParser;
@@ -37,11 +38,11 @@ import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.context.annotation.ClassPathScanningCandidateComponentProvider;
import org.springframework.core.type.filter.AnnotationTypeFilter;
import org.springframework.data.annotation.Persistent;
import org.springframework.data.document.mongodb.convert.MappingMongoConverter;
import org.springframework.data.document.mongodb.mapping.Document;
import org.springframework.data.document.mongodb.mapping.MongoMappingContext;
import org.springframework.data.document.mongodb.mapping.MongoPersistentEntityIndexCreator;
import org.springframework.data.mapping.context.MappingContextAwareBeanPostProcessor;
import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexCreator;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.util.StringUtils;
import org.springframework.util.xml.DomUtils;
import org.w3c.dom.Element;
@@ -65,6 +66,43 @@ public class MappingMongoConverterParser extends AbstractBeanDefinitionParser {
protected AbstractBeanDefinition parseInternal(Element element, ParserContext parserContext) {
BeanDefinitionRegistry registry = parserContext.getRegistry();
BeanDefinition conversionsDefinition = getCustomConversions(element, parserContext);
String ctxRef = potentiallyCreateMappingContext(element, parserContext, conversionsDefinition);
// Need a reference to a Mongo instance
String dbFactoryRef = element.getAttribute("db-factory-ref");
if (!StringUtils.hasText(dbFactoryRef)) {
dbFactoryRef = DB_FACTORY;
}
// Converter
BeanDefinitionBuilder converterBuilder = BeanDefinitionBuilder.genericBeanDefinition(MappingMongoConverter.class);
converterBuilder.addConstructorArgReference(dbFactoryRef);
converterBuilder.addConstructorArgReference(ctxRef);
if (conversionsDefinition != null) {
converterBuilder.addPropertyValue("customConversions", conversionsDefinition);
}
try {
registry.getBeanDefinition(INDEX_HELPER);
} catch (NoSuchBeanDefinitionException ignored) {
if (!StringUtils.hasText(dbFactoryRef)) {
dbFactoryRef = DB_FACTORY;
}
BeanDefinitionBuilder indexHelperBuilder = BeanDefinitionBuilder
.genericBeanDefinition(MongoPersistentEntityIndexCreator.class);
indexHelperBuilder.addConstructorArgValue(new RuntimeBeanReference(ctxRef));
indexHelperBuilder.addConstructorArgValue(new RuntimeBeanReference(dbFactoryRef));
registry.registerBeanDefinition(INDEX_HELPER, indexHelperBuilder.getBeanDefinition());
}
return converterBuilder.getBeanDefinition();
}
private String potentiallyCreateMappingContext(Element element, ParserContext parserContext,
BeanDefinition conversionsDefinition) {
String ctxRef = element.getAttribute("mapping-context-ref");
if (!StringUtils.hasText(ctxRef)) {
BeanDefinitionBuilder mappingContextBuilder = BeanDefinitionBuilder
@@ -75,57 +113,47 @@ public class MappingMongoConverterParser extends AbstractBeanDefinitionParser {
mappingContextBuilder.addPropertyValue("initialEntitySet", classesToAdd);
}
registry.registerBeanDefinition(MAPPING_CONTEXT, mappingContextBuilder.getBeanDefinition());
if (conversionsDefinition != null) {
AbstractBeanDefinition simpleTypesDefinition = new GenericBeanDefinition();
simpleTypesDefinition.setFactoryBeanName("customConversions");
simpleTypesDefinition.setFactoryMethodName("getSimpleTypeHolder");
mappingContextBuilder.addPropertyValue("simpleTypeHolder", simpleTypesDefinition);
}
parserContext.getRegistry().registerBeanDefinition(MAPPING_CONTEXT, mappingContextBuilder.getBeanDefinition());
ctxRef = MAPPING_CONTEXT;
}
try {
registry.getBeanDefinition(POST_PROCESSOR);
} catch (NoSuchBeanDefinitionException ignored) {
BeanDefinitionBuilder postProcBuilder = BeanDefinitionBuilder
.genericBeanDefinition(MappingContextAwareBeanPostProcessor.class);
postProcBuilder.addPropertyValue("mappingContextBeanName", ctxRef);
registry.registerBeanDefinition(POST_PROCESSOR, postProcBuilder.getBeanDefinition());
}
return ctxRef;
}
// Need a reference to a Mongo instance
String dbFactoryRef = element.getAttribute("db-factory-ref");
if (!StringUtils.hasText(dbFactoryRef)) {
dbFactoryRef = DB_FACTORY;
}
BeanDefinitionBuilder converterBuilder = BeanDefinitionBuilder.genericBeanDefinition(MappingMongoConverter.class);
converterBuilder.addConstructorArgReference(dbFactoryRef);
converterBuilder.addConstructorArgReference(ctxRef);
try {
registry.getBeanDefinition(INDEX_HELPER);
} catch (NoSuchBeanDefinitionException ignored) {
if (!StringUtils.hasText(dbFactoryRef)) {
dbFactoryRef = DB_FACTORY;
}
BeanDefinitionBuilder indexHelperBuilder = BeanDefinitionBuilder.genericBeanDefinition(MongoPersistentEntityIndexCreator.class);
indexHelperBuilder.addConstructorArgValue(new RuntimeBeanReference(ctxRef));
indexHelperBuilder.addConstructorArgValue(new RuntimeBeanReference(dbFactoryRef));
registry.registerBeanDefinition(INDEX_HELPER, indexHelperBuilder.getBeanDefinition());
}
private BeanDefinition getCustomConversions(Element element, ParserContext parserContext) {
List<Element> customConvertersElements = DomUtils.getChildElementsByTagName(element, "custom-converters");
if (customConvertersElements.size() == 1) {
Element customerConvertersElement = customConvertersElements.get(0);
ManagedList<BeanMetadataElement> converterBeans = new ManagedList<BeanMetadataElement>();
List<Element> listenerElements = DomUtils.getChildElementsByTagName(customerConvertersElement, "converter");
if (listenerElements != null) {
for (Element listenerElement : listenerElements) {
List<Element> converterElements = DomUtils.getChildElementsByTagName(customerConvertersElement, "converter");
if (converterElements != null) {
for (Element listenerElement : converterElements) {
converterBeans.add(parseConverter(listenerElement, parserContext));
}
}
converterBuilder.addPropertyValue("customConverters", converterBeans);
BeanDefinitionBuilder conversionsBuilder = BeanDefinitionBuilder.rootBeanDefinition(CustomConversions.class);
conversionsBuilder.addConstructorArgValue(converterBeans);
AbstractBeanDefinition conversionsBean = conversionsBuilder.getBeanDefinition();
conversionsBean.setSource(parserContext.extractSource(element));
parserContext.getRegistry().registerBeanDefinition("customConversions", conversionsBean);
return conversionsBean;
}
return converterBuilder.getBeanDefinition();
return null;
}
public Set<String> getInititalEntityClasses(Element element, BeanDefinitionBuilder builder) {

View File

@@ -0,0 +1,145 @@
/*
* Copyright 2011 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import static org.springframework.data.mongodb.config.BeanNames.*;
import static org.springframework.data.mongodb.config.ParsingUtils.*;
import org.springframework.beans.factory.BeanDefinitionStoreException;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.config.RuntimeBeanReference;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionReaderUtils;
import org.springframework.beans.factory.xml.AbstractBeanDefinitionParser;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.mongodb.core.MongoFactoryBean;
import org.springframework.data.mongodb.core.SimpleMongoDbFactory;
import org.springframework.util.StringUtils;
import org.w3c.dom.Element;
import com.mongodb.Mongo;
import com.mongodb.MongoURI;
/**
* {@link BeanDefinitionParser} to parse {@code db-factory} elements into {@link BeanDefinition}s.
*
* @author Jon Brisbin
* @author Oliver Gierke
*/
public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
@Override
protected String resolveId(Element element, AbstractBeanDefinition definition, ParserContext parserContext)
throws BeanDefinitionStoreException {
String id = element.getAttribute("id");
if (!StringUtils.hasText(id)) {
id = DB_FACTORY;
}
return id;
}
@Override
protected AbstractBeanDefinition parseInternal(Element element, ParserContext parserContext) {
String uri = element.getAttribute("uri");
String mongoRef = element.getAttribute("mongo-ref");
String dbname = element.getAttribute("dbname");
BeanDefinition userCredentials = getUserCredentialsBeanDefinition(element, parserContext);
// Common setup
BeanDefinitionBuilder dbFactoryBuilder = BeanDefinitionBuilder.genericBeanDefinition(SimpleMongoDbFactory.class);
ParsingUtils.setPropertyValue(element, dbFactoryBuilder, "write-concern", "writeConcern");
if (StringUtils.hasText(uri)) {
if(StringUtils.hasText(mongoRef) || StringUtils.hasText(dbname) || userCredentials != null) {
parserContext.getReaderContext().error("Configure either Mongo URI or details individually!", parserContext.extractSource(element));
}
dbFactoryBuilder.addConstructorArgValue(getMongoUri(uri));
return getSourceBeanDefinition(dbFactoryBuilder, parserContext, element);
}
// Defaulting
mongoRef = StringUtils.hasText(mongoRef) ? mongoRef : registerMongoBeanDefinition(element, parserContext);
dbname = StringUtils.hasText(dbname) ? dbname : "db";
dbFactoryBuilder.addConstructorArgValue(new RuntimeBeanReference(mongoRef));
dbFactoryBuilder.addConstructorArgValue(dbname);
if (userCredentials != null) {
dbFactoryBuilder.addConstructorArgValue(userCredentials);
}
return getSourceBeanDefinition(dbFactoryBuilder, parserContext, element);
}
/**
* Registers a default {@link BeanDefinition} of a {@link Mongo} instance and returns the name under which the
* {@link Mongo} instance was registered under.
*
* @param element must not be {@literal null}.
* @param parserContext must not be {@literal null}.
* @return
*/
private String registerMongoBeanDefinition(Element element, ParserContext parserContext) {
BeanDefinitionBuilder mongoBuilder = BeanDefinitionBuilder.genericBeanDefinition(MongoFactoryBean.class);
ParsingUtils.setPropertyValue(element, mongoBuilder, "host");
ParsingUtils.setPropertyValue(element, mongoBuilder, "port");
return BeanDefinitionReaderUtils.registerWithGeneratedName(mongoBuilder.getBeanDefinition(),
parserContext.getRegistry());
}
/**
* Returns a {@link BeanDefinition} for a {@link UserCredentials} object.
*
* @param element
* @return the {@link BeanDefinition} or {@literal null} if neither username nor password given.
*/
private BeanDefinition getUserCredentialsBeanDefinition(Element element, ParserContext context) {
String username = element.getAttribute("username");
String password = element.getAttribute("password");
if (!StringUtils.hasText(username) && !StringUtils.hasText(password)) {
return null;
}
BeanDefinitionBuilder userCredentialsBuilder = BeanDefinitionBuilder.genericBeanDefinition(UserCredentials.class);
userCredentialsBuilder.addConstructorArgValue(StringUtils.hasText(username) ? username : null);
userCredentialsBuilder.addConstructorArgValue(StringUtils.hasText(password) ? password : null);
return getSourceBeanDefinition(userCredentialsBuilder, context, element);
}
/**
* Creates a {@link BeanDefinition} for a {@link MongoURI}.
*
* @param uri
* @return
*/
private BeanDefinition getMongoUri(String uri) {
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(MongoURI.class);
builder.addConstructorArgValue(uri);
return builder.getBeanDefinition();
}
}

View File

@@ -13,7 +13,7 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.mongodb.config;
package org.springframework.data.mongodb.config;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.parsing.BeanComponentDefinition;
@@ -21,8 +21,8 @@ import org.springframework.beans.factory.parsing.CompositeComponentDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.document.mongodb.MongoAdmin;
import org.springframework.data.document.mongodb.monitor.*;
import org.springframework.data.mongodb.core.MongoAdmin;
import org.springframework.data.mongodb.monitor.*;
import org.springframework.util.StringUtils;
import org.w3c.dom.Element;

View File

@@ -13,16 +13,17 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.mongodb.config;
package org.springframework.data.mongodb.config;
import org.springframework.beans.factory.xml.NamespaceHandlerSupport;
import org.springframework.data.mongodb.repository.config.MongoRepositoryConfigParser;
/**
* {@link org.springframework.beans.factory.xml.NamespaceHandler} for Mongo DB based repositories.
*
*
* @author Oliver Gierke
*/
public class MongoRepositoryNamespaceHandler extends NamespaceHandlerSupport {
public class MongoNamespaceHandler extends NamespaceHandlerSupport {
/*
* (non-Javadoc)

View File

@@ -14,29 +14,25 @@
* limitations under the License.
*/
package org.springframework.data.document.mongodb.config;
package org.springframework.data.mongodb.config;
import org.springframework.beans.factory.BeanDefinitionStoreException;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.ManagedList;
import org.springframework.beans.factory.xml.AbstractSingleBeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.document.mongodb.MongoFactoryBean;
import org.springframework.data.document.mongodb.MongoOptionsFactoryBean;
import org.springframework.data.mongodb.core.MongoFactoryBean;
import org.springframework.util.StringUtils;
import org.springframework.util.xml.DomUtils;
import org.w3c.dom.Element;
import com.mongodb.ServerAddress;
/**
* Parser for &lt;mongo;gt; definitions. If no name
*
*
* @author Mark Pollack
*/
public class MongoParser extends AbstractSingleBeanDefinitionParser {
@Override
protected Class<?> getBeanClass(Element element) {
return MongoFactoryBean.class;
}
@@ -47,16 +43,13 @@ public class MongoParser extends AbstractSingleBeanDefinitionParser {
ParsingUtils.setPropertyValue(element, builder, "port", "port");
ParsingUtils.setPropertyValue(element, builder, "host", "host");
ParsingUtils.setPropertyValue(element, builder, "write-concern", "writeConcern");
ParsingUtils.parseMongoOptions(parserContext, element, builder);
ParsingUtils.parseReplicaSet(parserContext, element, builder);
}
@Override
protected String resolveId(Element element, AbstractBeanDefinition definition, ParserContext parserContext)
throws BeanDefinitionStoreException {
@@ -66,6 +59,4 @@ public class MongoParser extends AbstractSingleBeanDefinitionParser {
}
return name;
}
}

View File

@@ -0,0 +1,130 @@
/*
* Copyright (c) 2011 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.ManagedList;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.mongodb.core.MongoOptionsFactoryBean;
import org.springframework.util.StringUtils;
import org.springframework.util.xml.DomUtils;
import org.w3c.dom.Element;
import com.mongodb.ServerAddress;
abstract class ParsingUtils {
/**
* Parses the mongo replica-set element.
*
* @param parserContext the parser context
* @param element the mongo element
* @param mongoBuilder the bean definition builder to populate
* @return true if parsing actually occured, false otherwise
*/
static boolean parseReplicaSet(ParserContext parserContext, Element element, BeanDefinitionBuilder mongoBuilder) {
String replicaSetString = element.getAttribute("replica-set");
if (StringUtils.hasText(replicaSetString)) {
ManagedList<Object> serverAddresses = new ManagedList<Object>();
String[] replicaSetStringArray = StringUtils.commaDelimitedListToStringArray(replicaSetString);
for (String element2 : replicaSetStringArray) {
String[] hostAndPort = StringUtils.delimitedListToStringArray(element2, ":");
BeanDefinitionBuilder defBuilder = BeanDefinitionBuilder.genericBeanDefinition(ServerAddress.class);
defBuilder.addConstructorArgValue(hostAndPort[0]);
defBuilder.addConstructorArgValue(hostAndPort[1]);
serverAddresses.add(defBuilder.getBeanDefinition());
}
if (!serverAddresses.isEmpty()) {
mongoBuilder.addPropertyValue("replicaSetSeeds", serverAddresses);
}
}
return true;
}
/**
* Parses the mongo:options sub-element. Populates the given attribute factory with the proper attributes.
*
* @return true if parsing actually occured, false otherwise
*/
static boolean parseMongoOptions(ParserContext parserContext, Element element, BeanDefinitionBuilder mongoBuilder) {
Element optionsElement = DomUtils.getChildElementByTagName(element, "options");
if (optionsElement == null) {
return false;
}
BeanDefinitionBuilder optionsDefBuilder = BeanDefinitionBuilder
.genericBeanDefinition(MongoOptionsFactoryBean.class);
setPropertyValue(optionsElement, optionsDefBuilder, "connections-per-host", "connectionsPerHost");
setPropertyValue(optionsElement, optionsDefBuilder, "threads-allowed-to-block-for-connection-multiplier",
"threadsAllowedToBlockForConnectionMultiplier");
setPropertyValue(optionsElement, optionsDefBuilder, "max-wait-time", "maxWaitTime");
setPropertyValue(optionsElement, optionsDefBuilder, "connect-timeout", "connectTimeout");
setPropertyValue(optionsElement, optionsDefBuilder, "socket-timeout", "socketTimeout");
setPropertyValue(optionsElement, optionsDefBuilder, "socket-keep-alive", "socketKeepAlive");
setPropertyValue(optionsElement, optionsDefBuilder, "auto-connect-retry", "autoConnectRetry");
setPropertyValue(optionsElement, optionsDefBuilder, "max-auto-connect-retry-time", "maxAutoConnectRetryTime");
setPropertyValue(optionsElement, optionsDefBuilder, "write-number", "writeNumber");
setPropertyValue(optionsElement, optionsDefBuilder, "write-timeout", "writeTimeout");
setPropertyValue(optionsElement, optionsDefBuilder, "write-fsync", "writeFsync");
setPropertyValue(optionsElement, optionsDefBuilder, "slave-ok", "slaveOk");
mongoBuilder.addPropertyValue("mongoOptions", optionsDefBuilder.getBeanDefinition());
return true;
}
static void setPropertyValue(Element element, BeanDefinitionBuilder builder, String attrName, String propertyName) {
String attr = element.getAttribute(attrName);
if (StringUtils.hasText(attr)) {
builder.addPropertyValue(propertyName, attr);
}
}
/**
* Sets the property with the given attribute name on the given {@link BeanDefinitionBuilder} to the value of the
* attribute with the given name.
*
* @param element must not be {@literal null}.
* @param builder must not be {@literal null}.
* @param attrName must not be {@literal null} or empty.
*/
static void setPropertyValue(Element element, BeanDefinitionBuilder builder, String attrName) {
String attr = element.getAttribute(attrName);
if (StringUtils.hasText(attr)) {
builder.addPropertyValue(attrName, attr);
}
}
/**
* Returns the {@link BeanDefinition} built by the given {@link BeanDefinitionBuilder} enriched with source
* information derived from the given {@link Element}.
*
* @param builder must not be {@literal null}.
* @param context must not be {@literal null}.
* @param element must not be {@literal null}.
* @return
*/
static AbstractBeanDefinition getSourceBeanDefinition(BeanDefinitionBuilder builder, ParserContext context, Element element) {
AbstractBeanDefinition definition = builder.getBeanDefinition();
definition.setSource(context.extractSource(element));
return definition;
}
}

View File

@@ -1,5 +1,5 @@
/**
* Spring XML namespace configuration for MongoDB specific repositories.
*/
package org.springframework.data.document.mongodb.config;
package org.springframework.data.mongodb.config;

View File

@@ -13,7 +13,7 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.mongodb;
package org.springframework.data.mongodb.core;
import com.mongodb.DBCollection;
import com.mongodb.MongoException;

View File

@@ -13,7 +13,7 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.mongodb;
package org.springframework.data.mongodb.core;
/**
* Provides a simple wrapper to encapsulate the variety of settings you can use when creating a collection.
@@ -31,13 +31,10 @@ public class CollectionOptions {
/**
* Constructs a new <code>CollectionOptions</code> instance.
*
* @param size
* the collection size in bytes, this data space is preallocated
* @param maxDocuments
* the maximum number of documents in the collection.
* @param capped
* true to created a "capped" collection (fixed size with auto-FIFO behavior based on insertion order), false
* otherwise.
* @param size the collection size in bytes, this data space is preallocated
* @param maxDocuments the maximum number of documents in the collection.
* @param capped true to created a "capped" collection (fixed size with auto-FIFO behavior based on insertion order),
* false otherwise.
*/
public CollectionOptions(Integer size, Integer maxDocuments, Boolean capped) {
super();

View File

@@ -13,7 +13,7 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.mongodb;
package org.springframework.data.mongodb.core;
import com.mongodb.DBCursor;

View File

@@ -13,7 +13,7 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.mongodb;
package org.springframework.data.mongodb.core;
import com.mongodb.DB;
import com.mongodb.MongoException;

View File

@@ -1,4 +1,4 @@
package org.springframework.data.document.mongodb;
package org.springframework.data.mongodb.core;
import java.util.Map;
import java.util.concurrent.ConcurrentHashMap;

View File

@@ -0,0 +1,37 @@
/*
* Copyright 2010-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.dao.DataAccessException;
import com.mongodb.DBObject;
import com.mongodb.MongoException;
/**
* An interface used by {@link MongoTemplate} for processing documents returned from a MongoDB query on a per-document basis.
* Implementations of this interface perform the actual work of prcoessing each document but don't need to worry about
* exception handling. {@MongoException}s will be caught and translated by the calling MongoTemplate
*
* An DocumentCallbackHandler is typically stateful: It keeps the result state within the object, to be available later for later
* inspection.
*
* @author Mark Pollack
*
*/
public interface DocumentCallbackHandler {
void processDocument(DBObject dbObject) throws MongoException, DataAccessException;
}

Some files were not shown because too many files have changed in this diff Show More