Compare commits

..

107 Commits

Author SHA1 Message Date
Spring Buildmaster
e27c01fe5b DATAMONGO-1021 - Release version 1.6.0.RC1 (Evans RC1). 2014-08-13 07:02:41 -07:00
Oliver Gierke
d639e58fb9 DATAMONGO-1021 - Prepare 1.6.0.RC1 (Evans RC1). 2014-08-13 15:37:48 +02:00
Oliver Gierke
0195c2cb48 DATAMONGO-1021 - Updated changelog. 2014-08-13 15:37:44 +02:00
Oliver Gierke
068e2ec49b DATAMONGO-1024 - Upgraded to MongoDB Java driver 2.12.3. 2014-08-13 15:36:08 +02:00
Christoph Strobl
a9306b99ec DATAMONGO-957 - Add support for query modifiers.
Using Meta allows to the define $comment, $maxScan, $maxTimeMS and $snapshot on query. When executed we add the meta information to the cursor in use.

We’ve introduced the @Meta annotation that allows to the define $comment, $maxScan, $maxTimeMS and $snapshot on a repository finder method.
Added tests to verify proper invocation of template methods
Use DBCursor.copy() for CursorPreparer.

Original pull request: #216.
2014-08-13 14:56:10 +02:00
Thomas Darimont
3597194742 DATAMONGO-1012 - Improved identifier initialization on DBRef proxies.
Identifier initalization is now only triggered if field access is used. Before that the id initialization would've resolved the proxy eagerly as the getter access performed by the BeanWrapper would've been intercepted by the proxy and is indistinguishable from a normal method call. This would've rendered the entire use case to create proxies ad absurdum.

Added test case to check for non-initialization in the property access scenario.
2014-08-13 14:34:38 +02:00
Oliver Gierke
6f06ccec8e DATAMONGO-1012 - Identifier initialization for lazy DBRef proxies with field access.
We now initialize the ID property for proxies created for lazily initialized DBRefs. This will allow the lookup of ID properties for types that use field access without initializing the entire proxy.
2014-08-13 14:34:15 +02:00
Oliver Gierke
6fe7f220f9 DATAMONGO-1007 - Updated changelog. 2014-08-13 10:56:02 +02:00
Christoph Strobl
45e70d493d DATAMONGO-1016 - Remove deprecations in geospatial area.
Removed:
 - Box
 - Circle
 - CustomMetric
 - Distance
 - GeoPage
 - GeoResult
 - GeoResults
 - Metric
 - Metrics
 - Point
 - Polygon
 - Shape

Updated api doc.
Removed deprecation warnings.
2014-08-13 09:52:02 +02:00
Thomas Darimont
ce71ab83f2 DATAMONGO-1020 - LimitOperation is now a public class.
Original pull request: #218.
2014-08-12 12:30:41 +02:00
Oliver Gierke
bf85d8facd DATAMONGO-1005 - Polishing introduction of ObjectPath.
Simplified implementation of ObjectPath to use a static root instance and hand the path further down until final resolution in MappingMongoConverter.readValue(…). This removes a bit of boxing and unboxing code both in ObjectPath and the converter.

Introduced ObjectPath.getPathItem(…) to internalize the iteration to find a potentially already resolved object.

Renamed parameters and fields of type ObjectPath to path consistently. Removed obsolete method in MappingMongoConverter.

Original pull request: #209.
2014-08-12 08:12:31 +02:00
Thomas Darimont
c5ff7cdb2b DATAMONGO-1005 - Improve cycle-detection for DbRef's.
Introduced ObjectPath that collects the target objects while converting a DBObject to a domain object. If we detect that a potentially nested DBRef points to an object that is already under construction we simply return a reference to that object in order to avoid StackOverFlowErrors.

Original pull request: #209.
2014-08-12 08:10:47 +02:00
Mark Paluch
f9ccf4f532 DATAMONGO-1017 - Add support for custom implementations in CDI repositories.
Original pull request: #215.
2014-08-11 07:47:57 +02:00
Greg Turnquist
ab731f40a7 DATAMONGO-1019 - Corrected examples in reference documentation.
Examples were not properly converted. One table got dropped, so I added it back. Fix IMPORTANT notes.

Original pull requests: #214.
2014-08-10 16:04:45 +02:00
Oliver Gierke
d8434fffa8 DATAMONGO-1015 - Fixed link to Spring Data Commons reference docs. 2014-08-10 15:55:13 +02:00
Christoph Strobl
151b1d4510 DATAMONGO-973 - Add support for deriving full-text queries.
Added support to execute full-text queries on repositories. Query methods now can have a parameter of type TextCriteria which will be triggering a text search clause for the property annotated with @TestScore.

Retrieving document score and sorting by score is only possible if the entity holds a property annotated with @TextScore. If present, any find execution will be enriched so that it asserts loading of the according { $meta : textScore } field. The sort object will only be mapped in case the existing sort property already exists - in that case we replace the existing expression for the property with its $meta representation.

This allows for example the following:

TextCriteria criteria = TextCriteria.forDefaultLanguage().matching("term");

repository.findAllBy(criteria, new Sort("score"));
repository.findAllBy(criteria, new PageRequest(0, 10, Direction.DESC, "score"));
repository.findByFooOrderByScoreDesc("foo", criteria);

For more details and examples see the "Full text search queries" section in the reference manual.
2014-08-06 22:25:38 +02:00
Greg Turnquist
168cf3e1f6 DATAMONGO-1015 - Migrate reference documentation from Docbook to Asciidoctor. 2014-08-06 21:38:46 +02:00
Oliver Gierke
52dab0fa20 DATAMONGO-1008 - Polishing.
Slightly changed the implementation of the 2dsphere check, Minor refactorings in the test case.

Original pull request: #210.
2014-07-31 17:23:19 +02:00
Christoph Strobl
9257bab06e DATAMONGO-1008 - DefaultIndexOperations no considers 2dsphere, too.
We now also check for 2dsphere when inspecting index keys and create an geo IndexField in that case.

Original pull request: #210.
2014-07-31 17:23:19 +02:00
Oliver Gierke
27f0a6f27a DATAMONGO-1008 - Added repository type based checks to strict matching algorithm.
Repositories extending MongoRepository are now considered strict matches as well.

Related ticket: DATACMNS-526.
2014-07-31 16:20:26 +02:00
Oliver Gierke
5bedbef2f2 DATAMONGO-1009 - Adapt to new multi-store configuration detection.
We now consider repositories managing domain types annotated with @Document MongoDB specific ones.

Related ticket: DATACMNS-526.
2014-07-28 20:15:40 +02:00
Christoph Strobl
51e7be8aa0 DATAMONGO-1001 - Renamed LazyLoadingProxy.initialize() to getTarget().
Original pull request: #208.
2014-07-24 13:29:27 +02:00
Christoph Strobl
6c85bb39a3 DATAMONGO-1001 - Fix saving lazy loaded object.
We now resolve the target type for CGLib-proxied objects and initialize lazy loaded ones before saving. As it turns out CustomConversions already knows how to deal with proxies correctly. Ee added an explicit test to assert that.

Original pull request: #208.
2014-07-24 13:28:36 +02:00
Oliver Gierke
07f7247707 DATAMONGO-1002 - Update.toString() now uses SerializationUtils.
A simple call of toString() on a DBObject might result in an exception if the DBObject contains objects that are non-native MongoDB types (i.e. types that need to be converted prior to persistence).

We now use SerializationUtils.serializeToJsonSafely(…) to avoid exceptions.
2014-07-23 12:36:15 +02:00
Thomas Darimont
f669711670 DATAMONGO-995 - Improve support of quote handling for custom query parameters.
Introduced ParameterBindingParser which exposes parameter references in query strings as ParameterBindings. This allows us to detect whether a parameter reference in a query string is already quoted avoiding wrongly double-quoting the parameter value.

Original pull request: #185.
Related ticket: DATAMONGO-420.
2014-07-21 20:15:18 +02:00
Oliver Gierke
5f3671f349 DATAMONGO-996 - Fixed boundary detection in pagination.
The fix for DATAMONGO-950 introduced a tiny glitch so that retrieving pages after the first one was broken in the repository query execution. We now correctly use the previously detected number of elements to detect whether the Pageable given is out of scope.

Related ticket: DATAMONGO-950.
2014-07-18 19:01:44 +02:00
Thomas Darimont
1335cb699b DATAMONGO-420 - Improve support of quote handling for custom query parameters.
Introduced ParameterBindingParser which exposes parameter references in query strings as ParameterBindings. This allows us to detect whether a parameter reference in a query string is already quoted avoiding wrongly double-quoting the parameter value.

Original pull request: #185.
2014-07-17 15:27:46 +02:00
Oliver Gierke
84414b87c0 DATAMONGO-987 - Some polishing in MappingMongoConverter.
Let getValueInternal(…) use the provided SpELExpressionEvaluator instead of relying on the MongoDbPropertyValueProvider to create a new one. Removed the obsolete constructor in MongoDbPropertyValueProvider.
2014-07-17 15:18:21 +02:00
Thomas Darimont
a1ecd4a501 DATAMONGO-987 - Avoid creation of lazy-loading proxies for null-values.
We now avoid creating a lazy-loading proxy if we detect that the property-value in the backing DbObject for a @Lazy(true) annotated field is null.

Original pull request: #207.
2014-07-17 15:18:21 +02:00
Thomas Darimont
d7e6f2ee41 DATAMONGO-989 - MatchOperation should accept CriteriaDefinition.
We replaced the constructor that accepted a Criteria with one that accepts a CriteriaDefinition to not force clients to extends Criteria. 
Original pull request: #206.
2014-07-17 09:21:29 +02:00
Oliver Gierke
04870fb8b3 DATAMONGO-991 - Adapted to deprecation removals in Spring Data Commons.
Related ticket: DATACMNS-469.
2014-07-16 12:04:10 +02:00
Oliver Gierke
9d196b78f7 DATAMONGO-981 - After release cleanups. 2014-07-10 20:44:20 +02:00
Spring Buildmaster
4229525928 DATAMONGO-981 - Prepare next development iteration. 2014-07-10 10:38:58 -07:00
Spring Buildmaster
d861fecdb8 DATAMONGO-981 - Release version 1.6.0.M1. 2014-07-10 10:38:55 -07:00
Oliver Gierke
f280e23095 DATAMONGO-981 - Prepare 1.6.0.M1 (Evans M1). 2014-07-10 19:28:34 +02:00
Oliver Gierke
ed0e1d92c0 DATAMONGO-981 - Updated changelog. 2014-07-10 17:14:24 +02:00
Christoph Strobl
d82fc22659 DATAMONGO-944 - Add support for $currentDate to Update.
Added currentDate and currentTimestamp to Update.

Original pull request: #200.
2014-07-10 15:13:59 +02:00
Thomas Darimont
6616d6788c DATAMONGO-975 - Add support for extracting date/time components from a field projection.
We added some extract-methods to ProjectionOperationBuilder to be able to extract date / time components from projected fields.

Original pull request: #204.
2014-07-10 12:45:17 +02:00
Christoph Strobl
322a7cf033 DATAMONGO-969 - Fixed nested id handling in SpringDataMongodbSerializer.
SpringDataMongodbSerializer now defensively triggers mapping of the DBObject created by the default serializer. This asserts that ids buried in nested structures like { "_id" : { "$in" : ["x", "y"] } } are converted correctly.

Original pull request: #202.
2014-07-09 21:48:02 +02:00
Christoph Strobl
0f487c10ba DATAMONGO-983 - Remove links to forum.spring.io.
Replace forum links with those to stackoverflow.

Original Pull Request: #205
2014-07-09 21:21:04 +02:00
Christoph Strobl
11417144bd DATAMONGO-980 - Use meta annotations from commons for @Score.
We now use Spring Data Commons' @ReadOnlyProperty to meta-annotate @Score to mark it as read-only property.

Original pull request: #201.
Related tickets: DATACMNS-534.
2014-07-09 14:51:21 +02:00
Christoph Strobl
dafc59b163 DATAMONGO-972 - Querydsl integration now handles references correctly.
SpringDataMongodbSerializer now overrides the necessary methods to create the appropriate DBRef objects when serializing data via Querydsl.

We currently disable the test case as it the fix taking effect requires Querydsl 3.4.1 which unfortunately breaks Java 6 compatibility. We include the fix nonetheless to allow users on Java 7 to potentially use the latest Querydsl.

Original pull request: #203.
Related tickets: querydsl/querydsl#803.
2014-07-09 13:47:04 +02:00
Oliver Gierke
566f9a80c4 DATAMONGO-982 - Added build profiles to build against next MongoDB driver versions.
Added build profile for MongoDB Java driver versions 2.12.3-SNAPSHOT and 3.0.0-SNAPSHOT. Added another property to be able to build manifests correctly as the snapshot versions aren't valid OSGi versions.

Adapted MongoExceptionTranslator to convert the new Exceptions being thrown for server timeouts and the deprecated values we currently handle.
2014-07-08 17:36:01 +02:00
Christoph Strobl
89a42c5648 DATAMONGO-976 - Add support for reading $meta projection on textScore into document.
We introduced @TextScore that can be used to mark a property to take { $meta : “textScore” }. In contrast to @Transient the value will be considered when reading documents.
The value can and will only get picked up if the score field is retrieved from the store.

Original pull request: #198.
2014-07-07 11:43:43 +02:00
Christoph Strobl
83ffbb00e8 DATAMONGO-850 - Add support for full text search via $text
Using TextQuery and TextCriteria allows creation of queries using $text $search.

{ $meta : “textScore” } can be included using TextQuery.includeScore. As the fieldname used for textScore must not be fixed to “score” one can use the overload taking the fieldname to override the default.

Original pull request: #198.
2014-07-07 11:39:47 +02:00
Christoph Strobl
84913cecab DATAMONGO-937 - Add support for creating text index.
We now support creating text index based on information gathered on domain types.

Using @TextIndexed marks properties to be considered for the full text index. Use the weight attribute to influence document scoring during search operations.

Please note that using @TextIndexed on entity properties forces all properties of any sub document to be considered as part of the text index. Any set weight will in that case be propagated to the siblings taking the most recent weight information into account, which means that a the weight attribute can be overridden for properties in sub documents.

The setting the index default language can be done via @Document(language) while @Language can be used to define the language_override field.

As text search is disabled by default for mongodb 2.4 we use a jUnit ClassRule to restrict integration tests potentially creating text index (as the entities for testing are found in the classpath) to only be executed in when a 2.6+ mongodb server is present.

For usage hints please see section 6.3.4 (Text Indexes) of reference manual.

Original pull request: #198.
2014-07-07 11:30:08 +02:00
Christoph Strobl
998bb09a92 DATAMONGO-978 - Derived delete query should pass on type information.
We now pass on type information for derived delete queries to the according delete operation. This propagates the information correctly to the according Before and After events.

Before this change the type would have been set to null in case of non collection like method return type.

Original pull request: #199.
2014-07-03 14:16:57 +02:00
Oliver Gierke
cd68a8db54 DATAMONGO-977 - Removed reflective detection of Spring 4 in DBRef proxy creation.
After the Spring 4 upgrade we can now directly use the Objenesis infrastructure of it.
2014-07-02 09:23:19 +02:00
Oliver Gierke
df8477d180 DATAMONGO-955 - Updated changelog. 2014-06-30 10:57:12 +02:00
Christoph Strobl
244fbae0ce DATAMONGO-962 - Cycle guard should respect full path.
We now check on intersections of given path and existing to not only check types and contained property names but also properties full path which must not be present in already traversed paths.

Additionally we’ll now catch any CyclicPropertyReferenceExceptions on the root level to prevent cycle detection interfering with application startup.

Original pull request: #197.
2014-06-27 19:25:26 +02:00
Oliver Gierke
19e08a52c0 DATAMONGO-970 - MongoTemplate.remove(…) now correctly builds query for DBObjects.
If a DBObject was handed into MongoTemplate.remove(…) we previously failed to look up the id value to create a by-id-query. This commit adds explicit handling of DBObjects by looking up their _id field to obtain the id value.
2014-06-27 16:12:50 +02:00
Thomas Darimont
6389b1bb73 DATAMONGO-954 - Add support for system variables in aggregation operations.
Add assumes for appropriate MongoDB version.
2014-06-26 14:19:24 +02:00
Thomas Darimont
cadcbf6106 DATAMONGO-954 - Add support for system variables in aggregation operations.
We now support referring to system variables like for instance $$ROOT or $$CURRENT from within aggregation framework pipeline projection and group expressions.

Original pull request: #190.
2014-06-25 15:53:44 +02:00
Christoph Strobl
118f007ca6 DATAMONGO-963 - @CompoundIndex should treat expireAfterSeconds correctly.
We added an additional check on the fields used as key, so that TTL is ignored for CompoundIndex with more than one field (which effectively renders it useless on @CompoundIndex at all).

Prior to this change potentially invalid index structures would have been created for e.g. @CompoundIndex(def = "{'foo': 1, 'bar': 1}", expireAfterSeconds=10) leading to MongoDB not being able to clean up the indexes (logs: "ERROR: key for ttl index can only have 1 field")

This fix is related to https://jira.mongodb.org/browse/SERVER-10075.

Original pull request: #196.
2014-06-25 15:12:32 +02:00
Christoph Strobl
cbb32bd29d DATAMONGO-950 - Add support for limiting the query result in the query derivation mechanism.
When deriving the query from its method name we check for the limit set on the PartTree to pass this on to the created query. PagedExecution not takes the overall limit into account, skips a query execution entirely (if the Pageable is out of scope completely) or alters the query limits accordingly.

Note, that there has been significant rework of this compared to the pull request to avoid new API in Query and extensive changes in MongoTemplate's QueryCursorPreparer.

Original pull request: #191.
2014-06-25 14:58:59 +02:00
Thomas Darimont
9858dcd740 DATAMONGO-960 - Allow to pass options to the Aggregation Pipeline.
We introduced AggregationOptions abstraction to conveniently construct option objects that can be handed to an Aggregation via the new Aggregation.withOptions(...) factory method. For more details, see the Builder class' JavaDoc.

Note that we exposed the "rawResults" in AggregationResults and put a null guard in MongoTemplate aggregate in order to support the "explain" option.

Original pull request: #195.
2014-06-25 13:25:57 +02:00
Thomas Darimont
1fb76d135b DATAMONGO-953 - Add equals(…)/hashCode()/toString() to Update.
We now use the underlying updateObject to implement appropriate equals(…)/hashCode() and toString() methods.

Original pull request: #192.
2014-06-25 12:40:30 +02:00
Oliver Gierke
bb62c8b2f1 DATAMONGO-958 - Switch to FieldNamingStrategy SPI in Spring Data Commons. 2014-06-20 21:27:24 +02:00
Christoph Strobl
2cbe7bf885 DATAMONGO-952 - Derived queries should consider field specification in @Query.
PartTreeMongoQuery now explicitly check the presence of a manually defined field spec on the query method and creates a new Query if so.

Original pull request: #188.
2014-06-18 12:53:14 +02:00
Christoph Strobl
6043f6b74d DATAMONGO-949 - CycleGuard should only match properties in word boundaries.
We modified the regular expression used for cycle detection to match on the exact property name within the inspected path using word boundaries. This fix prevents sub sequences of an existing property (like ‘sub’ would have matched ‘substr’) from being matched.

Along the way we fixed the (false) assertion in one of the tests, as we create the +1 cycle reference index before actually breaking the operation.
2014-06-18 08:32:30 +02:00
Christoph Strobl
ef1366592a DATAMONGO-948 - Sort should be taken as is when no type information available.
Object type mapping for sort is skipped in the case no type information is present when executing query using mongo template.
2014-06-18 08:27:57 +02:00
Thomas Darimont
01cf9fb8f3 DATAMONGO-938 - Apply QueryMapper in MongoTemplate.mapReduce(…).
Previously MongoTemplate.mapReduce(...) didn't translate nested objects, e.g. GeoCommand, within the given query. That could lead to exceptions during query serialization. We now pass the query and sort object of the given Query through the QueryMapper to avoid such problems.

Original pull request: #184.
2014-06-18 08:27:54 +02:00
Thomas Darimont
285c406d5d DATAMONGO-745 - Added test cases for custom query with $in and pageable parameter.
Added test cases to verify that this works.

Original pull request: #186.
2014-06-18 07:49:06 +02:00
Oliver Gierke
ad29e52a57 DATAMONGO-936 - After release cleanups. 2014-05-20 19:54:03 +02:00
Spring Buildmaster
3cfe207c83 DATAMONGO-936 - Prepare next development iteration. 2014-05-20 09:35:38 -07:00
Spring Buildmaster
c7e65cbc40 DATAMONGO-936 - Release version 1.5.0.RELEASE. 2014-05-20 09:35:35 -07:00
Oliver Gierke
b8e02efb04 DATAMONGO-936 - Prepare 1.5 GA.
Removed obsolete readme.txt.
2014-05-20 17:14:04 +02:00
Oliver Gierke
c7f20fb836 DATAMONGO-936 - Prepare 1.5 GA. 2014-05-20 16:57:34 +02:00
Oliver Gierke
28a0202ef4 DATAMONGO-936 - Updated changelog. 2014-05-20 16:26:20 +02:00
Christoph Strobl
164e947045 DATAMONGO-926 - Avoid StackOverflowError while resolving index structures.
We now guard cyclic non transient, non DBRef property references while inspecting domain types for potentially index structures. To do so we check on the properties path and owning type to determine potential cycles. If found we log a warn message pointing to path, entity and property potentially causing problems and skip processing for this path.

Original pull request: #180.
2014-05-19 19:09:40 +02:00
Christoph Strobl
7e65c0c87d DATAMONGO-367 - Nested @Indexed should not trigger creation of separate collection.
The issue has been solved along with DATAMONGO-888 (Pull Request: #162). We have created additional tests to explicitly check it has truly been fixed.
2014-05-19 17:19:28 +02:00
Christoph Strobl
9c1f753f17 DATAMONGO-929 - Use property path for keys of Indexed and CompoundIndex.
Index creation failed for @Indexed and @CompoundIndex as the resolved dotPath was not used for creation. We now not only resolve the dotPath but also use it within the key for index definition. In case of a nested compound index the key definition is enhanced by the provided path.

When leaving the key definition empty for nested compound index we'll create an index for the whole nested document. Trying to create a compound index on root level not providing key information leads to InvalidDataApiUsageException.

Original pull request: #179.
2014-05-19 17:14:36 +02:00
Oliver Gierke
0f821eb52d DATAMONGO-925, DATAMONGO-928 - Polishing.
Only reject attribute setup if abbreviation is activated and a custom strategy is configured. Additional test cases for the rejection case and a custom, over-configuration (explicitly setting abbreviation to false, which is the default anyway).

Related pull request: #177.
2014-05-19 17:07:50 +02:00
Ryan Tenney
e3aadd63ab DATAMONGO-928 - Removed explicit default value for abbreviate-field-names from namespace XSD.
The default for boolean attributes leaks into the evaluation of XML namespace attributes which causes us being unable to detect whether two attributes have been set in a conflicting way.

Fix the documentation on the field-naming-strategy-ref attribute.

Original pull request: #183.
Related pull request: #177.
Related ticket: DATAMONGO-925.
2014-05-19 16:44:03 +02:00
Christoph Strobl
aa06d520df DATAMONGO-647 - Added test case to show that field names are mapped correctly.
Additional test added to check if the issue has truly been resolved by DATAMONGO-888.

Original pull request: #181.
Related pull Request: #162.
Related ticket: DATAMONGO-888.
2014-05-19 14:38:06 +02:00
Oliver Gierke
4fa1d4ba97 DATAMONGO-919 - After release cleanups. 2014-05-02 15:45:06 +02:00
Spring Buildmaster
ba9f11b345 DATAMONGO-919 - Prepare next development iteration. 2014-05-02 05:58:16 -07:00
Spring Buildmaster
4777dd2e5e DATAMONGO-919 - Release version 1.5.0.RC1. 2014-05-02 05:58:13 -07:00
Oliver Gierke
64b4591b72 DATAMONGO-919 - Prepare 1.5.0.RC1.
Upgraded to Spring Data Parent 1.4.0.RC1 and Spring Data Commons 1.8.0.RC1. Switched to milestone repository.
2014-05-02 14:50:48 +02:00
Oliver Gierke
bac5961fd8 DATAMONGO-919 - Updated changelog. 2014-05-02 14:50:48 +02:00
Thomas Darimont
b3cac862d8 DATAMONGO-924 - Improve aggregation field reference resolving.
Previously we didn't support referring to aliased fields defined in former stages of an aggregation pipeline. We now also consider field aliases during field reference lookup.

Original pull request: #176.
2014-05-02 14:46:13 +02:00
Oliver Gierke
b8ab2ad539 DATAMNOG-919 - Forward port changelog entries from bugfix branch. 2014-05-02 12:48:18 +02:00
Christoph Strobl
618d5bd5e9 DATAMONGO-919 - Prepare Release 1.5.0.RC1
Upgrade AspectJ Maven plugin to recent version which enables building the module on java 8 using AspectJ 1.8.
2014-05-01 21:02:01 +02:00
Oliver Gierke
096c3278b3 DATAMONGO-92 - Upgraded to MongoDB Java driver 2.12.1. 2014-04-30 08:51:18 +02:00
Kim Toms
c9d9976c22 DATAMONGO-920 - Improve debug message for delete events in AbstractMongoEventListener.
Adjusted debug message to reflect the actual operation.

Original pull request: #95.
2014-04-29 15:52:12 +02:00
Thomas Darimont
65497f93d4 DATAMONGO-827 - Allow index creation use mongodb auto generated names.
We now support letting MongoDB generate index names by introducing the attribute "useGeneratedName" to the @Indexed, @GeoSpatialIndex, @CompoundIndex annotations.
2014-04-28 19:58:56 +02:00
Christoph Strobl
af8a53bef6 DATAMONGO-909 - Compound index on inherited class should be created correctly.
With the overhaul of the index creation done in DATAMONGO-899 the CompoundIndex annotation is not longer just looked up at the concrete type but rather all its interfaces and super classes. So we just added an additional test to verify this behaviour.
2014-04-28 19:58:56 +02:00
Oliver Gierke
916b856e97 DATAMONGO-899 - Polished API of new index creation abstractions.
Removed the introduction of the IndexDefinition being collection aware again. The collection an index is created in is now held in the IndexDefinitionHolder. This is mostly due to the fact that the IndexDefinition implementations can be used with MongoTemplate and the index opoerations take a collection alongside the index definition.

Made the IndexResolver API package protected so that we can further change it going forward. We should think about deprecating the collectionName attributes on index annotations as it doesn't make too much sense to manually configure the collection name for the indexes as the collection is predefined through the domain type setting here. This would allow us to remove the entire collection handling code inside the IndexResolver implementation.

Turned IndexDefinitionHolder into a value object.

Original pull request: #168.
2014-04-28 19:58:50 +02:00
Christoph Strobl
7848da63f2 DATAMONGO-899 - Ensure proper creation of index structures for nested entities.
Index creation did not consider the properties path when creating the index. This lead to broken index creation when nesting entities that might require index structures.

Off now index creation traverses the entities property path for all top level entities (namely those holding the @Document annotation) and creates the index using the full property path.

This is a breaking change as having an entity to carry the @Document annotation has not been required by now.

Original Pull Request: #168
2014-04-28 16:50:26 +02:00
Christoph Strobl
f359a1d31a DATAMONGO-847 - Allow usage of Query within an Update clause.
In case we detect Query within a value used for an Update value we map the query itself to build the expression to use. This allows to form query statements for e.g. $pull using the same API as for the query itself.

Update update = new Update().pull("list", query(where("value").in("foo", "bar")));

Original Pull Request: #172.
2014-04-28 13:30:13 +02:00
Thomas Darimont
72d645feae DATAMONGO-917 - Improve Spring 4.0 framework version detection to avoid NullPointerExceptions.
We now check for the presence of DefaultParameterNameDiscoverer in order to determine if we are running with a Spring version later than 4.0 since this avoids potential NullPointerExceptions in cases where the package version information is not available e.g. in cases where the application was bundled into an uberjar e.g. via the maven-shade-plugin.

Original pull request: #173.
2014-04-28 13:21:09 +02:00
Thomas Darimont
72fe382bba DATAMONGO-913 - Improve DBRef handling in for LazyLoadingProxies.
We now use the captured DBRef of a given LazyLoadingProxy in MappingMongoConverter.toDBRef(..) in order to avoid a new DBRef creation that would fail for the proxy.

Original pull request: #174.
2014-04-28 13:07:57 +02:00
Thomas Darimont
d25e840cf5 DATAMONGO-914 - Improve resolving of lazy-loading proxies for classes that override equals(…)/hashCode().
We now properly resolve lazy-loading proxies for @DBRef's when an overridden equals or hash code method is called with Spring 4. We fall back to our old Objenesis proxy generation in order to circumvent the default handling for overridden hashcCode() and equals(…) methods in CglibAopProxies generated by Spring 4.

If we detect that we run with Spring 4 we use the repacked Objenesis that is included in Spring 4. Previously the generated proxy used some generic hashCode() or equals(…) logic that did not trigger a proper lazy loading in such cases.

Original pull request: #171.
2014-04-23 09:29:33 +02:00
Thomas Darimont
df1775572a DATAMONGO-912 - Consider custom conversions in all stages of an aggregation pipeline.
We now consider custom mongo conversions in all stages of an aggregation pipeline. Previously we did this only for the first stage and returned object basically unmapped in later stages. We now pass the root AggregationOperationContext on to nested ExposedFieldsAggregationOperationContexts so that those can delegate any mongo Mapping to the root context.

Original pull request: #170.
2014-04-23 09:00:56 +02:00
Christoph Strobl
86670cd49f DATAMONGO-893 - Converter must not write "_class" information for know types.
We now actively pass on property type information to MetadataBackedField to ensure type hints get picked up correctly when converting a value to the according DBObject.

This has to be done as the fix for DATAMONGO-812 enforced proper writing of _class information for Updates, which caused trouble when querying documents by nested (complex) properties using an 'in' clause.

Original pull request: #169.
2014-04-15 17:36:11 +02:00
Christoph Strobl
31a4bf906e DATAMONGO-892 - Reject nested MappingMongoConverter declarations in XML.
Mapping information is potentially required by multiple instances and thus must not be registered as nested bean. We now actively check for such an invalid scenario and explicitly reject it.

Original pull request: #165.
2014-04-15 09:11:12 +02:00
Christoph Strobl
599291e8b7 DATAMONGO-897 - Fixed potential NullPointerException in QueryMapper.
If an association property points to an interface not containing the id property QueryMapper threw a NullPointerException in isAssociationConversionNecessary(…) as the lookup of the id property fails. 

We now check for the presence of an id property on the target type and check for assignability to indicated the need for conversion (usually in case when developers use raw ids in their update clauses, not the actual target instance.

Original pull request: #164.
2014-04-15 09:11:00 +02:00
Thomas Darimont
f5a04fb9fb DATAMONGO-908 - Support for nested field references in group operations.
We now allow referring to nested field expressions if the root segment of the nested field expression was exposed in earlier stages of the aggregation pipeline.

Original pull request: #167.
2014-04-15 07:55:57 +02:00
Oliver Gierke
88558b67c3 DATAMONGO-866 - Polishing for new field naming strategy configuration support.
Use the camel case split logic from Spring Data Commons (introduced for DATACMNS-486) in a common CamelCaseSplittingFieldNamingStrategy super class.

MappingMongoConverterParser now also rejects the configuration if both abbreviate-field-names and field-naming-strategy-ref are configured.
2014-04-10 18:36:37 +02:00
Ryan Tenney
d52cb255e0 DATAMONGO-866 - Add means to configure field naming strategy. 2014-04-10 18:03:55 +02:00
Oliver Gierke
6c214cbc37 DATAMONGO-910 - Upgraded to latest MongoDB Java driver 2.12.0.
Removed special mongo-osgi property as the driver version is now a valid OSGi version number.
2014-04-10 16:55:22 +02:00
Jeff Yemin
01012c1448 DATAMONGO-895, DATAMONGO-896 - Assert compatibility with latest MongoDB Java driver.
Upgrade next MongoDB driver version to 2.12.0. Strong upgrade coming in a subsequent commit to make sure we can backport the compatibility checks to the bugfix branch without forcing users into a driver upgrade.

Relaxing error message comparison in assertion so that it still matches against the message returned by MongoDB 2.6. When comparing the value of the version field, compare against a Long rather than an Integer, since the version field generated is a Long. This allows the test to pass against the upcoming 2.12.0 release of the Java driver, which has a stricter implementation of BasisDBObject.equals(…).

Original pull requests: #159, #160.
2014-04-10 15:57:45 +02:00
Christoph Strobl
4c7befb910 DATAMONGO-888 - Sorting now considers mapping information.
We now pipe the DBObject containing sorting information for queries through the QueryMapper to make sure potential field mappings are applied.

Original Pull Request: #162.
2014-04-10 15:43:51 +02:00
Christoph Strobl
b62669ec8f DATAMONGO-907 - Assert compatibility with mongodb 2.6.
Fix test to only check on parts of the expected error message common in both 2.4 and 2.6.

Original Pull Request: #166.
2014-04-10 13:33:34 +02:00
Oliver Gierke
0fb74caf9b DATAMONGO-905 - Removed obsolete dependency to CGLib from cross-store support.
Also we now optionally depend on the HIbernate JPA API JAR so that using other persistence providers doesn'T cause an API JAR duplication.
2014-04-09 12:12:29 +02:00
Oliver Gierke
9623dac01f DATAMONGO-901 - Fixed regression of not registering type predicting post processor.
The changes for DATAMONGO-843 introduced a regression by skipping the registration of the RepositoryInterfaceAwareBeanPostProcessor. This can cause the wiring of repository bean definitions to fail depending on in which order the bean definitions get instantiated.

This change reintroduces the registration and adds an explicit test case for it.
2014-04-02 17:58:36 +02:00
Spring Buildmaster
695b27968c DATAMONGO-859 - Prepare next development iteration. 2014-03-31 17:10:40 +02:00
205 changed files with 15025 additions and 7698 deletions

View File

@@ -11,7 +11,7 @@ For a comprehensive treatment of all the Spring Data MongoDB features, please re
* the [User Guide](http://docs.spring.io/spring-data/mongodb/docs/current/reference/html/)
* the [JavaDocs](http://docs.spring.io/spring-data/mongodb/docs/current/api/) have extensive comments in them as well.
* the home page of [Spring Data MongoDB](http://projects.spring.io/spring-data-mongodb) contains links to articles and other resources.
* for more detailed questions, use the [forum](http://forum.spring.io/forum/spring-projects/data/nosql).
* for more detailed questions, use [Spring Data Mongodb on Stackoverflow](http://stackoverflow.com/questions/tagged/spring-data-mongodb).
If you are new to Spring as well as to Spring Data, look for information about [Spring projects](http://projects.spring.io/).
@@ -26,7 +26,7 @@ Add the Maven dependency:
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.4.1.RELEASE</version>
<version>1.5.0.RELEASE</version>
</dependency>
```
@@ -36,7 +36,7 @@ If you'd rather like the latest snapshots of the upcoming major version, use our
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.5.0.BUILD-SNAPSHOT</version>
<version>1.6.0.BUILD-SNAPSHOT</version>
</dependency>
<repository>
@@ -139,7 +139,7 @@ public class MyService {
Here are some ways for you to get involved in the community:
* Get involved with the Spring community on the Spring Community Forums. Please help out on the [forum](http://forum.spring.io/forum/spring-projects/data/nosql) by responding to questions and joining the debate.
* Get involved with the Spring community on Stackoverflow and help out on the [spring-data-mongodb](http://stackoverflow.com/questions/tagged/spring-data-mongodb) tag by responding to questions and joining the debate.
* Create [JIRA](https://jira.springframework.org/browse/DATADOC) tickets for bugs and new features and comment and vote on the ones that you are interested in.
* Github is for social coding: if you want to write code, we encourage contributions through pull requests from [forks of this repository](http://help.github.com/forking/). If you want to contribute code this way, please reference a JIRA ticket as well covering the specific issue you are addressing.
* Watch for upcoming articles on Spring by [subscribing](http://spring.io/blog) to spring.io.

48
pom.xml
View File

@@ -2,10 +2,10 @@
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.5.0.M1</version>
<version>1.6.0.RC1</version>
<packaging>pom</packaging>
<name>Spring Data MongoDB</name>
@@ -15,7 +15,7 @@
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>1.4.0.M1</version>
<version>1.5.0.RC1</version>
<relativePath>../spring-data-build/parent/pom.xml</relativePath>
</parent>
@@ -29,11 +29,11 @@
<properties>
<project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>1.8.0.M1</springdata.commons>
<mongo>2.11.4</mongo>
<mongo-osgi>${mongo}</mongo-osgi>
<springdata.commons>1.9.0.RC1</springdata.commons>
<mongo>2.12.3</mongo>
<mongo.osgi>2.12.3</mongo.osgi>
</properties>
<developers>
<developer>
<id>ogierke</id>
@@ -105,11 +105,35 @@
<profiles>
<profile>
<id>mongo-next</id>
<properties>
<mongo>2.12.0-rc0</mongo>
<mongo-osgi>2.12.0</mongo-osgi>
<mongo>2.12.5-SNAPSHOT</mongo>
</properties>
<repositories>
<repository>
<id>mongo-snapshots</id>
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
</repository>
</repositories>
</profile>
<profile>
<id>mongo-3-next</id>
<properties>
<mongo>3.0.0-SNAPSHOT</mongo>
</properties>
<repositories>
<repository>
<id>mongo-snapshots</id>
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
</repository>
</repositories>
</profile>
</profiles>
@@ -121,14 +145,14 @@
<version>${mongo}</version>
</dependency>
</dependencies>
<repositories>
<repository>
<id>spring-libs-milestone</id>
<url>http://repo.spring.io/libs-milestone/</url>
<url>http://repo.spring.io/libs-milestone</url>
</repository>
</repositories>
<pluginRepositories>
<pluginRepository>
<id>spring-plugins-release</id>

View File

@@ -6,7 +6,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.5.0.M1</version>
<version>1.6.0.RC1</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -48,7 +48,7 @@
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.5.0.M1</version>
<version>1.6.0.RC1</version>
</dependency>
<dependency>
@@ -56,17 +56,13 @@
<artifactId>aspectjrt</artifactId>
<version>${aspectj}</version>
</dependency>
<dependency>
<groupId>cglib</groupId>
<artifactId>cglib</artifactId>
<version>2.2</version>
</dependency>
<!-- JPA -->
<dependency>
<groupId>org.hibernate.javax.persistence</groupId>
<artifactId>hibernate-jpa-2.0-api</artifactId>
<version>${jpa}</version>
<optional>true</optional>
</dependency>
<!-- For Tests -->
@@ -102,7 +98,7 @@
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>aspectj-maven-plugin</artifactId>
<version>1.4</version>
<version>1.6</version>
<dependencies>
<dependency>
<groupId>org.aspectj</groupId>
@@ -130,7 +126,8 @@
<groupId>org.springframework</groupId>
<artifactId>spring-aspects</artifactId>
</aspectLibrary>
</aspectLibraries>
</aspectLibraries>
<complianceLevel>${source.level}</complianceLevel>
<source>${source.level}</source>
<target>${source.level}</target>
</configuration>

View File

@@ -2,12 +2,12 @@ Bundle-SymbolicName: org.springframework.data.mongodb.crossstore
Bundle-Name: Spring Data MongoDB Cross Store Support
Bundle-Vendor: Pivotal Software, Inc.
Bundle-ManifestVersion: 2
Import-Package:
Import-Package:
sun.reflect;version="0";resolution:=optional
Export-Template:
org.springframework.data.mongodb.crossstore.*;version="${project.version}"
Import-Template:
com.mongodb.*;version="${mongo-osgi:[=.=.=,+1.0.0)}",
Import-Template:
com.mongodb.*;version="${mongo.osgi:[=.=.=,+1.0.0)}",
javax.persistence.*;version="${jpa:[=.=.=,+1.0.0)}",
org.aspectj.*;version="${aspectj:[1.0.0, 2.0.0)}",
org.bson.*;version="0",

View File

@@ -2,7 +2,7 @@
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<artifactId>spring-data-mongodb-distribution</artifactId>
<packaging>pom</packaging>
@@ -13,10 +13,10 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.5.0.M1</version>
<version>1.6.0.RC1</version>
<relativePath>../pom.xml</relativePath>
</parent>
<properties>
<project.root>${basedir}/..</project.root>
<dist.key>SDMONGO</dist.key>
@@ -32,6 +32,10 @@
<groupId>org.codehaus.mojo</groupId>
<artifactId>wagon-maven-plugin</artifactId>
</plugin>
<plugin>
<groupId>org.asciidoctor</groupId>
<artifactId>asciidoctor-maven-plugin</artifactId>
</plugin>
</plugins>
</build>

View File

@@ -5,7 +5,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.5.0.M1</version>
<version>1.6.0.RC1</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -5,5 +5,5 @@ Bundle-ManifestVersion: 2
Import-Package:
sun.reflect;version="0";resolution:=optional
Import-Template:
com.mongodb.*;version="${mongo-osgi:[=.=.=,+1.0.0)}",
com.mongodb.*;version="${mongo.osgi:[=.=.=,+1.0.0)}",
org.apache.log4j.*;version="${log4j:[=.=.=,+1.0.0)}"

View File

@@ -1,5 +1,5 @@
<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<context version="7.1.9.205">
<context version="7.1.10.209">
<scope type="Project" name="spring-data-mongodb">
<element type="TypeFilterReferenceOverridden" name="Filter">
<element type="IncludeTypePattern" name="org.springframework.data.mongodb.**"/>
@@ -32,6 +32,7 @@
<element type="IncludeTypePattern" name="**.config.**"/>
</element>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|API" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|Implementation" type="AllowedDependency"/>
</element>
<dependency toName="Project|spring-data-mongodb::Layer|Config" type="AllowedDependency"/>

View File

@@ -11,7 +11,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.5.0.M1</version>
<version>1.6.0.RC1</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -28,6 +28,9 @@ import org.springframework.core.type.filter.AnnotationTypeFilter;
import org.springframework.data.annotation.Persistent;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.mapping.context.MappingContextIsNewStrategyFactory;
import org.springframework.data.mapping.model.CamelCaseAbbreviatingFieldNamingStrategy;
import org.springframework.data.mapping.model.FieldNamingStrategy;
import org.springframework.data.mapping.model.PropertyNameFieldNamingStrategy;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.SimpleMongoDbFactory;
@@ -35,7 +38,6 @@ import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.convert.DbRefResolver;
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.mapping.CamelCaseAbbreviatingFieldNamingStrategy;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.support.CachingIsNewStrategyFactory;
@@ -51,6 +53,7 @@ import com.mongodb.Mongo;
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
* @author Ryan Tenney
*/
@Configuration
public abstract class AbstractMongoConfiguration {
@@ -144,10 +147,7 @@ public abstract class AbstractMongoConfiguration {
MongoMappingContext mappingContext = new MongoMappingContext();
mappingContext.setInitialEntitySet(getInitialEntitySet());
mappingContext.setSimpleTypeHolder(customConversions().getSimpleTypeHolder());
if (abbreviateFieldNames()) {
mappingContext.setFieldNamingStrategy(new CamelCaseAbbreviatingFieldNamingStrategy());
}
mappingContext.setFieldNamingStrategy(fieldNamingStrategy());
return mappingContext;
}
@@ -232,4 +232,15 @@ public abstract class AbstractMongoConfiguration {
protected boolean abbreviateFieldNames() {
return false;
}
/**
* Configures a {@link FieldNamingStrategy} on the {@link MongoMappingContext} instance created.
*
* @return
* @since 1.5
*/
protected FieldNamingStrategy fieldNamingStrategy() {
return abbreviateFieldNames() ? new CamelCaseAbbreviatingFieldNamingStrategy()
: PropertyNameFieldNamingStrategy.INSTANCE;
}
}

View File

@@ -30,6 +30,7 @@ import org.springframework.beans.factory.config.BeanDefinitionHolder;
import org.springframework.beans.factory.config.RuntimeBeanReference;
import org.springframework.beans.factory.parsing.BeanComponentDefinition;
import org.springframework.beans.factory.parsing.CompositeComponentDefinition;
import org.springframework.beans.factory.parsing.ReaderContext;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
@@ -51,10 +52,10 @@ import org.springframework.core.type.filter.TypeFilter;
import org.springframework.data.annotation.Persistent;
import org.springframework.data.config.BeanComponentDefinitionBuilder;
import org.springframework.data.mapping.context.MappingContextIsNewStrategyFactory;
import org.springframework.data.mapping.model.CamelCaseAbbreviatingFieldNamingStrategy;
import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexCreator;
import org.springframework.data.mongodb.core.mapping.CamelCaseAbbreviatingFieldNamingStrategy;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.event.ValidatingMongoEventListener;
@@ -71,6 +72,7 @@ import org.w3c.dom.Element;
* @author Oliver Gierke
* @author Maciej Walkowiak
* @author Thomas Darimont
* @author Christoph Strobl
*/
public class MappingMongoConverterParser implements BeanDefinitionParser {
@@ -83,8 +85,11 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
*/
public BeanDefinition parse(Element element, ParserContext parserContext) {
BeanDefinitionRegistry registry = parserContext.getRegistry();
if (parserContext.isNested()) {
parserContext.getReaderContext().error("Mongo Converter must not be defined as nested bean.", element);
}
BeanDefinitionRegistry registry = parserContext.getRegistry();
String id = element.getAttribute(AbstractBeanDefinitionParser.ID_ATTRIBUTE);
id = StringUtils.hasText(id) ? id : DEFAULT_CONVERTER_BEAN_NAME;
@@ -209,11 +214,7 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
mappingContextBuilder.addPropertyValue("simpleTypeHolder", simpleTypesDefinition);
}
String abbreviateFieldNames = element.getAttribute("abbreviate-field-names");
if ("true".equals(abbreviateFieldNames)) {
mappingContextBuilder.addPropertyValue("fieldNamingStrategy", new RootBeanDefinition(
CamelCaseAbbreviatingFieldNamingStrategy.class));
}
parseFieldNamingStrategy(element, parserContext.getReaderContext(), mappingContextBuilder);
ctxRef = converterId == null || DEFAULT_CONVERTER_BEAN_NAME.equals(converterId) ? MAPPING_CONTEXT_BEAN_NAME
: converterId + "." + MAPPING_CONTEXT_BEAN_NAME;
@@ -222,6 +223,34 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
return ctxRef;
}
private static void parseFieldNamingStrategy(Element element, ReaderContext context, BeanDefinitionBuilder builder) {
String abbreviateFieldNames = element.getAttribute("abbreviate-field-names");
String fieldNamingStrategy = element.getAttribute("field-naming-strategy-ref");
boolean fieldNamingStrategyReferenced = StringUtils.hasText(fieldNamingStrategy);
boolean abbreviationActivated = StringUtils.hasText(abbreviateFieldNames)
&& Boolean.parseBoolean(abbreviateFieldNames);
if (fieldNamingStrategyReferenced && abbreviationActivated) {
context.error("Field name abbreviation cannot be activated if a field-naming-strategy-ref is configured!",
element);
return;
}
Object value = null;
if ("true".equals(abbreviateFieldNames)) {
value = new RootBeanDefinition(CamelCaseAbbreviatingFieldNamingStrategy.class);
} else if (fieldNamingStrategyReferenced) {
value = new RuntimeBeanReference(fieldNamingStrategy);
}
if (value != null) {
builder.addPropertyValue("fieldNamingStrategy", value);
}
}
private BeanDefinition getCustomConversions(Element element, ParserContext parserContext) {
List<Element> customConvertersElements = DomUtils.getChildElementsByTagName(element, "custom-converters");

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -18,6 +18,8 @@ package org.springframework.data.mongodb.core;
import static org.springframework.data.domain.Sort.Direction.*;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
import java.util.List;
import org.springframework.dao.DataAccessException;
@@ -36,11 +38,13 @@ import com.mongodb.MongoException;
* @author Mark Pollack
* @author Oliver Gierke
* @author Komi Innocent
* @author Christoph Strobl
*/
public class DefaultIndexOperations implements IndexOperations {
private static final Double ONE = Double.valueOf(1);
private static final Double MINUS_ONE = Double.valueOf(-1);
private static final Collection<String> TWO_D_IDENTIFIERS = Arrays.asList("2d", "2dsphere");
private final MongoOperations mongoOperations;
private final String collectionName;
@@ -140,8 +144,15 @@ public class DefaultIndexOperations implements IndexOperations {
Object value = keyDbObject.get(key);
if ("2d".equals(value)) {
if (TWO_D_IDENTIFIERS.contains(value)) {
indexFields.add(IndexField.geo(key));
} else if ("text".equals(value)) {
DBObject weights = (DBObject) ix.get("weights");
for (String fieldName : weights.keySet()) {
indexFields.add(IndexField.text(fieldName, Float.valueOf(weights.get(fieldName).toString())));
}
} else {
Double keyValue = new Double(value.toString());
@@ -159,8 +170,8 @@ public class DefaultIndexOperations implements IndexOperations {
boolean unique = ix.containsField("unique") ? (Boolean) ix.get("unique") : false;
boolean dropDuplicates = ix.containsField("dropDups") ? (Boolean) ix.get("dropDups") : false;
boolean sparse = ix.containsField("sparse") ? (Boolean) ix.get("sparse") : false;
indexInfoList.add(new IndexInfo(indexFields, name, unique, dropDuplicates, sparse));
String language = ix.containsField("default_language") ? (String) ix.get("default_language") : "";
indexInfoList.add(new IndexInfo(indexFields, name, unique, dropDuplicates, sparse, language));
}
return indexInfoList;

View File

@@ -23,11 +23,15 @@ import org.springframework.dao.InvalidDataAccessResourceUsageException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.UncategorizedMongoDbException;
import com.mongodb.MongoCursorNotFoundException;
import com.mongodb.MongoException;
import com.mongodb.MongoException.CursorNotFound;
import com.mongodb.MongoException.DuplicateKey;
import com.mongodb.MongoException.Network;
import com.mongodb.MongoInternalException;
import com.mongodb.MongoServerSelectionException;
import com.mongodb.MongoSocketException;
import com.mongodb.MongoTimeoutException;
/**
* Simple {@link PersistenceExceptionTranslator} for Mongo. Convert the given runtime exception to an appropriate
@@ -47,21 +51,23 @@ public class MongoExceptionTranslator implements PersistenceExceptionTranslator
// Check for well-known MongoException subclasses.
// All other MongoExceptions
if (ex instanceof DuplicateKey) {
if (ex instanceof DuplicateKey || ex instanceof DuplicateKeyException) {
return new DuplicateKeyException(ex.getMessage(), ex);
}
if (ex instanceof Network) {
if (ex instanceof Network || ex instanceof MongoSocketException) {
return new DataAccessResourceFailureException(ex.getMessage(), ex);
}
if (ex instanceof CursorNotFound) {
if (ex instanceof CursorNotFound || ex instanceof MongoCursorNotFoundException) {
return new DataAccessResourceFailureException(ex.getMessage(), ex);
}
// Driver 2.12 throws this to indicate connection problems. String comparison to avoid hard dependency
if (ex.getClass().getName().equals("com.mongodb.MongoServerSelectionException")) {
if (ex instanceof MongoServerSelectionException) {
return new DataAccessResourceFailureException(ex.getMessage(), ex);
}
if (ex instanceof MongoTimeoutException) {
return new DataAccessResourceFailureException(ex.getMessage(), ex);
}
@@ -69,6 +75,7 @@ public class MongoExceptionTranslator implements PersistenceExceptionTranslator
return new InvalidDataAccessResourceUsageException(ex.getMessage(), ex);
}
// All other MongoExceptions
if (ex instanceof MongoException) {
int code = ((MongoException) ex).getCode();

View File

@@ -19,11 +19,11 @@ import java.util.Collection;
import java.util.List;
import java.util.Set;
import org.springframework.data.geo.GeoResults;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.aggregation.AggregationResults;
import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.geo.GeoResults;
import org.springframework.data.mongodb.core.mapreduce.GroupBy;
import org.springframework.data.mongodb.core.mapreduce.GroupByResults;
import org.springframework.data.mongodb.core.mapreduce.MapReduceOptions;
@@ -52,7 +52,6 @@ import com.mongodb.WriteResult;
* @author Christoph Strobl
* @author Thomas Darimont
*/
@SuppressWarnings("deprecation")
public interface MongoOperations {
/**

View File

@@ -53,6 +53,7 @@ import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.convert.EntityReader;
import org.springframework.data.geo.Distance;
import org.springframework.data.geo.GeoResult;
import org.springframework.data.geo.GeoResults;
import org.springframework.data.geo.Metric;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.model.BeanWrapper;
@@ -71,7 +72,6 @@ import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.convert.MongoWriter;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.convert.UpdateMapper;
import org.springframework.data.mongodb.core.geo.GeoResults;
import org.springframework.data.mongodb.core.index.MongoMappingEventPublisher;
import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexCreator;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
@@ -115,6 +115,7 @@ import com.mongodb.WriteConcern;
import com.mongodb.WriteResult;
import com.mongodb.util.JSON;
import com.mongodb.util.JSONParseException;
/**
* Primary implementation of {@link MongoOperations}.
*
@@ -354,7 +355,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
public void executeQuery(Query query, String collectionName, DocumentCallbackHandler dch) {
executeQuery(query, collectionName, dch, new QueryCursorPreparer(query));
executeQuery(query, collectionName, dch, new QueryCursorPreparer(query, null));
}
/**
@@ -532,7 +533,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
return doFind(collectionName, query.getQueryObject(), query.getFieldsObject(), entityClass,
new QueryCursorPreparer(query));
new QueryCursorPreparer(query, entityClass));
}
public <T> T findById(Object id, Class<T> entityClass) {
@@ -614,8 +615,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
public <T> T findAndModify(Query query, Update update, FindAndModifyOptions options, Class<T> entityClass,
String collectionName) {
return doFindAndModify(collectionName, query.getQueryObject(), query.getFieldsObject(), query.getSortObject(),
entityClass, update, options);
return doFindAndModify(collectionName, query.getQueryObject(), query.getFieldsObject(),
getMappedSortObject(query, entityClass), entityClass, update, options);
}
// Find methods that take a Query to express the query and that return a single object that is also removed from the
@@ -626,8 +627,9 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
public <T> T findAndRemove(Query query, Class<T> entityClass, String collectionName) {
return doFindAndRemove(collectionName, query.getQueryObject(), query.getFieldsObject(), query.getSortObject(),
entityClass);
return doFindAndRemove(collectionName, query.getQueryObject(), query.getFieldsObject(),
getMappedSortObject(query, entityClass), entityClass);
}
public long count(Query query, Class<?> entityClass) {
@@ -1068,17 +1070,22 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
/**
* Returns {@link Entry} containing the {@link MongoPersistentProperty} defining the {@literal id} as
* {@link Entry#getKey()} and the {@link Id}s property value as its {@link Entry#getValue()}.
* Returns {@link Entry} containing the field name of the id property as {@link Entry#getKey()} and the {@link Id}s
* property value as its {@link Entry#getValue()}.
*
* @param object
* @return
*/
private Map.Entry<MongoPersistentProperty, Object> extractIdPropertyAndValue(Object object) {
private Entry<String, Object> extractIdPropertyAndValue(Object object) {
Assert.notNull(object, "Id cannot be extracted from 'null'.");
Class<?> objectType = object.getClass();
if (object instanceof DBObject) {
return Collections.singletonMap(ID_FIELD, ((DBObject) object).get(ID_FIELD)).entrySet().iterator().next();
}
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(objectType);
MongoPersistentProperty idProp = entity == null ? null : entity.getIdProperty();
@@ -1088,7 +1095,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
Object idValue = BeanWrapper.create(object, mongoConverter.getConversionService())
.getProperty(idProp, Object.class);
return Collections.singletonMap(idProp, idValue).entrySet().iterator().next();
return Collections.singletonMap(idProp.getFieldName(), idValue).entrySet().iterator().next();
}
/**
@@ -1099,8 +1106,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
*/
private Query getIdQueryFor(Object object) {
Map.Entry<MongoPersistentProperty, Object> id = extractIdPropertyAndValue(object);
return new Query(where(id.getKey().getFieldName()).is(id.getValue()));
Entry<String, Object> id = extractIdPropertyAndValue(object);
return new Query(where(id.getKey()).is(id.getValue()));
}
/**
@@ -1114,7 +1121,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
Assert.notEmpty(objects, "Cannot create Query for empty collection.");
Iterator<?> it = objects.iterator();
Map.Entry<MongoPersistentProperty, Object> firstEntry = extractIdPropertyAndValue(it.next());
Entry<String, Object> firstEntry = extractIdPropertyAndValue(it.next());
ArrayList<Object> ids = new ArrayList<Object>(objects.size());
ids.add(firstEntry.getValue());
@@ -1123,7 +1130,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
ids.add(extractIdPropertyAndValue(it.next()).getValue());
}
return new Query(where(firstEntry.getKey().getFieldName()).in(ids));
return new Query(where(firstEntry.getKey()).in(ids));
}
private void assertUpdateableIdIfNotSet(Object entity) {
@@ -1411,17 +1418,32 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
CommandResult commandResult = executeCommand(command);
handleCommandError(commandResult, command);
// map results
return new AggregationResults<O>(returnPotentiallyMappedResults(outputType, commandResult), commandResult);
}
/**
* Returns the potentially mapped results of the given {@commandResult} contained some.
*
* @param outputType
* @param commandResult
* @return
*/
private <O> List<O> returnPotentiallyMappedResults(Class<O> outputType, CommandResult commandResult) {
@SuppressWarnings("unchecked")
Iterable<DBObject> resultSet = (Iterable<DBObject>) commandResult.get("result");
List<O> mappedResults = new ArrayList<O>();
if (resultSet == null) {
return Collections.emptyList();
}
DbObjectCallback<O> callback = new UnwrapAndReadDbObjectCallback<O>(mongoConverter, outputType);
List<O> mappedResults = new ArrayList<O>();
for (DBObject dbObject : resultSet) {
mappedResults.add(callback.doWith(dbObject));
}
return new AggregationResults<O>(mappedResults, commandResult);
return mappedResults;
}
protected String replaceWithResourceIfNecessary(String function) {
@@ -1453,13 +1475,13 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
"Can not use skip or field specification with map reduce operations");
}
if (query.getQueryObject() != null) {
copyMapReduceOptions.put("query", query.getQueryObject());
copyMapReduceOptions.put("query", queryMapper.getMappedObject(query.getQueryObject(), null));
}
if (query.getLimit() > 0) {
copyMapReduceOptions.put("limit", query.getLimit());
}
if (query.getSortObject() != null) {
copyMapReduceOptions.put("sort", query.getSortObject());
copyMapReduceOptions.put("sort", queryMapper.getMappedObject(query.getSortObject(), null));
}
}
return copyMapReduceOptions;
@@ -1594,7 +1616,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
CursorPreparer preparer, DbObjectCallback<T> objectCallback) {
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass);
DBObject mappedFields = fields == null ? null : queryMapper.getMappedObject(fields, entity);
DBObject mappedFields = queryMapper.getMappedFields(fields, entity);
DBObject mappedQuery = queryMapper.getMappedObject(query, entity);
if (LOGGER.isDebugEnabled()) {
@@ -1941,6 +1964,15 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
return converter;
}
private DBObject getMappedSortObject(Query query, Class<?> type) {
if (query == null || query.getSortObject() == null) {
return null;
}
return queryMapper.getMappedSort(query.getSortObject(), mappingContext.getPersistentEntity(type));
}
// Callback implementations
/**
@@ -1998,7 +2030,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
public DBCursor doInCollection(DBCollection collection) throws MongoException, DataAccessException {
if (fields == null) {
if (fields == null || fields.toMap().isEmpty()) {
return collection.find(query);
} else {
return collection.find(query, fields);
@@ -2134,9 +2167,12 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
class QueryCursorPreparer implements CursorPreparer {
private final Query query;
private final Class<?> type;
public QueryCursorPreparer(Query query, Class<?> type) {
public QueryCursorPreparer(Query query) {
this.query = query;
this.type = type;
}
/*
@@ -2150,11 +2186,11 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
if (query.getSkip() <= 0 && query.getLimit() <= 0 && query.getSortObject() == null
&& !StringUtils.hasText(query.getHint())) {
&& !StringUtils.hasText(query.getHint()) && !query.getMeta().hasValues()) {
return cursor;
}
DBCursor cursorToUse = cursor;
DBCursor cursorToUse = cursor.copy();
try {
if (query.getSkip() > 0) {
@@ -2164,11 +2200,18 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
cursorToUse = cursorToUse.limit(query.getLimit());
}
if (query.getSortObject() != null) {
cursorToUse = cursorToUse.sort(query.getSortObject());
DBObject sortDbo = type != null ? getMappedSortObject(query, type) : query.getSortObject();
cursorToUse = cursorToUse.sort(sortDbo);
}
if (StringUtils.hasText(query.getHint())) {
cursorToUse = cursorToUse.hint(query.getHint());
}
if (query.getMeta().hasValues()) {
for (Entry<String, Object> entry : query.getMeta().values()) {
cursorToUse = cursorToUse.addSpecial(entry.getKey(), entry.getValue());
}
}
} catch (RuntimeException e) {
throw potentiallyConvertRuntimeException(e);
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -44,9 +44,23 @@ import com.mongodb.DBObject;
*/
public class Aggregation {
public static final AggregationOperationContext DEFAULT_CONTEXT = new NoOpAggregationOperationContext();
/**
* References the root document, i.e. the top-level document, currently being processed in the aggregation pipeline
* stage.
*/
public static final String ROOT = SystemVariable.ROOT.toString();
private final List<AggregationOperation> operations;
/**
* References the start of the field path being processed in the aggregation pipeline stage. Unless documented
* otherwise, all stages start with CURRENT the same as ROOT.
*/
public static final String CURRENT = SystemVariable.CURRENT.toString();
public static final AggregationOperationContext DEFAULT_CONTEXT = new NoOpAggregationOperationContext();
public static final AggregationOptions DEFAULT_OPTIONS = newAggregationOptions().build();
protected final List<AggregationOperation> operations;
private final AggregationOptions options;
/**
* Creates a new {@link Aggregation} from the given {@link AggregationOperation}s.
@@ -66,6 +80,20 @@ public class Aggregation {
return new Aggregation(operations);
}
/**
* Returns a copy of this {@link Aggregation} with the given {@link AggregationOptions} set. Note that options are
* supported in MongoDB version 2.6+.
*
* @param options must not be {@literal null}.
* @return
* @since 1.6
*/
public Aggregation withOptions(AggregationOptions options) {
Assert.notNull(options, "AggregationOptions must not be null.");
return new Aggregation(this.operations, options);
}
/**
* Creates a new {@link TypedAggregation} for the given type and {@link AggregationOperation}s.
*
@@ -92,11 +120,43 @@ public class Aggregation {
* @param aggregationOperations must not be {@literal null} or empty.
*/
protected Aggregation(AggregationOperation... aggregationOperations) {
this(asAggregationList(aggregationOperations));
}
/**
* @param aggregationOperations must not be {@literal null} or empty.
* @return
*/
protected static List<AggregationOperation> asAggregationList(AggregationOperation... aggregationOperations) {
Assert.notEmpty(aggregationOperations, "AggregationOperations must not be null or empty!");
return Arrays.asList(aggregationOperations);
}
/**
* Creates a new {@link Aggregation} from the given {@link AggregationOperation}s.
*
* @param aggregationOperations must not be {@literal null} or empty.
*/
protected Aggregation(List<AggregationOperation> aggregationOperations) {
this(aggregationOperations, DEFAULT_OPTIONS);
}
/**
* Creates a new {@link Aggregation} from the given {@link AggregationOperation}s.
*
* @param aggregationOperations must not be {@literal null} or empty.
* @param options must not be {@literal null} or empty.
*/
protected Aggregation(List<AggregationOperation> aggregationOperations, AggregationOptions options) {
Assert.notNull(aggregationOperations, "AggregationOperations must not be null!");
Assert.isTrue(aggregationOperations.length > 0, "At least one AggregationOperation has to be provided");
Assert.isTrue(aggregationOperations.size() > 0, "At least one AggregationOperation has to be provided");
Assert.notNull(options, "AggregationOptions must not be null!");
this.operations = Arrays.asList(aggregationOperations);
this.operations = aggregationOperations;
this.options = options;
}
/**
@@ -231,6 +291,16 @@ public class Aggregation {
return Fields.from(field(name, target));
}
/**
* Returns a new {@link AggregationOptions.Builder}.
*
* @return
* @since 1.6
*/
public static AggregationOptions.Builder newAggregationOptions() {
return new AggregationOptions.Builder();
}
/**
* Converts this {@link Aggregation} specification to a {@link DBObject}.
*
@@ -248,13 +318,15 @@ public class Aggregation {
if (operation instanceof FieldsExposingAggregationOperation) {
FieldsExposingAggregationOperation exposedFieldsOperation = (FieldsExposingAggregationOperation) operation;
context = new ExposedFieldsAggregationOperationContext(exposedFieldsOperation.getFields());
context = new ExposedFieldsAggregationOperationContext(exposedFieldsOperation.getFields(), rootContext);
}
}
DBObject command = new BasicDBObject("aggregate", inputCollectionName);
command.put("pipeline", operationDocuments);
command = options.applyAndReturnPotentiallyChangedCommand(command);
return command;
}
@@ -302,4 +374,51 @@ public class Aggregation {
return new FieldReference(new ExposedField(new AggregationField(name), true));
}
}
/**
* Describes the system variables available in MongoDB aggregation framework pipeline expressions.
*
* @author Thomas Darimont
* @see http://docs.mongodb.org/manual/reference/aggregation-variables
*/
enum SystemVariable {
ROOT, CURRENT;
private static final String PREFIX = "$$";
/**
* Return {@literal true} if the given {@code fieldRef} denotes a well-known system variable, {@literal false}
* otherwise.
*
* @param fieldRef may be {@literal null}.
* @return
*/
public static boolean isReferingToSystemVariable(String fieldRef) {
if (fieldRef == null || !fieldRef.startsWith(PREFIX) || fieldRef.length() <= 2) {
return false;
}
int indexOfFirstDot = fieldRef.indexOf('.');
String candidate = fieldRef.substring(2, indexOfFirstDot == -1 ? fieldRef.length() : indexOfFirstDot);
for (SystemVariable value : values()) {
if (value.name().equals(candidate)) {
return true;
}
}
return false;
}
/*
* (non-Javadoc)
* @see java.lang.Enum#toString()
*/
@Override
public String toString() {
return PREFIX.concat(name());
}
}
}

View File

@@ -0,0 +1,189 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Holds a set of configurable aggregation options that can be used within an aggregation pipeline. A list of support
* aggregation options can be found in the MongoDB reference documentation
* http://docs.mongodb.org/manual/reference/command/aggregate/#aggregate
*
* @author Thomas Darimont
* @author Oliver Gierke
* @see Aggregation#withOptions(AggregationOptions)
* @see TypedAggregation#withOptions(AggregationOptions)
* @since 1.6
*/
public class AggregationOptions {
private static final String CURSOR = "cursor";
private static final String EXPLAIN = "explain";
private static final String ALLOW_DISK_USE = "allowDiskUse";
private final boolean allowDiskUse;
private final boolean explain;
private final DBObject cursor;
/**
* Creates a new {@link AggregationOptions}.
*
* @param allowDiskUse whether to off-load intensive sort-operations to disk.
* @param explain whether to get the execution plan for the aggregation instead of the actual results.
* @param cursor can be {@literal null}, used to pass additional options to the aggregation.
*/
public AggregationOptions(boolean allowDiskUse, boolean explain, DBObject cursor) {
this.allowDiskUse = allowDiskUse;
this.explain = explain;
this.cursor = cursor;
}
/**
* Enables writing to temporary files. When set to true, aggregation stages can write data to the _tmp subdirectory in
* the dbPath directory.
*
* @return
*/
public boolean isAllowDiskUse() {
return allowDiskUse;
}
/**
* Specifies to return the information on the processing of the pipeline.
*
* @return
*/
public boolean isExplain() {
return explain;
}
/**
* Specify a document that contains options that control the creation of the cursor object.
*
* @return
*/
public DBObject getCursor() {
return cursor;
}
/**
* Returns a new potentially adjusted copy for the given {@code aggregationCommandObject} with the configuration
* applied.
*
* @param command the aggregation command.
* @return
*/
DBObject applyAndReturnPotentiallyChangedCommand(DBObject command) {
DBObject result = new BasicDBObject(command.toMap());
if (allowDiskUse && !result.containsField(ALLOW_DISK_USE)) {
result.put(ALLOW_DISK_USE, allowDiskUse);
}
if (explain && !result.containsField(EXPLAIN)) {
result.put(EXPLAIN, explain);
}
if (cursor != null && !result.containsField(CURSOR)) {
result.put("cursor", cursor);
}
return result;
}
/**
* Returns a {@link DBObject} representation of this {@link AggregationOptions}.
*
* @return
*/
public DBObject toDbObject() {
DBObject dbo = new BasicDBObject();
dbo.put(ALLOW_DISK_USE, allowDiskUse);
dbo.put(EXPLAIN, explain);
dbo.put(CURSOR, cursor);
return dbo;
}
/* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
return toDbObject().toString();
}
/**
* A Builder for {@link AggregationOptions}.
*
* @author Thomas Darimont
*/
public static class Builder {
private boolean allowDiskUse;
private boolean explain;
private DBObject cursor;
/**
* Defines whether to off-load intensive sort-operations to disk.
*
* @param allowDiskUse
* @return
*/
public Builder allowDiskUse(boolean allowDiskUse) {
this.allowDiskUse = allowDiskUse;
return this;
}
/**
* Defines whether to get the execution plan for the aggregation instead of the actual results.
*
* @param explain
* @return
*/
public Builder explain(boolean explain) {
this.explain = explain;
return this;
}
/**
* Additional options to the aggregation.
*
* @param cursor
* @return
*/
public Builder cursor(DBObject cursor) {
this.cursor = cursor;
return this;
}
/**
* Returns a new {@link AggregationOptions} instance with the given configuration.
*
* @return
*/
public AggregationOptions build() {
return new AggregationOptions(allowDiskUse, explain, cursor);
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -28,6 +28,7 @@ import com.mongodb.DBObject;
*
* @author Tobias Trelle
* @author Oliver Gierke
* @author Thomas Darimont
* @param <T> The class in which the results are mapped onto.
* @since 1.3
*/
@@ -90,6 +91,16 @@ public class AggregationResults<T> implements Iterable<T> {
return serverUsed;
}
/**
* Returns the raw result that was returned by the server.
*
* @return
* @since 1.6
*/
public DBObject getRawResults() {
return rawResults;
}
private String parseServerUsed() {
Object object = rawResults.get("serverUsed");

View File

@@ -268,14 +268,21 @@ public final class ExposedFields implements Iterable<ExposedField> {
return field.isAliased();
}
/**
* @return the synthetic
*/
public boolean isSynthetic() {
return synthetic;
}
/**
* Returns whether the field can be referred to using the given name.
*
* @param input
* @param name
* @return
*/
public boolean canBeReferredToBy(String input) {
return getTarget().equals(input);
public boolean canBeReferredToBy(String name) {
return getName().equals(name) || getTarget().equals(name);
}
/*
@@ -340,6 +347,7 @@ public final class ExposedFields implements Iterable<ExposedField> {
public FieldReference(ExposedField field) {
Assert.notNull(field, "ExposedField must not be null!");
this.field = field;
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -32,16 +32,22 @@ import com.mongodb.DBObject;
class ExposedFieldsAggregationOperationContext implements AggregationOperationContext {
private final ExposedFields exposedFields;
private final AggregationOperationContext rootContext;
/**
* Creates a new {@link ExposedFieldsAggregationOperationContext} from the given {@link ExposedFields}.
* Creates a new {@link ExposedFieldsAggregationOperationContext} from the given {@link ExposedFields}. Uses the given
* {@link AggregationOperationContext} to perform a mapping to mongo types if necessary.
*
* @param exposedFields must not be {@literal null}.
* @param rootContext must not be {@literal null}.
*/
public ExposedFieldsAggregationOperationContext(ExposedFields exposedFields) {
public ExposedFieldsAggregationOperationContext(ExposedFields exposedFields, AggregationOperationContext rootContext) {
Assert.notNull(exposedFields, "ExposedFields must not be null!");
Assert.notNull(rootContext, "RootContext must not be null!");
this.exposedFields = exposedFields;
this.rootContext = rootContext;
}
/*
@@ -50,7 +56,7 @@ class ExposedFieldsAggregationOperationContext implements AggregationOperationCo
*/
@Override
public DBObject getMappedObject(DBObject dbObject) {
return dbObject;
return rootContext.getMappedObject(dbObject);
}
/*
@@ -59,7 +65,7 @@ class ExposedFieldsAggregationOperationContext implements AggregationOperationCo
*/
@Override
public FieldReference getReference(Field field) {
return getReference(field.getTarget());
return getReference(field, field.getTarget());
}
/*
@@ -68,11 +74,42 @@ class ExposedFieldsAggregationOperationContext implements AggregationOperationCo
*/
@Override
public FieldReference getReference(String name) {
return getReference(null, name);
}
ExposedField field = exposedFields.getField(name);
/**
* Returns a {@link FieldReference} to the given {@link Field} with the given {@code name}.
*
* @param field may be {@literal null}
* @param name must not be {@literal null}
* @return
*/
private FieldReference getReference(Field field, String name) {
if (field != null) {
return new FieldReference(field);
Assert.notNull(name, "Name must not be null!");
ExposedField exposedField = exposedFields.getField(name);
if (exposedField != null) {
if (field != null) {
// we return a FieldReference to the given field directly to make sure that we reference the proper alias here.
return new FieldReference(new ExposedField(field, exposedField.isSynthetic()));
}
return new FieldReference(exposedField);
}
if (name.contains(".")) {
// for nested field references we only check that the root field exists.
ExposedField rootField = exposedFields.getField(name.split("\\.")[0]);
if (rootField != null) {
// We have to synthetic to true, in order to render the field-name as is.
return new FieldReference(new ExposedField(name, true));
}
}
throw new IllegalArgumentException(String.format("Invalid reference '%s'!", name));

View File

@@ -30,6 +30,7 @@ import org.springframework.util.StringUtils;
* Value object to capture a list of {@link Field} instances.
*
* @author Oliver Gierke
* @author Thomas Darimont
* @since 1.3
*/
public final class Fields implements Iterable<Field> {
@@ -186,7 +187,7 @@ public final class Fields implements Iterable<Field> {
private final String target;
/**
* Creates an aggregation fieldwith the given name. As no target is set explicitly, the name will be used as target
* Creates an aggregation field with the given name. As no target is set explicitly, the name will be used as target
* as well.
*
* @param key
@@ -217,6 +218,10 @@ public final class Fields implements Iterable<Field> {
return source;
}
if (Aggregation.SystemVariable.isReferingToSystemVariable(source)) {
return source;
}
int dollarIndex = source.lastIndexOf('$');
return dollarIndex == -1 ? source : source.substring(dollarIndex + 1);
}

View File

@@ -364,7 +364,16 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
}
public Object getValue(AggregationOperationContext context) {
return reference == null ? value : context.getReference(reference).toString();
if (reference == null) {
return value;
}
if (Aggregation.SystemVariable.isReferingToSystemVariable(reference)) {
return reference;
}
return context.getReference(reference).toString();
}
@Override

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -28,7 +28,7 @@ import com.mongodb.DBObject;
* @author Oliver Gierke
* @since 1.3
*/
class LimitOperation implements AggregationOperation {
public class LimitOperation implements AggregationOperation {
private final long maxElements;

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,7 +15,7 @@
*/
package org.springframework.data.mongodb.core.aggregation;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.CriteriaDefinition;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
@@ -32,17 +32,17 @@ import com.mongodb.DBObject;
*/
public class MatchOperation implements AggregationOperation {
private final Criteria criteria;
private final CriteriaDefinition criteriaDefinition;
/**
* Creates a new {@link MatchOperation} for the given {@link Criteria}.
* Creates a new {@link MatchOperation} for the given {@link CriteriaDefinition}.
*
* @param criteria must not be {@literal null}.
* @param criteriaDefinition must not be {@literal null}.
*/
public MatchOperation(Criteria criteria) {
public MatchOperation(CriteriaDefinition criteriaDefinition) {
Assert.notNull(criteria, "Criteria must not be null!");
this.criteria = criteria;
Assert.notNull(criteriaDefinition, "Criteria must not be null!");
this.criteriaDefinition = criteriaDefinition;
}
/*
@@ -51,6 +51,6 @@ public class MatchOperation implements AggregationOperation {
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
return new BasicDBObject("$match", context.getMappedObject(criteria.getCriteriaObject()));
return new BasicDBObject("$match", context.getMappedObject(criteriaDefinition.getCriteriaObject()));
}
}

View File

@@ -24,7 +24,6 @@ import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedFi
import org.springframework.data.mongodb.core.aggregation.ProjectionOperation.ProjectionOperationBuilder.FieldProjection;
import org.springframework.util.Assert;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
@@ -235,26 +234,53 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
}
/**
* An {@link ProjectionOperationBuilder} that is used for SpEL expression based projections.
*
* @author Thomas Darimont
*/
public static class ExpressionProjectionOperationBuilder extends AbstractProjectionOperationBuilder {
public static class ExpressionProjectionOperationBuilder extends ProjectionOperationBuilder {
private final Object[] params;
private final String expression;
/**
* Creates a new {@link ExpressionProjectionOperationBuilder} for the given value, {@link ProjectionOperation} and
* parameters.
*
* @param value must not be {@literal null}.
* @param expression must not be {@literal null}.
* @param operation must not be {@literal null}.
* @param parameters
*/
public ExpressionProjectionOperationBuilder(Object value, ProjectionOperation operation, Object[] parameters) {
public ExpressionProjectionOperationBuilder(String expression, ProjectionOperation operation, Object[] parameters) {
super(value, operation);
super(expression, operation, null);
this.expression = expression;
this.params = parameters.clone();
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.ProjectionOperationBuilder#project(java.lang.String, java.lang.Object[])
*/
@Override
public ProjectionOperationBuilder project(String operation, final Object... values) {
OperationProjection operationProjection = new OperationProjection(Fields.field(value.toString()), operation,
values) {
@Override
protected List<Object> getOperationArguments(AggregationOperationContext context) {
List<Object> result = new ArrayList<Object>(values.length + 1);
result.add(ExpressionProjection.toMongoExpression(context,
ExpressionProjectionOperationBuilder.this.expression, ExpressionProjectionOperationBuilder.this.params));
result.addAll(Arrays.asList(values));
return result;
}
};
return new ProjectionOperationBuilder(value, this.operation.and(operationProjection), operationProjection);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.AbstractProjectionOperationBuilder#as(java.lang.String)
@@ -303,7 +329,11 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
return new BasicDBObject(getExposedField().getName(), TRANSFORMER.transform(expression, context, params));
return new BasicDBObject(getExposedField().getName(), toMongoExpression(context, expression, params));
}
protected static Object toMongoExpression(AggregationOperationContext context, String expression, Object[] params) {
return TRANSFORMER.transform(expression, context, params);
}
}
}
@@ -320,7 +350,6 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
private static final String FIELD_REFERENCE_NOT_NULL = "Field reference must not be null!";
private final String name;
private final ProjectionOperation operation;
private final OperationProjection previousProjection;
/**
@@ -335,7 +364,23 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
super(name, operation);
this.name = name;
this.operation = operation;
this.previousProjection = previousProjection;
}
/**
* Creates a new {@link ProjectionOperationBuilder} for the field with the given value on top of the given
* {@link ProjectionOperation}.
*
* @param value
* @param operation
* @param previousProjection
*/
protected ProjectionOperationBuilder(Object value, ProjectionOperation operation,
OperationProjection previousProjection) {
super(value, operation);
this.name = null;
this.previousProjection = previousProjection;
}
@@ -521,8 +566,9 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
* @return
*/
public ProjectionOperationBuilder project(String operation, Object... values) {
OperationProjection projectionOperation = new OperationProjection(Fields.field(name), operation, values);
return new ProjectionOperationBuilder(name, this.operation.and(projectionOperation), projectionOperation);
OperationProjection operationProjection = new OperationProjection(Fields.field(value.toString()), operation,
values);
return new ProjectionOperationBuilder(value, this.operation.and(operationProjection), operationProjection);
}
/**
@@ -627,6 +673,10 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
// implicit reference or explicit include?
if (value == null || Boolean.TRUE.equals(value)) {
if (Aggregation.SystemVariable.isReferingToSystemVariable(field.getTarget())) {
return field.getTarget();
}
// check whether referenced field exists in the context
return context.getReference(field).getReferenceValue();
@@ -649,7 +699,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/**
* Creates a new {@link OperationProjection} for the given field.
*
* @param name the name of the field to add the operation projection for, must not be {@literal null} or empty.
* @param field the name of the field to add the operation projection for, must not be {@literal null} or empty.
* @param operation the actual operation key, must not be {@literal null} or empty.
* @param values the values to pass into the operation, must not be {@literal null}.
*/
@@ -672,18 +722,15 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
@Override
public DBObject toDBObject(AggregationOperationContext context) {
BasicDBList values = new BasicDBList();
values.addAll(buildReferences(context));
DBObject inner = new BasicDBObject("$" + operation, getOperationArguments(context));
DBObject inner = new BasicDBObject("$" + operation, values);
return new BasicDBObject(this.field.getName(), inner);
return new BasicDBObject(getField().getName(), inner);
}
private List<Object> buildReferences(AggregationOperationContext context) {
protected List<Object> getOperationArguments(AggregationOperationContext context) {
List<Object> result = new ArrayList<Object>(values.size());
result.add(context.getReference(field.getTarget()).toString());
result.add(context.getReference(getField().getName()).toString());
for (Object element : values) {
result.add(element instanceof Field ? context.getReference((Field) element).toString() : element);
@@ -692,6 +739,15 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
return result;
}
/**
* Returns the field that holds the {@link OperationProjection}.
*
* @return
*/
protected Field getField() {
return field;
}
/**
* Creates a new instance of this {@link OperationProjection} with the given alias.
*
@@ -699,7 +755,27 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
* @return
*/
public OperationProjection withAlias(String alias) {
return new OperationProjection(Fields.field(alias, this.field.getName()), operation, values.toArray());
final Field aliasedField = Fields.field(alias, this.field.getName());
return new OperationProjection(aliasedField, operation, values.toArray()) {
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.ProjectionOperationBuilder.OperationProjection#getField()
*/
@Override
protected Field getField() {
return aliasedField;
}
@Override
protected List<Object> getOperationArguments(AggregationOperationContext context) {
// We have to make sure that we use the arguments from the "previous" OperationProjection that we replace
// with this new instance.
return OperationProjection.this.getOperationArguments(context);
}
};
}
}
@@ -731,6 +807,96 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
return new BasicDBObject(name, nestedObject);
}
}
/**
* Extracts the minute from a date expression.
*
* @return
*/
public ProjectionOperationBuilder extractMinute() {
return project("minute");
}
/**
* Extracts the hour from a date expression.
*
* @return
*/
public ProjectionOperationBuilder extractHour() {
return project("hour");
}
/**
* Extracts the second from a date expression.
*
* @return
*/
public ProjectionOperationBuilder extractSecond() {
return project("second");
}
/**
* Extracts the millisecond from a date expression.
*
* @return
*/
public ProjectionOperationBuilder extractMillisecond() {
return project("millisecond");
}
/**
* Extracts the year from a date expression.
*
* @return
*/
public ProjectionOperationBuilder extractYear() {
return project("year");
}
/**
* Extracts the month from a date expression.
*
* @return
*/
public ProjectionOperationBuilder extractMonth() {
return project("month");
}
/**
* Extracts the week from a date expression.
*
* @return
*/
public ProjectionOperationBuilder extractWeek() {
return project("week");
}
/**
* Extracts the dayOfYear from a date expression.
*
* @return
*/
public ProjectionOperationBuilder extractDayOfYear() {
return project("dayOfYear");
}
/**
* Extracts the dayOfMonth from a date expression.
*
* @return
*/
public ProjectionOperationBuilder extractDayOfMonth() {
return project("dayOfMonth");
}
/**
* Extracts the dayOfWeek from a date expression.
*
* @return
*/
public ProjectionOperationBuilder extractDayOfWeek() {
return project("dayOfWeek");
}
}
/**
@@ -771,5 +937,4 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
*/
public abstract DBObject toDBObject(AggregationOperationContext context);
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,6 +15,8 @@
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.List;
import org.springframework.util.Assert;
/**
@@ -30,11 +32,34 @@ public class TypedAggregation<I> extends Aggregation {
/**
* Creates a new {@link TypedAggregation} from the given {@link AggregationOperation}s.
*
* @param inputType must not be {@literal null}.
* @param operations must not be {@literal null} or empty.
*/
public TypedAggregation(Class<I> inputType, AggregationOperation... operations) {
this(inputType, asAggregationList(operations));
}
super(operations);
/**
* Creates a new {@link TypedAggregation} from the given {@link AggregationOperation}s.
*
* @param inputType must not be {@literal null}.
* @param operations must not be {@literal null} or empty.
*/
public TypedAggregation(Class<I> inputType, List<AggregationOperation> operations) {
this(inputType, operations, DEFAULT_OPTIONS);
}
/**
* Creates a new {@link TypedAggregation} from the given {@link AggregationOperation}s and the given
* {@link AggregationOptions}.
*
* @param inputType must not be {@literal null}.
* @param operations must not be {@literal null} or empty.
* @param options must not be {@literal null}.
*/
public TypedAggregation(Class<I> inputType, List<AggregationOperation> operations, AggregationOptions options) {
super(operations, options);
Assert.notNull(inputType, "Input type must not be null!");
this.inputType = inputType;
@@ -48,4 +73,14 @@ public class TypedAggregation<I> extends Aggregation {
public Class<I> getInputType() {
return inputType;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.Aggregation#withOptions(org.springframework.data.mongodb.core.aggregation.AggregationOptions)
*/
public TypedAggregation<I> withOptions(AggregationOptions options) {
Assert.notNull(options, "AggregationOptions must not be null.");
return new TypedAggregation<I>(inputType, operations, options);
}
}

View File

@@ -45,6 +45,7 @@ import org.springframework.data.mongodb.core.convert.MongoConverters.DBObjectToS
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToBigDecimalConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToBigIntegerConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToURLConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.TermToStringConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.URLToStringConverter;
import org.springframework.data.mongodb.core.mapping.MongoSimpleTypes;
import org.springframework.util.Assert;
@@ -58,6 +59,7 @@ import org.springframework.util.Assert;
*
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
*/
public class CustomConversions {
@@ -106,7 +108,8 @@ public class CustomConversions {
toRegister.add(URLToStringConverter.INSTANCE);
toRegister.add(StringToURLConverter.INSTANCE);
toRegister.add(DBObjectToStringConverter.INSTANCE);
toRegister.add(TermToStringConverter.INSTANCE);
toRegister.addAll(JodaTimeConverters.getConvertersToRegister());
toRegister.addAll(GeoConverters.getConvertersToRegister());

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2014 the original author or authors.
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -13,15 +13,16 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
package org.springframework.data.mongodb.core.convert;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import com.mongodb.DBRef;
/**
* Interface for {@link Metric}s that can be applied to a base scale.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Metric}. This class is scheduled to be
* removed in the next major release.
* @author Oliver Gierke
* @author Thomas Darimont
*/
@Deprecated
public interface Metric extends org.springframework.data.geo.Metric {}
public interface DbRefProxyHandler {
Object populateId(MongoPersistentProperty property, DBRef source, Object proxy);
}

View File

@@ -39,7 +39,8 @@ public interface DbRefResolver {
* @param callback will never be {@literal null}.
* @return
*/
Object resolveDbRef(MongoPersistentProperty property, DBRef dbref, DbRefResolverCallback callback);
Object resolveDbRef(MongoPersistentProperty property, DBRef dbref, DbRefResolverCallback callback,
DbRefProxyHandler proxyHandler);
/**
* Creates a {@link DBRef} instance for the given {@link org.springframework.data.mongodb.core.mapping.DBRef}

View File

@@ -0,0 +1,79 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.model.BeanWrapper;
import org.springframework.data.mapping.model.DefaultSpELExpressionEvaluator;
import org.springframework.data.mapping.model.SpELContext;
import org.springframework.data.mapping.model.SpELExpressionEvaluator;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
import com.mongodb.DBRef;
/**
* @author Oliver Gierke
*/
class DefaultDbRefProxyHandler implements DbRefProxyHandler {
private final SpELContext spELContext;
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
private final ValueResolver resolver;
/**
* @param spELContext must not be {@literal null}.
* @param conversionService must not be {@literal null}.
* @param mappingContext must not be {@literal null}.
*/
public DefaultDbRefProxyHandler(SpELContext spELContext,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext, ValueResolver resolver) {
this.spELContext = spELContext;
this.mappingContext = mappingContext;
this.resolver = resolver;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.DbRefProxyHandler#populateId(com.mongodb.DBRef, java.lang.Object)
*/
@Override
public Object populateId(MongoPersistentProperty property, DBRef source, Object proxy) {
if (source == null) {
return proxy;
}
MongoPersistentEntity<?> persistentEntity = mappingContext.getPersistentEntity(property);
MongoPersistentProperty idProperty = persistentEntity.getIdProperty();
if(idProperty.usePropertyAccess()) {
return proxy;
}
SpELExpressionEvaluator evaluator = new DefaultSpELExpressionEvaluator(proxy, spELContext);
BeanWrapper<Object> proxyWrapper = BeanWrapper.create(proxy, null);
DBObject object = new BasicDBObject(idProperty.getFieldName(), source.getId());
ObjectPath objectPath = ObjectPath.ROOT.push(proxy, persistentEntity, null);
proxyWrapper.setProperty(idProperty, resolver.getValueInternal(idProperty, object, evaluator, objectPath));
return proxy;
}
}

View File

@@ -25,22 +25,19 @@ import java.lang.reflect.Method;
import org.aopalliance.intercept.MethodInterceptor;
import org.aopalliance.intercept.MethodInvocation;
import org.objenesis.Objenesis;
import org.objenesis.ObjenesisStd;
import org.springframework.aop.framework.ProxyFactory;
import org.springframework.cglib.proxy.Callback;
import org.springframework.cglib.proxy.Enhancer;
import org.springframework.cglib.proxy.Factory;
import org.springframework.cglib.proxy.MethodProxy;
import org.springframework.core.SpringVersion;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.LazyLoadingException;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.objenesis.ObjenesisStd;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import org.springframework.util.ReflectionUtils;
import org.springframework.util.StringUtils;
@@ -57,11 +54,9 @@ import com.mongodb.DBRef;
*/
public class DefaultDbRefResolver implements DbRefResolver {
private static final boolean IS_SPRING_4_OR_BETTER = SpringVersion.getVersion().startsWith("4");
private static final boolean OBJENESIS_PRESENT = ClassUtils.isPresent("org.objenesis.Objenesis", null);
private final MongoDbFactory mongoDbFactory;
private final PersistenceExceptionTranslator exceptionTranslator;
private final ObjenesisStd objenesis;
/**
* Creates a new {@link DefaultDbRefResolver} with the given {@link MongoDbFactory}.
@@ -74,6 +69,7 @@ public class DefaultDbRefResolver implements DbRefResolver {
this.mongoDbFactory = mongoDbFactory;
this.exceptionTranslator = mongoDbFactory.getExceptionTranslator();
this.objenesis = new ObjenesisStd(true);
}
/*
@@ -81,13 +77,14 @@ public class DefaultDbRefResolver implements DbRefResolver {
* @see org.springframework.data.mongodb.core.convert.DbRefResolver#resolveDbRef(org.springframework.data.mongodb.core.mapping.MongoPersistentProperty, org.springframework.data.mongodb.core.convert.DbRefResolverCallback)
*/
@Override
public Object resolveDbRef(MongoPersistentProperty property, DBRef dbref, DbRefResolverCallback callback) {
public Object resolveDbRef(MongoPersistentProperty property, DBRef dbref, DbRefResolverCallback callback,
DbRefProxyHandler handler) {
Assert.notNull(property, "Property must not be null!");
Assert.notNull(callback, "Callback must not be null!");
if (isLazyDbRef(property)) {
return createLazyLoadingProxy(property, dbref, callback);
return createLazyLoadingProxy(property, dbref, callback, handler);
}
return callback.resolve(property);
@@ -116,34 +113,47 @@ public class DefaultDbRefResolver implements DbRefResolver {
* @param callback must not be {@literal null}.
* @return
*/
private Object createLazyLoadingProxy(MongoPersistentProperty property, DBRef dbref, DbRefResolverCallback callback) {
private Object createLazyLoadingProxy(MongoPersistentProperty property, DBRef dbref, DbRefResolverCallback callback,
DbRefProxyHandler handler) {
Class<?> propertyType = property.getType();
LazyLoadingInterceptor interceptor = new LazyLoadingInterceptor(property, dbref, exceptionTranslator, callback);
if (!propertyType.isInterface()) {
Factory factory = (Factory) objenesis.newInstance(getEnhancedTypeFor(propertyType));
factory.setCallbacks(new Callback[] { interceptor });
return handler.populateId(property, dbref, factory);
}
ProxyFactory proxyFactory = new ProxyFactory();
Class<?> propertyType = property.getType();
for (Class<?> type : propertyType.getInterfaces()) {
proxyFactory.addInterface(type);
}
LazyLoadingInterceptor interceptor = new LazyLoadingInterceptor(property, dbref, exceptionTranslator, callback);
proxyFactory.addInterface(LazyLoadingProxy.class);
proxyFactory.addInterface(propertyType);
proxyFactory.addAdvice(interceptor);
if (propertyType.isInterface()) {
proxyFactory.addInterface(propertyType);
proxyFactory.addAdvice(interceptor);
return proxyFactory.getProxy();
}
return handler.populateId(property, dbref, proxyFactory.getProxy());
}
proxyFactory.setProxyTargetClass(true);
proxyFactory.setTargetClass(propertyType);
/**
* Returns the CGLib enhanced type for the given source type.
*
* @param type
* @return
*/
private Class<?> getEnhancedTypeFor(Class<?> type) {
if (IS_SPRING_4_OR_BETTER || !OBJENESIS_PRESENT) {
proxyFactory.addAdvice(interceptor);
return proxyFactory.getProxy();
}
Enhancer enhancer = new Enhancer();
enhancer.setSuperclass(type);
enhancer.setCallbackType(org.springframework.cglib.proxy.MethodInterceptor.class);
enhancer.setInterfaces(new Class[] { LazyLoadingProxy.class });
return ObjenesisProxyEnhancer.enhanceAndGet(proxyFactory, propertyType, interceptor);
return enhancer.createClass();
}
/**
@@ -163,6 +173,7 @@ public class DefaultDbRefResolver implements DbRefResolver {
*
* @author Thomas Darimont
* @author Oliver Gierke
* @author Christoph Strobl
*/
static class LazyLoadingInterceptor implements MethodInterceptor, org.springframework.cglib.proxy.MethodInterceptor,
Serializable {
@@ -179,7 +190,7 @@ public class DefaultDbRefResolver implements DbRefResolver {
static {
try {
INITIALIZE_METHOD = LazyLoadingProxy.class.getMethod("initialize");
INITIALIZE_METHOD = LazyLoadingProxy.class.getMethod("getTarget");
TO_DBREF_METHOD = LazyLoadingProxy.class.getMethod("toDBRef");
} catch (Exception e) {
throw new RuntimeException(e);
@@ -372,27 +383,4 @@ public class DefaultDbRefResolver implements DbRefResolver {
return result;
}
}
/**
* Static class to accomodate optional dependency on Objenesis.
*
* @author Oliver Gierke
*/
private static class ObjenesisProxyEnhancer {
private static final Objenesis OBJENESIS = new ObjenesisStd(true);
public static Object enhanceAndGet(ProxyFactory proxyFactory, Class<?> type,
org.springframework.cglib.proxy.MethodInterceptor interceptor) {
Enhancer enhancer = new Enhancer();
enhancer.setSuperclass(type);
enhancer.setCallbackType(org.springframework.cglib.proxy.MethodInterceptor.class);
enhancer.setInterfaces(new Class[] { LazyLoadingProxy.class });
Factory factory = (Factory) OBJENESIS.newInstance(enhancer.createClass());
factory.setCallbacks(new Callback[] { interceptor });
return factory;
}
}
}

View File

@@ -0,0 +1,61 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import org.springframework.data.mapping.model.SpELExpressionEvaluator;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import com.mongodb.DBObject;
/**
* Default implementation of {@link DbRefResolverCallback}.
*
* @author Oliver Gierke
*/
class DefaultDbRefResolverCallback implements DbRefResolverCallback {
private final DBObject surroundingObject;
private final ObjectPath path;
private final ValueResolver resolver;
private final SpELExpressionEvaluator evaluator;
/**
* Creates a new {@link DefaultDbRefResolverCallback} using the given {@link DBObject}, {@link ObjectPath},
* {@link ValueResolver} and {@link SpELExpressionEvaluator}.
*
* @param surroundingObject must not be {@literal null}.
* @param path must not be {@literal null}.
* @param evaluator must not be {@literal null}.
* @param resolver must not be {@literal null}.
*/
public DefaultDbRefResolverCallback(DBObject surroundingObject, ObjectPath path, SpELExpressionEvaluator evaluator,
ValueResolver resolver) {
this.surroundingObject = surroundingObject;
this.path = path;
this.resolver = resolver;
this.evaluator = evaluator;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.DbRefResolverCallback#resolve(org.springframework.data.mongodb.core.mapping.MongoPersistentProperty)
*/
@Override
public Object resolve(MongoPersistentProperty property) {
return resolver.getValueInternal(property, surroundingObject, evaluator, path);
}
}

View File

@@ -57,18 +57,15 @@ abstract class GeoConverters {
*
* @return
*/
@SuppressWarnings("unchecked")
public static Collection<? extends Object> getConvertersToRegister() {
return Arrays.asList( //
BoxToDbObjectConverter.INSTANCE //
, PolygonToDbObjectConverter.INSTANCE //
, CircleToDbObjectConverter.INSTANCE //
, LegacyCircleToDbObjectConverter.INSTANCE //
, SphereToDbObjectConverter.INSTANCE //
, DbObjectToBoxConverter.INSTANCE //
, DbObjectToPolygonConverter.INSTANCE //
, DbObjectToCircleConverter.INSTANCE //
, DbObjectToLegacyCircleConverter.INSTANCE //
, DbObjectToSphereConverter.INSTANCE //
, DbObjectToPointConverter.INSTANCE //
, PointToDbObjectConverter.INSTANCE //
@@ -91,13 +88,11 @@ abstract class GeoConverters {
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
@SuppressWarnings("deprecation")
public Point convert(DBObject source) {
Assert.isTrue(source.keySet().size() == 2, "Source must contain 2 elements");
return source == null ? null : new org.springframework.data.mongodb.core.geo.Point((Double) source.get("x"),
(Double) source.get("y"));
return source == null ? null : new Point((Double) source.get("x"), (Double) source.get("y"));
}
}
@@ -151,7 +146,7 @@ abstract class GeoConverters {
}
/**
* Converts a {@link BasicDBList} into a {@link org.springframework.data.mongodb.core.geo.Box}.
* Converts a {@link BasicDBList} into a {@link Box}.
*
* @author Thomas Darimont
* @since 1.5
@@ -166,7 +161,6 @@ abstract class GeoConverters {
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
@SuppressWarnings("deprecation")
public Box convert(DBObject source) {
if (source == null) {
@@ -176,7 +170,7 @@ abstract class GeoConverters {
Point first = DbObjectToPointConverter.INSTANCE.convert((DBObject) source.get("first"));
Point second = DbObjectToPointConverter.INSTANCE.convert((DBObject) source.get("second"));
return new org.springframework.data.mongodb.core.geo.Box(first, second);
return new Box(first, second);
}
}
@@ -210,7 +204,7 @@ abstract class GeoConverters {
}
/**
* Converts a {@link DBObject} into a {@link org.springframework.data.mongodb.core.geo.Circle}.
* Converts a {@link DBObject} into a {@link Circle}.
*
* @author Thomas Darimont
* @since 1.5
@@ -251,71 +245,6 @@ abstract class GeoConverters {
}
}
/**
* Converts a {@link Circle} into a {@link BasicDBList}.
*
* @author Thomas Darimont
* @since 1.5
*/
@SuppressWarnings("deprecation")
public static enum LegacyCircleToDbObjectConverter implements
Converter<org.springframework.data.mongodb.core.geo.Circle, DBObject> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public DBObject convert(org.springframework.data.mongodb.core.geo.Circle source) {
if (source == null) {
return null;
}
DBObject result = new BasicDBObject();
result.put("center", PointToDbObjectConverter.INSTANCE.convert(source.getCenter()));
result.put("radius", source.getRadius());
return result;
}
}
/**
* Converts a {@link BasicDBList} into a {@link org.springframework.data.mongodb.core.geo.Circle}.
*
* @author Thomas Darimont
* @since 1.5
*/
@ReadingConverter
@SuppressWarnings("deprecation")
public static enum DbObjectToLegacyCircleConverter implements
Converter<DBObject, org.springframework.data.mongodb.core.geo.Circle> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public org.springframework.data.mongodb.core.geo.Circle convert(DBObject source) {
if (source == null) {
return null;
}
DBObject centerSource = (DBObject) source.get("center");
Double radius = (Double) source.get("radius");
Assert.notNull(centerSource, "Center must not be null!");
Assert.notNull(radius, "Radius must not be null!");
Point center = DbObjectToPointConverter.INSTANCE.convert(centerSource);
return new org.springframework.data.mongodb.core.geo.Circle(center, radius);
}
}
/**
* Converts a {@link Sphere} into a {@link BasicDBList}.
*
@@ -422,7 +351,7 @@ abstract class GeoConverters {
}
/**
* Converts a {@link BasicDBList} into a {@link org.springframework.data.mongodb.core.geo.Polygon}.
* Converts a {@link BasicDBList} into a {@link Polygon}.
*
* @author Thomas Darimont
* @since 1.5
@@ -437,7 +366,7 @@ abstract class GeoConverters {
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
@SuppressWarnings({ "deprecation", "unchecked" })
@SuppressWarnings({ "unchecked" })
public Polygon convert(DBObject source) {
if (source == null) {
@@ -453,7 +382,7 @@ abstract class GeoConverters {
newPoints.add(DbObjectToPointConverter.INSTANCE.convert(element));
}
return new org.springframework.data.mongodb.core.geo.Polygon(newPoints);
return new Polygon(newPoints);
}
}
@@ -472,7 +401,6 @@ abstract class GeoConverters {
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
@SuppressWarnings("deprecation")
public DBObject convert(GeoCommand source) {
if (source == null) {
@@ -493,10 +421,10 @@ abstract class GeoConverters {
argument.add(toList(((Circle) shape).getCenter()));
argument.add(((Circle) shape).getRadius().getNormalizedValue());
} else if (shape instanceof org.springframework.data.mongodb.core.geo.Circle) {
} else if (shape instanceof Circle) {
argument.add(toList(((org.springframework.data.mongodb.core.geo.Circle) shape).getCenter()));
argument.add(((org.springframework.data.mongodb.core.geo.Circle) shape).getRadius());
argument.add(toList(((Circle) shape).getCenter()));
argument.add(((Circle) shape).getRadius());
} else if (shape instanceof Polygon) {

View File

@@ -23,6 +23,7 @@ import com.mongodb.DBRef;
* Allows direct interaction with the underlying {@link LazyLoadingInterceptor}.
*
* @author Thomas Darimont
* @author Christoph Strobl
* @since 1.5
*/
public interface LazyLoadingProxy {
@@ -33,7 +34,7 @@ public interface LazyLoadingProxy {
* @return
* @since 1.5
*/
Object initialize();
Object getTarget();
/**
* Returns the {@link DBRef} represented by this {@link LazyLoadingProxy}, may be null.

View File

@@ -31,7 +31,7 @@ import org.springframework.context.ApplicationContext;
import org.springframework.context.ApplicationContextAware;
import org.springframework.core.convert.ConversionException;
import org.springframework.core.convert.ConversionService;
import org.springframework.core.convert.support.ConversionServiceFactory;
import org.springframework.core.convert.support.DefaultConversionService;
import org.springframework.data.convert.CollectionFactory;
import org.springframework.data.convert.EntityInstantiator;
import org.springframework.data.convert.TypeMapper;
@@ -56,6 +56,7 @@ import org.springframework.data.util.ClassTypeInformation;
import org.springframework.data.util.TypeInformation;
import org.springframework.expression.spel.standard.SpelExpressionParser;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import org.springframework.util.CollectionUtils;
import com.mongodb.BasicDBList;
@@ -73,7 +74,7 @@ import com.mongodb.DBRef;
* @author Thomas Darimont
* @author Christoph Strobl
*/
public class MappingMongoConverter extends AbstractMongoConverter implements ApplicationContextAware {
public class MappingMongoConverter extends AbstractMongoConverter implements ApplicationContextAware, ValueResolver {
protected static final Logger LOGGER = LoggerFactory.getLogger(MappingMongoConverter.class);
@@ -81,6 +82,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
protected final SpelExpressionParser spelExpressionParser = new SpelExpressionParser();
protected final QueryMapper idMapper;
protected final DbRefResolver dbRefResolver;
protected ApplicationContext applicationContext;
protected MongoTypeMapper typeMapper;
protected String mapKeyDotReplacement = null;
@@ -93,11 +95,10 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* @param mongoDbFactory must not be {@literal null}.
* @param mappingContext must not be {@literal null}.
*/
@SuppressWarnings("deprecation")
public MappingMongoConverter(DbRefResolver dbRefResolver,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext) {
super(ConversionServiceFactory.createDefaultConversionService());
super(new DefaultConversionService());
Assert.notNull(dbRefResolver, "DbRefResolver must not be null!");
Assert.notNull(mappingContext, "MappingContext must not be null!");
@@ -184,11 +185,11 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
protected <S extends Object> S read(TypeInformation<S> type, DBObject dbo) {
return read(type, dbo, null);
return read(type, dbo, ObjectPath.ROOT);
}
@SuppressWarnings("unchecked")
protected <S extends Object> S read(TypeInformation<S> type, DBObject dbo, Object parent) {
private <S extends Object> S read(TypeInformation<S> type, DBObject dbo, ObjectPath path) {
if (null == dbo) {
return null;
@@ -206,11 +207,11 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
if (typeToUse.isCollectionLike() && dbo instanceof BasicDBList) {
return (S) readCollectionOrArray(typeToUse, (BasicDBList) dbo, parent);
return (S) readCollectionOrArray(typeToUse, (BasicDBList) dbo, path);
}
if (typeToUse.isMap()) {
return (S) readMap(typeToUse, dbo, parent);
return (S) readMap(typeToUse, dbo, path);
}
// Retrieve persistent entity info
@@ -220,41 +221,55 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
throw new MappingException("No mapping metadata found for " + rawType.getName());
}
return read(persistentEntity, dbo, parent);
return read(persistentEntity, dbo, path);
}
private ParameterValueProvider<MongoPersistentProperty> getParameterProvider(MongoPersistentEntity<?> entity,
DBObject source, DefaultSpELExpressionEvaluator evaluator, Object parent) {
DBObject source, DefaultSpELExpressionEvaluator evaluator, ObjectPath path) {
MongoDbPropertyValueProvider provider = new MongoDbPropertyValueProvider(source, evaluator, parent);
MongoDbPropertyValueProvider provider = new MongoDbPropertyValueProvider(source, evaluator, path);
PersistentEntityParameterValueProvider<MongoPersistentProperty> parameterProvider = new PersistentEntityParameterValueProvider<MongoPersistentProperty>(
entity, provider, parent);
entity, provider, path.getCurrentObject());
return new ConverterAwareSpELExpressionParameterValueProvider(evaluator, conversionService, parameterProvider,
parent);
return new ConverterAwareSpELExpressionParameterValueProvider(evaluator, conversionService, parameterProvider, path);
}
private <S extends Object> S read(final MongoPersistentEntity<S> entity, final DBObject dbo, final Object parent) {
private <S extends Object> S read(final MongoPersistentEntity<S> entity, final DBObject dbo, final ObjectPath path) {
final DefaultSpELExpressionEvaluator evaluator = new DefaultSpELExpressionEvaluator(dbo, spELContext);
ParameterValueProvider<MongoPersistentProperty> provider = getParameterProvider(entity, dbo, evaluator, parent);
ParameterValueProvider<MongoPersistentProperty> provider = getParameterProvider(entity, dbo, evaluator, path);
EntityInstantiator instantiator = instantiators.getInstantiatorFor(entity);
S instance = instantiator.createInstance(entity, provider);
final BeanWrapper<S> wrapper = BeanWrapper.create(instance, conversionService);
final MongoPersistentProperty idProperty = entity.getIdProperty();
final S result = wrapper.getBean();
// make sure id property is set before all other properties
Object idValue = null;
if (idProperty != null) {
idValue = getValueInternal(idProperty, dbo, evaluator, path);
wrapper.setProperty(idProperty, idValue);
}
final ObjectPath currentPath = path.push(result, entity, idValue);
// Set properties not already set in the constructor
entity.doWithProperties(new PropertyHandler<MongoPersistentProperty>() {
public void doWithPersistentProperty(MongoPersistentProperty prop) {
// we skip the id property since it was already set
if (idProperty != null && idProperty.equals(prop)) {
return;
}
if (!dbo.containsField(prop.getFieldName()) || entity.isConstructorArgument(prop)) {
return;
}
Object obj = getValueInternal(prop, dbo, evaluator, result);
wrapper.setProperty(prop, obj);
wrapper.setProperty(prop, getValueInternal(prop, dbo, evaluator, currentPath));
}
});
@@ -262,19 +277,21 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
entity.doWithAssociations(new AssociationHandler<MongoPersistentProperty>() {
public void doWithAssociation(Association<MongoPersistentProperty> association) {
MongoPersistentProperty property = association.getInverse();
final MongoPersistentProperty property = association.getInverse();
Object value = dbo.get(property.getName());
if (value == null) {
return;
}
DBRef dbref = value instanceof DBRef ? (DBRef) value : null;
Object obj = dbRefResolver.resolveDbRef(property, dbref, new DbRefResolverCallback() {
@Override
public Object resolve(MongoPersistentProperty property) {
return getValueInternal(property, dbo, evaluator, parent);
}
});
DbRefProxyHandler handler = new DefaultDbRefProxyHandler(spELContext, mappingContext,
MappingMongoConverter.this);
DbRefResolverCallback callback = new DefaultDbRefResolverCallback(dbo, currentPath, evaluator,
MappingMongoConverter.this);
wrapper.setProperty(property, obj);
wrapper.setProperty(property, dbRefResolver.resolveDbRef(property, dbref, callback, handler));
}
});
@@ -294,6 +311,11 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
Assert.isTrue(annotation != null, "The referenced property has to be mapped with @DBRef!");
}
// @see DATAMONGO-913
if (object instanceof LazyLoadingProxy) {
return ((LazyLoadingProxy) object).toDBRef();
}
return createDBRef(object, referingProperty);
}
@@ -309,14 +331,17 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return;
}
boolean handledByCustomConverter = conversions.getCustomWriteTarget(obj.getClass(), DBObject.class) != null;
TypeInformation<? extends Object> type = ClassTypeInformation.from(obj.getClass());
Class<?> entityType = obj.getClass();
boolean handledByCustomConverter = conversions.getCustomWriteTarget(entityType, DBObject.class) != null;
TypeInformation<? extends Object> type = ClassTypeInformation.from(entityType);
if (!handledByCustomConverter && !(dbo instanceof BasicDBList)) {
typeMapper.writeType(type, dbo);
}
writeInternal(obj, dbo, type);
Object target = obj instanceof LazyLoadingProxy ? ((LazyLoadingProxy) obj).getTarget() : obj;
writeInternal(target, dbo, type);
}
/**
@@ -332,7 +357,8 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return;
}
Class<?> customTarget = conversions.getCustomWriteTarget(obj.getClass(), DBObject.class);
Class<?> entityType = obj.getClass();
Class<?> customTarget = conversions.getCustomWriteTarget(entityType, DBObject.class);
if (customTarget != null) {
DBObject result = conversionService.convert(obj, DBObject.class);
@@ -340,17 +366,17 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return;
}
if (Map.class.isAssignableFrom(obj.getClass())) {
if (Map.class.isAssignableFrom(entityType)) {
writeMapInternal((Map<Object, Object>) obj, dbo, ClassTypeInformation.MAP);
return;
}
if (Collection.class.isAssignableFrom(obj.getClass())) {
if (Collection.class.isAssignableFrom(entityType)) {
writeCollectionInternal((Collection<?>) obj, ClassTypeInformation.LIST, (BasicDBList) dbo);
return;
}
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(obj.getClass());
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityType);
writeInternal(obj, dbo, entity);
addCustomTypeKeyIfNecessary(typeHint, obj, dbo);
}
@@ -380,7 +406,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
entity.doWithProperties(new PropertyHandler<MongoPersistentProperty>() {
public void doWithPersistentProperty(MongoPersistentProperty prop) {
if (prop.equals(idProperty)) {
if (prop.equals(idProperty) || !prop.isWritable()) {
return;
}
@@ -457,7 +483,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* If we have a LazyLoadingProxy we make sure it is initialized first.
*/
if (obj instanceof LazyLoadingProxy) {
obj = ((LazyLoadingProxy) obj).initialize();
obj = ((LazyLoadingProxy) obj).getTarget();
}
// Lookup potential custom target type
@@ -677,10 +703,11 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
TypeInformation<?> actualType = type != null ? type.getActualType() : null;
Class<?> reference = actualType == null ? Object.class : actualType.getType();
Class<?> valueType = ClassUtils.getUserClass(value.getClass());
boolean notTheSameClass = !value.getClass().equals(reference);
boolean notTheSameClass = !valueType.equals(reference);
if (notTheSameClass) {
typeMapper.writeType(value.getClass(), dbObject);
typeMapper.writeType(valueType, dbObject);
}
}
@@ -786,11 +813,14 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
idMapper.convertId(id));
}
protected Object getValueInternal(MongoPersistentProperty prop, DBObject dbo, SpELExpressionEvaluator eval,
Object parent) {
MongoDbPropertyValueProvider provider = new MongoDbPropertyValueProvider(dbo, spELContext, parent);
return provider.getPropertyValue(prop);
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.ValueResolver#getValueInternal(org.springframework.data.mongodb.core.mapping.MongoPersistentProperty, com.mongodb.DBObject, org.springframework.data.mapping.model.SpELExpressionEvaluator, java.lang.Object)
*/
@Override
public Object getValueInternal(MongoPersistentProperty prop, DBObject dbo, SpELExpressionEvaluator evaluator,
ObjectPath path) {
return new MongoDbPropertyValueProvider(dbo, evaluator, path).getPropertyValue(prop);
}
/**
@@ -798,11 +828,13 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
*
* @param targetType must not be {@literal null}.
* @param sourceValue must not be {@literal null}.
* @param path must not be {@literal null}.
* @return the converted {@link Collection} or array, will never be {@literal null}.
*/
private Object readCollectionOrArray(TypeInformation<?> targetType, BasicDBList sourceValue, Object parent) {
private Object readCollectionOrArray(TypeInformation<?> targetType, BasicDBList sourceValue, ObjectPath path) {
Assert.notNull(targetType);
Assert.notNull(targetType, "Target type must not be null!");
Assert.notNull(path, "Object path must not be null!");
Class<?> collectionType = targetType.getType();
@@ -823,9 +855,9 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
if (dbObjItem instanceof DBRef) {
items.add(DBRef.class.equals(rawComponentType) ? dbObjItem : read(componentType, readRef((DBRef) dbObjItem),
parent));
path));
} else if (dbObjItem instanceof DBObject) {
items.add(read(componentType, (DBObject) dbObjItem, parent));
items.add(read(componentType, (DBObject) dbObjItem, path));
} else {
items.add(getPotentiallyConvertedSimpleRead(dbObjItem, rawComponentType));
}
@@ -838,13 +870,15 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* Reads the given {@link DBObject} into a {@link Map}. will recursively resolve nested {@link Map}s as well.
*
* @param type the {@link Map} {@link TypeInformation} to be used to unmarshall this {@link DBObject}.
* @param dbObject
* @param dbObject must not be {@literal null}
* @param path must not be {@literal null}
* @return
*/
@SuppressWarnings("unchecked")
protected Map<Object, Object> readMap(TypeInformation<?> type, DBObject dbObject, Object parent) {
protected Map<Object, Object> readMap(TypeInformation<?> type, DBObject dbObject, ObjectPath path) {
Assert.notNull(dbObject);
Assert.notNull(dbObject, "DBObject must not be null!");
Assert.notNull(path, "Object path must not be null!");
Class<?> mapType = typeMapper.readType(dbObject, type).getType();
@@ -871,7 +905,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
Object value = entry.getValue();
if (value instanceof DBObject) {
map.put(key, read(valueType, (DBObject) value, parent));
map.put(key, read(valueType, (DBObject) value, path));
} else if (value instanceof DBRef) {
map.put(key, DBRef.class.equals(rawValueType) ? value : read(valueType, readRef((DBRef) value)));
} else {
@@ -883,21 +917,6 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return map;
}
protected <T> List<?> unwrapList(BasicDBList dbList, TypeInformation<T> targetType) {
List<Object> rootList = new ArrayList<Object>();
for (int i = 0; i < dbList.size(); i++) {
Object obj = dbList.get(i);
if (obj instanceof BasicDBList) {
rootList.add(unwrapList((BasicDBList) obj, targetType.getComponentType()));
} else if (obj instanceof DBObject) {
rootList.add(read(targetType, (DBObject) obj));
} else {
rootList.add(obj);
}
}
return rootList;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.MongoWriter#convertToMongoType(java.lang.Object, org.springframework.data.util.TypeInformation)
@@ -919,7 +938,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return getPotentiallyConvertedSimpleWrite(obj);
}
TypeInformation<?> typeHint = typeInformation == null ? null : ClassTypeInformation.OBJECT;
TypeInformation<?> typeHint = typeInformation == null ? ClassTypeInformation.OBJECT : typeInformation;
if (obj instanceof BasicDBList) {
return maybeConvertList((BasicDBList) obj, typeHint);
@@ -1007,24 +1026,34 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return dbObject;
}
/**
* {@link PropertyValueProvider} to evaluate a SpEL expression if present on the property or simply accesses the field
* of the configured source {@link DBObject}.
*
* @author Oliver Gierke
*/
private class MongoDbPropertyValueProvider implements PropertyValueProvider<MongoPersistentProperty> {
private final DBObjectAccessor source;
private final SpELExpressionEvaluator evaluator;
private final Object parent;
private final ObjectPath path;
public MongoDbPropertyValueProvider(DBObject source, SpELContext factory, Object parent) {
this(source, new DefaultSpELExpressionEvaluator(source, factory), parent);
}
public MongoDbPropertyValueProvider(DBObject source, DefaultSpELExpressionEvaluator evaluator, Object parent) {
/**
* Creates a new {@link MongoDbPropertyValueProvider} for the given source, {@link SpELExpressionEvaluator} and
* {@link ObjectPath}.
*
* @param source must not be {@literal null}.
* @param evaluator must not be {@literal null}.
* @param path can be {@literal null}.
*/
public MongoDbPropertyValueProvider(DBObject source, SpELExpressionEvaluator evaluator, ObjectPath path) {
Assert.notNull(source);
Assert.notNull(evaluator);
this.source = new DBObjectAccessor(source);
this.evaluator = evaluator;
this.parent = parent;
this.path = path;
}
/*
@@ -1040,7 +1069,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return null;
}
return readValue(value, property.getTypeInformation(), parent);
return readValue(value, property.getTypeInformation(), path);
}
}
@@ -1053,7 +1082,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
private class ConverterAwareSpELExpressionParameterValueProvider extends
SpELExpressionParameterValueProvider<MongoPersistentProperty> {
private final Object parent;
private final ObjectPath path;
/**
* Creates a new {@link ConverterAwareSpELExpressionParameterValueProvider}.
@@ -1063,10 +1092,10 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* @param delegate must not be {@literal null}.
*/
public ConverterAwareSpELExpressionParameterValueProvider(SpELExpressionEvaluator evaluator,
ConversionService conversionService, ParameterValueProvider<MongoPersistentProperty> delegate, Object parent) {
ConversionService conversionService, ParameterValueProvider<MongoPersistentProperty> delegate, ObjectPath path) {
super(evaluator, conversionService, delegate);
this.parent = parent;
this.path = path;
}
/*
@@ -1075,28 +1104,44 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
*/
@Override
protected <T> T potentiallyConvertSpelValue(Object object, Parameter<T, MongoPersistentProperty> parameter) {
return readValue(object, parameter.getType(), parent);
return readValue(object, parameter.getType(), path);
}
}
@SuppressWarnings("unchecked")
private <T> T readValue(Object value, TypeInformation<?> type, Object parent) {
private <T> T readValue(Object value, TypeInformation<?> type, ObjectPath path) {
Class<?> rawType = type.getType();
if (conversions.hasCustomReadTarget(value.getClass(), rawType)) {
return (T) conversionService.convert(value, rawType);
} else if (value instanceof DBRef) {
return (T) (rawType.equals(DBRef.class) ? value : read(type, readRef((DBRef) value), parent));
return potentiallyReadOrResolveDbRef((DBRef) value, type, path, rawType);
} else if (value instanceof BasicDBList) {
return (T) readCollectionOrArray(type, (BasicDBList) value, parent);
return (T) readCollectionOrArray(type, (BasicDBList) value, path);
} else if (value instanceof DBObject) {
return (T) read(type, (DBObject) value, parent);
return (T) read(type, (DBObject) value, path);
} else {
return (T) getPotentiallyConvertedSimpleRead(value, rawType);
}
}
@SuppressWarnings("unchecked")
private <T> T potentiallyReadOrResolveDbRef(DBRef dbref, TypeInformation<?> type, ObjectPath path, Class<?> rawType) {
if (rawType.equals(DBRef.class)) {
return (T) dbref;
}
Object object = dbref == null ? null : path.getPathItem(dbref.getId(), dbref.getRef());
if (object != null) {
return (T) object;
}
return (T) (object != null ? object : read(type, readRef(dbref), path));
}
/**
* Performs the fetch operation for the given {@link DBRef}.
*

View File

@@ -25,6 +25,8 @@ import org.springframework.core.convert.ConversionFailedException;
import org.springframework.core.convert.TypeDescriptor;
import org.springframework.core.convert.converter.Converter;
import org.springframework.data.convert.ReadingConverter;
import org.springframework.data.convert.WritingConverter;
import org.springframework.data.mongodb.core.query.Term;
import org.springframework.util.StringUtils;
import com.mongodb.DBObject;
@@ -34,6 +36,7 @@ import com.mongodb.DBObject;
*
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
*/
abstract class MongoConverters {
@@ -160,4 +163,19 @@ abstract class MongoConverters {
return source == null ? null : source.toString();
}
}
/**
* @author Christoph Strobl
* @since 1.6
*/
@WritingConverter
public static enum TermToStringConverter implements Converter<Term, String> {
INSTANCE;
@Override
public String convert(Term source) {
return source == null ? null : source.getFormatted();
}
}
}

View File

@@ -0,0 +1,161 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.util.Assert;
import com.mongodb.DBObject;
/**
* A path of objects nested into each other. The type allows access to all parent objects currently in creation even
* when resolving more nested objects. This allows to avoid re-resolving object instances that are logically equivalent
* to already resolved ones.
* <p>
* An immutable ordered set of target objects for {@link DBObject} to {@link Object} conversions. Object paths can be
* constructed by the {@link #toObjectPath(Object)} method and extended via {@link #push(Object)}.
*
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.6
*/
class ObjectPath {
public static final ObjectPath ROOT = new ObjectPath();
private final List<ObjectPathItem> items;
private ObjectPath() {
this.items = Collections.emptyList();
}
/**
* Creates a new {@link ObjectPath} from the given parent {@link ObjectPath} by adding the provided
* {@link ObjectPathItem} to it.
*
* @param parent can be {@literal null}.
* @param item
*/
private ObjectPath(ObjectPath parent, ObjectPath.ObjectPathItem item) {
List<ObjectPath.ObjectPathItem> items = new ArrayList<ObjectPath.ObjectPathItem>(parent.items);
items.add(item);
this.items = Collections.unmodifiableList(items);
}
/**
* Returns a copy of the {@link ObjectPath} with the given {@link Object} as current object.
*
* @param object must not be {@literal null}.
* @param entity must not be {@literal null}.
* @param id must not be {@literal null}.
* @return
*/
public ObjectPath push(Object object, MongoPersistentEntity<?> entity, Object id) {
Assert.notNull(object, "Object must not be null!");
Assert.notNull(entity, "MongoPersistentEntity must not be null!");
ObjectPathItem item = new ObjectPathItem(object, id, entity.getCollection());
return new ObjectPath(this, item);
}
/**
* Returns the object with the given id and stored in the given collection if it's contained in the {@link ObjectPath}
* .
*
* @param id must not be {@literal null}.
* @param collection must not be {@literal null} or empty.
* @return
*/
public Object getPathItem(Object id, String collection) {
Assert.notNull(id, "Id must not be null!");
Assert.hasText(collection, "Collection name must not be null!");
for (ObjectPathItem item : items) {
Object object = item.getObject();
if (object == null) {
continue;
}
if (item.getIdValue() == null) {
continue;
}
if (collection.equals(item.getCollection()) && id.equals(item.getIdValue())) {
return object;
}
}
return null;
}
/**
* Returns the current object of the {@link ObjectPath} or {@literal null} if the path is empty.
*
* @return
*/
public Object getCurrentObject() {
return items.isEmpty() ? null : items.get(items.size() - 1).getObject();
}
/**
* An item in an {@link ObjectPath}.
*
* @author Thomas Darimont
* @author Oliver Gierke
*/
private static class ObjectPathItem {
private final Object object;
private final Object idValue;
private final String collection;
/**
* Creates a new {@link ObjectPathItem}.
*
* @param object
* @param idValue
* @param collection
*/
ObjectPathItem(Object object, Object idValue, String collection) {
this.object = object;
this.idValue = idValue;
this.collection = collection;
}
public Object getObject() {
return object;
}
public Object getIdValue() {
return idValue;
}
public String getCollection() {
return collection;
}
}
}

View File

@@ -57,6 +57,11 @@ import com.mongodb.DBRef;
public class QueryMapper {
private static final List<String> DEFAULT_ID_NAMES = Arrays.asList("id", "_id");
private static final DBObject META_TEXT_SCORE = new BasicDBObject("$meta", "textScore");
private enum MetaMapping {
FORCE, WHEN_PRESENT, IGNORE;
}
private final ConversionService conversionService;
private final MongoConverter converter;
@@ -119,6 +124,61 @@ public class QueryMapper {
return result;
}
/**
* Maps fields used for sorting to the {@link MongoPersistentEntity}s properties. <br />
* Also converts properties to their {@code $meta} representation if present.
*
* @param sortObject
* @param entity
* @return
* @since 1.6
*/
public DBObject getMappedSort(DBObject sortObject, MongoPersistentEntity<?> entity) {
if (sortObject == null) {
return null;
}
DBObject mappedSort = getMappedObject(sortObject, entity);
mapMetaAttributes(mappedSort, entity, MetaMapping.WHEN_PRESENT);
return mappedSort;
}
/**
* Maps fields to retrieve to the {@link MongoPersistentEntity}s properties. <br />
* Also onverts and potentially adds missing property {@code $meta} representation.
*
* @param fieldsObject
* @param entity
* @return
* @since 1.6
*/
public DBObject getMappedFields(DBObject fieldsObject, MongoPersistentEntity<?> entity) {
DBObject mappedFields = fieldsObject != null ? getMappedObject(fieldsObject, entity) : new BasicDBObject();
mapMetaAttributes(mappedFields, entity, MetaMapping.FORCE);
return mappedFields.keySet().isEmpty() ? null : mappedFields;
}
private void mapMetaAttributes(DBObject source, MongoPersistentEntity<?> entity, MetaMapping metaMapping) {
if (entity == null || source == null) {
return;
}
if (entity.hasTextScoreProperty() && !MetaMapping.IGNORE.equals(metaMapping)) {
MongoPersistentProperty textScoreProperty = entity.getTextScoreProperty();
if (MetaMapping.FORCE.equals(metaMapping)
|| (MetaMapping.WHEN_PRESENT.equals(metaMapping) && source.containsField(textScoreProperty.getFieldName()))) {
source.putAll(getMappedTextScoreField(textScoreProperty));
}
}
}
private DBObject getMappedTextScoreField(MongoPersistentProperty property) {
return new BasicDBObject(property.getFieldName(), META_TEXT_SCORE);
}
/**
* Extracts the mapped object value for given field out of rawValue taking nested {@link Keyword}s into account
*
@@ -250,14 +310,31 @@ public class QueryMapper {
* type of the given value is compatible with the type of the given document field in order to deal with potential
* query field exclusions, since MongoDB uses the {@code int} {@literal 0} as an indicator for an excluded field.
*
* @param documentField
* @param documentField must not be {@literal null}.
* @param value
* @return
*/
protected boolean isAssociationConversionNecessary(Field documentField, Object value) {
return documentField.isAssociation() && value != null
&& (documentField.getProperty().getActualType().isAssignableFrom(value.getClass()) //
|| documentField.getPropertyEntity().getIdProperty().getActualType().isAssignableFrom(value.getClass()));
Assert.notNull(documentField, "Document field must not be null!");
if (value == null) {
return false;
}
if (!documentField.isAssociation()) {
return false;
}
Class<? extends Object> type = value.getClass();
MongoPersistentProperty property = documentField.getProperty();
if (property.getActualType().isAssignableFrom(type)) {
return true;
}
MongoPersistentEntity<?> entity = documentField.getPropertyEntity();
return entity.hasIdProperty() && entity.getIdProperty().getActualType().isAssignableFrom(type);
}
/**
@@ -289,7 +366,7 @@ public class QueryMapper {
* @return the converted mongo type or null if source is null
*/
protected Object delegateConvertToMongoType(Object source, MongoPersistentEntity<?> entity) {
return converter.convertToMongoType(source);
return converter.convertToMongoType(source, entity == null ? null : entity.getTypeInformation());
}
protected Object convertAssociation(Object source, Field field) {
@@ -594,6 +671,21 @@ public class QueryMapper {
*/
public MetadataBackedField(String name, MongoPersistentEntity<?> entity,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> context) {
this(name, entity, context, null);
}
/**
* Creates a new {@link MetadataBackedField} with the given name, {@link MongoPersistentEntity} and
* {@link MappingContext} with the given {@link MongoPersistentProperty}.
*
* @param name must not be {@literal null} or empty.
* @param entity must not be {@literal null}.
* @param context must not be {@literal null}.
* @param property may be {@literal null}.
*/
public MetadataBackedField(String name, MongoPersistentEntity<?> entity,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> context,
MongoPersistentProperty property) {
super(name);
@@ -603,7 +695,7 @@ public class QueryMapper {
this.mappingContext = context;
this.path = getPath(name);
this.property = path == null ? null : path.getLeafProperty();
this.property = path == null ? property : path.getLeafProperty();
this.association = findAssociation();
}
@@ -613,7 +705,7 @@ public class QueryMapper {
*/
@Override
public MetadataBackedField with(String name) {
return new MetadataBackedField(name, entity, mappingContext);
return new MetadataBackedField(name, entity, mappingContext, property);
}
/*

View File

@@ -25,6 +25,7 @@ import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty.PropertyToFieldNameConverter;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update.Modifier;
import org.springframework.data.mongodb.core.query.Update.Modifiers;
import org.springframework.data.util.ClassTypeInformation;
@@ -79,10 +80,19 @@ public class UpdateMapper extends QueryMapper {
return createMapEntry(field, convertSimpleOrDBObject(rawValue, field.getPropertyEntity()));
}
if (!isUpdateModifier(rawValue)) {
return super.getMappedObjectForField(field, getMappedValue(field, rawValue));
if (isQuery(rawValue)) {
return createMapEntry(field,
super.getMappedObject(((Query) rawValue).getQueryObject(), field.getPropertyEntity()));
}
if (isUpdateModifier(rawValue)) {
return getMappedUpdateModifier(field, rawValue);
}
return super.getMappedObjectForField(field, getMappedValue(field, rawValue));
}
private Entry<String, Object> getMappedUpdateModifier(Field field, Object rawValue) {
Object value = null;
if (rawValue instanceof Modifier) {
@@ -99,7 +109,6 @@ public class UpdateMapper extends QueryMapper {
value = modificationOperations;
} else {
throw new IllegalArgumentException(String.format("Unable to map value of type '%s'!", rawValue.getClass()));
}
@@ -119,6 +128,10 @@ public class UpdateMapper extends QueryMapper {
return value instanceof Modifier || value instanceof Modifiers;
}
private boolean isQuery(Object value) {
return value instanceof Query;
}
private DBObject getMappedValue(Modifier modifier) {
Object value = converter.convertToMongoType(modifier.getValue(), ClassTypeInformation.OBJECT);

View File

@@ -0,0 +1,42 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import org.springframework.data.mapping.model.SpELExpressionEvaluator;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import com.mongodb.DBObject;
/**
* Internal API to trigger the resolution of properties.
*
* @author Oliver Gierke
*/
interface ValueResolver {
/**
* Resolves the value for the given {@link MongoPersistentProperty} within the given {@link DBObject} using the given
* {@link SpELExpressionEvaluator} and {@link ObjectPath}.
*
* @param prop
* @param dbo
* @param evaluator
* @param parent
* @return
*/
Object getValueInternal(MongoPersistentProperty prop, DBObject dbo, SpELExpressionEvaluator evaluator,
ObjectPath parent);
}

View File

@@ -1,75 +0,0 @@
/*
* Copyright 2010-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import org.springframework.data.geo.Point;
/**
* Represents a geospatial box value.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Box}. This class is scheduled to be
* removed in the next major release.
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
*/
@Deprecated
public class Box extends org.springframework.data.geo.Box implements Shape {
public static final String COMMAND = "$box";
public Box(Point lowerLeft, Point upperRight) {
super(lowerLeft, upperRight);
}
public Box(double[] lowerLeft, double[] upperRight) {
super(lowerLeft, upperRight);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Shape#asList()
*/
public List<? extends Object> asList() {
List<List<Double>> list = new ArrayList<List<Double>>();
list.add(Arrays.asList(getFirst().getX(), getFirst().getY()));
list.add(Arrays.asList(getSecond().getX(), getSecond().getY()));
return list;
}
public org.springframework.data.mongodb.core.geo.Point getLowerLeft() {
return new org.springframework.data.mongodb.core.geo.Point(getFirst());
}
public org.springframework.data.mongodb.core.geo.Point getUpperRight() {
return new org.springframework.data.mongodb.core.geo.Point(getSecond());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Shape#getCommand()
*/
public String getCommand() {
return COMMAND;
}
}

View File

@@ -1,154 +0,0 @@
/*
* Copyright 2010-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import org.springframework.data.annotation.PersistenceConstructor;
import org.springframework.data.geo.Distance;
import org.springframework.data.geo.Metrics;
import org.springframework.data.geo.Point;
import org.springframework.util.Assert;
/**
* Represents a geospatial circle value.
* <p>
* Note: We deliberately do not extend org.springframework.data.geo.Circle because introducing it's distance concept
* would break the clients that use the old Circle API.
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Circle}. This class is scheduled to be
* removed in the next major release.
*/
@Deprecated
public class Circle implements Shape {
public static final String COMMAND = "$center";
private final Point center;
private final double radius;
/**
* Creates a new {@link Circle} from the given {@link Point} and radius.
*
* @param center must not be {@literal null}.
* @param radius must be greater or equal to zero.
*/
@PersistenceConstructor
public Circle(Point center, double radius) {
Assert.notNull(center);
Assert.isTrue(radius >= 0, "Radius must not be negative!");
this.center = center;
this.radius = radius;
}
/**
* Creates a new {@link Circle} from the given coordinates and radius as {@link Distance} with a
* {@link Metrics#NEUTRAL}.
*
* @param centerX
* @param centerY
* @param radius must be greater or equal to zero.
*/
public Circle(double centerX, double centerY, double radius) {
this(new Point(centerX, centerY), radius);
}
/**
* Returns the center of the {@link Circle}.
*
* @return will never be {@literal null}.
*/
public Point getCenter() {
return center;
}
/**
* Returns the radius of the {@link Circle}.
*
* @return
*/
public double getRadius() {
return radius;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Shape#asList()
*/
public List<Object> asList() {
List<Object> result = new ArrayList<Object>();
result.add(Arrays.asList(getCenter().getX(), getCenter().getY()));
result.add(getRadius());
return result;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Shape#getCommand()
*/
public String getCommand() {
return COMMAND;
}
/*
* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
return String.format("Circle [center=%s, radius=%f]", center, radius);
}
/* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (obj == null || !getClass().equals(obj.getClass())) {
return false;
}
Circle that = (Circle) obj;
return this.center.equals(that.center) && this.radius == that.radius;
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
int result = 17;
result += 31 * center.hashCode();
result += 31 * radius;
return result;
}
}

View File

@@ -1,37 +0,0 @@
/*
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
/**
* Value object to create custom {@link Metric}s on the fly.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Metric}. This class is scheduled to be
* removed in the next major release.
* @author Oliver Gierke
* @author Thomas Darimont
*/
@Deprecated
public class CustomMetric extends org.springframework.data.geo.CustomMetric implements Metric {
/**
* Creates a custom {@link Metric} using the given multiplier.
*
* @param multiplier
*/
public CustomMetric(double multiplier) {
super(multiplier);
}
}

View File

@@ -1,44 +0,0 @@
/*
* Copyright 2010-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import org.springframework.data.geo.Metric;
import org.springframework.data.geo.Metrics;
/**
* Value object to represent distances in a given metric.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Distance}. This class is scheduled to
* be removed in the next major release.
* @author Oliver Gierke
* @author Thomas Darimont
*/
@Deprecated
public class Distance extends org.springframework.data.geo.Distance {
/**
* Creates a new {@link Distance}.
*
* @param value
*/
public Distance(double value) {
this(value, Metrics.NEUTRAL);
}
public Distance(double value, Metric metric) {
super(value, metric);
}
}

View File

@@ -1,54 +0,0 @@
/*
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import org.springframework.data.domain.Page;
import org.springframework.data.domain.Pageable;
/**
* Custom {@link Page} to carry the average distance retrieved from the {@link GeoResults} the {@link GeoPage} is set up
* from.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.GeoPage}. This class is scheduled to
* be removed in the next major release.
* @author Oliver Gierke
* @author Thomas Darimont
*/
@Deprecated
public class GeoPage<T> extends org.springframework.data.geo.GeoPage<T> {
private static final long serialVersionUID = 23421312312412L;
/**
* Creates a new {@link GeoPage} from the given {@link GeoResults}.
*
* @param content must not be {@literal null}.
*/
public GeoPage(GeoResults<T> results) {
super(results);
}
/**
* Creates a new {@link GeoPage} from the given {@link GeoResults}, {@link Pageable} and total.
*
* @param results must not be {@literal null}.
* @param pageable must not be {@literal null}.
* @param total
*/
public GeoPage(GeoResults<T> results, Pageable pageable, long total) {
super(results, pageable, total);
}
}

View File

@@ -1,38 +0,0 @@
/*
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
/**
* Calue object capturing some arbitrary object plus a distance.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.GeoResult}. This class is scheduled to
* be removed in the next major release.
* @author Oliver Gierke
* @author Thomas Darimont
*/
@Deprecated
public class GeoResult<T> extends org.springframework.data.geo.GeoResult<T> {
/**
* Creates a new {@link GeoResult} for the given content and distance.
*
* @param content must not be {@literal null}.
* @param distance must not be {@literal null}.
*/
public GeoResult(T content, Distance distance) {
super(content, distance);
}
}

View File

@@ -1,60 +0,0 @@
/*
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.List;
import org.springframework.data.annotation.PersistenceConstructor;
import org.springframework.data.geo.Distance;
import org.springframework.data.geo.GeoResult;
import org.springframework.data.geo.Metric;
/**
* Value object to capture {@link GeoResult}s as well as the average distance they have.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.GeoResults}. This class is scheduled
* to be removed in the next major release.
* @author Oliver Gierke
* @author Thomas Darimont
*/
@Deprecated
public class GeoResults<T> extends org.springframework.data.geo.GeoResults<T> {
/**
* Creates a new {@link GeoResults} instance manually calculating the average distance from the distance values of the
* given {@link GeoResult}s.
*
* @param results must not be {@literal null}.
*/
public GeoResults(List<? extends GeoResult<T>> results) {
super(results);
}
public GeoResults(List<? extends GeoResult<T>> results, Metric metric) {
super(results, metric);
}
/**
* Creates a new {@link GeoResults} instance from the given {@link GeoResult}s and average distance.
*
* @param results must not be {@literal null}.
* @param averageDistance
*/
@PersistenceConstructor
public GeoResults(List<? extends GeoResult<T>> results, Distance averageDistance) {
super(results, averageDistance);
}
}

View File

@@ -1,48 +0,0 @@
/*
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import org.springframework.data.mongodb.core.query.NearQuery;
/**
* Commonly used {@link Metrics} for {@link NearQuery}s.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Metrics}. This class is scheduled to
* be removed in the next major release.
* @author Oliver Gierke
* @author Thomas Darimont
*/
@Deprecated
public enum Metrics implements Metric {
KILOMETERS(org.springframework.data.geo.Metrics.KILOMETERS.getMultiplier()), //
MILES(org.springframework.data.geo.Metrics.MILES.getMultiplier()), //
NEUTRAL(org.springframework.data.geo.Metrics.NEUTRAL.getMultiplier()); //
private final double multiplier;
private Metrics(double multiplier) {
this.multiplier = multiplier;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Metric#getMultiplier()
*/
public double getMultiplier() {
return multiplier;
}
}

View File

@@ -1,55 +0,0 @@
/*
* Copyright 2010-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.Arrays;
import java.util.List;
import org.springframework.data.annotation.PersistenceConstructor;
/**
* Represents a geospatial point value.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Point}. This class is scheduled to be
* removed in the next major release.
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
*/
@Deprecated
public class Point extends org.springframework.data.geo.Point {
@PersistenceConstructor
public Point(double x, double y) {
super(x, y);
}
public Point(org.springframework.data.geo.Point point) {
super(point);
}
public double[] asArray() {
return new double[] { getX(), getY() };
}
public List<Double> asList() {
return asList(this);
}
public static List<Double> asList(org.springframework.data.geo.Point point) {
return Arrays.asList(point.getX(), point.getY());
}
}

View File

@@ -1,92 +0,0 @@
/*
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import org.springframework.data.geo.Point;
/**
* Simple value object to represent a {@link Polygon}.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Point}. This class is scheduled to be
* removed in the next major release.
* @author Oliver Gierke
* @author Thomas Darimont
*/
@Deprecated
public class Polygon extends org.springframework.data.geo.Polygon implements Shape {
public static final String COMMAND = "$polygon";
/**
* Creates a new {@link Polygon} for the given Points.
*
* @param x
* @param y
* @param z
* @param others
*/
public <P extends Point> Polygon(P x, P y, P z, P... others) {
super(x, y, z, others);
}
/**
* Creates a new {@link Polygon} for the given Points.
*
* @param points
*/
public <P extends Point> Polygon(List<P> points) {
super(points);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Shape#getCommand()
*/
public String getCommand() {
return COMMAND;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Shape#asList()
*/
@Override
public List<? extends Object> asList() {
return asList(this);
}
/**
* Returns a {@link List} of x,y-coordinate tuples of {@link Point}s from the given {@link Polygon}.
*
* @param polygon
* @return
*/
public static List<? extends Object> asList(org.springframework.data.geo.Polygon polygon) {
List<Point> points = polygon.getPoints();
List<List<Double>> tuples = new ArrayList<List<Double>>(points.size());
for (Point point : points) {
tuples.add(Arrays.asList(point.getX(), point.getY()));
}
return tuples;
}
}

View File

@@ -1,45 +0,0 @@
/*
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.List;
/**
* Common interface for all shapes. Allows building MongoDB representations of them.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Shape}. This class is scheduled to be
* removed in the next major release.
* @author Oliver Gierke
* @author Thomas Darimont
*/
@Deprecated
public interface Shape extends org.springframework.data.geo.Shape {
/**
* Returns the {@link Shape} as a list of usually {@link Double} or {@link List}s of {@link Double}s. Wildcard bound
* to allow implementations to return a more concrete element type.
*
* @return
*/
List<? extends Object> asList();
/**
* Returns the command to be used to create the {@literal $within} criterion.
*
* @return
*/
String getCommand();
}

View File

@@ -22,6 +22,7 @@ import org.springframework.data.annotation.PersistenceConstructor;
import org.springframework.data.geo.Circle;
import org.springframework.data.geo.Distance;
import org.springframework.data.geo.Point;
import org.springframework.data.geo.Shape;
import org.springframework.util.Assert;
/**
@@ -30,7 +31,6 @@ import org.springframework.util.Assert;
* @author Thomas Darimont
* @since 1.5
*/
@SuppressWarnings("deprecation")
public class Sphere implements Shape {
public static final String COMMAND = "$centerSphere";
@@ -73,23 +73,13 @@ public class Sphere implements Shape {
this(circle.getCenter(), circle.getRadius());
}
/**
* Creates a Sphere from the given {@link Circle}.
*
* @param circle
*/
@Deprecated
public Sphere(org.springframework.data.mongodb.core.geo.Circle circle) {
this(circle.getCenter(), circle.getRadius());
}
/**
* Returns the center of the {@link Circle}.
*
* @return will never be {@literal null}.
*/
public org.springframework.data.mongodb.core.geo.Point getCenter() {
return new org.springframework.data.mongodb.core.geo.Point(this.center);
public Point getCenter() {
return new Point(this.center);
}
/**
@@ -141,20 +131,21 @@ public class Sphere implements Shape {
return result;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Shape#asList()
/**
* Returns the {@link Shape} as a list of usually {@link Double} or {@link List}s of {@link Double}s. Wildcard bound
* to allow implementations to return a more concrete element type.
*
* @return
*/
@Override
public List<? extends Object> asList() {
return Arrays.asList(Arrays.asList(center.getX(), center.getY()), this.radius.getValue());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Shape#getCommand()
/**
* Returns the command to be used to create the {@literal $within} criterion.
*
* @return
*/
@Override
public String getCommand() {
return COMMAND;
}

View File

@@ -36,11 +36,12 @@ public @interface CompoundIndex {
/**
* The actual index definition in JSON format. The keys of the JSON document are the fields to be indexed, the values
* define the index direction (1 for ascending, -1 for descending).
* define the index direction (1 for ascending, -1 for descending). <br />
* If left empty on nested document, the whole document will be indexed.
*
* @return
*/
String def();
String def() default "";
/**
* It does not actually make sense to use that attribute as the direction has to be defined in the {@link #def()}
@@ -78,6 +79,15 @@ public @interface CompoundIndex {
*/
String name() default "";
/**
* If set to {@literal true} then MongoDB will ignore the given index name and instead generate a new name. Defaults
* to {@literal false}.
*
* @return
* @since 1.5
*/
boolean useGeneratedName() default false;
/**
* The collection the index will be created in. Will default to the collection the annotated domain class will be
* stored in.
@@ -97,8 +107,11 @@ public @interface CompoundIndex {
/**
* Configures the number of seconds after which the collection should expire. Defaults to -1 for no expiry.
*
* @deprecated TTL cannot be defined for {@link CompoundIndex} having more than one field as key. Will be removed in
* 1.6.
* @see http://docs.mongodb.org/manual/tutorial/expire-data/
* @return
*/
@Deprecated
int expireAfterSeconds() default -1;
}

View File

@@ -0,0 +1,56 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.index;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Index definition to span multiple keys.
*
* @author Christoph Strobl
* @since 1.5
*/
public class CompoundIndexDefinition extends Index {
private DBObject keys;
/**
* Creates a new {@link CompoundIndexDefinition} for the given keys.
*
* @param keys must not be {@literal null}.
*/
public CompoundIndexDefinition(DBObject keys) {
Assert.notNull(keys, "Keys must not be null!");
this.keys = keys;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.Index#getIndexKeys()
*/
@Override
public DBObject getIndexKeys() {
BasicDBObject dbo = new BasicDBObject();
dbo.putAll(this.keys);
dbo.putAll(super.getIndexKeys());
return dbo;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2013 the original author or authors.
* Copyright 2010-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -25,6 +25,7 @@ import java.lang.annotation.Target;
*
* @author Jon Brisbin
* @author Laurent Canet
* @author Thomas Darimont
*/
@Target(ElementType.FIELD)
@Retention(RetentionPolicy.RUNTIME)
@@ -37,6 +38,15 @@ public @interface GeoSpatialIndexed {
*/
String name() default "";
/**
* If set to {@literal true} then MongoDB will ignore the given index name and instead generate a new name. Defaults
* to {@literal false}.
*
* @return
* @since 1.5
*/
boolean useGeneratedName() default false;
/**
* Name of the collection in which to create the index.
*

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2013 the original author or authors.
* Copyright 2010-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -27,6 +27,7 @@ import com.mongodb.DBObject;
* @author Jon Brisbin
* @author Oliver Gierke
* @author Laurent Canet
* @author Christoph Strobl
*/
public class GeospatialIndex implements IndexDefinition {
@@ -57,8 +58,6 @@ public class GeospatialIndex implements IndexDefinition {
*/
public GeospatialIndex named(String name) {
Assert.hasText(name, "Name must have text!");
this.name = name;
return this;
}
@@ -151,12 +150,12 @@ public class GeospatialIndex implements IndexDefinition {
public DBObject getIndexOptions() {
if (name == null && min == null && max == null && bucketSize == null) {
if (!StringUtils.hasText(name) && min == null && max == null && bucketSize == null) {
return null;
}
DBObject dbo = new BasicDBObject();
if (name != null) {
if (StringUtils.hasText(name)) {
dbo.put("name", name);
}
@@ -186,6 +185,7 @@ public class GeospatialIndex implements IndexDefinition {
}
break;
}
return dbo;
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2013 the original author or authors.
* Copyright 2010-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -17,13 +17,20 @@ package org.springframework.data.mongodb.core.index;
import java.util.LinkedHashMap;
import java.util.Map;
import java.util.concurrent.TimeUnit;
import org.springframework.data.domain.Sort.Direction;
import org.springframework.data.mongodb.core.query.Order;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* @author Oliver Gierke
* @author Christoph Strobl
*/
@SuppressWarnings("deprecation")
public class Index implements IndexDefinition {
@@ -41,6 +48,10 @@ public class Index implements IndexDefinition {
private boolean sparse = false;
private boolean background = false;
private long expire = -1;
public Index() {}
public Index(String key, Direction direction) {
@@ -104,6 +115,44 @@ public class Index implements IndexDefinition {
return this;
}
/**
* Build the index in background (non blocking).
*
* @return
* @since 1.5
*/
public Index background() {
this.background = true;
return this;
}
/**
* Specifies TTL in seconds.
*
* @param value
* @return
* @since 1.5
*/
public Index expire(long value) {
return expire(value, TimeUnit.SECONDS);
}
/**
* Specifies TTL with given {@link TimeUnit}.
*
* @param value
* @param unit
* @return
* @since 1.5
*/
public Index expire(long value, TimeUnit unit) {
Assert.notNull(unit, "TimeUnit for expiration must not be null.");
this.expire = unit.toSeconds(value);
return this;
}
/**
* @see http://docs.mongodb.org/manual/core/index-creation/#index-creation-duplicate-dropping
* @param duplicates
@@ -125,11 +174,9 @@ public class Index implements IndexDefinition {
}
public DBObject getIndexOptions() {
if (name == null && !unique) {
return null;
}
DBObject dbo = new BasicDBObject();
if (name != null) {
if (StringUtils.hasText(name)) {
dbo.put("name", name);
}
if (unique) {
@@ -141,6 +188,13 @@ public class Index implements IndexDefinition {
if (sparse) {
dbo.put("sparse", true);
}
if (background) {
dbo.put("background", true);
}
if (expire >= 0) {
dbo.put("expireAfterSeconds", expire);
}
return dbo;
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright (c) 2011-2014 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -20,11 +20,11 @@ import com.mongodb.DBObject;
/**
* @author Jon Brisbin <jbrisbin@vmware.com>
* @author Christoph Strobl
*/
public interface IndexDefinition {
DBObject getIndexKeys();
DBObject getIndexOptions();
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2012-2013 the original author or authors.
* Copyright 2012-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -24,22 +24,33 @@ import org.springframework.util.ObjectUtils;
* Value object for an index field.
*
* @author Oliver Gierke
* @author Christoph Strobl
*/
@SuppressWarnings("deprecation")
public final class IndexField {
enum Type {
GEO, TEXT, DEFAULT;
}
private final String key;
private final Direction direction;
private final boolean isGeo;
private final Type type;
private final Float weight;
private IndexField(String key, Direction direction, boolean isGeo) {
private IndexField(String key, Direction direction, Type type) {
this(key, direction, type, Float.NaN);
}
private IndexField(String key, Direction direction, Type type, Float weight) {
Assert.hasText(key);
Assert.isTrue(direction != null ^ isGeo);
Assert.isTrue(direction != null ^ (Type.GEO.equals(type) || Type.TEXT.equals(type)));
this.key = key;
this.direction = direction;
this.isGeo = isGeo;
this.type = type == null ? Type.DEFAULT : type;
this.weight = weight == null ? Float.NaN : weight;
}
/**
@@ -53,12 +64,12 @@ public final class IndexField {
@Deprecated
public static IndexField create(String key, Order order) {
Assert.notNull(order);
return new IndexField(key, order.toDirection(), false);
return new IndexField(key, order.toDirection(), Type.DEFAULT);
}
public static IndexField create(String key, Direction order) {
Assert.notNull(order);
return new IndexField(key, order, false);
return new IndexField(key, order, Type.DEFAULT);
}
/**
@@ -68,7 +79,16 @@ public final class IndexField {
* @return
*/
public static IndexField geo(String key) {
return new IndexField(key, null, true);
return new IndexField(key, null, Type.GEO);
}
/**
* Creates a text {@link IndexField} for the given key.
*
* @since 1.6
*/
public static IndexField text(String key, Float weight) {
return new IndexField(key, null, Type.TEXT, weight);
}
/**
@@ -101,10 +121,20 @@ public final class IndexField {
/**
* Returns whether the {@link IndexField} is a geo index field.
*
* @return the isGeo
* @return true if type is {@link Type#GEO}.
*/
public boolean isGeo() {
return isGeo;
return Type.GEO.equals(type);
}
/**
* Returns wheter the {@link IndexField} is a text index field.
*
* @return true if type is {@link Type#TEXT}
* @since 1.6
*/
public boolean isText() {
return Type.TEXT.equals(type);
}
/*
@@ -125,7 +155,7 @@ public final class IndexField {
IndexField that = (IndexField) obj;
return this.key.equals(that.key) && ObjectUtils.nullSafeEquals(this.direction, that.direction)
&& this.isGeo == that.isGeo;
&& this.type == that.type;
}
/*
@@ -138,7 +168,8 @@ public final class IndexField {
int result = 17;
result += 31 * ObjectUtils.nullSafeHashCode(key);
result += 31 * ObjectUtils.nullSafeHashCode(direction);
result += 31 * ObjectUtils.nullSafeHashCode(isGeo);
result += 31 * ObjectUtils.nullSafeHashCode(type);
result += 31 * ObjectUtils.nullSafeHashCode(weight);
return result;
}
@@ -148,6 +179,7 @@ public final class IndexField {
*/
@Override
public String toString() {
return String.format("IndexField [ key: %s, direction: %s, isGeo: %s]", key, direction, isGeo);
return String.format("IndexField [ key: %s, direction: %s, type: %s, weight: %s]", key, direction, type, weight);
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2002-2011 the original author or authors.
* Copyright 2002-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -23,6 +23,11 @@ import java.util.List;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
/**
* @author Mark Pollack
* @author Oliver Gierke
* @author Christoph Strobl
*/
public class IndexInfo {
private final List<IndexField> indexFields;
@@ -31,14 +36,30 @@ public class IndexInfo {
private final boolean unique;
private final boolean dropDuplicates;
private final boolean sparse;
private final String language;
/**
* @deprecated Will be removed in 1.7. Please use {@link #IndexInfo(List, String, boolean, boolean, boolean, String)}
* @param indexFields
* @param name
* @param unique
* @param dropDuplicates
* @param sparse
*/
@Deprecated
public IndexInfo(List<IndexField> indexFields, String name, boolean unique, boolean dropDuplicates, boolean sparse) {
this(indexFields, name, unique, dropDuplicates, sparse, "");
}
public IndexInfo(List<IndexField> indexFields, String name, boolean unique, boolean dropDuplicates, boolean sparse,
String language) {
this.indexFields = Collections.unmodifiableList(indexFields);
this.name = name;
this.unique = unique;
this.dropDuplicates = dropDuplicates;
this.sparse = sparse;
this.language = language;
}
/**
@@ -84,14 +105,23 @@ public class IndexInfo {
return sparse;
}
/**
* @return
* @since 1.6
*/
public String getLanguage() {
return language;
}
@Override
public String toString() {
return "IndexInfo [indexFields=" + indexFields + ", name=" + name + ", unique=" + unique + ", dropDuplicates="
+ dropDuplicates + ", sparse=" + sparse + "]";
+ dropDuplicates + ", sparse=" + sparse + ", language=" + language + "]";
}
@Override
public int hashCode() {
final int prime = 31;
int result = 1;
result = prime * result + (dropDuplicates ? 1231 : 1237);
@@ -99,6 +129,7 @@ public class IndexInfo {
result = prime * result + ((name == null) ? 0 : name.hashCode());
result = prime * result + (sparse ? 1231 : 1237);
result = prime * result + (unique ? 1231 : 1237);
result = prime * result + ObjectUtils.nullSafeHashCode(language);
return result;
}
@@ -137,6 +168,9 @@ public class IndexInfo {
if (unique != other.unique) {
return false;
}
if (!ObjectUtils.nullSafeEquals(language, other.language)) {
return false;
}
return true;
}
}

View File

@@ -0,0 +1,37 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.index;
import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexResolver.IndexDefinitionHolder;
/**
* {@link IndexResolver} finds those {@link IndexDefinition}s to be created for a given class.
*
* @author Christoph Strobl
* @since 1.5
*/
interface IndexResolver {
/**
* Find and create {@link IndexDefinition}s for properties of given {@code type}. {@link IndexDefinition}s are created
* for properties and types with {@link Indexed}, {@link CompoundIndexes} or {@link GeoSpatialIndexed}.
*
* @param type
* @return Empty {@link Iterable} in case no {@link IndexDefinition} could be resolved for type.
*/
Iterable<? extends IndexDefinitionHolder> resolveIndexForClass(Class<?> type);
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -27,6 +27,7 @@ import java.lang.annotation.Target;
* @author Oliver Gierke
* @author Philipp Schneider
* @author Johno Crawford
* @author Thomas Darimont
*/
@Target(ElementType.FIELD)
@Retention(RetentionPolicy.RUNTIME)
@@ -63,6 +64,15 @@ public @interface Indexed {
*/
String name() default "";
/**
* If set to {@literal true} then MongoDB will ignore the given index name and instead generate a new name. Defaults
* to {@literal false}.
*
* @return
* @since 1.5
*/
boolean useGeneratedName() default false;
/**
* Colleciton name for index to be created on.
*

View File

@@ -22,19 +22,15 @@ import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.context.ApplicationListener;
import org.springframework.data.mapping.PersistentEntity;
import org.springframework.data.mapping.PropertyHandler;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.context.MappingContextEvent;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexResolver.IndexDefinitionHolder;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
import com.mongodb.util.JSON;
/**
* Component that inspects {@link MongoPersistentEntity} instances contained in the given {@link MongoMappingContext}
@@ -45,6 +41,7 @@ import com.mongodb.util.JSON;
* @author Philipp Schneider
* @author Johno Crawford
* @author Laurent Canet
* @author Christoph Strobl
*/
public class MongoPersistentEntityIndexCreator implements
ApplicationListener<MappingContextEvent<MongoPersistentEntity<?>, MongoPersistentProperty>> {
@@ -54,21 +51,37 @@ public class MongoPersistentEntityIndexCreator implements
private final Map<Class<?>, Boolean> classesSeen = new ConcurrentHashMap<Class<?>, Boolean>();
private final MongoDbFactory mongoDbFactory;
private final MongoMappingContext mappingContext;
private final IndexResolver indexResolver;
/**
* Creats a new {@link MongoPersistentEntityIndexCreator} for the given {@link MongoMappingContext} and
* {@link MongoDbFactory}.
*
* @param mappingContext must not be {@literal null}
* @param mongoDbFactory must not be {@literal null}
* @param mappingContext must not be {@literal null}.
* @param mongoDbFactory must not be {@literal null}.
*/
public MongoPersistentEntityIndexCreator(MongoMappingContext mappingContext, MongoDbFactory mongoDbFactory) {
this(mappingContext, mongoDbFactory, new MongoPersistentEntityIndexResolver(mappingContext));
}
/**
* Creats a new {@link MongoPersistentEntityIndexCreator} for the given {@link MongoMappingContext} and
* {@link MongoDbFactory}.
*
* @param mappingContext must not be {@literal null}.
* @param mongoDbFactory must not be {@literal null}.
* @param indexResolver must not be {@literal null}.
*/
public MongoPersistentEntityIndexCreator(MongoMappingContext mappingContext, MongoDbFactory mongoDbFactory,
IndexResolver indexResolver) {
Assert.notNull(mongoDbFactory);
Assert.notNull(mappingContext);
Assert.notNull(indexResolver);
this.mongoDbFactory = mongoDbFactory;
this.mappingContext = mappingContext;
this.indexResolver = indexResolver;
for (MongoPersistentEntity<?> entity : mappingContext.getPersistentEntities()) {
checkForIndexes(entity);
@@ -93,87 +106,36 @@ public class MongoPersistentEntityIndexCreator implements
}
}
protected void checkForIndexes(final MongoPersistentEntity<?> entity) {
final Class<?> type = entity.getType();
private void checkForIndexes(final MongoPersistentEntity<?> entity) {
Class<?> type = entity.getType();
if (!classesSeen.containsKey(type)) {
this.classesSeen.put(type, Boolean.TRUE);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Analyzing class " + type + " for index information.");
}
// Make sure indexes get created
if (type.isAnnotationPresent(CompoundIndexes.class)) {
CompoundIndexes indexes = type.getAnnotation(CompoundIndexes.class);
for (CompoundIndex index : indexes.value()) {
String indexColl = StringUtils.hasText(index.collection()) ? index.collection() : entity.getCollection();
DBObject definition = (DBObject) JSON.parse(index.def());
ensureIndex(indexColl, index.name(), definition, index.unique(), index.dropDups(), index.sparse(),
index.background(), index.expireAfterSeconds());
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Created compound index " + index);
}
}
}
entity.doWithProperties(new PropertyHandler<MongoPersistentProperty>() {
public void doWithPersistentProperty(MongoPersistentProperty property) {
if (property.isAnnotationPresent(Indexed.class)) {
Indexed index = property.findAnnotation(Indexed.class);
String name = index.name();
if (!StringUtils.hasText(name)) {
name = property.getFieldName();
} else {
if (!name.equals(property.getName()) && index.unique() && !index.sparse()) {
// Names don't match, and sparse is not true. This situation will generate an error on the server.
if (LOGGER.isWarnEnabled()) {
LOGGER.warn("The index name " + name + " doesn't match this property name: " + property.getName()
+ ". Setting sparse=true on this index will prevent errors when inserting documents.");
}
}
}
String collection = StringUtils.hasText(index.collection()) ? index.collection() : entity.getCollection();
int direction = index.direction() == IndexDirection.ASCENDING ? 1 : -1;
DBObject definition = new BasicDBObject(property.getFieldName(), direction);
ensureIndex(collection, name, definition, index.unique(), index.dropDups(), index.sparse(),
index.background(), index.expireAfterSeconds());
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Created property index " + index);
}
} else if (property.isAnnotationPresent(GeoSpatialIndexed.class)) {
GeoSpatialIndexed index = property.findAnnotation(GeoSpatialIndexed.class);
GeospatialIndex indexObject = new GeospatialIndex(property.getFieldName());
indexObject.withMin(index.min()).withMax(index.max());
indexObject.named(StringUtils.hasText(index.name()) ? index.name() : property.getName());
indexObject.typed(index.type()).withBucketSize(index.bucketSize())
.withAdditionalField(index.additionalField());
String collection = StringUtils.hasText(index.collection()) ? index.collection() : entity.getCollection();
mongoDbFactory.getDb().getCollection(collection)
.ensureIndex(indexObject.getIndexKeys(), indexObject.getIndexOptions());
if (LOGGER.isDebugEnabled()) {
LOGGER.debug(String.format("Created %s for entity %s in collection %s! ", indexObject, entity.getType(),
collection));
}
}
}
});
classesSeen.put(type, true);
checkForAndCreateIndexes(entity);
}
}
private void checkForAndCreateIndexes(MongoPersistentEntity<?> entity) {
if (entity.findAnnotation(Document.class) != null) {
for (IndexDefinitionHolder indexToCreate : indexResolver.resolveIndexForClass(entity.getType())) {
createIndex(indexToCreate);
}
}
}
private void createIndex(IndexDefinitionHolder indexDefinition) {
mongoDbFactory.getDb().getCollection(indexDefinition.getCollection())
.createIndex(indexDefinition.getIndexKeys(), indexDefinition.getIndexOptions());
}
/**
* Returns whether the current index creator was registered for the given {@link MappingContext}.
*
@@ -183,33 +145,4 @@ public class MongoPersistentEntityIndexCreator implements
public boolean isIndexCreatorFor(MappingContext<?, ?> context) {
return this.mappingContext.equals(context);
}
/**
* Triggers the actual index creation.
*
* @param collection the collection to create the index in
* @param name the name of the index about to be created
* @param indexDefinition the index definition
* @param unique whether it shall be a unique index
* @param dropDups whether to drop duplicates
* @param sparse sparse or not
* @param background whether the index will be created in the background
* @param expireAfterSeconds the time to live for documents in the collection
*/
protected void ensureIndex(String collection, String name, DBObject indexDefinition, boolean unique,
boolean dropDups, boolean sparse, boolean background, int expireAfterSeconds) {
DBObject opts = new BasicDBObject();
opts.put("name", name);
opts.put("dropDups", dropDups);
opts.put("sparse", sparse);
opts.put("unique", unique);
opts.put("background", background);
if (expireAfterSeconds != -1) {
opts.put("expireAfterSeconds", expireAfterSeconds);
}
mongoDbFactory.getDb().getCollection(collection).ensureIndex(indexDefinition, opts);
}
}

View File

@@ -0,0 +1,670 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.index;
import java.util.ArrayList;
import java.util.Collection;
import java.util.Collections;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;
import java.util.concurrent.TimeUnit;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.core.annotation.AnnotationUtils;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.data.domain.Sort;
import org.springframework.data.mapping.PropertyHandler;
import org.springframework.data.mongodb.core.index.Index.Duplicates;
import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexResolver.TextIndexIncludeOptions.IncludeStrategy;
import org.springframework.data.mongodb.core.index.TextIndexDefinition.TextIndexDefinitionBuilder;
import org.springframework.data.mongodb.core.index.TextIndexDefinition.TextIndexedFieldSpec;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.BasicDBObjectBuilder;
import com.mongodb.DBObject;
import com.mongodb.util.JSON;
/**
* {@link IndexResolver} implementation inspecting {@link MongoPersistentEntity} for {@link MongoPersistentEntity} to be
* indexed. <br />
* All {@link MongoPersistentProperty} of the {@link MongoPersistentEntity} are inspected for potential indexes by
* scanning related annotations.
*
* @author Christoph Strobl
* @since 1.5
*/
public class MongoPersistentEntityIndexResolver implements IndexResolver {
private static final Logger LOGGER = LoggerFactory.getLogger(MongoPersistentEntityIndexResolver.class);
private final MongoMappingContext mappingContext;
/**
* Create new {@link MongoPersistentEntityIndexResolver}.
*
* @param mappingContext must not be {@literal null}.
*/
public MongoPersistentEntityIndexResolver(MongoMappingContext mappingContext) {
Assert.notNull(mappingContext, "Mapping context must not be null in order to resolve index definitions");
this.mappingContext = mappingContext;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.IndexResolver#resolveIndexForClass(java.lang.Class)
*/
@Override
public List<IndexDefinitionHolder> resolveIndexForClass(Class<?> type) {
return resolveIndexForEntity(mappingContext.getPersistentEntity(type));
}
/**
* Resolve the {@link IndexDefinition}s for given {@literal root} entity by traversing {@link MongoPersistentProperty}
* scanning for index annotations {@link Indexed}, {@link CompoundIndex} and {@link GeospatialIndex}. The given
* {@literal root} has therefore to be annotated with {@link Document}.
*
* @param root must not be null.
* @return List of {@link IndexDefinitionHolder}. Will never be {@code null}.
* @throws IllegalArgumentException in case of missing {@link Document} annotation marking root entities.
*/
public List<IndexDefinitionHolder> resolveIndexForEntity(final MongoPersistentEntity<?> root) {
Assert.notNull(root, "Index cannot be resolved for given 'null' entity.");
Document document = root.findAnnotation(Document.class);
Assert.notNull(document, "Given entity is not collection root.");
final List<IndexDefinitionHolder> indexInformation = new ArrayList<MongoPersistentEntityIndexResolver.IndexDefinitionHolder>();
indexInformation.addAll(potentiallyCreateCompoundIndexDefinitions("", root.getCollection(), root.getType()));
indexInformation.addAll(potentiallyCreateTextIndexDefinition(root));
final CycleGuard guard = new CycleGuard();
root.doWithProperties(new PropertyHandler<MongoPersistentProperty>() {
@Override
public void doWithPersistentProperty(MongoPersistentProperty persistentProperty) {
try {
if (persistentProperty.isEntity()) {
indexInformation.addAll(resolveIndexForClass(persistentProperty.getActualType(),
persistentProperty.getFieldName(), root.getCollection(), guard));
}
IndexDefinitionHolder indexDefinitionHolder = createIndexDefinitionHolderForProperty(
persistentProperty.getFieldName(), root.getCollection(), persistentProperty);
if (indexDefinitionHolder != null) {
indexInformation.add(indexDefinitionHolder);
}
} catch (CyclicPropertyReferenceException e) {
LOGGER.warn(e.getMessage());
}
}
});
return indexInformation;
}
/**
* Recursively resolve and inspect properties of given {@literal type} for indexes to be created.
*
* @param type
* @param path The {@literal "dot} path.
* @param collection
* @return List of {@link IndexDefinitionHolder} representing indexes for given type and its referenced property
* types. Will never be {@code null}.
*/
private List<IndexDefinitionHolder> resolveIndexForClass(final Class<?> type, final String path,
final String collection, final CycleGuard guard) {
final List<IndexDefinitionHolder> indexInformation = new ArrayList<MongoPersistentEntityIndexResolver.IndexDefinitionHolder>();
indexInformation.addAll(potentiallyCreateCompoundIndexDefinitions(path, collection, type));
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(type);
entity.doWithProperties(new PropertyHandler<MongoPersistentProperty>() {
@Override
public void doWithPersistentProperty(MongoPersistentProperty persistentProperty) {
String propertyDotPath = (StringUtils.hasText(path) ? path + "." : "") + persistentProperty.getFieldName();
guard.protect(persistentProperty, path);
if (persistentProperty.isEntity()) {
try {
indexInformation.addAll(resolveIndexForClass(persistentProperty.getActualType(), propertyDotPath,
collection, guard));
} catch (CyclicPropertyReferenceException e) {
LOGGER.warn(e.getMessage());
}
}
IndexDefinitionHolder indexDefinitionHolder = createIndexDefinitionHolderForProperty(propertyDotPath,
collection, persistentProperty);
if (indexDefinitionHolder != null) {
indexInformation.add(indexDefinitionHolder);
}
}
});
return indexInformation;
}
private IndexDefinitionHolder createIndexDefinitionHolderForProperty(String dotPath, String collection,
MongoPersistentProperty persistentProperty) {
if (persistentProperty.isAnnotationPresent(Indexed.class)) {
return createIndexDefinition(dotPath, collection, persistentProperty);
} else if (persistentProperty.isAnnotationPresent(GeoSpatialIndexed.class)) {
return createGeoSpatialIndexDefinition(dotPath, collection, persistentProperty);
}
return null;
}
private List<IndexDefinitionHolder> potentiallyCreateCompoundIndexDefinitions(String dotPath, String collection,
Class<?> type) {
if (AnnotationUtils.findAnnotation(type, CompoundIndexes.class) == null
&& AnnotationUtils.findAnnotation(type, CompoundIndex.class) == null) {
return Collections.emptyList();
}
return createCompoundIndexDefinitions(dotPath, collection, type);
}
private Collection<? extends IndexDefinitionHolder> potentiallyCreateTextIndexDefinition(MongoPersistentEntity<?> root) {
TextIndexDefinitionBuilder indexDefinitionBuilder = new TextIndexDefinitionBuilder().named(root.getType()
.getSimpleName() + "_TextIndex");
if (StringUtils.hasText(root.getLanguage())) {
indexDefinitionBuilder.withDefaultLanguage(root.getLanguage());
}
try {
appendTextIndexInformation("", indexDefinitionBuilder, root,
new TextIndexIncludeOptions(IncludeStrategy.DEFAULT), new CycleGuard());
} catch (CyclicPropertyReferenceException e) {
LOGGER.warn(e.getMessage());
}
TextIndexDefinition indexDefinition = indexDefinitionBuilder.build();
if (!indexDefinition.hasFieldSpec()) {
return Collections.emptyList();
}
IndexDefinitionHolder holder = new IndexDefinitionHolder("", indexDefinition, root.getCollection());
return Collections.singletonList(holder);
}
private void appendTextIndexInformation(final String dotPath,
final TextIndexDefinitionBuilder indexDefinitionBuilder, MongoPersistentEntity<?> entity,
final TextIndexIncludeOptions includeOptions, final CycleGuard guard) {
entity.doWithProperties(new PropertyHandler<MongoPersistentProperty>() {
@Override
public void doWithPersistentProperty(MongoPersistentProperty persistentProperty) {
guard.protect(persistentProperty, dotPath);
if (persistentProperty.isLanguageProperty()) {
indexDefinitionBuilder.withLanguageOverride(persistentProperty.getFieldName());
}
TextIndexed indexed = persistentProperty.findAnnotation(TextIndexed.class);
if (includeOptions.isForce() || indexed != null || persistentProperty.isEntity()) {
String propertyDotPath = (StringUtils.hasText(dotPath) ? dotPath + "." : "")
+ persistentProperty.getFieldName();
Float weight = indexed != null ? indexed.weight()
: (includeOptions.getParentFieldSpec() != null ? includeOptions.getParentFieldSpec().getWeight() : 1.0F);
if (persistentProperty.isEntity()) {
TextIndexIncludeOptions optionsForNestedType = includeOptions;
if (!IncludeStrategy.FORCE.equals(includeOptions.getStrategy()) && indexed != null) {
optionsForNestedType = new TextIndexIncludeOptions(IncludeStrategy.FORCE, new TextIndexedFieldSpec(
propertyDotPath, weight));
}
try {
appendTextIndexInformation(propertyDotPath, indexDefinitionBuilder,
mappingContext.getPersistentEntity(persistentProperty.getActualType()), optionsForNestedType, guard);
} catch (CyclicPropertyReferenceException e) {
LOGGER.warn(e.getMessage(), e);
}
} else if (includeOptions.isForce() || indexed != null) {
indexDefinitionBuilder.onField(propertyDotPath, weight);
}
}
}
});
}
/**
* Create {@link IndexDefinition} wrapped in {@link IndexDefinitionHolder} for {@link CompoundIndexes} of given type.
*
* @param dotPath The properties {@literal "dot"} path representation from its document root.
* @param fallbackCollection
* @param type
* @return
*/
protected List<IndexDefinitionHolder> createCompoundIndexDefinitions(String dotPath, String fallbackCollection,
Class<?> type) {
List<IndexDefinitionHolder> indexDefinitions = new ArrayList<MongoPersistentEntityIndexResolver.IndexDefinitionHolder>();
CompoundIndexes indexes = AnnotationUtils.findAnnotation(type, CompoundIndexes.class);
if (indexes != null) {
for (CompoundIndex index : indexes.value()) {
indexDefinitions.add(createCompoundIndexDefinition(dotPath, fallbackCollection, index));
}
}
CompoundIndex index = AnnotationUtils.findAnnotation(type, CompoundIndex.class);
if (index != null) {
indexDefinitions.add(createCompoundIndexDefinition(dotPath, fallbackCollection, index));
}
return indexDefinitions;
}
@SuppressWarnings("deprecation")
protected IndexDefinitionHolder createCompoundIndexDefinition(String dotPath, String fallbackCollection,
CompoundIndex index) {
CompoundIndexDefinition indexDefinition = new CompoundIndexDefinition(resolveCompoundIndexKeyFromStringDefinition(
dotPath, index.def()));
if (!index.useGeneratedName()) {
indexDefinition.named(index.name());
}
if (index.unique()) {
indexDefinition.unique(index.dropDups() ? Duplicates.DROP : Duplicates.RETAIN);
}
if (index.sparse()) {
indexDefinition.sparse();
}
if (index.background()) {
indexDefinition.background();
}
int ttl = index.expireAfterSeconds();
if (ttl >= 0) {
if (indexDefinition.getIndexKeys().keySet().size() > 1) {
LOGGER.warn("TTL is not supported for compound index with more than one key. TTL={} will be ignored.", ttl);
} else {
indexDefinition.expire(ttl, TimeUnit.SECONDS);
}
}
String collection = StringUtils.hasText(index.collection()) ? index.collection() : fallbackCollection;
return new IndexDefinitionHolder(dotPath, indexDefinition, collection);
}
private DBObject resolveCompoundIndexKeyFromStringDefinition(String dotPath, String keyDefinitionString) {
if (!StringUtils.hasText(dotPath) && !StringUtils.hasText(keyDefinitionString)) {
throw new InvalidDataAccessApiUsageException("Cannot create index on root level for empty keys.");
}
if (!StringUtils.hasText(keyDefinitionString)) {
return new BasicDBObject(dotPath, 1);
}
DBObject dbo = (DBObject) JSON.parse(keyDefinitionString);
if (!StringUtils.hasText(dotPath)) {
return dbo;
}
BasicDBObjectBuilder dboBuilder = new BasicDBObjectBuilder();
for (String key : dbo.keySet()) {
dboBuilder.add(dotPath + "." + key, dbo.get(key));
}
return dboBuilder.get();
}
/**
* Creates {@link IndexDefinition} wrapped in {@link IndexDefinitionHolder} out of {@link Indexed} for given
* {@link MongoPersistentProperty}.
*
* @param dotPath The properties {@literal "dot"} path representation from its document root.
* @param collection
* @param persitentProperty
* @return
*/
protected IndexDefinitionHolder createIndexDefinition(String dotPath, String fallbackCollection,
MongoPersistentProperty persitentProperty) {
Indexed index = persitentProperty.findAnnotation(Indexed.class);
String collection = StringUtils.hasText(index.collection()) ? index.collection() : fallbackCollection;
Index indexDefinition = new Index().on(dotPath,
IndexDirection.ASCENDING.equals(index.direction()) ? Sort.Direction.ASC : Sort.Direction.DESC);
if (!index.useGeneratedName()) {
indexDefinition.named(StringUtils.hasText(index.name()) ? index.name() : dotPath);
}
if (index.unique()) {
indexDefinition.unique(index.dropDups() ? Duplicates.DROP : Duplicates.RETAIN);
}
if (index.sparse()) {
indexDefinition.sparse();
}
if (index.background()) {
indexDefinition.background();
}
if (index.expireAfterSeconds() >= 0) {
indexDefinition.expire(index.expireAfterSeconds(), TimeUnit.SECONDS);
}
return new IndexDefinitionHolder(dotPath, indexDefinition, collection);
}
/**
* Creates {@link IndexDefinition} wrapped in {@link IndexDefinitionHolder} out of {@link GeoSpatialIndexed} for
* {@link MongoPersistentProperty}.
*
* @param dotPath The properties {@literal "dot"} path representation from its document root.
* @param collection
* @param persistentProperty
* @return
*/
protected IndexDefinitionHolder createGeoSpatialIndexDefinition(String dotPath, String fallbackCollection,
MongoPersistentProperty persistentProperty) {
GeoSpatialIndexed index = persistentProperty.findAnnotation(GeoSpatialIndexed.class);
String collection = StringUtils.hasText(index.collection()) ? index.collection() : fallbackCollection;
GeospatialIndex indexDefinition = new GeospatialIndex(dotPath);
indexDefinition.withBits(index.bits());
indexDefinition.withMin(index.min()).withMax(index.max());
if (!index.useGeneratedName()) {
indexDefinition.named(StringUtils.hasText(index.name()) ? index.name() : persistentProperty.getName());
}
indexDefinition.typed(index.type()).withBucketSize(index.bucketSize()).withAdditionalField(index.additionalField());
return new IndexDefinitionHolder(dotPath, indexDefinition, collection);
}
/**
* {@link CycleGuard} holds information about properties and the paths for accessing those. This information is used
* to detect potential cycles within the references.
*
* @author Christoph Strobl
*/
static class CycleGuard {
private final Map<String, List<Path>> propertyTypeMap;
CycleGuard() {
this.propertyTypeMap = new LinkedHashMap<String, List<Path>>();
}
/**
* @param property The property to inspect
* @param path The path under which the property can be reached.
* @throws CyclicPropertyReferenceException in case a potential cycle is detected.
* @see Path#cycles(MongoPersistentProperty, String)
*/
void protect(MongoPersistentProperty property, String path) throws CyclicPropertyReferenceException {
String propertyTypeKey = createMapKey(property);
if (propertyTypeMap.containsKey(propertyTypeKey)) {
List<Path> paths = propertyTypeMap.get(propertyTypeKey);
for (Path existingPath : paths) {
if (existingPath.cycles(property, path)) {
paths.add(new Path(property, path));
throw new CyclicPropertyReferenceException(property.getFieldName(), property.getOwner().getType(),
existingPath.getPath());
}
}
paths.add(new Path(property, path));
} else {
ArrayList<Path> paths = new ArrayList<Path>();
paths.add(new Path(property, path));
propertyTypeMap.put(propertyTypeKey, paths);
}
}
private String createMapKey(MongoPersistentProperty property) {
return property.getOwner().getType().getSimpleName() + ":" + property.getFieldName();
}
/**
* Path defines the property and its full path from the document root. <br />
* A {@link Path} with {@literal spring.data.mongodb} would be created for the property {@code Three.mongodb}.
*
* <pre>
* <code>
* &#64;Document
* class One {
* Two spring;
* }
*
* class Two {
* Three data;
* }
*
* class Three {
* String mongodb;
* }
* </code>
* </pre>
*
* @author Christoph Strobl
*/
static class Path {
private final MongoPersistentProperty property;
private final String path;
Path(MongoPersistentProperty property, String path) {
this.property = property;
this.path = path;
}
public String getPath() {
return path;
}
/**
* Checks whether the given property is owned by the same entity and if it has been already visited by a subset of
* the current path. Given {@literal foo.bar.bar} cycles if {@literal foo.bar} has already been visited and
* {@code class Bar} contains a property of type {@code Bar}. The previously mentioned path would not cycle if
* {@code class Bar} contained a property of type {@code SomeEntity} named {@literal bar}.
*
* @param property
* @param path
* @return
*/
boolean cycles(MongoPersistentProperty property, String path) {
if (!property.getOwner().equals(this.property.getOwner())) {
return false;
}
return path.contains(this.path);
}
}
}
/**
* @author Christoph Strobl
* @since 1.5
*/
public static class CyclicPropertyReferenceException extends RuntimeException {
private static final long serialVersionUID = -3762979307658772277L;
private final String propertyName;
private final Class<?> type;
private final String dotPath;
public CyclicPropertyReferenceException(String propertyName, Class<?> type, String dotPath) {
this.propertyName = propertyName;
this.type = type;
this.dotPath = dotPath;
}
/*
* (non-Javadoc)
* @see java.lang.Throwable#getMessage()
*/
@Override
public String getMessage() {
return String.format("Found cycle for field '%s' in type '%s' for path '%s'", propertyName,
type != null ? type.getSimpleName() : "unknown", dotPath);
}
}
/**
* Implementation of {@link IndexDefinition} holding additional (property)path information used for creating the
* index. The path itself is the properties {@literal "dot"} path representation from its root document.
*
* @author Christoph Strobl
* @since 1.5
*/
public static class IndexDefinitionHolder implements IndexDefinition {
private final String path;
private final IndexDefinition indexDefinition;
private final String collection;
/**
* Create
*
* @param path
*/
public IndexDefinitionHolder(String path, IndexDefinition definition, String collection) {
this.path = path;
this.indexDefinition = definition;
this.collection = collection;
}
public String getCollection() {
return collection;
}
/**
* Get the {@literal "dot"} path used to create the index.
*
* @return
*/
public String getPath() {
return path;
}
/**
* Get the {@literal raw} {@link IndexDefinition}.
*
* @return
*/
public IndexDefinition getIndexDefinition() {
return indexDefinition;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.IndexDefinition#getIndexKeys()
*/
@Override
public DBObject getIndexKeys() {
return indexDefinition.getIndexKeys();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.IndexDefinition#getIndexOptions()
*/
@Override
public DBObject getIndexOptions() {
return indexDefinition.getIndexOptions();
}
}
/**
* @author Christoph Strobl
* @since 1.6
*/
static class TextIndexIncludeOptions {
enum IncludeStrategy {
FORCE, DEFAULT;
}
private final IncludeStrategy strategy;
private final TextIndexedFieldSpec parentFieldSpec;
public TextIndexIncludeOptions(IncludeStrategy strategy, TextIndexedFieldSpec parentFieldSpec) {
this.strategy = strategy;
this.parentFieldSpec = parentFieldSpec;
}
public TextIndexIncludeOptions(IncludeStrategy strategy) {
this(strategy, null);
}
public IncludeStrategy getStrategy() {
return strategy;
}
public TextIndexedFieldSpec getParentFieldSpec() {
return parentFieldSpec;
}
public boolean isForce() {
return IncludeStrategy.FORCE.equals(strategy);
}
}
}

View File

@@ -0,0 +1,336 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.index;
import java.util.Collection;
import java.util.LinkedHashSet;
import java.util.Set;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* {@link IndexDefinition} to span multiple keys for text search.
*
* @author Christoph Strobl
* @since 1.6
*/
public class TextIndexDefinition implements IndexDefinition {
private String name;
private Set<TextIndexedFieldSpec> fieldSpecs;
private String defaultLanguage;
private String languageOverride;
TextIndexDefinition() {
fieldSpecs = new LinkedHashSet<TextIndexedFieldSpec>();
}
/**
* Creates a {@link TextIndexDefinition} for all fields in the document.
*
* @return
*/
public static TextIndexDefinition forAllFields() {
return new TextIndexDefinitionBuilder().onAllFields().build();
}
/**
* Get {@link TextIndexDefinitionBuilder} to create {@link TextIndexDefinition}.
*
* @return
*/
public static TextIndexDefinitionBuilder builder() {
return new TextIndexDefinitionBuilder();
}
/**
* @param fieldSpec
*/
public void addFieldSpec(TextIndexedFieldSpec fieldSpec) {
this.fieldSpecs.add(fieldSpec);
}
/**
* @param fieldSpecs
*/
public void addFieldSpecs(Collection<TextIndexedFieldSpec> fieldSpecs) {
this.fieldSpecs.addAll(fieldSpecs);
}
/**
* Returns if the {@link TextIndexDefinition} has fields assigned.
*
* @return
*/
public boolean hasFieldSpec() {
return !fieldSpecs.isEmpty();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.IndexDefinition#getIndexKeys()
*/
@Override
public DBObject getIndexKeys() {
DBObject keys = new BasicDBObject();
for (TextIndexedFieldSpec fieldSpec : fieldSpecs) {
keys.put(fieldSpec.fieldname, "text");
}
return keys;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.IndexDefinition#getIndexOptions()
*/
@Override
public DBObject getIndexOptions() {
DBObject options = new BasicDBObject();
if (StringUtils.hasText(name)) {
options.put("name", name);
}
if (StringUtils.hasText(defaultLanguage)) {
options.put("default_language", defaultLanguage);
}
BasicDBObject weightsDbo = new BasicDBObject();
for (TextIndexedFieldSpec fieldSpec : fieldSpecs) {
if (fieldSpec.isWeighted()) {
weightsDbo.put(fieldSpec.getFieldname(), fieldSpec.getWeight());
}
}
if (!weightsDbo.isEmpty()) {
options.put("weights", weightsDbo);
}
if (StringUtils.hasText(languageOverride)) {
options.put("language_override", languageOverride);
}
return options;
}
/**
* @author Christoph Strobl
* @since 1.6
*/
public static class TextIndexedFieldSpec {
private final String fieldname;
private final Float weight;
/**
* Create new {@link TextIndexedFieldSpec} for given fieldname without any weight.
*
* @param fieldname
*/
public TextIndexedFieldSpec(String fieldname) {
this(fieldname, 1.0F);
}
/**
* Create new {@link TextIndexedFieldSpec} for given fieldname and weight.
*
* @param fieldname
* @param weight
*/
public TextIndexedFieldSpec(String fieldname, Float weight) {
Assert.hasText(fieldname, "Text index field cannot be blank.");
this.fieldname = fieldname;
this.weight = weight != null ? weight : 1.0F;
}
/**
* Get the fieldname associated with the {@link TextIndexedFieldSpec}.
*
* @return
*/
public String getFieldname() {
return fieldname;
}
/**
* Get the weight associated with the {@link TextIndexedFieldSpec}.
*
* @return
*/
public Float getWeight() {
return weight;
}
/**
* @return true if {@link #weight} has a value that is a valid number.
*/
public boolean isWeighted() {
return this.weight != null && this.weight.compareTo(1.0F) != 0;
}
@Override
public int hashCode() {
return ObjectUtils.nullSafeHashCode(fieldname);
}
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (obj == null) {
return false;
}
if (!(obj instanceof TextIndexedFieldSpec)) {
return false;
}
TextIndexedFieldSpec other = (TextIndexedFieldSpec) obj;
return ObjectUtils.nullSafeEquals(this.fieldname, other.fieldname);
}
}
/**
* {@link TextIndexDefinitionBuilder} helps defining options for creating {@link TextIndexDefinition}.
*
* @author Christoph Strobl
* @since 1.6
*/
public static class TextIndexDefinitionBuilder {
private TextIndexDefinition instance;
private static final TextIndexedFieldSpec ALL_FIELDS = new TextIndexedFieldSpec("$**");
public TextIndexDefinitionBuilder() {
this.instance = new TextIndexDefinition();
}
/**
* Define the name to be used when creating the index in the store.
*
* @param name
* @return
*/
public TextIndexDefinitionBuilder named(String name) {
this.instance.name = name;
return this;
}
/**
* Define the index to span all fields using wilcard. <br/>
* <strong>NOTE</strong> {@link TextIndexDefinition} cannot contain any other fields when defined with wildcard.
*
* @return
*/
public TextIndexDefinitionBuilder onAllFields() {
if (!instance.fieldSpecs.isEmpty()) {
throw new InvalidDataAccessApiUsageException("Cannot add wildcard fieldspect to non empty.");
}
this.instance.fieldSpecs.add(ALL_FIELDS);
return this;
}
/**
* Include given fields with default weight.
*
* @param fieldnames
* @return
*/
public TextIndexDefinitionBuilder onFields(String... fieldnames) {
for (String fieldname : fieldnames) {
onField(fieldname);
}
return this;
}
/**
* Include given field with default weight.
*
* @param fieldname
* @return
*/
public TextIndexDefinitionBuilder onField(String fieldname) {
return onField(fieldname, 1F);
}
/**
* Include given field with weight.
*
* @param fieldname
* @return
*/
public TextIndexDefinitionBuilder onField(String fieldname, Float weight) {
if (this.instance.fieldSpecs.contains(ALL_FIELDS)) {
throw new InvalidDataAccessApiUsageException(String.format("Cannot add %s to field spec for all fields.",
fieldname));
}
this.instance.fieldSpecs.add(new TextIndexedFieldSpec(fieldname, weight));
return this;
}
/**
* Define the default language to be used when indexing documents.
*
* @param language
* @see http://docs.mongodb.org/manual/tutorial/specify-language-for-text-index/#specify-default-language-text-index
* @return
*/
public TextIndexDefinitionBuilder withDefaultLanguage(String language) {
this.instance.defaultLanguage = language;
return this;
}
/**
* Define field for language override.
*
* @param fieldname
* @return
*/
public TextIndexDefinitionBuilder withLanguageOverride(String fieldname) {
if (StringUtils.hasText(this.instance.languageOverride)) {
throw new InvalidDataAccessApiUsageException(String.format(
"Cannot set language override on %s as it is already defined on %s.", fieldname,
this.instance.languageOverride));
}
this.instance.languageOverride = fieldname;
return this;
}
public TextIndexDefinition build() {
return this.instance;
}
}
}

View File

@@ -0,0 +1,44 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.index;
import java.lang.annotation.Documented;
import java.lang.annotation.ElementType;
import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;
import java.lang.annotation.Target;
/**
* {@link TextIndexed} marks a field to be part of the text index. As there can be only one text index per collection
* all fields marked with {@link TextIndexed} are combined into one single index. <br />
*
* @author Christoph Strobl
* @since 1.6
*/
@Documented
@Target({ ElementType.FIELD })
@Retention(RetentionPolicy.RUNTIME)
public @interface TextIndexed {
/**
* Defines the significance of the filed relative to other indexed fields. The value directly influences the documents
* score. <br/>
* Defaulted to {@literal 1.0}.
*
* @return
*/
float weight() default 1.0F;
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -38,6 +38,7 @@ import org.springframework.expression.ParserContext;
import org.springframework.expression.spel.standard.SpelExpressionParser;
import org.springframework.expression.spel.support.StandardEvaluationContext;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import org.springframework.util.StringUtils;
/**
@@ -47,12 +48,14 @@ import org.springframework.util.StringUtils;
* @author Jon Brisbin
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
*/
public class BasicMongoPersistentEntity<T> extends BasicPersistentEntity<T, MongoPersistentProperty> implements
MongoPersistentEntity<T>, ApplicationContextAware {
private static final String AMBIGUOUS_FIELD_MAPPING = "Ambiguous field mapping detected! Both %s and %s map to the same field name %s! Disambiguate using @DocumentField annotation!";
private final String collection;
private final String language;
private final SpelExpressionParser parser;
private final StandardEvaluationContext context;
@@ -75,8 +78,10 @@ public class BasicMongoPersistentEntity<T> extends BasicPersistentEntity<T, Mong
if (rawType.isAnnotationPresent(Document.class)) {
Document d = rawType.getAnnotation(Document.class);
this.collection = StringUtils.hasText(d.collection()) ? d.collection() : fallback;
this.language = StringUtils.hasText(d.language()) ? d.language() : "";
} else {
this.collection = fallback;
this.language = "";
}
}
@@ -100,6 +105,33 @@ public class BasicMongoPersistentEntity<T> extends BasicPersistentEntity<T, Mong
return expression.getValue(context, String.class);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.MongoPersistentEntity#getLanguage()
*/
@Override
public String getLanguage() {
return this.language;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.MongoPersistentEntity#getTextScoreProperty()
*/
@Override
public MongoPersistentProperty getTextScoreProperty() {
return getPersistentProperty(TextScore.class);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.MongoPersistentEntity#hasTextScoreProperty()
*/
@Override
public boolean hasTextScoreProperty() {
return getTextScoreProperty() != null;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mapping.model.BasicPersistentEntity#verify()
@@ -107,12 +139,22 @@ public class BasicMongoPersistentEntity<T> extends BasicPersistentEntity<T, Mong
@Override
public void verify() {
verifyFieldUniqueness();
verifyFieldTypes();
}
private void verifyFieldUniqueness() {
AssertFieldNameUniquenessHandler handler = new AssertFieldNameUniquenessHandler();
doWithProperties(handler);
doWithAssociations(handler);
}
private void verifyFieldTypes() {
doWithProperties(new PropertyTypeAssertionHandler());
}
/**
* {@link Comparator} implementation inspecting the {@link MongoPersistentProperty}'s order.
*
@@ -226,4 +268,46 @@ public class BasicMongoPersistentEntity<T> extends BasicPersistentEntity<T, Mong
properties.put(fieldName, property);
}
}
/**
* @author Christoph Strobl
* @since 1.6
*/
private static class PropertyTypeAssertionHandler implements PropertyHandler<MongoPersistentProperty> {
@Override
public void doWithPersistentProperty(MongoPersistentProperty persistentProperty) {
potentiallyAssertTextScoreType(persistentProperty);
potentiallyAssertLanguageType(persistentProperty);
}
private void potentiallyAssertLanguageType(MongoPersistentProperty persistentProperty) {
if (persistentProperty.isLanguageProperty()) {
assertPropertyType(persistentProperty, String.class);
}
}
private void potentiallyAssertTextScoreType(MongoPersistentProperty persistentProperty) {
if (persistentProperty.isTextScoreProperty()) {
assertPropertyType(persistentProperty, Float.class, Double.class);
}
}
private void assertPropertyType(MongoPersistentProperty persistentProperty, Class<?>... validMatches) {
for (Class<?> potentialMatch : validMatches) {
if (ClassUtils.isAssignable(potentialMatch, persistentProperty.getActualType())) {
return;
}
}
throw new MappingException(String.format("Missmatching types for %s. Found %s expected one of %s.",
persistentProperty.getField(), persistentProperty.getActualType(),
StringUtils.arrayToCommaDelimitedString(validMatches)));
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -27,7 +27,9 @@ import org.slf4j.LoggerFactory;
import org.springframework.data.annotation.Id;
import org.springframework.data.mapping.Association;
import org.springframework.data.mapping.model.AnnotationBasedPersistentProperty;
import org.springframework.data.mapping.model.FieldNamingStrategy;
import org.springframework.data.mapping.model.MappingException;
import org.springframework.data.mapping.model.PropertyNameFieldNamingStrategy;
import org.springframework.data.mapping.model.SimpleTypeHolder;
import org.springframework.util.StringUtils;
@@ -39,6 +41,7 @@ import com.mongodb.DBObject;
* @author Oliver Gierke
* @author Patryk Wasik
* @author Thomas Darimont
* @author Christoph Strobl
*/
public class BasicMongoPersistentProperty extends AnnotationBasedPersistentProperty<MongoPersistentProperty> implements
MongoPersistentProperty {
@@ -46,6 +49,7 @@ public class BasicMongoPersistentProperty extends AnnotationBasedPersistentPrope
private static final Logger LOG = LoggerFactory.getLogger(BasicMongoPersistentProperty.class);
private static final String ID_FIELD_NAME = "_id";
private static final String LANGUAGE_FIELD_NAME = "language";
private static final Set<Class<?>> SUPPORTED_ID_TYPES = new HashSet<Class<?>>();
private static final Set<String> SUPPORTED_ID_PROPERTY_NAMES = new HashSet<String>();
@@ -179,4 +183,22 @@ public class BasicMongoPersistentProperty extends AnnotationBasedPersistentPrope
public DBRef getDBRef() {
return findAnnotation(DBRef.class);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.MongoPersistentProperty#isLanguageProperty()
*/
@Override
public boolean isLanguageProperty() {
return getFieldName().equals(LANGUAGE_FIELD_NAME) || isAnnotationPresent(Language.class);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.MongoPersistentProperty#isTextScoreProperty()
*/
@Override
public boolean isTextScoreProperty() {
return isAnnotationPresent(TextScore.class);
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -18,6 +18,7 @@ package org.springframework.data.mongodb.core.mapping;
import java.beans.PropertyDescriptor;
import java.lang.reflect.Field;
import org.springframework.data.mapping.model.FieldNamingStrategy;
import org.springframework.data.mapping.model.SimpleTypeHolder;
/**

View File

@@ -1,46 +0,0 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.mapping;
import java.util.Locale;
/**
* {@link FieldNamingStrategy} that abbreviates field names by using the very first letter of the camel case parts of
* the {@link MongoPersistentProperty}'s name.
*
* @since 1.3
* @author Oliver Gierke
*/
public class CamelCaseAbbreviatingFieldNamingStrategy implements FieldNamingStrategy {
private static final String CAMEL_CASE_PATTERN = "(?<!(^|[A-Z]))(?=[A-Z])|(?<!^)(?=[A-Z][a-z])";
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.FieldNamingStrategy#getFieldName(org.springframework.data.mongodb.core.mapping.MongoPersistentProperty)
*/
public String getFieldName(MongoPersistentProperty property) {
String[] parts = property.getName().split(CAMEL_CASE_PATTERN);
StringBuilder builder = new StringBuilder();
for (String part : parts) {
builder.append(part.substring(0, 1).toLowerCase(Locale.US));
}
return builder.toString();
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright (c) 2011-2014 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -29,6 +29,7 @@ import org.springframework.data.annotation.Persistent;
*
* @author Jon Brisbin <jbrisbin@vmware.com>
* @author Oliver Gierke ogierke@vmware.com
* @author Christoph Strobl
*/
@Persistent
@Inherited
@@ -37,4 +38,13 @@ import org.springframework.data.annotation.Persistent;
public @interface Document {
String collection() default "";
/**
* Defines the default language to be used with this document.
*
* @since 1.6
* @return
*/
String language() default "";
}

View File

@@ -1,36 +0,0 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.mapping;
/**
* SPI interface to determine how to name document fields in cases the field name is not manually defined.
*
* @see DocumentField
* @see PropertyNameFieldNamingStrategy
* @see CamelCaseAbbreviatingFieldNamingStrategy
* @since 1.3
* @author Oliver Gierke
*/
public interface FieldNamingStrategy {
/**
* Returns the field name to be used for the given {@link MongoPersistentProperty}.
*
* @param property must not be {@literal null} or empty;
* @return
*/
String getFieldName(MongoPersistentProperty property);
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,21 +15,21 @@
*/
package org.springframework.data.mongodb.core.mapping;
import java.lang.annotation.Documented;
import java.lang.annotation.ElementType;
import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;
import java.lang.annotation.Target;
/**
* {@link FieldNamingStrategy} simply using the {@link MongoPersistentProperty}'s name.
* Mark property as language field.
*
* @since 1.3
* @author Oliver Gierke
* @author Christoph Strobl
* @since 1.6
*/
public enum PropertyNameFieldNamingStrategy implements FieldNamingStrategy {
@Documented
@Target({ ElementType.FIELD })
@Retention(RetentionPolicy.RUNTIME)
public @interface Language {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.FieldNamingStrategy#getFieldName(org.springframework.data.mongodb.core.mapping.MongoPersistentProperty)
*/
public String getFieldName(MongoPersistentProperty property) {
return property.getName();
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -24,6 +24,8 @@ import org.springframework.context.ApplicationContext;
import org.springframework.context.ApplicationContextAware;
import org.springframework.data.mapping.context.AbstractMappingContext;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.model.FieldNamingStrategy;
import org.springframework.data.mapping.model.PropertyNameFieldNamingStrategy;
import org.springframework.data.mapping.model.SimpleTypeHolder;
import org.springframework.data.util.TypeInformation;

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2012 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -21,6 +21,7 @@ import org.springframework.data.mapping.PersistentEntity;
* MongoDB specific {@link PersistentEntity} abstraction.
*
* @author Oliver Gierke
* @author Christoph Strobl
*/
public interface MongoPersistentEntity<T> extends PersistentEntity<T, MongoPersistentProperty> {
@@ -30,4 +31,30 @@ public interface MongoPersistentEntity<T> extends PersistentEntity<T, MongoPersi
* @return
*/
String getCollection();
/**
* Returns the default language to be used for this entity.
*
* @since 1.6
* @return
*/
String getLanguage();
/**
* Returns the property holding text score value.
*
* @since 1.6
* @see #hasTextScoreProperty()
* @return {@literal null} if not present.
*/
MongoPersistentProperty getTextScoreProperty();
/**
* Returns whether the entity has a {@link TextScore} property.
*
* @since 1.6
* @return true if property annotated with {@link TextScore} is present.
*/
boolean hasTextScoreProperty();
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -26,6 +26,7 @@ import org.springframework.data.mapping.PersistentProperty;
* @author Oliver Gierke
* @author Patryk Wasik
* @author Thomas Darimont
* @author Christoph Strobl
*/
public interface MongoPersistentProperty extends PersistentProperty<MongoPersistentProperty> {
@@ -59,6 +60,24 @@ public interface MongoPersistentProperty extends PersistentProperty<MongoPersist
*/
boolean isExplicitIdProperty();
/**
* Returns whether the property indicates the documents language either by having a {@link #getFieldName()} equal to
* {@literal language} or being annotated with {@link Language}.
*
* @return
* @since 1.6
*/
boolean isLanguageProperty();
/**
* Returns whether the property holds the documents score calculated by text search. <br/>
* It's marked with {@link TextScore}.
*
* @return
* @since 1.6
*/
boolean isTextScoreProperty();
/**
* Returns the {@link DBRef} if the property is a reference.
*

View File

@@ -0,0 +1,40 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.mapping;
import java.lang.annotation.Documented;
import java.lang.annotation.ElementType;
import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;
import java.lang.annotation.Target;
import org.springframework.data.annotation.ReadOnlyProperty;
/**
* {@link TextScore} marks the property to be considered as the on server calculated {@literal textScore} when doing
* full text search. <br />
* <b>NOTE</b> Property will not be written when saving entity.
*
* @author Christoph Strobl
* @since 1.6
*/
@ReadOnlyProperty
@Documented
@Retention(RetentionPolicy.RUNTIME)
@Target(ElementType.FIELD)
public @interface TextScore {
}

View File

@@ -126,13 +126,13 @@ public abstract class AbstractMongoEventListener<E> implements ApplicationListen
public void onAfterDelete(DBObject dbo) {
if (LOG.isDebugEnabled()) {
LOG.debug("onAfterConvert({})", dbo);
LOG.debug("onAfterDelete({})", dbo);
}
}
public void onBeforeDelete(DBObject dbo) {
if (LOG.isDebugEnabled()) {
LOG.debug("onAfterConvert({})", dbo);
LOG.debug("onBeforeDelete({})", dbo);
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2010-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -24,11 +24,12 @@ import com.mongodb.util.JSON;
*
* @author Thomas Risberg
* @author Oliver Gierke
* @author Christoph Strobl
*/
public class BasicQuery extends Query {
private final DBObject queryObject;
private final DBObject fieldsObject;
private DBObject fieldsObject;
private DBObject sortObject;
public BasicQuery(String query) {
@@ -49,8 +50,12 @@ public class BasicQuery extends Query {
this.fieldsObject = fieldsObject;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.Query#addCriteria(org.springframework.data.mongodb.core.query.CriteriaDefinition)
*/
@Override
public Query addCriteria(Criteria criteria) {
public Query addCriteria(CriteriaDefinition criteria) {
this.queryObject.putAll(criteria.getCriteriaObject());
return this;
}
@@ -84,4 +89,12 @@ public class BasicQuery extends Query {
public void setSortObject(DBObject sortObject) {
this.sortObject = sortObject;
}
/**
* @since 1.6
* @param fieldsObject
*/
protected void setFieldsObject(DBObject fieldsObject) {
this.fieldsObject = fieldsObject;
}
}

View File

@@ -388,20 +388,6 @@ public class Criteria implements CriteriaDefinition {
return this;
}
/**
* @see Criteria#withinSphere(Circle)
* @param circle
* @return
* @deprecated As of 1.5, Use {@link #withinSphere(Circle)}. This method is scheduled to be removed in the next major
* release.
*/
@Deprecated
public Criteria withinSphere(org.springframework.data.mongodb.core.geo.Circle circle) {
Assert.notNull(circle);
criteria.put("$within", new GeoCommand(new Sphere(circle)));
return this;
}
/**
* Creates a geospatial criterion using a {@literal $within} operation.
*

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2010-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -17,8 +17,25 @@ package org.springframework.data.mongodb.core.query;
import com.mongodb.DBObject;
/**
* @author Oliver Gierke
* @author Christoph Strobl
*/
public interface CriteriaDefinition {
/**
* Get {@link DBObject} representation.
*
* @return
*/
DBObject getCriteriaObject();
}
/**
* Get the identifying {@literal key}.
*
* @return
* @since 1.6
*/
String getKey();
}

View File

@@ -66,17 +66,16 @@ public class GeoCommand {
* @param shape must not be {@literal null}.
* @return
*/
@SuppressWarnings("deprecation")
private String getCommand(Shape shape) {
Assert.notNull(shape, "Shape must not be null!");
if (shape instanceof Box) {
return org.springframework.data.mongodb.core.geo.Box.COMMAND;
} else if (shape instanceof Circle || shape instanceof org.springframework.data.mongodb.core.geo.Circle) {
return org.springframework.data.mongodb.core.geo.Circle.COMMAND;
return "$box";
} else if (shape instanceof Circle) {
return "$center";
} else if (shape instanceof Polygon) {
return org.springframework.data.mongodb.core.geo.Polygon.COMMAND;
return "$polygon";
} else if (shape instanceof Sphere) {
return org.springframework.data.mongodb.core.geo.Sphere.COMMAND;
}

View File

@@ -0,0 +1,193 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.query;
import java.util.Collections;
import java.util.LinkedHashMap;
import java.util.Map;
import java.util.Map.Entry;
import java.util.concurrent.TimeUnit;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils;
/**
* Meta-data for {@link Query} instances.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.6
*/
public class Meta {
private enum MetaKey {
MAX_TIME_MS("$maxTimeMS"), MAX_SCAN("$maxScan"), COMMENT("$comment"), SNAPSHOT("$snapshot");
private String key;
private MetaKey(String key) {
this.key = key;
}
}
private final Map<String, Object> values = new LinkedHashMap<String, Object>(2);
/**
* @return {@literal null} if not set.
*/
public Long getMaxTimeMsec() {
return getValue(MetaKey.MAX_TIME_MS.key);
}
/**
* Set the maximum time limit in milliseconds for processing operations.
*
* @param maxTimeMsec
*/
public void setMaxTimeMsec(long maxTimeMsec) {
setMaxTime(maxTimeMsec, TimeUnit.MILLISECONDS);
}
/**
* Set the maximum time limit for processing operations.
*
* @param timeout
* @param timeUnit
*/
public void setMaxTime(long timeout, TimeUnit timeUnit) {
setValue(MetaKey.MAX_TIME_MS.key, (timeUnit != null ? timeUnit : TimeUnit.MILLISECONDS).toMillis(timeout));
}
/**
* @return {@literal null} if not set.
*/
public Long getMaxScan() {
return getValue(MetaKey.MAX_SCAN.key);
}
/**
* Only scan the specified number of documents.
*
* @param maxScan
*/
public void setMaxScan(long maxScan) {
setValue(MetaKey.MAX_SCAN.key, maxScan);
}
/**
* Add a comment to the query.
*
* @param comment
*/
public void setComment(String comment) {
setValue(MetaKey.COMMENT.key, comment);
}
/**
* @return {@literal null} if not set.
*/
public String getComment() {
return getValue(MetaKey.COMMENT.key);
}
/**
* Using snapshot prevents the cursor from returning a document more than once.
*
* @param useSnapshot
*/
public void setSnapshot(boolean useSnapshot) {
setValue(MetaKey.SNAPSHOT.key, useSnapshot);
}
/**
* @return {@literal null} if not set.
*/
public boolean getSnapshot() {
return getValue(MetaKey.SNAPSHOT.key, false);
}
/**
* @return
*/
public boolean hasValues() {
return !this.values.isEmpty();
}
/**
* Get {@link Iterable} of set meta values.
*
* @return
*/
public Iterable<Entry<String, Object>> values() {
return Collections.unmodifiableSet(this.values.entrySet());
}
/**
* Sets or removes the value in case of {@literal null} or empty {@link String}.
*
* @param key must not be {@literal null} or empty.
* @param value
*/
private void setValue(String key, Object value) {
Assert.hasText(key, "Meta key must not be 'null' or blank.");
if (value == null || (value instanceof String && !StringUtils.hasText((String) value))) {
this.values.remove(key);
}
this.values.put(key, value);
}
@SuppressWarnings("unchecked")
private <T> T getValue(String key) {
return (T) this.values.get(key);
}
private <T> T getValue(String key, T defaultValue) {
T value = getValue(key);
return value != null ? value : defaultValue;
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
return ObjectUtils.nullSafeHashCode(this.values);
}
/*
* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (!(obj instanceof Meta)) {
return false;
}
Meta other = (Meta) obj;
return ObjectUtils.nullSafeEquals(this.values, other.values);
}
}

View File

@@ -343,7 +343,6 @@ public final class NearQuery {
*
* @return
*/
@SuppressWarnings("deprecation")
public DBObject toDBObject() {
BasicDBObject dbObject = new BasicDBObject();

View File

@@ -25,6 +25,7 @@ import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;
import java.util.Set;
import java.util.concurrent.TimeUnit;
import org.springframework.data.domain.Pageable;
import org.springframework.data.domain.Sort;
@@ -39,56 +40,65 @@ import com.mongodb.DBObject;
* @author Thomas Risberg
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
*/
public class Query {
private static final String RESTRICTED_TYPES_KEY = "_$RESTRICTED_TYPES";
private final Set<Class<?>> restrictedTypes = new HashSet<Class<?>>();
private final Map<String, Criteria> criteria = new LinkedHashMap<String, Criteria>();
private final Map<String, CriteriaDefinition> criteria = new LinkedHashMap<String, CriteriaDefinition>();
private Field fieldSpec;
private Sort sort;
private int skip;
private int limit;
private String hint;
private Meta meta = new Meta();
/**
* Static factory method to create a {@link Query} using the provided {@link Criteria}.
* Static factory method to create a {@link Query} using the provided {@link CriteriaDefinition}.
*
* @param criteria must not be {@literal null}.
* @param criteriaDefinition must not be {@literal null}.
* @return
* @since 1.6
*/
public static Query query(Criteria criteria) {
return new Query(criteria);
public static Query query(CriteriaDefinition criteriaDefinition) {
return new Query(criteriaDefinition);
}
public Query() {}
/**
* Creates a new {@link Query} using the given {@link Criteria}.
* Creates a new {@link Query} using the given {@link CriteriaDefinition}.
*
* @param criteria must not be {@literal null}.
* @param criteriaDefinition must not be {@literal null}.
* @since 1.6
*/
public Query(Criteria criteria) {
addCriteria(criteria);
public Query(CriteriaDefinition criteriaDefinition) {
addCriteria(criteriaDefinition);
}
/**
* Adds the given {@link Criteria} to the current {@link Query}.
* Adds the given {@link CriteriaDefinition} to the current {@link Query}.
*
* @param criteria must not be {@literal null}.
* @param criteriaDefinition must not be {@literal null}.
* @return
* @since 1.6
*/
public Query addCriteria(Criteria criteria) {
CriteriaDefinition existing = this.criteria.get(criteria.getKey());
String key = criteria.getKey();
public Query addCriteria(CriteriaDefinition criteriaDefinition) {
CriteriaDefinition existing = this.criteria.get(criteriaDefinition.getKey());
String key = criteriaDefinition.getKey();
if (existing == null) {
this.criteria.put(key, criteria);
this.criteria.put(key, criteriaDefinition);
} else {
throw new InvalidMongoDbApiUsageException("Due to limitations of the com.mongodb.BasicDBObject, "
+ "you can't add a second '" + key + "' criteria. " + "Query already contains '"
+ existing.getCriteriaObject() + "'.");
}
return this;
}
@@ -268,8 +278,86 @@ public class Query {
return hint;
}
protected List<Criteria> getCriteria() {
return new ArrayList<Criteria>(this.criteria.values());
/**
* @param maxTimeMsec
* @return
* @see Meta#setMaxTimeMsec(long)
* @since 1.6
*/
public Query maxTimeMsec(long maxTimeMsec) {
meta.setMaxTimeMsec(maxTimeMsec);
return this;
}
/**
* @param timeout
* @param timeUnit
* @return
* @see Meta#setMaxTime(long, TimeUnit)
* @since 1.6
*/
public Query maxTime(long timeout, TimeUnit timeUnit) {
meta.setMaxTime(timeout, timeUnit);
return this;
}
/**
* @param maxScan
* @return
* @see Meta#setMaxScan(long)
* @since 1.6
*/
public Query maxScan(long maxScan) {
meta.setMaxScan(maxScan);
return this;
}
/**
* @param comment
* @return
* @see Meta#setComment(String)
* @since 1.6
*/
public Query comment(String comment) {
meta.setComment(comment);
return this;
}
/**
* @return
* @see Meta#setSnapshot(boolean)
* @since 1.6
*/
public Query useSnapshot() {
meta.setSnapshot(true);
return this;
}
/**
* @return never {@literal null}.
* @since 1.6
*/
public Meta getMeta() {
return meta;
}
/**
* @param meta must not be {@literal null}.
* @since 1.6
*/
public void setMeta(Meta meta) {
Assert.notNull(meta, "Query meta might be empty but must not be null.");
this.meta = meta;
}
protected List<CriteriaDefinition> getCriteria() {
return new ArrayList<CriteriaDefinition>(this.criteria.values());
}
/*
@@ -305,8 +393,9 @@ public class Query {
boolean hintEqual = this.hint == null ? that.hint == null : this.hint.equals(that.hint);
boolean skipEqual = this.skip == that.skip;
boolean limitEqual = this.limit == that.limit;
boolean metaEqual = nullSafeEquals(this.meta, that.meta);
return criteriaEqual && fieldsEqual && sortEqual && hintEqual && skipEqual && limitEqual;
return criteriaEqual && fieldsEqual && sortEqual && hintEqual && skipEqual && limitEqual && metaEqual;
}
/*
@@ -324,6 +413,7 @@ public class Query {
result += 31 * nullSafeHashCode(hint);
result += 31 * skip;
result += 31 * limit;
result += 31 * nullSafeHashCode(meta);
return result;
}

View File

@@ -0,0 +1,102 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.query;
/**
* A {@link Term} defines one or multiple words {@link Type#WORD} or phrases {@link Type#PHRASE} to be used in the
* context of full text search.
*
* @author Christoph Strobl
* @since 1.6
*/
public class Term {
public enum Type {
WORD, PHRASE;
}
private final Type type;
private final String raw;
private boolean negated;
/**
* Creates a new {@link Term} of {@link Type#WORD}.
*
* @param raw
*/
public Term(String raw) {
this(raw, Type.WORD);
}
/**
* Creates a new {@link Term} of given {@link Type}.
*
* @param raw
* @param type defaulted to {@link Type#WORD} if {@literal null}.
*/
public Term(String raw, Type type) {
this.raw = raw;
this.type = type == null ? Type.WORD : type;
}
/**
* Negates the term.
*
* @return
*/
public Term negate() {
this.negated = true;
return this;
}
/**
* @return return true if term is negated.
*/
public boolean isNegated() {
return negated;
}
/**
* @return type of term. Never {@literal null}.
*/
public Type getType() {
return type;
}
/**
* Get formatted representation of term.
*
* @return
*/
public String getFormatted() {
String formatted = Type.PHRASE.equals(type) ? quotePhrase(raw) : raw;
return negated ? negateRaw(formatted) : formatted;
}
@Override
public String toString() {
return getFormatted();
}
protected String quotePhrase(String raw) {
return "\"" + raw + "\"";
}
protected String negateRaw(String raw) {
return "-" + raw;
}
}

View File

@@ -0,0 +1,211 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.query;
import java.util.ArrayList;
import java.util.List;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.BasicDBObjectBuilder;
import com.mongodb.DBObject;
/**
* Implementation of {@link CriteriaDefinition} to be used for full text search.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.6
*/
public class TextCriteria implements CriteriaDefinition {
private final List<Term> terms;
private String language;
/**
* Creates a new {@link TextCriteria}.
*
* @see #forDefaultLanguage()
* @see #forLanguage(String)
*/
public TextCriteria() {
this(null);
}
private TextCriteria(String language) {
this.language = language;
this.terms = new ArrayList<Term>();
}
/**
* Returns a new {@link TextCriteria} for the default language.
*
* @return
*/
public static TextCriteria forDefaultLanguage() {
return new TextCriteria();
}
/**
* For a full list of supported languages see the mongdodb reference manual for <a
* href="http://docs.mongodb.org/manual/reference/text-search-languages/">Text Search Languages</a>.
*
* @param language
* @return
*/
public static TextCriteria forLanguage(String language) {
Assert.hasText(language, "Language must not be null or empty!");
return new TextCriteria(language);
}
/**
* Configures the {@link TextCriteria} to match any of the given words.
*
* @param words the words to match.
* @return
*/
public TextCriteria matchingAny(String... words) {
for (String word : words) {
matching(word);
}
return this;
}
/**
* Adds given {@link Term} to criteria.
*
* @param term must not be {@literal null}.
*/
public TextCriteria matching(Term term) {
Assert.notNull(term, "Term to add must not be null.");
this.terms.add(term);
return this;
}
/**
* @param term
* @return
*/
public TextCriteria matching(String term) {
if (StringUtils.hasText(term)) {
matching(new Term(term));
}
return this;
}
/**
* @param term
* @return
*/
public TextCriteria notMatching(String term) {
if (StringUtils.hasText(term)) {
matching(new Term(term, Term.Type.WORD).negate());
}
return this;
}
/**
* @param words
* @return
*/
public TextCriteria notMatchingAny(String... words) {
for (String word : words) {
notMatching(word);
}
return this;
}
/**
* Given value will treated as a single phrase.
*
* @param phrase
* @return
*/
public TextCriteria notMatchingPhrase(String phrase) {
if (StringUtils.hasText(phrase)) {
matching(new Term(phrase, Term.Type.PHRASE).negate());
}
return this;
}
/**
* Given value will treated as a single phrase.
*
* @param phrase
* @return
*/
public TextCriteria matchingPhrase(String phrase) {
if (StringUtils.hasText(phrase)) {
matching(new Term(phrase, Term.Type.PHRASE));
}
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.CriteriaDefinition#getKey()
*/
@Override
public String getKey() {
return "$text";
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.CriteriaDefinition#getCriteriaObject()
*/
@Override
public DBObject getCriteriaObject() {
BasicDBObjectBuilder builder = new BasicDBObjectBuilder();
if (StringUtils.hasText(language)) {
builder.add("$language", language);
}
if (!terms.isEmpty()) {
builder.add("$search", join(terms));
}
return new BasicDBObject("$text", builder.get());
}
private String join(Iterable<Term> terms) {
List<String> result = new ArrayList<String>();
for (Term term : terms) {
if (term != null) {
result.add(term.getFormatted());
}
}
return StringUtils.collectionToDelimitedString(result, " ");
}
}

View File

@@ -0,0 +1,188 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.query;
import java.util.Locale;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* {@link Query} implementation to be used to for performing full text searches.
*
* @author Christoph Strobl
* @since 1.6
*/
public class TextQuery extends Query {
private final String DEFAULT_SCORE_FIELD_FIELDNAME = "score";
private final DBObject META_TEXT_SCORE = new BasicDBObject("$meta", "textScore");
private String scoreFieldName = DEFAULT_SCORE_FIELD_FIELDNAME;
private boolean includeScore = false;
private boolean sortByScore = false;
/**
* Creates new {@link TextQuery} using the the given {@code wordsAndPhrases} with {@link TextCriteria}
*
* @param wordsAndPhrases
* @see TextCriteria#matching(String)
*/
public TextQuery(String wordsAndPhrases) {
super(TextCriteria.forDefaultLanguage().matching(wordsAndPhrases));
}
/**
* Creates new {@link TextQuery} in {@code language}. <br />
* For a full list of supported languages see the mongdodb reference manual for <a
* href="http://docs.mongodb.org/manual/reference/text-search-languages/">Text Search Languages</a>.
*
* @param wordsAndPhrases
* @param language
* @see TextCriteria#forLanguage(String)
* @see TextCriteria#matching(String)
*/
public TextQuery(String wordsAndPhrases, String language) {
super(TextCriteria.forLanguage(language).matching(wordsAndPhrases));
}
/**
* Creates new {@link TextQuery} using the {@code locale}s language.<br />
* For a full list of supported languages see the mongdodb reference manual for <a
* href="http://docs.mongodb.org/manual/reference/text-search-languages/">Text Search Languages</a>.
*
* @param wordsAndPhrases
* @param locale
*/
public TextQuery(String wordsAndPhrases, Locale locale) {
this(wordsAndPhrases, locale != null ? locale.getLanguage() : (String) null);
}
/**
* Creates new {@link TextQuery} for given {@link TextCriteria}.
*
* @param criteria.
*/
public TextQuery(TextCriteria criteria) {
super(criteria);
}
/**
* Creates new {@link TextQuery} searching for given {@link TextCriteria}.
*
* @param criteria
* @return
*/
public static TextQuery queryText(TextCriteria criteria) {
return new TextQuery(criteria);
}
/**
* Add sorting by text score. Will also add text score to returned fields.
*
* @see TextQuery#includeScore()
* @return
*/
public TextQuery sortByScore() {
this.includeScore();
this.sortByScore = true;
return this;
}
/**
* Add field {@literal score} holding the documents textScore to the returned fields.
*
* @return
*/
public TextQuery includeScore() {
this.includeScore = true;
return this;
}
/**
* Include text search document score in returned fields using the given fieldname.
*
* @param fieldname
* @return
*/
public TextQuery includeScore(String fieldname) {
setScoreFieldName(fieldname);
includeScore();
return this;
}
/**
* Set the fieldname used for scoring.
*
* @param fieldName
*/
public void setScoreFieldName(String fieldName) {
this.scoreFieldName = fieldName;
}
/**
* Get the fieldname used for scoring
*
* @return
*/
public String getScoreFieldName() {
return scoreFieldName;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.Query#getFieldsObject()
*/
@Override
public DBObject getFieldsObject() {
if (!this.includeScore) {
return super.getFieldsObject();
}
DBObject fields = super.getFieldsObject();
if (fields == null) {
fields = new BasicDBObject();
}
fields.put(getScoreFieldName(), META_TEXT_SCORE);
return fields;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.Query#getSortObject()
*/
@Override
public DBObject getSortObject() {
DBObject sort = new BasicDBObject();
if (this.sortByScore) {
sort.put(getScoreFieldName(), META_TEXT_SCORE);
}
if (super.getSortObject() != null) {
sort.putAll(super.getSortObject());
}
return sort;
}
}

View File

@@ -15,10 +15,11 @@
*/
package org.springframework.data.mongodb.core.query;
import static org.springframework.util.ObjectUtils.*;
import java.util.Arrays;
import java.util.Collection;
import java.util.Collections;
import java.util.HashMap;
import java.util.HashSet;
import java.util.LinkedHashMap;
import java.util.List;
@@ -40,6 +41,7 @@ import com.mongodb.DBObject;
* @author Oliver Gierke
* @author Becca Gaspard
* @author Christoph Strobl
* @author Thomas Darimont
*/
public class Update {
@@ -278,7 +280,36 @@ public class Update {
return this;
}
/**
* Update given key to current date using {@literal $currentDate} modifier.
*
* @see http://docs.mongodb.org/manual/reference/operator/update/currentDate/
* @param key
* @return
* @since 1.6
*/
public Update currentDate(String key) {
addMultiFieldOperation("$currentDate", key, true);
return this;
}
/**
* Update given key to current date using {@literal $currentDate : &#123; $type : "timestamp" &#125;} modifier.
*
* @see http://docs.mongodb.org/manual/reference/operator/update/currentDate/
* @param key
* @return
* @since 1.6
*/
public Update currentTimestamp(String key) {
addMultiFieldOperation("$currentDate", key, new BasicDBObject("$type", "timestamp"));
return this;
}
public DBObject getUpdateObject() {
DBObject dbo = new BasicDBObject();
for (String k : modifierOps.keySet()) {
dbo.put(k, modifierOps.get(k));
@@ -335,14 +366,52 @@ public class Update {
return StringUtils.startsWithIgnoreCase(key, "$");
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
return getUpdateObject().hashCode();
}
/*
* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (obj == null || getClass() != obj.getClass()) {
return false;
}
Update that = (Update) obj;
return this.getUpdateObject().equals(that.getUpdateObject());
}
/*
* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
return SerializationUtils.serializeToJsonSafely(getUpdateObject());
}
/**
* Modifiers holds a distinct collection of {@link Modifier}
*
* @author Christoph Strobl
* @author Thomas Darimont
*/
public static class Modifiers {
private HashMap<String, Modifier> modifiers;
private Map<String, Modifier> modifiers;
public Modifiers() {
this.modifiers = new LinkedHashMap<String, Modifier>(1);
@@ -355,6 +424,33 @@ public class Update {
public void addModifier(Modifier modifier) {
this.modifiers.put(modifier.getKey(), modifier);
}
/* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
return nullSafeHashCode(modifiers);
}
/* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (obj == null || getClass() != obj.getClass()) {
return false;
}
Modifiers that = (Modifiers) obj;
return this.modifiers.equals(that.modifiers);
}
}
/**
@@ -379,6 +475,7 @@ public class Update {
* Implementation of {@link Modifier} representing {@code $each}.
*
* @author Christoph Strobl
* @author Thomas Darimont
*/
private static class Each implements Modifier {
@@ -399,6 +496,7 @@ public class Update {
}
Object[] convertedValues = new Object[values.length];
for (int i = 0; i < values.length; i++) {
convertedValues[i] = values[i];
}
@@ -406,21 +504,57 @@ public class Update {
return convertedValues;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.Update.Modifier#getKey()
*/
@Override
public String getKey() {
return "$each";
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.Update.Modifier#getValue()
*/
@Override
public Object getValue() {
return this.values;
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
return nullSafeHashCode(values);
}
/*
* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object that) {
if (this == that) {
return true;
}
if (that == null || getClass() != that.getClass()) {
return false;
}
return nullSafeEquals(values, ((Each) that).values);
}
}
/**
* Builder for creating {@code $push} modifiers
*
* @author Christoph Strobl
* @author Thomas Darimont
*/
public class PushOperatorBuilder {
@@ -453,6 +587,50 @@ public class Update {
public Update value(Object value) {
return Update.this.push(key, value);
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
int result = 17;
result += 31 * result + getOuterType().hashCode();
result += 31 * result + nullSafeHashCode(key);
result += 31 * result + nullSafeHashCode(modifiers);
return result;
}
/*
* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (obj == null || getClass() != obj.getClass()) {
return false;
}
PushOperatorBuilder that = (PushOperatorBuilder) obj;
if (!getOuterType().equals(that.getOuterType())) {
return false;
}
return nullSafeEquals(this.key, that.key) && nullSafeEquals(this.modifiers, that.modifiers);
}
private Update getOuterType() {
return Update.this;
}
}
/**
@@ -488,7 +666,5 @@ public class Update {
public Update value(Object value) {
return Update.this.addToSet(this.key, value);
}
}
}

View File

@@ -0,0 +1,64 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.repository;
import java.lang.annotation.Documented;
import java.lang.annotation.ElementType;
import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;
import java.lang.annotation.Target;
import org.springframework.data.annotation.QueryAnnotation;
/**
* @author Christoph Strobl
* @since 1.6
*/
@Retention(RetentionPolicy.RUNTIME)
@Target(ElementType.METHOD)
@Documented
@QueryAnnotation
public @interface Meta {
/**
* Set the maximum time limit in milliseconds for processing operations.
*
* @return
*/
long maxExcecutionTime() default -1;
/**
* Only scan the specified number of documents.
*
* @return
*/
long maxScanDocuments() default -1;
/**
* Add a comment to the query.
*
* @return
*/
String comment() default "";
/**
* Using snapshot prevents the cursor from returning a document more than once.
*
* @return
*/
boolean snapshot() default false;
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2012 the original author or authors.
* Copyright 2010-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -26,6 +26,7 @@ import org.springframework.data.repository.PagingAndSortingRepository;
* Mongo specific {@link org.springframework.data.repository.Repository} interface.
*
* @author Oliver Gierke
* @author Christoph Strobl
*/
@NoRepositoryBean
public interface MongoRepository<T, ID extends Serializable> extends PagingAndSortingRepository<T, ID> {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2012 the original author or authors.
* Copyright 2012-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -31,6 +31,7 @@ import org.springframework.util.Assert;
* {@link CdiRepositoryBean} to create Mongo repository instances.
*
* @author Oliver Gierke
* @author Mark Paluch
*/
public class MongoRepositoryBean<T> extends CdiRepositoryBean<T> {
@@ -43,11 +44,13 @@ public class MongoRepositoryBean<T> extends CdiRepositoryBean<T> {
* @param qualifiers must not be {@literal null}.
* @param repositoryType must not be {@literal null}.
* @param beanManager must not be {@literal null}.
* @param customImplementationBean the bean for the custom implementation of the
* {@link org.springframework.data.repository.Repository}, can be {@literal null}.
*/
public MongoRepositoryBean(Bean<MongoOperations> operations, Set<Annotation> qualifiers, Class<T> repositoryType,
BeanManager beanManager) {
BeanManager beanManager, Bean<?> customImplementationBean) {
super(qualifiers, repositoryType, beanManager);
super(qualifiers, repositoryType, beanManager, customImplementationBean);
Assert.notNull(operations);
this.operations = operations;
@@ -58,11 +61,11 @@ public class MongoRepositoryBean<T> extends CdiRepositoryBean<T> {
* @see org.springframework.data.repository.cdi.CdiRepositoryBean#create(javax.enterprise.context.spi.CreationalContext, java.lang.Class)
*/
@Override
protected T create(CreationalContext<T> creationalContext, Class<T> repositoryType) {
protected T create(CreationalContext<T> creationalContext, Class<T> repositoryType, Object customImplementation) {
MongoOperations mongoOperations = getDependencyInstance(operations, MongoOperations.class);
MongoRepositoryFactory factory = new MongoRepositoryFactory(mongoOperations);
return factory.getRepository(repositoryType);
return factory.getRepository(repositoryType, customImplementation);
}
}

View File

@@ -40,6 +40,7 @@ import org.springframework.data.repository.cdi.CdiRepositoryExtensionSupport;
* CDI extension to export Mongo repositories.
*
* @author Oliver Gierke
* @author Mark Paluch
*/
public class MongoRepositoryExtension extends CdiRepositoryExtensionSupport {
@@ -109,7 +110,10 @@ public class MongoRepositoryExtension extends CdiRepositoryExtensionSupport {
MongoOperations.class.getName(), qualifiers));
}
Bean<?> customImplementationBean = getCustomImplementationBean(repositoryType, beanManager, qualifiers);
// Construct and return the repository bean.
return new MongoRepositoryBean<T>(mongoOperations, qualifiers, repositoryType, beanManager);
return new MongoRepositoryBean<T>(mongoOperations, qualifiers, repositoryType, beanManager,
customImplementationBean);
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2012 the original author or authors.
* Copyright 2012-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,6 +15,10 @@
*/
package org.springframework.data.mongodb.repository.config;
import java.lang.annotation.Annotation;
import java.util.Collection;
import java.util.Collections;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
@@ -22,7 +26,9 @@ import org.springframework.beans.factory.support.RootBeanDefinition;
import org.springframework.core.annotation.AnnotationAttributes;
import org.springframework.data.config.ParsingUtils;
import org.springframework.data.mongodb.config.BeanNames;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.repository.MongoRepository;
import org.springframework.data.mongodb.repository.support.MongoRepositoryFactoryBean;
import org.springframework.data.repository.config.AnnotationRepositoryConfigurationSource;
import org.springframework.data.repository.config.RepositoryConfigurationExtension;
@@ -43,6 +49,15 @@ public class MongoRepositoryConfigurationExtension extends RepositoryConfigurati
private boolean fallbackMappingContextCreated = false;
/*
* (non-Javadoc)
* @see org.springframework.data.repository.config.RepositoryConfigurationExtensionSupport#getModuleName()
*/
@Override
public String getModuleName() {
return "MongoDB";
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.config.RepositoryConfigurationExtensionSupport#getModulePrefix()
@@ -60,6 +75,24 @@ public class MongoRepositoryConfigurationExtension extends RepositoryConfigurati
return MongoRepositoryFactoryBean.class.getName();
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.config.RepositoryConfigurationExtensionSupport#getIdentifyingAnnotations()
*/
@Override
protected Collection<Class<? extends Annotation>> getIdentifyingAnnotations() {
return Collections.<Class<? extends Annotation>> singleton(Document.class);
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.config.RepositoryConfigurationExtensionSupport#getIdentifyingTypes()
*/
@Override
protected Collection<Class<?>> getIdentifyingTypes() {
return Collections.<Class<?>> singleton(MongoRepository.class);
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.config.RepositoryConfigurationExtensionSupport#postProcess(org.springframework.beans.factory.support.BeanDefinitionBuilder, org.springframework.data.repository.config.RepositoryConfigurationSource)
@@ -105,6 +138,8 @@ public class MongoRepositoryConfigurationExtension extends RepositoryConfigurati
@Override
public void registerBeansForRoot(BeanDefinitionRegistry registry, RepositoryConfigurationSource configurationSource) {
super.registerBeansForRoot(registry, configurationSource);
if (!registry.containsBeanDefinition(BeanNames.MAPPING_CONTEXT_BEAN_NAME)) {
RootBeanDefinition definition = new RootBeanDefinition(MongoMappingContext.class);

View File

@@ -15,6 +15,7 @@
*/
package org.springframework.data.mongodb.repository.query;
import java.util.Collections;
import java.util.List;
import org.springframework.core.convert.ConversionService;
@@ -25,6 +26,7 @@ import org.springframework.data.domain.Pageable;
import org.springframework.data.domain.Slice;
import org.springframework.data.domain.SliceImpl;
import org.springframework.data.geo.Distance;
import org.springframework.data.geo.GeoPage;
import org.springframework.data.geo.GeoResult;
import org.springframework.data.geo.GeoResults;
import org.springframework.data.geo.Point;
@@ -84,6 +86,8 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
MongoParameterAccessor accessor = new MongoParametersParameterAccessor(method, parameters);
Query query = createQuery(new ConvertingParameterAccessor(operations.getConverter(), accessor));
applyQueryMetaAttributesWhenPresent(query);
Object result = null;
if (isDeleteQuery()) {
@@ -119,6 +123,14 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
return CONVERSION_SERVICE.convert(result, expectedReturnType);
}
private Query applyQueryMetaAttributesWhenPresent(Query query) {
if (method.hasQueryMetaAttributes()) {
query.setMeta(method.getQueryMetaAttributes());
}
return query;
}
/**
* Creates a {@link Query} instance using the given {@link ConvertingParameterAccessor}. Will delegate to
* {@link #createQuery(ConvertingParameterAccessor)} by default but allows customization of the count query to be
@@ -128,7 +140,12 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
* @return
*/
protected Query createCountQuery(ConvertingParameterAccessor accessor) {
return createQuery(accessor);
Query query = createQuery(accessor);
applyQueryMetaAttributesWhenPresent(query);
return query;
}
/**
@@ -254,10 +271,26 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
Object execute(Query query) {
MongoEntityMetadata<?> metadata = method.getEntityInformation();
int overallLimit = query.getLimit();
long count = operations.count(query, metadata.getCollectionName());
count = overallLimit != 0 ? Math.min(count, query.getLimit()) : count;
List<?> result = operations.find(query.with(pageable), metadata.getJavaType(), metadata.getCollectionName());
boolean pageableOutOfScope = pageable.getOffset() > count;
if (pageableOutOfScope) {
return new PageImpl<Object>(Collections.emptyList(), pageable, count);
}
// Apply raw pagination
query = query.with(pageable);
// Adjust limit if page would exceed the overall limit
if (overallLimit != 0 && pageable.getOffset() + pageable.getPageSize() > overallLimit) {
query.limit(overallLimit - pageable.getOffset());
}
List<?> result = operations.find(query, metadata.getJavaType(), metadata.getCollectionName());
return new PageImpl(result, pageable, count);
}
}
@@ -293,7 +326,6 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
*
* @author Oliver Gierke
*/
@SuppressWarnings("deprecation")
final class GeoNearExecution extends Execution {
private final MongoParameterAccessor accessor;
@@ -325,12 +357,11 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
MongoEntityMetadata<?> metadata = method.getEntityInformation();
long count = operations.count(countQuery, metadata.getCollectionName());
return new org.springframework.data.mongodb.core.geo.GeoPage<Object>(doExecuteQuery(query),
accessor.getPageable(), count);
return new GeoPage<Object>(doExecuteQuery(query), accessor.getPageable(), count);
}
@SuppressWarnings("unchecked")
private org.springframework.data.mongodb.core.geo.GeoResults<Object> doExecuteQuery(Query query) {
private GeoResults<Object> doExecuteQuery(Query query) {
Point nearLocation = accessor.getGeoNearLocation();
NearQuery nearQuery = NearQuery.near(nearLocation);
@@ -350,8 +381,7 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
}
MongoEntityMetadata<?> metadata = method.getEntityInformation();
return (org.springframework.data.mongodb.core.geo.GeoResults<Object>) operations.geoNear(nearQuery,
metadata.getJavaType(), metadata.getCollectionName());
return (GeoResults<Object>) operations.geoNear(nearQuery, metadata.getJavaType(), metadata.getCollectionName());
}
private boolean isListOfGeoResult() {
@@ -391,7 +421,7 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
return operations.findAllAndRemove(query, metadata.getJavaType());
}
WriteResult writeResult = operations.remove(query, metadata.getCollectionName());
WriteResult writeResult = operations.remove(query, metadata.getJavaType(), metadata.getCollectionName());
return writeResult != null ? writeResult.getN() : 0L;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -27,6 +27,7 @@ import org.springframework.data.geo.Distance;
import org.springframework.data.geo.Point;
import org.springframework.data.mongodb.core.convert.MongoWriter;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.query.TextCriteria;
import org.springframework.data.repository.query.ParameterAccessor;
import org.springframework.data.util.TypeInformation;
import org.springframework.util.Assert;
@@ -38,6 +39,7 @@ import com.mongodb.DBRef;
* Custom {@link ParameterAccessor} that uses a {@link MongoWriter} to serialize parameters into Mongo format.
*
* @author Oliver Gierke
* @author Christoph Strobl
*/
public class ConvertingParameterAccessor implements MongoParameterAccessor {
@@ -110,6 +112,14 @@ public class ConvertingParameterAccessor implements MongoParameterAccessor {
return delegate.getGeoNearLocation();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.query.MongoParameterAccessor#getFullText()
*/
public TextCriteria getFullText() {
return delegate.getFullText();
}
/**
* Converts the given value with the underlying {@link MongoWriter}.
*

View File

@@ -17,12 +17,14 @@ package org.springframework.data.mongodb.repository.query;
import org.springframework.data.geo.Distance;
import org.springframework.data.geo.Point;
import org.springframework.data.mongodb.core.query.TextCriteria;
import org.springframework.data.repository.query.ParameterAccessor;
/**
* Mongo-specific {@link ParameterAccessor} exposing a maximum distance parameter.
*
* @author Oliver Gierke
* @author Christoph Strobl
*/
public interface MongoParameterAccessor extends ParameterAccessor {
@@ -40,4 +42,12 @@ public interface MongoParameterAccessor extends ParameterAccessor {
* @return
*/
Point getGeoNearLocation();
/**
* Returns the {@link TextCriteria} to be used for full text query.
*
* @return null if not set.
* @since 1.6
*/
TextCriteria getFullText();
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -22,6 +22,7 @@ import java.util.List;
import org.springframework.core.MethodParameter;
import org.springframework.data.geo.Distance;
import org.springframework.data.geo.Point;
import org.springframework.data.mongodb.core.query.TextCriteria;
import org.springframework.data.mongodb.repository.Near;
import org.springframework.data.mongodb.repository.query.MongoParameters.MongoParameter;
import org.springframework.data.repository.query.Parameter;
@@ -31,10 +32,13 @@ import org.springframework.data.repository.query.Parameters;
* Custom extension of {@link Parameters} discovering additional
*
* @author Oliver Gierke
* @author Christoph Strobl
*/
public class MongoParameters extends Parameters<MongoParameters, MongoParameter> {
private final Integer distanceIndex;
private final Integer fullTextIndex;
private Integer nearIndex;
/**
@@ -48,6 +52,7 @@ public class MongoParameters extends Parameters<MongoParameters, MongoParameter>
super(method);
List<Class<?>> parameterTypes = Arrays.asList(method.getParameterTypes());
this.distanceIndex = parameterTypes.indexOf(Distance.class);
this.fullTextIndex = parameterTypes.indexOf(TextCriteria.class);
if (this.nearIndex == null && isGeoNearMethod) {
this.nearIndex = getNearIndex(parameterTypes);
@@ -56,19 +61,19 @@ public class MongoParameters extends Parameters<MongoParameters, MongoParameter>
}
}
private MongoParameters(List<MongoParameter> parameters, Integer distanceIndex, Integer nearIndex) {
private MongoParameters(List<MongoParameter> parameters, Integer distanceIndex, Integer nearIndex,
Integer fullTextIndex) {
super(parameters);
this.distanceIndex = distanceIndex;
this.nearIndex = nearIndex;
this.fullTextIndex = fullTextIndex;
}
@SuppressWarnings({ "unchecked", "deprecation" })
private final int getNearIndex(List<Class<?>> parameterTypes) {
for (Class<?> reference : Arrays.asList(Point.class, org.springframework.data.mongodb.core.geo.Point.class,
double[].class)) {
for (Class<?> reference : Arrays.asList(Point.class, double[].class)) {
int nearIndex = parameterTypes.indexOf(reference);
@@ -124,13 +129,31 @@ public class MongoParameters extends Parameters<MongoParameters, MongoParameter>
return nearIndex;
}
/**
* Returns ths inde of the parameter to be used as a textquery param
*
* @return
* @since 1.6
*/
public int getFullTextParameterIndex() {
return fullTextIndex != null ? fullTextIndex.intValue() : -1;
}
/**
* @return
* @since 1.6
*/
public boolean hasFullTextParameter() {
return this.fullTextIndex != null && this.fullTextIndex.intValue() >= 0;
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.query.Parameters#createFrom(java.util.List)
*/
@Override
protected MongoParameters createFrom(List<MongoParameter> parameters) {
return new MongoParameters(parameters, this.distanceIndex, this.nearIndex);
return new MongoParameters(parameters, this.distanceIndex, this.nearIndex, this.fullTextIndex);
}
/**
@@ -162,7 +185,8 @@ public class MongoParameters extends Parameters<MongoParameters, MongoParameter>
*/
@Override
public boolean isSpecialParameter() {
return super.isSpecialParameter() || Distance.class.isAssignableFrom(getType()) || isNearParameter();
return super.isSpecialParameter() || Distance.class.isAssignableFrom(getType()) || isNearParameter()
|| TextCriteria.class.isAssignableFrom(getType());
}
private boolean isNearParameter() {
@@ -181,5 +205,7 @@ public class MongoParameters extends Parameters<MongoParameters, MongoParameter>
private boolean hasNearAnnotation() {
return parameter.getParameterAnnotation(Near.class) != null;
}
}
}

View File

@@ -17,12 +17,17 @@ package org.springframework.data.mongodb.repository.query;
import org.springframework.data.geo.Distance;
import org.springframework.data.geo.Point;
import org.springframework.data.mongodb.core.query.Term;
import org.springframework.data.mongodb.core.query.TextCriteria;
import org.springframework.data.repository.query.ParametersParameterAccessor;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
/**
* Mongo-specific {@link ParametersParameterAccessor} to allow access to the {@link Distance} parameter.
*
* @author Oliver Gierke
* @author Christoph Strobl
*/
public class MongoParametersParameterAccessor extends ParametersParameterAccessor implements MongoParameterAccessor {
@@ -77,4 +82,35 @@ public class MongoParametersParameterAccessor extends ParametersParameterAccesso
return (Point) value;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.query.MongoParameterAccessor#getFullText()
*/
@Override
public TextCriteria getFullText() {
int index = method.getParameters().getFullTextParameterIndex();
return index >= 0 ? potentiallyConvertFullText(getValue(index)) : null;
}
protected TextCriteria potentiallyConvertFullText(Object fullText) {
Assert.notNull(fullText, "Fulltext parameter must not be 'null'.");
if (fullText instanceof String) {
return TextCriteria.forDefaultLanguage().matching((String) fullText);
}
if (fullText instanceof Term) {
return TextCriteria.forDefaultLanguage().matching((Term) fullText);
}
if (fullText instanceof TextCriteria) {
return ((TextCriteria) fullText);
}
throw new IllegalArgumentException(String.format(
"Expected full text parameter to be one of String, Term or TextCriteria but found %s.",
ClassUtils.getShortName(fullText.getClass())));
}
}

View File

@@ -146,11 +146,7 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
@Override
protected Query complete(Criteria criteria, Sort sort) {
if (criteria == null) {
return null;
}
Query query = new Query(criteria).with(sort);
Query query = (criteria == null ? new Query() : new Query(criteria)).with(sort);
if (LOG.isDebugEnabled()) {
LOG.debug("Created query " + query);

Some files were not shown because too many files have changed in this diff Show More