Compare commits

..

56 Commits

Author SHA1 Message Date
Spring Buildmaster
1a46abfbb9 DATAMONGO-740 - Release version 1.3.0.RELEASE. 2013-09-09 15:56:20 -07:00
Thomas Darimont
61284228dd DATAMONGO-740 - Prepare 1.3.0 RELEASE. 2013-09-09 15:50:04 -07:00
Thomas Darimont
8cb92de1ee DATAMONGO-445 - Allow to skip unnecessary elements in NearQuery.
Added support for skipping elements for NearQuery in MongoTemplate. As mongodb currently (2.4.4) doesn't support he skipping of elements in geoNear-Queries we skip the unnecessary elements ourselves. We use the limit & skip information from the given query or an explicitly passed Pageable.

Original pull request: #64.
2013-08-22 09:20:19 +02:00
Oliver Gierke
5d3cc3fa04 DATAMONGO-743 - Added DBObjectToStringConverter.
Added a reading converter to dump DBObject instances into a String directly to enable executing queries against MongoDB into a String version of the query result without marshaling the DBObject into a Map first.
2013-08-22 08:55:07 +02:00
Thomas Darimont
c0b99740dc DATAMONGO-742 - Document CDI integration in reference documentation.
Added chapter for CDI Integration under the new chapter Miscellaneous.

Original pull request: #63.
2013-08-13 12:20:58 +02:00
Randy Watler
595bbd3aa7 DATAMONGO-737 - Register TransactionSynchronization holder once per Mongo instance.
Original pull request: #62.
2013-08-12 09:51:02 +02:00
Chuong Ngo
5d2fc31164 DATAMONGO-738 - Allow to pass collectionName along with entityClass as parameter to update methods in MongoTemplate.
Added appropriate overloaded methods to MongoOperations and MongoTemplate. Applied pull request from Chuong Ngo <chuong.h.ngo.ctr@mail.mil>.
Original pull request: #57.
2013-08-09 15:30:00 +02:00
Thomas Darimont
a9dc0fae69 DATAMONGO-725 - Improve configurability and documentation of TypeMapper on MappingMongoConverter.
Added new attribute type-mapper-ref to the mapping-converter element in spring-mongo-1.3.xsd in order to support the configuration of custom-type-mappers. Removed the unsupported attributes "mongo-ref" and "mongo-template-ref" from the mapping-converter element in spring-mongo-1.3.xsd because they are not considered anymore.

Updated MappingMongoConverterParser to be aware of the new attribute. Added examples for configuring a custom MongoTypeMapper the usage of @TypeAlias to the reference documentation.

Original pull request: #61.
2013-08-09 13:44:53 +02:00
Thomas Darimont
0605c7b753 DATAMONGO-507 - Reject incorrect usage of Criteria#not().
Added a guard to Criteria#(and|or|nor)Operator to prevent wrapping $and, $or or $nor expressions in a $not expression as mongodb currently doesn't support this. Added test case to CriteriaTests to verify that not() works as specified.

Original pull request: #60.
2013-08-09 12:23:25 +02:00
Thomas Darimont
21352a8829 DATAMONGO-602 - Added test case to MongoTemplate tests to verify that querying by BigInteger ids work.
Original pull request: #59.
2013-08-07 18:58:32 +02:00
Oliver Gierke
58e1d2dbd9 DATAMONGO-741 - Fixed check for nested property references in aggregation framework.
Fixed using the actual field reference instead of the field name on resolving. Added equals(…) and hashCode() methods to value objects. Added unit tests for TypeBasedAggregationOperationContext.
2013-08-07 10:21:48 +02:00
Spring Buildmaster
4f7821e3c2 DATAMONGO-732 - Prepare next development iteration. 2013-08-05 08:48:22 -07:00
Spring Buildmaster
9dd866e34a DATAMONGO-732 - Release version 1.3.0.RC1. 2013-08-05 08:48:15 -07:00
Oliver Gierke
def6079795 DATAMONGO-732 - Prepare 1.3.0.RC1.
Updated changelog, notice and readme. Upgraded to Spring Data Build parent 1.1.1.RELEASE. Upgraded to Spring Data Commons RC1. Switched to milestone repository. Updated links to the parts of the Spring Data Commons reference documentation.
2013-08-05 17:33:58 +02:00
Thomas Darimont
f3f537c1a6 DATAMONGO-586 - Adjusted examples in reference documentation.
Modified formatting and moved the detailed descriptions below the example code.
Added example for arithmetic operations in projection operations.
2013-08-05 17:20:37 +02:00
Thomas Darimont
ad44db386b DATAMONGO-586 - Adjusted examples in reference documentation.
Changed examples in reference documentation to reflect the new DSL style.
2013-08-05 14:59:15 +02:00
Thomas Darimont
bcc3bf61b6 DATAMONGO-586 - Add initial support for arithmetic expressions.
Added test cases to ProjectionOperationUnitTests. Adjusted DSL for GroupOperation to be similar to ProjectionOperation. Introduced GroupOperationBuilder to GroupOperation to be able to define an alias for the current GroupOperation. Adjusted test cases in AggregationTests to the new DSL style accordingly. Added test cases to GroupOperationUnitTests for push and addToSet.
2013-08-05 14:58:10 +02:00
Oliver Gierke
1a28a294d1 DATAMONGO-586 - A bit of Jürgenization on ProjectionOperations. 2013-08-05 14:58:09 +02:00
Thomas Darimont
14623a3655 DATAMONGO-586 - Add initial support for arithmetic expressions.
ProjectionOperationBuilder now implements AggregationOperation in order to be able support aliased as well as non alias projection operation expressions. Added test case for arithmetic operations to AggregationTests. Added Product domain class to be able to demonstrate some meaningful arithmetic operations. Applied changes from code review. Added internal private remove method to ProjectionOperation to allow previous operation to support aliasing.
2013-08-05 14:56:45 +02:00
Thomas Darimont
6dcaa31897 DATAMONGO-586 - Added chapter for Aggregation Framework support to the reference documentation.
Added chapter to mongodb.xml.
Added myself to the authors list in index.xml
2013-08-05 14:56:45 +02:00
Oliver Gierke
e57fe346c0 DATAMONGO-586 - Next round of polishing.
Refined the way the aggregation pipeline gets rendered into a DBObject. More tests. Added $avg and shortcut methods to GroupOperations. Fixed ProjectionOperation to use 1 for implicit references. Made ProjectionOperation publicly visible. Added automatic result unwrapping. API consistency, tests, JavaDoc polish.
2013-08-05 14:55:36 +02:00
Thomas Darimont
7dd94949d5 DATAMONGO-586 - Initial support for automatic field reference resolution.
Added automatic field reference resolution which removes the need to have in depth knowledge on how aggregation steps structures the output.
Introduced AggregateOperationContext abstraction to hold the information of available fields for an aggregation step.
Introduced ContextConsumingAggregateOperation and ContextProducingAggregateOperation abstractions to be able to distinguish operations.
Updates test cases to reflect the API changes.
2013-08-05 14:55:35 +02:00
Oliver Gierke
966f971bee DATAMONGO-586 - Some refactorings in the integration test. 2013-08-05 14:55:34 +02:00
Thomas Darimont
aa23c579e8 DATAMONGO-586 - Added test case for complex aggregation framework use case.
Added Zipcode sample dataset from 10gen. Allow Projections to be used in conjunction with GroupOperations. Integration & Refactoring of github contribution by Tobias Trelle and Sebastian Herold. Switched from builder-style to static factory based DSL construction of aggregation specifications. Introduced embedded DSL for convenient construction of aggregation specifications. Added test cases based on mongodb aggregation framework examples. Added more test cases, additional java doc. Added test case for unwind operation (returnFiveMostCommonLikes) in AggregationTests. Other test cases should now also run in CI environment, due to deterministic result ordering. Adjusted write concern to ensure persistence of sample data.

Introduced TypedAggregation which holds type information of the input type of an aggregation. Cleaned up aggregate methods on MongoOperations. Removed HasToDBObject interface. Cleaned up constructors for Aggregation and TypedAggregation.
2013-08-05 14:49:48 +02:00
Oliver Gierke
6b634d08ce DATAMONGO-586 - Yet another round of polish.
Added Apache license headers where missing. Removed separate package. Reduced visibility of ReferenceUtil as it's internal use only. Fixed formatting.
2013-08-05 14:49:25 +02:00
Sebastian Herold
b7b61405f9 DATAMONGO-586 - Further enhancements to Aggregation API.
Fixed parameter names in comments. Add static factory method. Implement basic aggregation operation join point. Implement match operation. Extracted ReferenceUtil. Created starting point of $group operation with _id field definition and $addToSet fields.
2013-08-05 14:49:24 +02:00
Oliver Gierke
4d65aa7207 DATAMONGO-586 - First round of polish.
Fixed or added copyright headers where necessary. Added Tobias as author where necessary. Added @since tags to newly introduced classes and methods. Documented non-nullability of parameters. Polished test cases a bit.
2013-08-05 14:49:24 +02:00
Tobias Trelle
c129c706a3 DATAMONGO-586 - Initial commit for support of the aggregation framework.
Fluent interface for AggregationPipeline, tests. Added type safe versions for aggregation operations $match and $sort. Not null assertions + auto-prefix field in $unwind operation. Type safe impl for projections (first version). Support for $add and $substract in projection.
2013-08-05 14:49:24 +02:00
Johno Crawford
7823385ac7 DATAMONGO-544 - Added support for TTL collection via indexing annotation.
Added expireAfterSeconds attribute to @Indexed and @CompoundIndex annotations. Adapted MongoPersistentEntityIndexCreator to evaluate the attribute and configure the index about to be created if the attribute was configured to something non default.

Original pull request: #55.
2013-07-31 09:17:50 +02:00
Oliver Gierke
21fcfe11c2 DATAMONGO-731 - Adapted API changes for Parameters in Spring Data Commons.
Adapted API changes introduced for DATACMNS-350.
2013-07-30 15:18:25 +02:00
Thomas Darimont
bfe33a446c DATAMONGO-728 - Added missing package-info files. 2013-07-23 16:32:38 +02:00
Thomas Darimont
9be50316c3 DATAMONGO-709 - Added support for restricting results by document types.
Added restrict(…) method to the Query API that generates appropriate filter criteria to restrict the result to certain types only. 

Type restrictions in query expressions are now applied in QueryMapper via a MongoTypeMapper based on information passed in through the query object in a "special" key. Exposed MongoTypeMapper in MongoConverter and MappingMongoConverter. Merged DefaultTypeMapper and DefaultMongoTypeMapper.

Original pull request: #53.
2013-07-23 15:14:09 +02:00
Thomas Darimont
30513267af DATAMONGO-721 - Updates retain type information for collections properties.
Added test case to demonstrate that the fix for DATACMNS-345 will propagate into Spring Data MongoDB.

Original pull request: #52.
2013-07-16 13:39:03 +02:00
Oliver Gierke
d3d480e79b DATAMONGO-724 - Adapted test case to show type information written by converters gets regarded. 2013-07-16 11:51:50 +02:00
Oliver Gierke
c39ad1bbc4 DATAMONGO-724 - Added test case to show registered converters work.
Added test case that shows that if a custom converter doesn't write type information on its own, the managed type can't be used in polymorphic scenarios. Direct type mappings still work as expected.
2013-07-16 10:09:55 +02:00
Oliver Gierke
fcdc29df49 DATAMONGO-723 - Cleand up a few misnamed test cases.
Reactivated test cases that were name Test instead of Tests by accident.
2013-07-12 18:44:13 +02:00
Oliver Gierke
de7120d8dd DATAMONGO-717 - Forward port of test case.
Added a testcase to show the lifecycle callbacks are invoked in the appropriate order.
2013-07-10 23:07:08 +02:00
Thomas Darimont
84df02ae38 DATAMONGO-685 - ServerInfo should return used hostname reported by MongoDB.
Added test case getHostNameShouldReturnServerNameReportedByMongo() to MongoMonitorIntegrationTests. Modified MongoMonitorIntegrationTests to use common mongo-infrastructure configuration. ServerInfo.getHostName() is now derived from serverStatus.serverUsed.

Original pull request: #51.
2013-07-10 14:30:51 +02:00
Thomas Darimont
d6c5907940 DATAMONGO-702 - Allow usage of property names in field specifications.
MongoTemplate now translates property names used in a Query's field specification into the according field names. Refactored delegation in various doFind(…) methods and polished JavaDoc.

Original pull request: #50.
2013-07-09 11:26:20 +02:00
Thomas Darimont
b2fe54c0a1 DATAMONGO-392 - Updates now retain type information.
Extracted mongo type conversion in QueryMapper into delegateConvertToMongoType(…). Introduced QueryMapper subclass UpdateQueryMapper to retain type information in delegateConvertToMongoType(…). Added test case updatesShouldRetainTypeInformation to MongoTemplateTests.

Original pull request: #49.
2013-07-08 15:07:10 +02:00
Oliver Gierke
47a198c688 DATAMONGO-540 - Fixed test case.
Apparently the test case was not working on MongoDB instances in version 2.2.
2013-07-08 14:16:07 +02:00
Thomas Darimont
5d9dbda03b DATAMONGO-688 - Improve detection of id properties.
Added support for precedence of explicit id property mapping over implicit property mappings. Changed BasicMongoPersistentProperty.getFieldName() to return the mongo _id field name only for the "effective" id property considering the owner entity if already set). Added some test cases for all possible cases to MongoMappingContextUnitTests.

Original pull request: #48.
2013-07-08 14:16:00 +02:00
Thomas Darimont
36d52862bc DATAMONGO-671 - Improve test case to not fail on fast machines. 2013-07-05 11:57:15 +02:00
Thomas Darimont
0afbf6fe19 DATAMONGO-540 - Added test case to show findOne(…) works after upsert. 2013-07-05 11:55:37 +02:00
Oliver Gierke
b0bf8cb718 DATAMONGO-714 - Updated formatter to latest changes. 2013-07-05 11:51:20 +02:00
Thomas Darimont
567a8d9d5b DATAMONGO-713 - Fixed some spelling errors in README.md. 2013-07-05 10:29:35 +02:00
Oliver Gierke
ceef18d7a4 DATAMONGO-671 - Added integration tests to show lookups by date are working. 2013-07-03 17:48:04 +02:00
Thomas Darimont
4f57712f12 DATAMONGO-693 - More robust handling of host and replica set config in MongoFactoryBean.
MongoFactoryBean now considers empty strings for the replicaPair property as not set at all. The ServerAdressPropertyEditor also returns null as value for empty text strings. Deprecated setter for replica pair on MongoFactoryBean.
2013-07-02 16:31:02 +02:00
Oliver Gierke
478396c503 DATAMONGO-675 - MongoTemplate now maps Update in findAndModify(…).
The Update object handed to ….findAndModify(…) is now handed through the QueryMapper before being executed.
2013-07-01 08:28:40 +02:00
Oliver Gierke
aa80d1ad0a DATAMONGO-706 - Fixed DBRef conversion for nested keywords. 2013-06-29 13:07:34 +02:00
Oliver Gierke
fd28ab4d33 DATAMONGO-705 - Fixed handling of DBRefs in QueryMapper.
QueryMapper now converts values to become DBRefs correctly in getMappedKeyword(…). Added an exclusion path for the value handling in case we have an $exists keyword.
2013-06-25 19:25:13 +02:00
Andrew Duncan
187c80dfcc DATAMONGO-701 - Improve performance of starts-with and ends-with queries.
This changes the starts-with regex to the prefixed form using ^ to better make use of any index on the queried field. Also changes ending-with queries to use the $ anchor.
2013-06-25 17:56:07 +02:00
Ivan Sopov
389a3ac066 DATAMONGO-704 - Removed references to SimpleMongoConverter from JavaDoc.
Removing SimpleMongoConverter references from javadocs In commit 2832b524d3 MappingMongoConverter was made default instead of SimpleMongoConverter. Also SimpleMongoConverter was completely removed between 1.0.0.M3 and 1.0.0.M4 releases. This is an update for JavaDocs, that still reference SimpleMongoConverter as the default MongoConverter.
2013-06-25 17:31:32 +02:00
Oliver Gierke
297bd3e3dd DATAMONGO-690 - Upgraded to Spring Data Commons snapshots. 2013-06-24 13:39:13 +02:00
Oliver Gierke
b11fba3321 DATAMONGO-694 - Added test case to show overriding accusers works. 2013-06-14 12:44:24 +02:00
Spring Buildmaster
3c68671d86 DATAMONGO-690 - Prepare next development iteration. 2013-06-04 11:37:17 -07:00
138 changed files with 36999 additions and 585 deletions

View File

@@ -6,7 +6,7 @@ The Spring Data MongoDB project aims to provide a familiar and consistent Spring
## Getting Help
For a comprehensive treatmet of all the Spring Data MongoDB features, please refer to:
For a comprehensive treatment of all the Spring Data MongoDB features, please refer to:
* the [User Guide](http://static.springsource.org/spring-data/data-mongodb/docs/current/reference/html/)
* the [JavaDocs](http://static.springsource.org/spring-data/data-mongodb/docs/current/api/) have extensive comments in them as well.
@@ -26,7 +26,7 @@ Add the Maven dependency:
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.2.1.RELEASE</version>
<version>1.2.3.RELEASE</version>
</dependency>
```
@@ -36,7 +36,7 @@ If you'd rather like the latest snapshots of the upcoming major version, use our
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.2.1.RELEASE</version>
<version>1.3.0.BUILD-SNAPSHOT</version>
</dependency>
<repository>
@@ -57,7 +57,7 @@ MongoTemplate is the central support class for Mongo database operations. It pro
### Spring Data repositories
To simplify the creation of data repositories Sprin Data MongoDB provides a generic repository programming model. It will automatically create a repository proxy for you that adds implementations of finder methods you specify on an interface.
To simplify the creation of data repositories Spring Data MongoDB provides a generic repository programming model. It will automatically create a repository proxy for you that adds implementations of finder methods you specify on an interface.
For example, given a `Person` class with first and last name properties, a `PersonRepository` interface that can query for `Person` by last name and when the first name matches a like expression is shown below:
@@ -70,7 +70,7 @@ public interface PersonRepository extends CrudRepository<Person, Long> {
}
```
The queries issued on execution will be derived from the method name. Exending `CrudRepository` causes CRUD methods being pulled into the interface so that you can easily save and find single entities and collections of them.
The queries issued on execution will be derived from the method name. Extending `CrudRepository` causes CRUD methods being pulled into the interface so that you can easily save and find single entities and collections of them.
You can have Spring automatically create a proxy for the interface by using the following JavaConfig:

View File

@@ -73,7 +73,7 @@
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_enum_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_angle_bracket_in_type_parameters" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.clear_blank_lines_in_block_comment" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_type_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_type_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.lineSplit" value="120"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_if" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_between_brackets_in_array_type_reference" value="do not insert"/>
@@ -102,7 +102,7 @@
<setting id="org.eclipse.jdt.core.formatter.alignment_for_parameters_in_constructor_declaration" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_colon_in_labeled_statement" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_annotation_type_declaration" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_method_body" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_method_body" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_method_declaration" value="0"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_method_invocation" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_try" value="do not insert"/>
@@ -124,7 +124,7 @@
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_parenthesized_expression_in_throw" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_type_arguments" value="do not insert"/>
<setting id="org.eclipse.jdt.core.compiler.problem.enumIdentifier" value="error"/>
<setting id="org.eclipse.jdt.core.formatter.indent_switchstatements_compare_to_switch" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.indent_switchstatements_compare_to_switch" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_ellipsis" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_block" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_for_inits" value="do not insert"/>
@@ -135,9 +135,9 @@
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_for_increments" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.format_line_comment_starting_on_first_column" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_bracket_in_array_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_after_annotation_on_field" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_enum_constant" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_after_annotation_on_field" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.indent_root_tags" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_enum_constant" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_enum_declarations" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_union_type_in_multicatch" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_explicitconstructorcall_arguments" value="insert"/>
@@ -150,12 +150,12 @@
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_after_opening_brace_in_array_initializer" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_closing_brace_in_block" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_bracket_in_array_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_enum_constant" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_enum_constant" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_angle_bracket_in_type_arguments" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_constructor_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_constructor_declaration_throws" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_if" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.clear_blank_lines_in_javadoc_comment" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.comment.clear_blank_lines_in_javadoc_comment" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_throws_clause_in_constructor_declaration" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_assignment_operator" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_assignment_operator" value="insert"/>
@@ -182,11 +182,11 @@
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_angle_bracket_in_parameterized_type_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_anonymous_type_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.keep_empty_array_initializer_on_one_line" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_enum_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_enum_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.keep_imple_if_on_one_line" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_constructor_declaration_parameters" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_closing_angle_bracket_in_type_parameters" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_at_end_of_file_if_missing" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_at_end_of_file_if_missing" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_colon_in_for" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_colon_in_labeled_statement" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_parameterized_type_reference" value="do not insert"/>
@@ -237,7 +237,7 @@
<setting id="org.eclipse.jdt.core.formatter.alignment_for_throws_clause_in_method_declaration" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_angle_bracket_in_type_parameters" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_bracket_in_array_allocation_expression" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_anonymous_type_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_anonymous_type_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_colon_in_conditional" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_angle_bracket_in_parameterized_type_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_for" value="do not insert"/>
@@ -251,12 +251,12 @@
<setting id="org.eclipse.jdt.core.compiler.codegen.targetPlatform" value="1.7"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_resources_in_try" value="80"/>
<setting id="org.eclipse.jdt.core.formatter.use_tabs_only_for_leading_indentations" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_arguments_in_annotation" value="0"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_arguments_in_annotation" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.comment.format_header" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.comment.format_block_comments" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_enum_constant" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_enum_constants" value="0"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_block" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_block" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.indent_body_declarations_compare_to_annotation_declaration_header" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_parenthesized_expression" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_parenthesized_expression" value="do not insert"/>

10
pom.xml
View File

@@ -5,7 +5,7 @@
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.3.0.M1</version>
<version>1.3.0.RELEASE</version>
<packaging>pom</packaging>
<name>Spring Data MongoDB</name>
@@ -15,7 +15,7 @@
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>1.1.0.RELEASE</version>
<version>1.2.0.RELEASE</version>
<relativePath>../spring-data-build/parent/pom.xml</relativePath>
</parent>
@@ -29,7 +29,7 @@
<properties>
<project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>1.6.0.M1</springdata.commons>
<springdata.commons>1.6.0.RELEASE</springdata.commons>
<mongo>2.10.1</mongo>
</properties>
@@ -91,8 +91,8 @@
<repositories>
<repository>
<id>spring-lib-milestone</id>
<url>http://repo.springsource.org/libs-milestone-local</url>
<id>spring-lib-release</id>
<url>http://repo.springsource.org/libs-release-local</url>
</repository>
</repositories>

View File

@@ -6,7 +6,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.3.0.M1</version>
<version>1.3.0.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -52,7 +52,7 @@
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.3.0.M1</version>
<version>1.3.0.RELEASE</version>
</dependency>
<dependency>

View File

@@ -0,0 +1,5 @@
/**
* Infrastructure for Spring Data's MongoDB cross store support.
*/
package org.springframework.data.mongodb.crossstore;

View File

@@ -13,7 +13,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.3.0.M1</version>
<version>1.3.0.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -32,7 +32,7 @@ An example log entry might look like:
{
"_id" : ObjectId("4d89341a8ef397e06940d5cd"),
"applicationId" : "my.application",
"name" : "org.springframework.data.mongodb.log4j.AppenderTest",
"name" : "org.springframework.data.mongodb.log4j.MongoLog4jAppenderIntegrationTests",
"level" : "DEBUG",
"timestamp" : ISODate("2011-03-23T16:53:46.778Z"),
"properties" : {

View File

@@ -5,7 +5,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.3.0.M1</version>
<version>1.3.0.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -0,0 +1,5 @@
/**
* Infrastructure for to use MongoDB as a logging sink.
*/
package org.springframework.data.mongodb.log4j;

View File

@@ -1,11 +1,11 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -13,63 +13,65 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.log4j;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import java.net.UnknownHostException;
import java.util.Calendar;
import com.mongodb.DB;
import com.mongodb.DBCursor;
import com.mongodb.Mongo;
import org.apache.log4j.Logger;
import org.apache.log4j.MDC;
import org.junit.Before;
import org.junit.Test;
/**
* @author Jon Brisbin <jbrisbin@vmware.com>
*/
public class AppenderTest {
import com.mongodb.DB;
import com.mongodb.DBCursor;
import com.mongodb.Mongo;
private static final String NAME = AppenderTest.class.getName();
private Logger log = Logger.getLogger(NAME);
private Mongo mongo;
private DB db;
private String collection;
/**
* Integration tests for {@link MongoLog4jAppender}.
*
* @author Jon Brisbin
* @author Oliver Gierke
*/
public class MongoLog4jAppenderIntegrationTests {
static final String NAME = MongoLog4jAppenderIntegrationTests.class.getName();
Logger log = Logger.getLogger(NAME);
Mongo mongo;
DB db;
String collection;
@Before
public void setup() {
try {
mongo = new Mongo("localhost", 27017);
db = mongo.getDB("logs");
Calendar now = Calendar.getInstance();
collection = String.valueOf(now.get(Calendar.YEAR)) + String.format("%1$02d", now.get(Calendar.MONTH) + 1);
db.getCollection(collection).drop();
} catch (UnknownHostException e) {
throw new RuntimeException(e.getMessage(), e);
}
public void setUp() throws Exception {
mongo = new Mongo("localhost", 27017);
db = mongo.getDB("logs");
Calendar now = Calendar.getInstance();
collection = String.valueOf(now.get(Calendar.YEAR)) + String.format("%1$02d", now.get(Calendar.MONTH) + 1);
db.getCollection(collection).drop();
}
@Test
public void testLogging() {
log.debug("DEBUG message");
log.info("INFO message");
log.warn("WARN message");
log.error("ERROR message");
DBCursor msgs = db.getCollection(collection).find();
assertThat(msgs.count(), is(4));
assertThat(msgs.count(), is(4));
}
@Test
public void testProperties() {
MDC.put("property", "one");
log.debug("DEBUG message");
}
}

View File

@@ -10,11 +10,4 @@ log4j.appender.stdout.collectionPattern = %X{year}%X{month}
log4j.appender.stdout.applicationId = my.application
log4j.appender.stdout.warnOrHigherWriteConcern = FSYNC_SAFE
log4j.category.org.apache.activemq=ERROR
log4j.category.org.springframework.batch=DEBUG
log4j.category.org.springframework.data.document.mongodb=DEBUG
log4j.category.org.springframework.transaction=INFO
log4j.category.org.hibernate.SQL=DEBUG
# for debugging datasource initialization
# log4j.category.test.jdbc=DEBUG
log4j.category.org.springframework.data.mongodb=DEBUG

View File

@@ -11,7 +11,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.3.0.M1</version>
<version>1.3.0.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -70,6 +70,7 @@ import org.w3c.dom.Element;
* @author Jon Brisbin
* @author Oliver Gierke
* @author Maciej Walkowiak
* @author Thomas Darimont
*/
public class MappingMongoConverterParser implements BeanDefinitionParser {
@@ -105,6 +106,11 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
converterBuilder.addConstructorArgReference(dbFactoryRef);
converterBuilder.addConstructorArgReference(ctxRef);
String typeMapperRef = element.getAttribute("type-mapper-ref");
if (StringUtils.hasText(typeMapperRef)) {
converterBuilder.addPropertyReference("typeMapper", typeMapperRef);
}
if (conversionsDefinition != null) {
converterBuilder.addPropertyValue("customConversions", conversionsDefinition);
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -31,6 +31,7 @@ import com.mongodb.ServerAddress;
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
*/
public class ServerAddressPropertyEditor extends PropertyEditorSupport {
@@ -43,6 +44,11 @@ public class ServerAddressPropertyEditor extends PropertyEditorSupport {
@Override
public void setAsText(String replicaSetString) {
if (!StringUtils.hasText(replicaSetString)) {
setValue(null);
return;
}
String[] replicaSetStringArray = StringUtils.commaDelimitedListToStringArray(replicaSetString);
Set<ServerAddress> serverAddresses = new HashSet<ServerAddress>(replicaSetStringArray.length);

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2010-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -26,14 +26,13 @@ import com.mongodb.DB;
import com.mongodb.Mongo;
/**
* Helper class featuring helper methods for internal MongoDb classes.
* <p/>
* <p>
* Mainly intended for internal use within the framework.
* Helper class featuring helper methods for internal MongoDb classes. Mainly intended for internal use within the
* framework.
*
* @author Thomas Risberg
* @author Graeme Rocher
* @author Oliver Gierke
* @author Randy Watler
* @since 1.0
*/
public abstract class MongoDbUtils {
@@ -131,8 +130,11 @@ public abstract class MongoDbUtils {
holderToUse.addDB(databaseName, db);
}
TransactionSynchronizationManager.registerSynchronization(new MongoSynchronization(holderToUse, mongo));
holderToUse.setSynchronizedWithTransaction(true);
// synchronize holder only if not yet synchronized
if (!holderToUse.isSynchronizedWithTransaction()) {
TransactionSynchronizationManager.registerSynchronization(new MongoSynchronization(holderToUse, mongo));
holderToUse.setSynchronizedWithTransaction(true);
}
if (holderToUse != dbHolder) {
TransactionSynchronizationManager.bindResource(mongo, holderToUse);

View File

@@ -15,7 +15,9 @@
*/
package org.springframework.data.mongodb.core;
import java.util.Arrays;
import java.util.ArrayList;
import java.util.Collection;
import java.util.Collections;
import java.util.List;
import org.springframework.beans.factory.DisposableBean;
@@ -24,6 +26,7 @@ import org.springframework.beans.factory.InitializingBean;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.CannotGetMongoDbConnectionException;
import org.springframework.util.StringUtils;
import com.mongodb.Mongo;
import com.mongodb.MongoOptions;
@@ -36,6 +39,7 @@ import com.mongodb.WriteConcern;
* @author Thomas Risberg
* @author Graeme Rocher
* @author Oliver Gierke
* @author Thomas Darimont
* @since 1.0
*/
public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, DisposableBean,
@@ -57,11 +61,38 @@ public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, D
}
public void setReplicaSetSeeds(ServerAddress[] replicaSetSeeds) {
this.replicaSetSeeds = Arrays.asList(replicaSetSeeds);
this.replicaSetSeeds = filterNonNullElementsAsList(replicaSetSeeds);
}
/**
* @deprecated use {@link #setReplicaSetSeeds(ServerAddress[])} instead
*
* @param replicaPair
*/
@Deprecated
public void setReplicaPair(ServerAddress[] replicaPair) {
this.replicaPair = Arrays.asList(replicaPair);
this.replicaPair = filterNonNullElementsAsList(replicaPair);
}
/**
* @param elements the elements to filter <T>
* @return a new unmodifiable {@link List#} from the given elements without nulls
*/
private <T> List<T> filterNonNullElementsAsList(T[] elements) {
if (elements == null) {
return Collections.emptyList();
}
List<T> candidateElements = new ArrayList<T>();
for (T element : elements) {
if (element != null) {
candidateElements.add(element);
}
}
return Collections.unmodifiableList(candidateElements);
}
public void setHost(String host) {
@@ -127,15 +158,15 @@ public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, D
mongoOptions = new MongoOptions();
}
if (replicaPair != null) {
if (!isNullOrEmpty(replicaPair)) {
if (replicaPair.size() < 2) {
throw new CannotGetMongoDbConnectionException("A replica pair must have two server entries");
}
mongo = new Mongo(replicaPair.get(0), replicaPair.get(1), mongoOptions);
} else if (replicaSetSeeds != null) {
} else if (!isNullOrEmpty(replicaSetSeeds)) {
mongo = new Mongo(replicaSetSeeds, mongoOptions);
} else {
String mongoHost = host != null ? host : defaultOptions.getHost();
String mongoHost = StringUtils.hasText(host) ? host : defaultOptions.getHost();
mongo = port != null ? new Mongo(new ServerAddress(mongoHost, port), mongoOptions) : new Mongo(mongoHost,
mongoOptions);
}
@@ -147,6 +178,10 @@ public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, D
this.mongo = mongo;
}
private boolean isNullOrEmpty(Collection<?> elements) {
return elements == null || elements.isEmpty();
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.DisposableBean#destroy()

View File

@@ -19,6 +19,9 @@ import java.util.Collection;
import java.util.List;
import java.util.Set;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.aggregation.AggregationResults;
import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.geo.GeoResult;
import org.springframework.data.mongodb.core.geo.GeoResults;
@@ -45,6 +48,8 @@ import com.mongodb.WriteResult;
* @author Thomas Risberg
* @author Mark Pollack
* @author Oliver Gierke
* @author Tobias Trelle
* @author Chuong Ngo
*/
public interface MongoOperations {
@@ -247,7 +252,7 @@ public interface MongoOperations {
* Query for a list of objects of type T from the collection used by the entity class.
* <p/>
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of SimpleMongoConverter will be used.
* configured otherwise, an instance of MappingMongoConverter will be used.
* <p/>
* If your collection does not contain a homogeneous collection of types, this operation will not be an efficient way
* to map objects since the test for class type is done in the client and not on the server.
@@ -261,7 +266,7 @@ public interface MongoOperations {
* Query for a list of objects of type T from the specified collection.
* <p/>
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of SimpleMongoConverter will be used.
* configured otherwise, an instance of MappingMongoConverter will be used.
* <p/>
* If your collection does not contain a homogeneous collection of types, this operation will not be an efficient way
* to map objects since the test for class type is done in the client and not on the server.
@@ -301,6 +306,57 @@ public interface MongoOperations {
*/
<T> GroupByResults<T> group(Criteria criteria, String inputCollectionName, GroupBy groupBy, Class<T> entityClass);
/**
* Execute an aggregation operation. The raw results will be mapped to the given entity class. The name of the
* inputCollection is derived from the inputType of the aggregation.
*
* @param aggregation The {@link TypedAggregation} specification holding the aggregation operations, must not be
* {@literal null}.
* @param collectionName The name of the input collection to use for the aggreation.
* @param outputType The parameterized type of the returned list, must not be {@literal null}.
* @return The results of the aggregation operation.
* @since 1.3
*/
<O> AggregationResults<O> aggregate(TypedAggregation<?> aggregation, String collectionName, Class<O> outputType);
/**
* Execute an aggregation operation. The raw results will be mapped to the given entity class. The name of the
* inputCollection is derived from the inputType of the aggregation.
*
* @param aggregation The {@link TypedAggregation} specification holding the aggregation operations, must not be
* {@literal null}.
* @param outputType The parameterized type of the returned list, must not be {@literal null}.
* @return The results of the aggregation operation.
* @since 1.3
*/
<O> AggregationResults<O> aggregate(TypedAggregation<?> aggregation, Class<O> outputType);
/**
* Execute an aggregation operation. The raw results will be mapped to the given entity class.
*
* @param aggregation The {@link Aggregation} specification holding the aggregation operations, must not be
* {@literal null}.
* @param inputType the inputType where the aggregation operation will read from, must not be {@literal null} or
* empty.
* @param outputType The parameterized type of the returned list, must not be {@literal null}.
* @return The results of the aggregation operation.
* @since 1.3
*/
<O> AggregationResults<O> aggregate(Aggregation aggregation, Class<?> inputType, Class<O> outputType);
/**
* Execute an aggregation operation. The raw results will be mapped to the given entity class.
*
* @param aggregation The {@link Aggregation} specification holding the aggregation operations, must not be
* {@literal null}.
* @param collectionName the collection where the aggregation operation will read from, must not be {@literal null} or
* empty.
* @param outputType The parameterized type of the returned list, must not be {@literal null}.
* @return The results of the aggregation operation.
* @since 1.3
*/
<O> AggregationResults<O> aggregate(Aggregation aggregation, String collectionName, Class<O> outputType);
/**
* Execute a map-reduce operation. The map-reduce operation will be formed with an output type of INLINE
*
@@ -382,7 +438,7 @@ public interface MongoOperations {
* specified type.
* <p/>
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of SimpleMongoConverter will be used.
* configured otherwise, an instance of MappingMongoConverter will be used.
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
@@ -399,7 +455,7 @@ public interface MongoOperations {
* type.
* <p/>
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of SimpleMongoConverter will be used.
* configured otherwise, an instance of MappingMongoConverter will be used.
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
@@ -422,7 +478,7 @@ public interface MongoOperations {
* Map the results of an ad-hoc query on the collection for the entity class to a List of the specified type.
* <p/>
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of SimpleMongoConverter will be used.
* configured otherwise, an instance of MappingMongoConverter will be used.
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
@@ -438,7 +494,7 @@ public interface MongoOperations {
* Map the results of an ad-hoc query on the specified collection to a List of the specified type.
* <p/>
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of SimpleMongoConverter will be used.
* configured otherwise, an instance of MappingMongoConverter will be used.
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
@@ -504,7 +560,7 @@ public interface MongoOperations {
* type. The first document that matches the query is returned and also removed from the collection in the database.
* <p/>
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of SimpleMongoConverter will be used.
* configured otherwise, an instance of MappingMongoConverter will be used.
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
@@ -557,7 +613,7 @@ public interface MongoOperations {
* Insert the object into the specified collection.
* <p/>
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of SimpleMongoConverter will be used.
* configured otherwise, an instance of MappingMongoConverter will be used.
* <p/>
* Insert is used to initially store the object into the database. To update an existing object use the save method.
*
@@ -595,7 +651,7 @@ public interface MongoOperations {
* object is not already present, that is an 'upsert'.
* <p/>
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of SimpleMongoConverter will be used.
* configured otherwise, an instance of MappingMongoConverter will be used.
* <p/>
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
@@ -612,7 +668,7 @@ public interface MongoOperations {
* is an 'upsert'.
* <p/>
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of SimpleMongoConverter will be used.
* configured otherwise, an instance of MappingMongoConverter will be used.
* <p/>
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
@@ -648,6 +704,18 @@ public interface MongoOperations {
*/
WriteResult upsert(Query query, Update update, String collectionName);
/**
* Performs an upsert. If no document is found that matches the query, a new document is created and inserted by
* combining the query document and the update document.
*
* @param query the query document that specifies the criteria used to select a record to be upserted
* @param update the update document that contains the updated object or $ operators to manipulate the existing object
* @param entityClass class of the pojo to be operated on
* @param collectionName name of the collection to update the object in
* @return the WriteResult which lets you access the results of the previous write.
*/
WriteResult upsert(Query query, Update update, Class<?> entityClass, String collectionName);
/**
* Updates the first object that is found in the collection of the entity class that matches the query document with
* the provided update document.
@@ -672,6 +740,19 @@ public interface MongoOperations {
*/
WriteResult updateFirst(Query query, Update update, String collectionName);
/**
* Updates the first object that is found in the specified collection that matches the query document criteria with
* the provided updated document.
*
* @param query the query document that specifies the criteria used to select a record to be updated
* @param update the update document that contains the updated object or $ operators to manipulate the existing
* object.
* @param entityClass class of the pojo to be operated on
* @param collectionName name of the collection to update the object in
* @return the WriteResult which lets you access the results of the previous write.
*/
WriteResult updateFirst(Query query, Update update, Class<?> entityClass, String collectionName);
/**
* Updates all objects that are found in the collection for the entity class that matches the query document criteria
* with the provided updated document.
@@ -696,6 +777,19 @@ public interface MongoOperations {
*/
WriteResult updateMulti(Query query, Update update, String collectionName);
/**
* Updates all objects that are found in the collection for the entity class that matches the query document criteria
* with the provided updated document.
*
* @param query the query document that specifies the criteria used to select a record to be updated
* @param update the update document that contains the updated object or $ operators to manipulate the existing
* object.
* @param entityClass class of the pojo to be operated on
* @param collectionName name of the collection to update the object in
* @return the WriteResult which lets you access the results of the previous write.
*/
WriteResult updateMulti(final Query query, final Update update, Class<?> entityClass, String collectionName);
/**
* Remove the given object from the collection by id.
*

View File

@@ -53,10 +53,17 @@ import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.model.BeanWrapper;
import org.springframework.data.mapping.model.MappingException;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.aggregation.AggregationOperationContext;
import org.springframework.data.mongodb.core.aggregation.AggregationResults;
import org.springframework.data.mongodb.core.aggregation.Fields;
import org.springframework.data.mongodb.core.aggregation.TypeBasedAggregationOperationContext;
import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.convert.MongoWriter;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.convert.UpdateMapper;
import org.springframework.data.mongodb.core.geo.Distance;
import org.springframework.data.mongodb.core.geo.GeoResult;
import org.springframework.data.mongodb.core.geo.GeoResults;
@@ -113,6 +120,10 @@ import com.mongodb.util.JSONParseException;
* @author Oliver Gierke
* @author Amol Nayak
* @author Patryk Wasik
* @author Tobias Trelle
* @author Sebastian Herold
* @author Thomas Darimont
* @author Chuong Ngo
*/
public class MongoTemplate implements MongoOperations, ApplicationContextAware {
@@ -135,7 +146,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
private final MongoDbFactory mongoDbFactory;
private final MongoExceptionTranslator exceptionTranslator = new MongoExceptionTranslator();
private final QueryMapper mapper;
private final QueryMapper queryMapper;
private final UpdateMapper updateMapper;
private WriteConcern writeConcern;
private WriteConcernResolver writeConcernResolver = DefaultWriteConcernResolver.INSTANCE;
@@ -188,7 +200,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
this.mongoDbFactory = mongoDbFactory;
this.mongoConverter = mongoConverter == null ? getDefaultMongoConverter(mongoDbFactory) : mongoConverter;
this.mapper = new QueryMapper(this.mongoConverter);
this.queryMapper = new QueryMapper(this.mongoConverter);
this.updateMapper = new UpdateMapper(this.mongoConverter);
// We always have a mapping context in the converter, whether it's a simple one or not
mappingContext = this.mongoConverter.getMappingContext();
@@ -495,7 +508,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
throw new InvalidDataAccessApiUsageException("Query passed in to exist can't be null");
}
DBObject mappedQuery = mapper.getMappedObject(query.getQueryObject(), getPersistentEntity(entityClass));
DBObject mappedQuery = queryMapper.getMappedObject(query.getQueryObject(), getPersistentEntity(entityClass));
return execute(collectionName, new FindCallback(mappedQuery)).hasNext();
}
@@ -553,8 +566,26 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
mongoConverter, entityClass), near.getMetric());
List<GeoResult<T>> result = new ArrayList<GeoResult<T>>(results.size());
int index = 0;
int elementsToSkip = near.getSkip() != null ? near.getSkip() : 0;
for (Object element : results) {
result.add(callback.doWith((DBObject) element));
/*
* As MongoDB currently (2.4.4) doesn't support the skipping of elements in near queries
* we skip the elements ourselves to avoid at least the document 2 object mapping overhead.
*
* @see https://jira.mongodb.org/browse/SERVER-3925
*/
if (index >= elementsToSkip) {
result.add(callback.doWith((DBObject) element));
}
index++;
}
if (elementsToSkip > 0) {
// as we skipped some elements we have to calculate the averageDistance ourselves:
return new GeoResults<T>(result, near.getMetric());
}
DBObject stats = (DBObject) commandResult.get("stats");
@@ -604,7 +635,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
private long count(Query query, Class<?> entityClass, String collectionName) {
Assert.hasText(collectionName);
final DBObject dbObject = query == null ? null : mapper.getMappedObject(query.getQueryObject(),
final DBObject dbObject = query == null ? null : queryMapper.getMappedObject(query.getQueryObject(),
entityClass == null ? null : mappingContext.getPersistentEntity(entityClass));
return execute(collectionName, new CollectionCallback<Long>() {
@@ -916,6 +947,10 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
return doUpdate(collectionName, query, update, null, true, false);
}
public WriteResult upsert(Query query, Update update, Class<?> entityClass, String collectionName) {
return doUpdate(collectionName, query, update, entityClass, true, false);
}
public WriteResult updateFirst(Query query, Update update, Class<?> entityClass) {
return doUpdate(determineCollectionName(entityClass), query, update, entityClass, false, false);
}
@@ -924,6 +959,10 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
return doUpdate(collectionName, query, update, null, false, false);
}
public WriteResult updateFirst(Query query, Update update, Class<?> entityClass, String collectionName) {
return doUpdate(collectionName, query, update, entityClass, false, false);
}
public WriteResult updateMulti(Query query, Update update, Class<?> entityClass) {
return doUpdate(determineCollectionName(entityClass), query, update, entityClass, false, true);
}
@@ -932,6 +971,10 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
return doUpdate(collectionName, query, update, null, false, true);
}
public WriteResult updateMulti(final Query query, final Update update, Class<?> entityClass, String collectionName) {
return doUpdate(collectionName, query, update, entityClass, false, true);
}
protected WriteResult doUpdate(final String collectionName, final Query query, final Update update,
final Class<?> entityClass, final boolean upsert, final boolean multi) {
@@ -940,10 +983,10 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
MongoPersistentEntity<?> entity = entityClass == null ? null : getPersistentEntity(entityClass);
DBObject queryObj = query == null ? new BasicDBObject()
: mapper.getMappedObject(query.getQueryObject(), entity);
DBObject updateObj = update == null ? new BasicDBObject() : mapper.getMappedObject(update.getUpdateObject(),
DBObject queryObj = query == null ? new BasicDBObject() : queryMapper.getMappedObject(query.getQueryObject(),
entity);
DBObject updateObj = update == null ? new BasicDBObject() : updateMapper.getMappedObject(
update.getUpdateObject(), entity);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Calling update using query: " + queryObj + " and update: " + updateObj + " in collection: "
@@ -1061,7 +1104,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
maybeEmitEvent(new BeforeDeleteEvent<T>(queryObject, entityClass));
DBObject dboq = mapper.getMappedObject(queryObject, entity);
DBObject dboq = queryMapper.getMappedObject(queryObject, entity);
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.REMOVE, collectionName,
entityClass, null, queryObject);
@@ -1156,7 +1199,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
if (criteria == null) {
dbo.put("cond", null);
} else {
dbo.put("cond", mapper.getMappedObject(criteria.getCriteriaObject(), null));
dbo.put("cond", queryMapper.getMappedObject(criteria.getCriteriaObject(), null));
}
// If initial document was a JavaScript string, potentially loaded by Spring's Resource abstraction, load it and
// convert to DBObject
@@ -1205,6 +1248,64 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
@Override
public <O> AggregationResults<O> aggregate(TypedAggregation<?> aggregation, Class<O> outputType) {
return aggregate(aggregation, determineCollectionName(aggregation.getInputType()), outputType);
}
@Override
public <O> AggregationResults<O> aggregate(TypedAggregation<?> aggregation, String inputCollectionName,
Class<O> outputType) {
Assert.notNull(aggregation, "Aggregation pipeline must not be null!");
AggregationOperationContext context = new TypeBasedAggregationOperationContext(aggregation.getInputType(),
mappingContext, queryMapper);
return aggregate(aggregation, inputCollectionName, outputType, context);
}
@Override
public <O> AggregationResults<O> aggregate(Aggregation aggregation, Class<?> inputType, Class<O> outputType) {
return aggregate(aggregation, determineCollectionName(inputType), outputType,
new TypeBasedAggregationOperationContext(inputType, mappingContext, queryMapper));
}
@Override
public <O> AggregationResults<O> aggregate(Aggregation aggregation, String collectionName, Class<O> outputType) {
return aggregate(aggregation, collectionName, outputType, null);
}
protected <O> AggregationResults<O> aggregate(Aggregation aggregation, String collectionName, Class<O> outputType,
AggregationOperationContext context) {
Assert.hasText(collectionName, "Collection name must not be null or empty!");
Assert.notNull(aggregation, "Aggregation pipeline must not be null!");
Assert.notNull(outputType, "Output type must not be null!");
AggregationOperationContext rootContext = context == null ? Aggregation.DEFAULT_CONTEXT : context;
DBObject command = aggregation.toDbObject(collectionName, rootContext);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Executing aggregation: {}", serializeToJsonSafely(command));
}
CommandResult commandResult = executeCommand(command);
handleCommandError(commandResult, command);
// map results
@SuppressWarnings("unchecked")
Iterable<DBObject> resultSet = (Iterable<DBObject>) commandResult.get("result");
List<O> mappedResults = new ArrayList<O>();
DbObjectCallback<O> callback = new UnwrapAndReadDbObjectCallback<O>(mongoConverter, outputType);
for (DBObject dbObject : resultSet) {
mappedResults.add(callback.doWith(dbObject));
}
return new AggregationResults<O>(mappedResults, commandResult);
}
protected String replaceWithResourceIfNecessary(String function) {
String func = function;
@@ -1313,57 +1414,28 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
/**
* Map the results of an ad-hoc query on the default MongoDB collection to an object using the template's converter
* <p/>
* The query document is specified as a standard DBObject and so is the fields specification.
* Map the results of an ad-hoc query on the default MongoDB collection to an object using the template's converter.
* The query document is specified as a standard {@link DBObject} and so is the fields specification.
*
* @param collectionName name of the collection to retrieve the objects from
* @param query the query document that specifies the criteria used to find a record
* @param fields the document that specifies the fields to be returned
* @param collectionName name of the collection to retrieve the objects from.
* @param query the query document that specifies the criteria used to find a record.
* @param fields the document that specifies the fields to be returned.
* @param entityClass the parameterized type of the returned list.
* @return the List of converted objects.
* @return the {@link List} of converted objects.
*/
protected <T> T doFindOne(String collectionName, DBObject query, DBObject fields, Class<T> entityClass) {
EntityReader<? super T, DBObject> readerToUse = this.mongoConverter;
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass);
DBObject mappedQuery = mapper.getMappedObject(query, entity);
return executeFindOneInternal(new FindOneCallback(mappedQuery, fields), new ReadDbObjectCallback<T>(readerToUse,
entityClass), collectionName);
}
/**
* Map the results of an ad-hoc query on the default MongoDB collection to a List of the specified type. The object is
* converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless configured
* otherwise, an instance of SimpleMongoConverter will be used. The query document is specified as a standard DBObject
* and so is the fields specification. Can be overridden by subclasses.
*
* @param collectionName name of the collection to retrieve the objects from
* @param query the query document that specifies the criteria used to find a record
* @param fields the document that specifies the fields to be returned
* @param entityClass the parameterized type of the returned list.
* @param preparer allows for customization of the DBCursor used when iterating over the result set, (apply limits,
* skips and so on).
* @return the List of converted objects.
*/
protected <T> List<T> doFind(String collectionName, DBObject query, DBObject fields, Class<T> entityClass,
CursorPreparer preparer) {
return doFind(collectionName, query, fields, entityClass, preparer, new ReadDbObjectCallback<T>(mongoConverter,
entityClass));
}
protected <S, T> List<T> doFind(String collectionName, DBObject query, DBObject fields, Class<S> entityClass,
CursorPreparer preparer, DbObjectCallback<T> objectCallback) {
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass);
DBObject mappedQuery = queryMapper.getMappedObject(query, entity);
DBObject mappedFields = fields == null ? null : queryMapper.getMappedObject(fields, entity);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug(String.format("find using query: %s fields: %s for class: %s in collection: %s",
serializeToJsonSafely(query), fields, entityClass, collectionName));
LOGGER.debug(String.format("findOne using query: %s fields: %s for class: %s in collection: %s",
serializeToJsonSafely(query), mappedFields, entityClass, collectionName));
}
return executeFindMultiInternal(new FindCallback(mapper.getMappedObject(query, entity), fields), preparer,
objectCallback, collectionName);
return executeFindOneInternal(new FindOneCallback(mappedQuery, mappedFields), new ReadDbObjectCallback<T>(
this.mongoConverter, entityClass), collectionName);
}
/**
@@ -1377,14 +1449,43 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
* @return the List of converted objects.
*/
protected <T> List<T> doFind(String collectionName, DBObject query, DBObject fields, Class<T> entityClass) {
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("find using query: " + query + " fields: " + fields + " for class: " + entityClass
+ " in collection: " + collectionName);
}
EntityReader<? super T, DBObject> readerToUse = this.mongoConverter;
return doFind(collectionName, query, fields, entityClass, null, new ReadDbObjectCallback<T>(this.mongoConverter,
entityClass));
}
/**
* Map the results of an ad-hoc query on the default MongoDB collection to a List of the specified type. The object is
* converted from the MongoDB native representation using an instance of {@see MongoConverter}. The query document is
* specified as a standard DBObject and so is the fields specification.
*
* @param collectionName name of the collection to retrieve the objects from.
* @param query the query document that specifies the criteria used to find a record.
* @param fields the document that specifies the fields to be returned.
* @param entityClass the parameterized type of the returned list.
* @param preparer allows for customization of the {@link DBCursor} used when iterating over the result set, (apply
* limits, skips and so on).
* @return the {@link List} of converted objects.
*/
protected <T> List<T> doFind(String collectionName, DBObject query, DBObject fields, Class<T> entityClass,
CursorPreparer preparer) {
return doFind(collectionName, query, fields, entityClass, preparer, new ReadDbObjectCallback<T>(mongoConverter,
entityClass));
}
protected <S, T> List<T> doFind(String collectionName, DBObject query, DBObject fields, Class<S> entityClass,
CursorPreparer preparer, DbObjectCallback<T> objectCallback) {
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass);
return executeFindMultiInternal(new FindCallback(mapper.getMappedObject(query, entity), fields), null,
new ReadDbObjectCallback<T>(readerToUse, entityClass), collectionName);
DBObject mappedFields = fields == null ? null : queryMapper.getMappedObject(fields, entity);
DBObject mappedQuery = queryMapper.getMappedObject(query, entity);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug(String.format("find using query: %s fields: %s for class: %s in collection: %s",
serializeToJsonSafely(query), mappedFields, entityClass, collectionName));
}
return executeFindMultiInternal(new FindCallback(mappedQuery, mappedFields), preparer, objectCallback,
collectionName);
}
protected DBObject convertToDbObject(CollectionOptions collectionOptions) {
@@ -1422,7 +1523,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
+ entityClass + " in collection: " + collectionName);
}
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass);
return executeFindOneInternal(new FindAndRemoveCallback(mapper.getMappedObject(query, entity), fields, sort),
return executeFindOneInternal(new FindAndRemoveCallback(queryMapper.getMappedObject(query, entity), fields, sort),
new ReadDbObjectCallback<T>(readerToUse, entityClass), collectionName);
}
@@ -1437,19 +1538,15 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass);
DBObject updateObj = update.getUpdateObject();
for (String key : updateObj.keySet()) {
updateObj.put(key, mongoConverter.convertToMongoType(updateObj.get(key)));
}
DBObject mappedQuery = mapper.getMappedObject(query, entity);
DBObject mappedUpdate = queryMapper.getMappedObject(update.getUpdateObject(), entity);
DBObject mappedQuery = queryMapper.getMappedObject(query, entity);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("findAndModify using query: " + mappedQuery + " fields: " + fields + " sort: " + sort
+ " for class: " + entityClass + " and update: " + updateObj + " in collection: " + collectionName);
+ " for class: " + entityClass + " and update: " + mappedUpdate + " in collection: " + collectionName);
}
return executeFindOneInternal(new FindAndModifyCallback(mappedQuery, fields, sort, updateObj, options),
return executeFindOneInternal(new FindAndModifyCallback(mappedQuery, fields, sort, mappedUpdate, options),
new ReadDbObjectCallback<T>(readerToUse, entityClass), collectionName);
}
@@ -1874,6 +1971,35 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
}
class UnwrapAndReadDbObjectCallback<T> extends ReadDbObjectCallback<T> {
public UnwrapAndReadDbObjectCallback(EntityReader<? super T, DBObject> reader, Class<T> type) {
super(reader, type);
}
@Override
public T doWith(DBObject object) {
Object idField = object.get(Fields.UNDERSCORE_ID);
if (!(idField instanceof DBObject)) {
return super.doWith(object);
}
DBObject toMap = new BasicDBObject();
DBObject nested = (DBObject) idField;
toMap.putAll(nested);
for (String key : object.keySet()) {
if (!Fields.UNDERSCORE_ID.equals(key)) {
toMap.put(key, object.get(key));
}
}
return super.doWith(toMap);
}
}
private enum DefaultWriteConcernResolver implements WriteConcernResolver {
INSTANCE;

View File

@@ -0,0 +1,285 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import static org.springframework.data.mongodb.core.aggregation.Fields.*;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import org.springframework.data.domain.Sort;
import org.springframework.data.domain.Sort.Direction;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
import org.springframework.data.mongodb.core.aggregation.Fields.AggregationField;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.SerializationUtils;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* An {@code Aggregation} is a representation of a list of aggregation steps to be performed by the MongoDB Aggregation
* Framework.
*
* @author Tobias Trelle
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.3
*/
public class Aggregation {
public static final AggregationOperationContext DEFAULT_CONTEXT = new NoOpAggregationOperationContext();
private final List<AggregationOperation> operations;
/**
* Creates a new {@link Aggregation} from the given {@link AggregationOperation}s.
*
* @param operations must not be {@literal null} or empty.
*/
public static Aggregation newAggregation(AggregationOperation... operations) {
return new Aggregation(operations);
}
/**
* Creates a new {@link TypedAggregation} for the given type and {@link AggregationOperation}s.
*
* @param type must not be {@literal null}.
* @param operations must not be {@literal null} or empty.
*/
public static <T> TypedAggregation<T> newAggregation(Class<T> type, AggregationOperation... operations) {
return new TypedAggregation<T>(type, operations);
}
/**
* Creates a new {@link Aggregation} from the given {@link AggregationOperation}s.
*
* @param aggregationOperations must not be {@literal null} or empty.
*/
protected Aggregation(AggregationOperation... aggregationOperations) {
Assert.notNull(aggregationOperations, "AggregationOperations must not be null!");
Assert.isTrue(aggregationOperations.length > 0, "At least one AggregationOperation has to be provided");
this.operations = Arrays.asList(aggregationOperations);
}
/**
* A pointer to the previous {@link AggregationOperation}.
*
* @return
*/
public static String previousOperation() {
return "_id";
}
/**
* Creates a new {@link ProjectionOperation} including the given fields.
*
* @param fields must not be {@literal null}.
* @return
*/
public static ProjectionOperation project(String... fields) {
return project(fields(fields));
}
/**
* Creates a new {@link ProjectionOperation} includeing the given {@link Fields}.
*
* @param fields must not be {@literal null}.
* @return
*/
public static ProjectionOperation project(Fields fields) {
return new ProjectionOperation(fields);
}
/**
* Factory method to create a new {@link UnwindOperation} for the field with the given name.
*
* @param fieldName must not be {@literal null} or empty.
* @return
*/
public static UnwindOperation unwind(String field) {
return new UnwindOperation(field(field));
}
/**
* Creates a new {@link GroupOperation} for the given fields.
*
* @param fields must not be {@literal null}.
* @return
*/
public static GroupOperation group(String... fields) {
return group(fields(fields));
}
/**
* Creates a new {@link GroupOperation} for the given {@link Fields}.
*
* @param fields must not be {@literal null}.
* @return
*/
public static GroupOperation group(Fields fields) {
return new GroupOperation(fields);
}
/**
* Factory method to create a new {@link SortOperation} for the given {@link Sort}.
*
* @param sort must not be {@literal null}.
* @return
*/
public static SortOperation sort(Sort sort) {
return new SortOperation(sort);
}
/**
* Factory method to create a new {@link SortOperation} for the given sort {@link Direction} and {@code fields}.
*
* @param direction must not be {@literal null}.
* @param fields must not be {@literal null}.
* @return
*/
public static SortOperation sort(Direction direction, String... fields) {
return new SortOperation(new Sort(direction, fields));
}
/**
* Creates a new {@link SkipOperation} skipping the given number of elements.
*
* @param elementsToSkip must not be less than zero.
* @return
*/
public static SkipOperation skip(int elementsToSkip) {
return new SkipOperation(elementsToSkip);
}
/**
* Creates a new {@link LimitOperation} limiting the result to the given number of elements.
*
* @param maxElements must not be less than zero.
* @return
*/
public static LimitOperation limit(long maxElements) {
return new LimitOperation(maxElements);
}
/**
* Creates a new {@link MatchOperation} using the given {@link Criteria}.
*
* @param criteria must not be {@literal null}.
* @return
*/
public static MatchOperation match(Criteria criteria) {
return new MatchOperation(criteria);
}
/**
* Creates a new {@link Fields} instance for the given field names.
*
* @see Fields#fields(String...)
* @param fields must not be {@literal null}.
* @return
*/
public static Fields fields(String... fields) {
return Fields.fields(fields);
}
/**
* Creates a new {@link Fields} instance from the given field name and target reference.
*
* @param name must not be {@literal null} or empty.
* @param target must not be {@literal null} or empty.
* @return
*/
public static Fields bind(String name, String target) {
return Fields.from(field(name, target));
}
/**
* Converts this {@link Aggregation} specification to a {@link DBObject}.
*
* @param inputCollectionName the name of the input collection
* @return the {@code DBObject} representing this aggregation
*/
public DBObject toDbObject(String inputCollectionName, AggregationOperationContext rootContext) {
AggregationOperationContext context = rootContext;
List<DBObject> operationDocuments = new ArrayList<DBObject>(operations.size());
for (AggregationOperation operation : operations) {
operationDocuments.add(operation.toDBObject(context));
if (operation instanceof AggregationOperationContext) {
context = (AggregationOperationContext) operation;
}
}
DBObject command = new BasicDBObject("aggregate", inputCollectionName);
command.put("pipeline", operationDocuments);
return command;
}
/*
* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
return SerializationUtils
.serializeToJsonSafely(toDbObject("__collection__", new NoOpAggregationOperationContext()));
}
/**
* Simple {@link AggregationOperationContext} that just returns {@link FieldReference}s as is.
*
* @author Oliver Gierke
*/
private static class NoOpAggregationOperationContext implements AggregationOperationContext {
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getMappedObject(com.mongodb.DBObject)
*/
@Override
public DBObject getMappedObject(DBObject dbObject) {
return dbObject;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getReference(org.springframework.data.mongodb.core.aggregation.ExposedFields.AvailableField)
*/
@Override
public FieldReference getReference(Field field) {
return new FieldReference(new ExposedField(field, true));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getReference(java.lang.String)
*/
@Override
public FieldReference getReference(String name) {
return new FieldReference(new ExposedField(new AggregationField(name), true));
}
}
}

View File

@@ -0,0 +1,37 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import com.mongodb.DBObject;
/**
* Represents one single operation in an aggregation pipeline.
*
* @author Sebastian Herold
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.3
*/
public interface AggregationOperation {
/**
* Turns the {@link AggregationOperation} into a {@link DBObject} by using the given
* {@link AggregationOperationContext}.
*
* @return the DBObject
*/
DBObject toDBObject(AggregationOperationContext context);
}

View File

@@ -0,0 +1,55 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
import com.mongodb.DBObject;
/**
* The context for an {@link AggregationOperation}.
*
* @author Oliver Gierke
* @since 1.3
*/
public interface AggregationOperationContext {
/**
* Returns the mapped {@link DBObject}, potentially converting the source considering mapping metadata etc.
*
* @param dbObject will never be {@literal null}.
* @return must not be {@literal null}.
*/
DBObject getMappedObject(DBObject dbObject);
/**
* Returns a {@link FieldReference} for the given field or {@literal null} if the context does not expose the given
* field.
*
* @param field must not be {@literal null}.
* @return
*/
FieldReference getReference(Field field);
/**
* Returns the {@link FieldReference} for the field with the given name or {@literal null} if the context does not
* expose a field with the given name.
*
* @param name must not be {@literal null} or empty.
* @return
*/
FieldReference getReference(String name);
}

View File

@@ -0,0 +1,98 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.Collections;
import java.util.Iterator;
import java.util.List;
import org.springframework.util.Assert;
import com.mongodb.DBObject;
/**
* Collects the results of executing an aggregation operation.
*
* @author Tobias Trelle
* @author Oliver Gierke
* @param <T> The class in which the results are mapped onto.
* @since 1.3
*/
public class AggregationResults<T> implements Iterable<T> {
private final List<T> mappedResults;
private final DBObject rawResults;
private final String serverUsed;
/**
* Creates a new {@link AggregationResults} instance from the given mapped and raw results.
*
* @param mappedResults must not be {@literal null}.
* @param rawResults must not be {@literal null}.
*/
public AggregationResults(List<T> mappedResults, DBObject rawResults) {
Assert.notNull(mappedResults);
Assert.notNull(rawResults);
this.mappedResults = Collections.unmodifiableList(mappedResults);
this.rawResults = rawResults;
this.serverUsed = parseServerUsed();
}
/**
* Returns the aggregation results.
*
* @return
*/
public List<T> getMappedResults() {
return mappedResults;
}
/**
* Returns the unique mapped result. Assumes no result or exactly one.
*
* @return
* @throws IllegalArgumentException in case more than one result is available.
*/
public T getUniqueMappedResult() {
Assert.isTrue(mappedResults.size() < 2, "Expected unique result or null, but got more than one!");
return mappedResults.size() == 1 ? mappedResults.get(0) : null;
}
/*
* (non-Javadoc)
* @see java.lang.Iterable#iterator()
*/
public Iterator<T> iterator() {
return mappedResults.iterator();
}
/**
* Returns the server that has been used to perform the aggregation.
*
* @return
*/
public String getServerUsed() {
return serverUsed;
}
private String parseServerUsed() {
Object object = rawResults.get("serverUsed");
return object instanceof String ? (String) object : null;
}
}

View File

@@ -0,0 +1,350 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.Iterator;
import java.util.List;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.util.Assert;
import org.springframework.util.CompositeIterator;
/**
* Value object to capture the fields exposed by an {@link AggregationOperation}.
*
* @author Oliver Gierke
* @since 1.3
*/
public class ExposedFields implements Iterable<ExposedField> {
private static final List<ExposedField> NO_FIELDS = Collections.emptyList();
private static final ExposedFields EMPTY = new ExposedFields(NO_FIELDS, NO_FIELDS);
private final List<ExposedField> originalFields;
private final List<ExposedField> syntheticFields;
/**
* Creates a new {@link ExposedFields} instance from the given {@link ExposedField}s.
*
* @param fields must not be {@literal null}.
* @return
*/
public static ExposedFields from(ExposedField... fields) {
return from(Arrays.asList(fields));
}
/**
* Creates a new {@link ExposedFields} instance from the given {@link ExposedField}s.
*
* @param fields must not be {@literal null}.
* @return
*/
private static ExposedFields from(List<ExposedField> fields) {
ExposedFields result = EMPTY;
for (ExposedField field : fields) {
result = result.and(field);
}
return result;
}
/**
* Creates synthetic {@link ExposedFields} from the given {@link Fields}.
*
* @param fields must not be {@literal null}.
* @return
*/
public static ExposedFields synthetic(Fields fields) {
return createFields(fields, true);
}
/**
* Creates non-synthetic {@link ExposedFields} from the given {@link Fields}.
*
* @param fields must not be {@literal null}.
* @return
*/
public static ExposedFields nonSynthetic(Fields fields) {
return createFields(fields, false);
}
/**
* Creates a new {@link ExposedFields} instance for the given fields in either sythetic or non-synthetic way.
*
* @param fields must not be {@literal null}.
* @param synthetic
* @return
*/
private static ExposedFields createFields(Fields fields, boolean synthetic) {
Assert.notNull(fields, "Fields must not be null!");
List<ExposedField> result = new ArrayList<ExposedField>();
for (Field field : fields) {
result.add(new ExposedField(field, synthetic));
}
return ExposedFields.from(result);
}
/**
* Creates a new {@link ExposedFields} with the given orignals and synthetics.
*
* @param originals must not be {@literal null}.
* @param synthetic must not be {@literal null}.
*/
private ExposedFields(List<ExposedField> originals, List<ExposedField> synthetic) {
this.originalFields = originals;
this.syntheticFields = synthetic;
}
/**
* Creates a new {@link ExposedFields} adding the given {@link ExposedField}.
*
* @param field must not be {@literal null}.
* @return
*/
public ExposedFields and(ExposedField field) {
Assert.notNull(field, "Exposed field must not be null!");
ArrayList<ExposedField> result = new ArrayList<ExposedField>();
result.addAll(field.synthetic ? syntheticFields : originalFields);
result.add(field);
return new ExposedFields(field.synthetic ? originalFields : result, field.synthetic ? result : syntheticFields);
}
/**
* Returns the field with the given name or {@literal null} if no field with the given name is available.
*
* @param name
* @return
*/
public ExposedField getField(String name) {
for (ExposedField field : this) {
if (field.canBeReferredToBy(name)) {
return field;
}
}
return null;
}
/**
* Returns whether the {@link ExposedFields} exposes a single field only.
*
* @return
*/
public boolean exposesSingleFieldOnly() {
return originalFields.size() + syntheticFields.size() == 1;
}
/*
* (non-Javadoc)
* @see java.lang.Iterable#iterator()
*/
@Override
public Iterator<ExposedField> iterator() {
CompositeIterator<ExposedField> iterator = new CompositeIterator<ExposedField>();
iterator.add(syntheticFields.iterator());
iterator.add(originalFields.iterator());
return iterator;
}
/**
* A single exposed field.
*
* @author Oliver Gierke
*/
static class ExposedField implements Field {
private final boolean synthetic;
private final Field field;
/**
* Creates a new {@link ExposedField} with the given key.
*
* @param key must not be {@literal null} or empty.
* @param synthetic whether the exposed field is synthetic.
*/
public ExposedField(String key, boolean synthetic) {
this(Fields.field(key), synthetic);
}
/**
* Creates a new {@link ExposedField} for the given {@link Field}.
*
* @param delegate must not be {@literal null}.
* @param synthetic whether the exposed field is synthetic.
*/
public ExposedField(Field delegate, boolean synthetic) {
this.field = delegate;
this.synthetic = synthetic;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.Field#getKey()
*/
@Override
public String getName() {
return field.getName();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.Field#getTarget()
*/
@Override
public String getTarget() {
return field.getTarget();
}
/**
* Returns whether the field can be referred to using the given name.
*
* @param input
* @return
*/
public boolean canBeReferredToBy(String input) {
return getTarget().equals(input);
}
/*
* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
return String.format("AggregationField: %s, synthetic: %s", field, synthetic);
}
/*
* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (!(obj instanceof ExposedField)) {
return false;
}
ExposedField that = (ExposedField) obj;
return this.field.equals(that.field) && this.synthetic == that.synthetic;
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
int result = 17;
result += 31 * field.hashCode();
result += 31 * (synthetic ? 0 : 1);
return result;
}
}
/**
* A reference to an {@link ExposedField}.
*
* @author Oliver Gierke
*/
static class FieldReference {
private final ExposedField field;
/**
* Creates a new {@link FieldReference} for the given {@link ExposedField}.
*
* @param field must not be {@literal null}.
*/
public FieldReference(ExposedField field) {
Assert.notNull(field, "ExposedField must not be null!");
this.field = field;
}
/**
* Returns the raw, unqualified reference, i.e. the field reference without a {@literal $} prefix.
*
* @return
*/
public String getRaw() {
String target = field.getTarget();
return field.synthetic ? target : String.format("%s.%s", Fields.UNDERSCORE_ID, target);
}
/*
* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
return String.format("$%s", getRaw());
}
/*
* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (!(obj instanceof FieldReference)) {
return false;
}
FieldReference that = (FieldReference) obj;
return this.field.equals(that.field);
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
return field.hashCode();
}
}
}

View File

@@ -0,0 +1,67 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
import com.mongodb.DBObject;
/**
* Support class to implement {@link AggregationOperation}s that will become an {@link AggregationOperationContext} as
* well defining {@link ExposedFields}.
*
* @author Oliver Gierke
* @since 1.3
*/
public abstract class ExposedFieldsAggregationOperationContext implements AggregationOperationContext {
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getMappedObject(com.mongodb.DBObject)
*/
@Override
public DBObject getMappedObject(DBObject dbObject) {
return dbObject;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getReference(org.springframework.data.mongodb.core.aggregation.ExposedFields.AvailableField)
*/
@Override
public FieldReference getReference(Field field) {
return getReference(field.getTarget());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getReference(java.lang.String)
*/
@Override
public FieldReference getReference(String name) {
ExposedField field = getFields().getField(name);
if (field != null) {
return new FieldReference(field);
}
throw new IllegalArgumentException(String.format("Invalid reference '%s'!", name));
}
protected abstract ExposedFields getFields();
}

View File

@@ -0,0 +1,39 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
/**
* Abstraction for a field.
*
* @author Oliver Gierke
* @since 1.3
*/
public interface Field {
/**
* Returns the name of the field.
*
* @return must not be {@literal null}.
*/
String getName();
/**
* Returns the target of the field. In case no explicit target is available {@link #getName()} should be returned.
*
* @return must not be {@literal null}.
*/
String getTarget();
}

View File

@@ -0,0 +1,271 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.HashMap;
import java.util.Iterator;
import java.util.List;
import java.util.Map;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils;
/**
* Value object to capture a list of {@link Field} instances.
*
* @author Oliver Gierke
* @since 1.3
*/
public class Fields implements Iterable<Field> {
private static final String AMBIGUOUS_EXCEPTION = "Found two fields both using '%s' as name: %s and %s! Please "
+ "customize your field definitions to get to unique field names!";
public static String UNDERSCORE_ID = "_id";
public static String UNDERSCORE_ID_REF = "$_id";
private final List<Field> fields;
/**
* Creates a new {@link Fields} instance from the given {@link Fields}.
*
* @param fields must not be {@literal null} or empty.
* @return
*/
public static Fields from(Field... fields) {
Assert.notNull(fields, "Fields must not be null!");
return new Fields(Arrays.asList(fields));
}
/**
* Creates a new {@link Fields} instance for {@link Field}s with the given names.
*
* @param names must not be {@literal null}.
* @return
*/
public static Fields fields(String... names) {
Assert.notNull(names, "Field names must not be null!");
List<Field> fields = new ArrayList<Field>();
for (String name : names) {
fields.add(field(name));
}
return new Fields(fields);
}
/**
* Creates a {@link Field} with the given name.
*
* @param name must not be {@literal null} or empty.
* @return
*/
public static Field field(String name) {
return new AggregationField(name);
}
public static Field field(String name, String target) {
Assert.hasText(target, "Target must not be null or empty!");
return new AggregationField(name, target);
}
/**
* Creates a new {@link Fields} instance using the given {@link Field}s.
*
* @param fields must not be {@literal null}.
*/
private Fields(List<Field> fields) {
Assert.notNull(fields, "Fields must not be null!");
this.fields = verify(fields);
}
private static final List<Field> verify(List<Field> fields) {
Map<String, Field> reference = new HashMap<String, Field>();
for (Field field : fields) {
String name = field.getName();
Field found = reference.get(name);
if (found != null) {
throw new IllegalArgumentException(String.format(AMBIGUOUS_EXCEPTION, name, found, field));
}
reference.put(name, field);
}
return fields;
}
private Fields(Fields existing, Field tail) {
this.fields = new ArrayList<Field>(existing.fields.size() + 1);
this.fields.addAll(existing.fields);
this.fields.add(tail);
}
/**
* Creates a new {@link Fields} instance with a new {@link Field} of the given name added.
*
* @param name must not be {@literal null}.
* @return
*/
public Fields and(String name) {
return and(new AggregationField(name));
}
public Fields and(String name, String target) {
return and(new AggregationField(name, target));
}
public Fields and(Field field) {
return new Fields(this, field);
}
public Fields and(Fields fields) {
Fields result = this;
for (Field field : fields) {
result = result.and(field);
}
return result;
}
public Field getField(String name) {
for (Field field : fields) {
if (field.getName().equals(name)) {
return field;
}
}
return null;
}
/*
* (non-Javadoc)
* @see java.lang.Iterable#iterator()
*/
@Override
public Iterator<Field> iterator() {
return fields.iterator();
}
/**
* Value object to encapsulate a field in an aggregation operation.
*
* @author Oliver Gierke
*/
static class AggregationField implements Field {
private final String name;
private final String target;
/**
* Creates an aggregation fieldwith the given name. As no target is set explicitly, the name will be used as target
* as well.
*
* @param key
*/
public AggregationField(String key) {
this(key, null);
}
public AggregationField(String name, String target) {
Assert.hasText(name, "AggregationField name must not be null or empty!");
if (target == null && name.contains(".")) {
this.name = name.substring(name.indexOf(".") + 1);
this.target = name;
} else {
this.name = name;
this.target = target;
}
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.Field#getKey()
*/
public String getName() {
return name;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.Field#getAlias()
*/
public String getTarget() {
return StringUtils.hasText(this.target) ? this.target : this.name;
}
/*
* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
return String.format("AggregationField - name: %s, target: %s", name, target);
}
/*
* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (!(obj instanceof AggregationField)) {
return false;
}
AggregationField that = (AggregationField) obj;
return this.name.equals(that.name) && ObjectUtils.nullSafeEquals(this.target, that.target);
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
int result = 17;
result += 31 * name.hashCode();
result += 31 * ObjectUtils.nullSafeHashCode(target);
return result;
}
}
}

View File

@@ -0,0 +1,46 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* @author Thomas Darimont
* @since 1.3
*/
public class GeoNearOperation implements AggregationOperation {
private final NearQuery nearQuery;
public GeoNearOperation(NearQuery nearQuery) {
Assert.notNull(nearQuery);
this.nearQuery = nearQuery;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
return new BasicDBObject("$geoNear", context.getMappedObject(nearQuery.toDBObject()));
}
}

View File

@@ -0,0 +1,367 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.List;
import java.util.Locale;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Encapsulates the aggregation framework {@code $group}-operation.
*
* @see http://docs.mongodb.org/manual/reference/aggregation/group/#stage._S_group
* @author Sebastian Herold
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.3
*/
public class GroupOperation extends ExposedFieldsAggregationOperationContext implements AggregationOperation {
private final ExposedFields nonSynthecticFields;
private final List<Operation> operations;
/**
* Creates a new {@link GroupOperation} including the given {@link Fields}.
*
* @param fields must not be {@literal null}.
*/
public GroupOperation(Fields fields) {
this.nonSynthecticFields = ExposedFields.nonSynthetic(fields);
this.operations = new ArrayList<Operation>();
}
/**
* Creates a new {@link GroupOperation} from the given {@link GroupOperation}.
*
* @param groupOperation must not be {@literal null}.
*/
protected GroupOperation(GroupOperation groupOperation) {
this(groupOperation, Collections.<Operation> emptyList());
}
/**
* Creates a new {@link GroupOperation} from the given {@link GroupOperation} and the given {@link Operation}s.
*
* @param groupOperation
* @param nextOperations
*/
private GroupOperation(GroupOperation groupOperation, List<Operation> nextOperations) {
Assert.notNull(groupOperation, "GroupOperation must not be null!");
Assert.notNull(nextOperations, "NextOperations must not be null!");
this.nonSynthecticFields = groupOperation.nonSynthecticFields;
this.operations = new ArrayList<Operation>(nextOperations.size() + 1);
this.operations.addAll(groupOperation.operations);
this.operations.addAll(nextOperations);
}
/**
* Creates a new {@link GroupOperation} from the current one adding the given {@link Operation}.
*
* @param operation must not be {@literal null}.
* @return
*/
protected GroupOperation and(Operation operation) {
return new GroupOperation(this, Arrays.asList(operation));
}
/**
* Builder for {@link GroupOperation}s on a field.
*
* @author Thomas Darimont
*/
public class GroupOperationBuilder {
private final GroupOperation groupOperation;
private final Operation operation;
/**
* Creates a new {@link GroupOperationBuilder} from the given {@link GroupOperation} and {@link Operation}.
*
* @param groupOperation
* @param operation
*/
private GroupOperationBuilder(GroupOperation groupOperation, Operation operation) {
Assert.notNull(groupOperation, "GroupOperation must not be null!");
Assert.notNull(operation, "Operation must not be null!");
this.groupOperation = groupOperation;
this.operation = operation;
}
/**
* Allows to specify an alias for the new-operation operation.
*
* @param alias
* @return
*/
public GroupOperation as(String alias) {
return this.groupOperation.and(operation.withAlias(alias));
}
}
/**
* Generates an {@link GroupOperationBuilder} for a {@code $sum}-expression.
* <p>
* Count expressions are emulated via {@code $sum: 1}.
* <p>
*
* @return
*/
public GroupOperationBuilder count() {
return newBuilder(GroupOps.SUM, null, 1);
}
/**
* Generates an {@link GroupOperationBuilder} for a {@code $sum}-expression for the given field-reference.
*
* @param reference
* @return
*/
public GroupOperationBuilder sum(String reference) {
return sum(reference, null);
}
private GroupOperationBuilder sum(String reference, Object value) {
return newBuilder(GroupOps.SUM, reference, value);
}
/**
* Generates an {@link GroupOperationBuilder} for an {@code $add_to_set}-expression for the given field-reference.
*
* @param reference
* @return
*/
public GroupOperationBuilder addToSet(String reference) {
return addToSet(reference, null);
}
/**
* Generates an {@link GroupOperationBuilder} for an {@code $add_to_set}-expression for the given value.
*
* @param value
* @return
*/
public GroupOperationBuilder addToSet(Object value) {
return addToSet(null, value);
}
private GroupOperationBuilder addToSet(String reference, Object value) {
return newBuilder(GroupOps.ADD_TO_SET, reference, value);
}
/**
* Generates an {@link GroupOperationBuilder} for an {@code $last}-expression for the given field-reference.
*
* @param reference
* @return
*/
public GroupOperationBuilder last(String reference) {
return newBuilder(GroupOps.LAST, reference, null);
}
/**
* Generates an {@link GroupOperationBuilder} for a {@code $first}-expression for the given field-reference.
*
* @param reference
* @return
*/
public GroupOperationBuilder first(String reference) {
return newBuilder(GroupOps.FIRST, reference, null);
}
/**
* Generates an {@link GroupOperationBuilder} for an {@code $avg}-expression for the given field-reference.
*
* @param reference
* @return
*/
public GroupOperationBuilder avg(String reference) {
return newBuilder(GroupOps.AVG, reference, null);
}
/**
* Generates an {@link GroupOperationBuilder} for an {@code $push}-expression for the given field-reference.
*
* @param reference
* @return
*/
public GroupOperationBuilder push(String reference) {
return push(reference, null);
}
/**
* Generates an {@link GroupOperationBuilder} for an {@code $push}-expression for the given value.
*
* @param value
* @return
*/
public GroupOperationBuilder push(Object value) {
return push(null, value);
}
private GroupOperationBuilder push(String reference, Object value) {
return newBuilder(GroupOps.PUSH, reference, value);
}
/**
* Generates an {@link GroupOperationBuilder} for an {@code $min}-expression that for the given field-reference.
*
* @param reference
* @return
*/
public GroupOperationBuilder min(String reference) {
return newBuilder(GroupOps.MIN, reference, null);
}
/**
* Generates an {@link GroupOperationBuilder} for an {@code $max}-expression that for the given field-reference.
*
* @param reference
* @return
*/
public GroupOperationBuilder max(String reference) {
return newBuilder(GroupOps.MAX, reference, null);
}
private GroupOperationBuilder newBuilder(Keyword keyword, String reference, Object value) {
return new GroupOperationBuilder(this, new Operation(keyword, null, reference, value));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getFields()
*/
@Override
public ExposedFields getFields() {
ExposedFields fields = this.nonSynthecticFields.and(new ExposedField(Fields.UNDERSCORE_ID, true));
for (Operation operation : operations) {
fields = fields.and(operation.asField());
}
return fields;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public com.mongodb.DBObject toDBObject(AggregationOperationContext context) {
BasicDBObject operationObject = new BasicDBObject();
if (nonSynthecticFields.exposesSingleFieldOnly()) {
FieldReference reference = context.getReference(nonSynthecticFields.iterator().next());
operationObject.put(Fields.UNDERSCORE_ID, reference.toString());
} else {
BasicDBObject inner = new BasicDBObject();
for (ExposedField field : nonSynthecticFields) {
FieldReference reference = context.getReference(field);
inner.put(field.getName(), reference.toString());
}
operationObject.put(Fields.UNDERSCORE_ID, inner);
}
for (Operation operation : operations) {
operationObject.putAll(operation.toDBObject(context));
}
return new BasicDBObject("$group", operationObject);
}
interface Keyword {
String toString();
}
private static enum GroupOps implements Keyword {
SUM, LAST, FIRST, PUSH, AVG, MIN, MAX, ADD_TO_SET, COUNT;
@Override
public String toString() {
String[] parts = name().split("_");
StringBuilder builder = new StringBuilder();
for (String part : parts) {
String lowerCase = part.toLowerCase(Locale.US);
builder.append(builder.length() == 0 ? lowerCase : StringUtils.capitalize(lowerCase));
}
return "$" + builder.toString();
}
}
static class Operation implements AggregationOperation {
private final Keyword op;
private final String key;
private final String reference;
private final Object value;
public Operation(Keyword op, String key, String reference, Object value) {
this.op = op;
this.key = key;
this.reference = reference;
this.value = value;
}
public Operation withAlias(String key) {
return new Operation(op, key, reference, value);
}
public ExposedField asField() {
return new ExposedField(key, true);
}
public DBObject toDBObject(AggregationOperationContext context) {
return new BasicDBObject(key, new BasicDBObject(op.toString(), getValue(context)));
}
public Object getValue(AggregationOperationContext context) {
return reference == null ? value : context.getReference(reference).toString();
}
@Override
public String toString() {
return "Operation [op=" + op + ", key=" + key + ", reference=" + reference + ", value=" + value + "]";
}
}
}

View File

@@ -0,0 +1,52 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Encapsulates the {@code $limit}-operation
*
* @see http://docs.mongodb.org/manual/reference/aggregation/limit/
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.3
*/
class LimitOperation implements AggregationOperation {
private final long maxElements;
/**
* @param maxElements Number of documents to consider.
*/
public LimitOperation(long maxElements) {
Assert.isTrue(maxElements >= 0, "Maximum number of elements must be greater or equal to zero!");
this.maxElements = maxElements;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
return new BasicDBObject("$limit", maxElements);
}
}

View File

@@ -0,0 +1,56 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Encapsulates the {@code $match}-operation
*
* @see http://docs.mongodb.org/manual/reference/aggregation/match/
* @author Sebastian Herold
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.3
*/
public class MatchOperation implements AggregationOperation {
private final Criteria criteria;
/**
* Creates a new {@link MatchOperation} for the given {@link Criteria}.
*
* @param criteria must not be {@literal null}.
*/
public MatchOperation(Criteria criteria) {
Assert.notNull(criteria, "Criteria must not be null!");
this.criteria = criteria;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
return new BasicDBObject("$match", context.getMappedObject(criteria.getCriteriaObject()));
}
}

View File

@@ -0,0 +1,556 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.List;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
import org.springframework.data.mongodb.core.aggregation.ProjectionOperation.ProjectionOperationBuilder.FieldProjection;
import org.springframework.util.Assert;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Encapsulates the aggregation framework {@code $project}-operation. Projection of field to be used in an
* {@link Aggregation}. A projection is similar to a {@link Field} inclusion/exclusion but more powerful. It can
* generate new fields, change values of given field etc.
* <p>
*
* @see http://docs.mongodb.org/manual/reference/aggregation/project/
* @author Tobias Trelle
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.3
*/
public class ProjectionOperation extends ExposedFieldsAggregationOperationContext implements AggregationOperation {
private static final List<Projection> NONE = Collections.emptyList();
private final List<Projection> projections;
/**
* Creates a new empty {@link ProjectionOperation}.
*/
public ProjectionOperation() {
this(NONE, NONE);
}
/**
* Creates a new {@link ProjectionOperation} including the given {@link Fields}.
*
* @param fields must not be {@literal null}.
*/
public ProjectionOperation(Fields fields) {
this(NONE, ProjectionOperationBuilder.FieldProjection.from(fields, true));
}
/**
* Copy constructor to allow building up {@link ProjectionOperation} instances from already existing
* {@link Projection}s.
*
* @param current must not be {@literal null}.
* @param projections must not be {@literal null}.
*/
private ProjectionOperation(List<? extends Projection> current, List<? extends Projection> projections) {
Assert.notNull(current, "Current projections must not be null!");
Assert.notNull(projections, "Projections must not be null!");
this.projections = new ArrayList<ProjectionOperation.Projection>(current.size() + projections.size());
this.projections.addAll(current);
this.projections.addAll(projections);
}
/**
* Creates a new {@link ProjectionOperation} with the current {@link Projection}s and the given one.
*
* @param projection must not be {@literal null}.
* @return
*/
private ProjectionOperation and(Projection projection) {
return new ProjectionOperation(this.projections, Arrays.asList(projection));
}
/**
* Creates a new {@link ProjectionOperation} with the current {@link Projection}s replacing the last current one with
* the given one.
*
* @param projection must not be {@literal null}.
* @return
*/
private ProjectionOperation andReplaceLastOneWith(Projection projection) {
List<Projection> projections = this.projections.isEmpty() ? Collections.<Projection> emptyList() : this.projections
.subList(0, this.projections.size() - 1);
return new ProjectionOperation(projections, Arrays.asList(projection));
}
/**
* Creates a new {@link ProjectionOperationBuilder} to define a projection for the field with the given name.
*
* @param name must not be {@literal null} or empty.
* @return
*/
public ProjectionOperationBuilder and(String name) {
return new ProjectionOperationBuilder(name, this, null);
}
/**
* Excludes the given fields from the projection.
*
* @param fields must not be {@literal null}.
* @return
*/
public ProjectionOperation andExclude(String... fields) {
List<FieldProjection> excludeProjections = FieldProjection.from(Fields.fields(fields), false);
return new ProjectionOperation(this.projections, excludeProjections);
}
/**
* Includes the given fields into the projection.
*
* @param fields must not be {@literal null}.
* @return
*/
public ProjectionOperation andInclude(String... fields) {
List<FieldProjection> projections = FieldProjection.from(Fields.fields(fields), true);
return new ProjectionOperation(this.projections, projections);
}
/**
* Includes the given fields into the projection.
*
* @param fields must not be {@literal null}.
* @return
*/
public ProjectionOperation andInclude(Fields fields) {
return new ProjectionOperation(this.projections, FieldProjection.from(fields, true));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ExposedFieldsAggregationOperationContext#getFields()
*/
@Override
protected ExposedFields getFields() {
ExposedFields fields = null;
for (Projection projection : projections) {
ExposedField field = projection.getExposedField();
fields = fields == null ? ExposedFields.from(field) : fields.and(field);
}
return fields;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
BasicDBObject fieldObject = new BasicDBObject();
for (Projection projection : projections) {
fieldObject.putAll(projection.toDBObject(context));
}
return new BasicDBObject("$project", fieldObject);
}
/**
* Builder for {@link ProjectionOperation}s on a field.
*
* @author Oliver Gierke
*/
public static class ProjectionOperationBuilder implements AggregationOperation {
private final String name;
private final ProjectionOperation operation;
private final OperationProjection previousProjection;
/**
* Creates a new {@link ProjectionOperationBuilder} for the field with the given name on top of the given
* {@link ProjectionOperation}.
*
* @param name must not be {@literal null} or empty.
* @param operation must not be {@literal null}.
* @param previousProjection the previous operation projection, may be {@literal null}.
*/
public ProjectionOperationBuilder(String name, ProjectionOperation operation, OperationProjection previousProjection) {
Assert.hasText(name, "Field name must not be null or empty!");
Assert.notNull(operation, "ProjectionOperation must not be null!");
this.name = name;
this.operation = operation;
this.previousProjection = previousProjection;
}
/**
* Projects the result of the previous operation onto the current field. Will automatically add an exclusion for
* {@code _id} as what would be held in it by default will now go into the field just projected into.
*
* @return
*/
public ProjectionOperation previousOperation() {
return this.operation.andExclude(Fields.UNDERSCORE_ID) //
.and(new PreviousOperationProjection(name));
}
/**
* Defines a nested field binding for the current field.
*
* @param fields must not be {@literal null}.
* @return
*/
public ProjectionOperation nested(Fields fields) {
return this.operation.and(new NestedFieldProjection(name, fields));
}
/**
* Allows to specify an alias for the previous projection operation.
*
* @param string
* @return
*/
public ProjectionOperation as(String alias) {
if (previousProjection != null) {
return this.operation.andReplaceLastOneWith(previousProjection.withAlias(alias));
} else {
return this.operation.and(new FieldProjection(Fields.field(alias, name), null));
}
}
/**
* Generates an {@code $add} expression that adds the given number to the previously mentioned field.
*
* @param number
* @return
*/
public ProjectionOperationBuilder plus(Number number) {
Assert.notNull(number, "Number must not be null!");
return project("add", number);
}
/**
* Generates an {@code $subtract} expression that subtracts the given number to the previously mentioned field.
*
* @param number
* @return
*/
public ProjectionOperationBuilder minus(Number number) {
Assert.notNull(number, "Number must not be null!");
return project("subtract", number);
}
/**
* Generates an {@code $multiply} expression that multiplies the given number with the previously mentioned field.
*
* @param number
* @return
*/
public ProjectionOperationBuilder multiply(Number number) {
Assert.notNull(number, "Number must not be null!");
return project("multiply", number);
}
/**
* Generates an {@code $divide} expression that divides the previously mentioned field by the given number.
*
* @param number
* @return
*/
public ProjectionOperationBuilder divide(Number number) {
Assert.notNull(number, "Number must not be null!");
Assert.isTrue(Math.abs(number.intValue()) != 0, "Number must not be zero!");
return project("divide", number);
}
/**
* Generates an {@code $mod} expression that divides the previously mentioned field by the given number and returns
* the remainder.
*
* @param number
* @return
*/
public ProjectionOperationBuilder mod(Number number) {
Assert.notNull(number, "Number must not be null!");
Assert.isTrue(Math.abs(number.intValue()) != 0, "Number must not be zero!");
return project("mod", number);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
return this.operation.toDBObject(context);
}
/**
* Adds a generic projection for the current field.
*
* @param operation the operation key, e.g. {@code $add}.
* @param values the values to be set for the projection operation.
* @return
*/
public ProjectionOperationBuilder project(String operation, Object... values) {
OperationProjection projectionOperation = new OperationProjection(Fields.field(name), operation, values);
return new ProjectionOperationBuilder(name, this.operation.and(projectionOperation), projectionOperation);
}
/**
* A {@link Projection} to pull in the result of the previous operation.
*
* @author Oliver Gierke
*/
static class PreviousOperationProjection extends Projection {
private final String name;
/**
* Creates a new {@link PreviousOperationProjection} for the field with the given name.
*
* @param name must not be {@literal null} or empty.
*/
public PreviousOperationProjection(String name) {
super(Fields.field(name));
this.name = name;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.Projection#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
return new BasicDBObject(name, Fields.UNDERSCORE_ID_REF);
}
}
/**
* A {@link FieldProjection} to map a result of a previous {@link AggregationOperation} to a new field.
*
* @author Oliver Gierke
*/
static class FieldProjection extends Projection {
private final Field field;
private final Object value;
/**
* Creates a new {@link FieldProjection} for the field of the given name, assigning the given value.
*
* @param name must not be {@literal null} or empty.
* @param value
*/
public FieldProjection(String name, Object value) {
this(Fields.field(name), value);
}
private FieldProjection(Field field, Object value) {
super(field);
this.field = field;
this.value = value;
}
/**
* Factory method to easily create {@link FieldProjection}s for the given {@link Fields}.
*
* @param fields the {@link Fields} to in- or exclude, must not be {@literal null}.
* @param include whether to include or exclude the fields.
* @return
*/
public static List<FieldProjection> from(Fields fields, boolean include) {
Assert.notNull(fields, "Fields must not be null!");
List<FieldProjection> projections = new ArrayList<FieldProjection>();
for (Field field : fields) {
projections.add(new FieldProjection(field, include ? null : 0));
}
return projections;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.Projection#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
if (value != null) {
return new BasicDBObject(field.getName(), value);
}
FieldReference reference = context.getReference(field.getTarget());
return new BasicDBObject(field.getName(), reference.toString());
}
}
static class OperationProjection extends Projection {
private final Field field;
private final String operation;
private final List<Object> values;
/**
* Creates a new {@link OperationProjection} for the given field.
*
* @param name the name of the field to add the operation projection for, must not be {@literal null} or empty.
* @param operation the actual operation key, must not be {@literal null} or empty.
* @param values the values to pass into the operation, must not be {@literal null}.
*/
public OperationProjection(Field field, String operation, Object[] values) {
super(field);
Assert.hasText(operation, "Operation must not be null or empty!");
Assert.notNull(values, "Values must not be null!");
this.field = field;
this.operation = operation;
this.values = Arrays.asList(values);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.Projection#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
BasicDBList values = new BasicDBList();
values.addAll(buildReferences(context));
DBObject inner = new BasicDBObject("$" + operation, values);
return new BasicDBObject(this.field.getName(), inner);
}
private List<Object> buildReferences(AggregationOperationContext context) {
List<Object> result = new ArrayList<Object>(values.size());
result.add(context.getReference(field.getTarget()).toString());
for (Object element : values) {
result.add(element instanceof Field ? context.getReference((Field) element).toString() : element);
}
return result;
}
/**
* Creates a new instance of this {@link OperationProjection} with the given alias.
*
* @param alias the alias to set
* @return
*/
public OperationProjection withAlias(String alias) {
return new OperationProjection(Fields.field(alias, this.field.getName()), operation, values.toArray());
}
}
static class NestedFieldProjection extends Projection {
private final String name;
private final Fields fields;
public NestedFieldProjection(String name, Fields fields) {
super(Fields.field(name));
this.name = name;
this.fields = fields;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.Projection#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
DBObject nestedObject = new BasicDBObject();
for (Field field : fields) {
nestedObject.put(field.getName(), context.getReference(field.getTarget()).toString());
}
return new BasicDBObject(name, nestedObject);
}
}
}
/**
* Base class for {@link Projection} implementations.
*
* @author Oliver Gierke
*/
private static abstract class Projection {
private final ExposedField field;
/**
* Creates new {@link Projection} for the given {@link Field}.
*
* @param field must not be {@literal null}.
*/
public Projection(Field field) {
Assert.notNull(field, "Field must not be null!");
this.field = new ExposedField(field, true);
}
/**
* Returns the field exposed by the {@link Projection}.
*
* @return will never be {@literal null}.
*/
public ExposedField getExposedField() {
return field;
}
/**
* Renders the current {@link Projection} into a {@link DBObject} based on the given
* {@link AggregationOperationContext}.
*
* @param context will never be {@literal null}.
* @return
*/
public abstract DBObject toDBObject(AggregationOperationContext context);
}
}

View File

@@ -0,0 +1,54 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Encapsulates the aggregation framework {@code $skip}-operation.
*
* @see http://docs.mongodb.org/manual/reference/aggregation/skip/
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.3
*/
public class SkipOperation implements AggregationOperation {
private final long skipCount;
/**
* Creates a new {@link SkipOperation} skipping the given number of elements.
*
* @param skipCount number of documents to skip.
*/
public SkipOperation(long skipCount) {
Assert.isTrue(skipCount >= 0, "Skip count must not be negative!");
this.skipCount = skipCount;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
return new BasicDBObject("$skip", skipCount);
}
}

View File

@@ -0,0 +1,76 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import org.springframework.data.domain.Sort;
import org.springframework.data.domain.Sort.Direction;
import org.springframework.data.domain.Sort.Order;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Encapsulates the aggregation framework {@code $sort}-operation.
*
* @see http://docs.mongodb.org/manual/reference/aggregation/sort/#pipe._S_sort
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.3
*/
public class SortOperation implements AggregationOperation {
private final Sort sort;
/**
* Creates a new {@link SortOperation} for the given {@link Sort} instance.
*
* @param sort must not be {@literal null}.
*/
public SortOperation(Sort sort) {
Assert.notNull(sort, "Sort must not be null!");
this.sort = sort;
}
public SortOperation and(Direction direction, String... fields) {
return and(new Sort(direction, fields));
}
public SortOperation and(Sort sort) {
return new SortOperation(this.sort.and(sort));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
BasicDBObject object = new BasicDBObject();
for (Order order : sort) {
// Check reference
FieldReference reference = context.getReference(order.getProperty());
object.put(reference.getRaw(), order.isAscending() ? 1 : -1);
}
return new BasicDBObject("$sort", object);
}
}

View File

@@ -0,0 +1,102 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import static org.springframework.data.mongodb.core.aggregation.Fields.*;
import org.springframework.data.mapping.PropertyPath;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.context.PersistentPropertyPath;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.util.Assert;
import com.mongodb.DBObject;
/**
* {@link AggregationOperationContext} aware of a particular type and a {@link MappingContext} to potentially translate
* property references into document field names.
*
* @author Oliver Gierke
* @since 1.3
*/
public class TypeBasedAggregationOperationContext implements AggregationOperationContext {
private final Class<?> type;
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
private final QueryMapper mapper;
/**
* Creates a new {@link TypeBasedAggregationOperationContext} for the given type, {@link MappingContext} and
* {@link QueryMapper}.
*
* @param type must not be {@literal null}.
* @param mappingContext must not be {@literal null}.
* @param mapper must not be {@literal null}.
*/
public TypeBasedAggregationOperationContext(Class<?> type,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext, QueryMapper mapper) {
Assert.notNull(type, "Type must not be null!");
Assert.notNull(mappingContext, "MappingContext must not be null!");
Assert.notNull(mapper, "QueryMapper must not be null!");
this.type = type;
this.mappingContext = mappingContext;
this.mapper = mapper;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getMappedObject(com.mongodb.DBObject)
*/
@Override
public DBObject getMappedObject(DBObject dbObject) {
return mapper.getMappedObject(dbObject, mappingContext.getPersistentEntity(type));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getReference(org.springframework.data.mongodb.core.aggregation.ExposedFields.AvailableField)
*/
@Override
public FieldReference getReference(Field field) {
PropertyPath.from(field.getTarget(), type);
return getReferenceFor(field);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getReference(java.lang.String)
*/
@Override
public FieldReference getReference(String name) {
PropertyPath path = PropertyPath.from(name, type);
PersistentPropertyPath<MongoPersistentProperty> propertyPath = mappingContext.getPersistentPropertyPath(path);
return getReferenceFor(field(path.getLeafProperty().getSegment(),
propertyPath.toDotPath(MongoPersistentProperty.PropertyToFieldNameConverter.INSTANCE)));
}
private FieldReference getReferenceFor(Field field) {
return new FieldReference(new ExposedField(field, true));
}
}

View File

@@ -0,0 +1,51 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import org.springframework.util.Assert;
/**
* A {@code TypedAggregation} is a special {@link Aggregation} that holds information of the input aggregation type.
*
* @author Thomas Darimont
* @author Oliver Gierke
*/
public class TypedAggregation<I> extends Aggregation {
private final Class<I> inputType;
/**
* Creates a new {@link TypedAggregation} from the given {@link AggregationOperation}s.
*
* @param operations must not be {@literal null} or empty.
*/
public TypedAggregation(Class<I> inputType, AggregationOperation... operations) {
super(operations);
Assert.notNull(inputType, "Input type must not be null!");
this.inputType = inputType;
}
/**
* Returns the input type for the {@link Aggregation}.
*
* @return the inputType will never be {@literal null}.
*/
public Class<I> getInputType() {
return inputType;
}
}

View File

@@ -0,0 +1,64 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Encapsulates the aggregation framework {@code $unwind}-operation.
*
* @see http://docs.mongodb.org/manual/reference/aggregation/unwind/#pipe._S_unwind
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.3
*/
public class UnwindOperation extends ExposedFieldsAggregationOperationContext implements AggregationOperation {
private final ExposedField field;
/**
* Creates a new {@link UnwindOperation} for the given {@link Field}.
*
* @param field must not be {@literal null}.
*/
public UnwindOperation(Field field) {
Assert.notNull(field);
this.field = new ExposedField(field, true);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ExposedFieldsAggregationOperationContext#getFields()
*/
@Override
protected ExposedFields getFields() {
return ExposedFields.from(field);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
return new BasicDBObject("$unwind", context.getReference(field).toString());
}
}

View File

@@ -0,0 +1,5 @@
/**
* Support for the MongoDB aggregation framework.
* @since 1.3
*/
package org.springframework.data.mongodb.core.aggregation;

View File

@@ -39,6 +39,7 @@ import org.springframework.data.convert.WritingConverter;
import org.springframework.data.mapping.model.SimpleTypeHolder;
import org.springframework.data.mongodb.core.convert.MongoConverters.BigDecimalToStringConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.BigIntegerToStringConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.DBObjectToStringConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToBigDecimalConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToBigIntegerConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToURLConverter;
@@ -98,6 +99,7 @@ public class CustomConversions {
this.converters.add(StringToBigIntegerConverter.INSTANCE);
this.converters.add(URLToStringConverter.INSTANCE);
this.converters.add(StringToURLConverter.INSTANCE);
this.converters.add(DBObjectToStringConverter.INSTANCE);
this.converters.addAll(JodaTimeConverters.getConvertersToRegister());
this.converters.addAll(converters);

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -18,16 +18,19 @@ package org.springframework.data.mongodb.core.convert;
import java.util.Arrays;
import java.util.List;
import java.util.Map;
import java.util.Set;
import org.springframework.data.convert.SimpleTypeInformationMapper;
import org.springframework.data.convert.DefaultTypeMapper;
import org.springframework.data.convert.SimpleTypeInformationMapper;
import org.springframework.data.convert.TypeAliasAccessor;
import org.springframework.data.convert.TypeInformationMapper;
import org.springframework.data.mapping.PersistentEntity;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.util.ClassTypeInformation;
import org.springframework.data.util.TypeInformation;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
@@ -37,33 +40,43 @@ import com.mongodb.DBObject;
* respectively.
*
* @author Oliver Gierke
* @author Thomas Darimont
*/
public class DefaultMongoTypeMapper extends DefaultTypeMapper<DBObject> implements MongoTypeMapper {
public static final String DEFAULT_TYPE_KEY = "_class";
@SuppressWarnings("rawtypes")
@SuppressWarnings("rawtypes")//
private static final TypeInformation<List> LIST_TYPE_INFO = ClassTypeInformation.from(List.class);
@SuppressWarnings("rawtypes")
@SuppressWarnings("rawtypes")//
private static final TypeInformation<Map> MAP_TYPE_INFO = ClassTypeInformation.from(Map.class);
private String typeKey = DEFAULT_TYPE_KEY;
private final TypeAliasAccessor<DBObject> accessor;
private final String typeKey;
public DefaultMongoTypeMapper() {
this(DEFAULT_TYPE_KEY, Arrays.asList(SimpleTypeInformationMapper.INSTANCE));
this(DEFAULT_TYPE_KEY);
}
public DefaultMongoTypeMapper(String typeKey) {
super(new DBObjectTypeAliasAccessor(typeKey));
this.typeKey = typeKey;
this(typeKey, Arrays.asList(SimpleTypeInformationMapper.INSTANCE));
}
public DefaultMongoTypeMapper(String typeKey, MappingContext<? extends PersistentEntity<?, ?>, ?> mappingContext) {
super(new DBObjectTypeAliasAccessor(typeKey), mappingContext, Arrays.asList(SimpleTypeInformationMapper.INSTANCE));
this.typeKey = typeKey;
this(typeKey, new DBObjectTypeAliasAccessor(typeKey), mappingContext, Arrays
.asList(SimpleTypeInformationMapper.INSTANCE));
}
public DefaultMongoTypeMapper(String typeKey, List<? extends TypeInformationMapper> mappers) {
super(new DBObjectTypeAliasAccessor(typeKey), mappers);
this(typeKey, new DBObjectTypeAliasAccessor(typeKey), null, mappers);
}
private DefaultMongoTypeMapper(String typeKey, TypeAliasAccessor<DBObject> accessor,
MappingContext<? extends PersistentEntity<?, ?>, ?> mappingContext, List<? extends TypeInformationMapper> mappers) {
super(accessor, mappingContext, mappers);
this.typeKey = typeKey;
this.accessor = accessor;
}
/*
@@ -74,6 +87,31 @@ public class DefaultMongoTypeMapper extends DefaultTypeMapper<DBObject> implemen
return typeKey == null ? false : typeKey.equals(key);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.MongoTypeMapper#writeTypeRestrictions(java.util.Set)
*/
@Override
public void writeTypeRestrictions(DBObject result, Set<Class<?>> restrictedTypes) {
if (restrictedTypes == null || restrictedTypes.isEmpty()) {
return;
}
BasicDBList restrictedMappedTypes = new BasicDBList();
for (Class<?> restrictedType : restrictedTypes) {
Object typeAlias = getAliasFor(ClassTypeInformation.from(restrictedType));
if (typeAlias != null) {
restrictedMappedTypes.add(typeAlias);
}
}
accessor.writeTypeTo(result, new BasicDBObject("$in", restrictedMappedTypes));
}
/* (non-Javadoc)
* @see org.springframework.data.convert.DefaultTypeMapper#getFallbackTypeFor(java.lang.Object)
*/
@@ -83,6 +121,7 @@ public class DefaultMongoTypeMapper extends DefaultTypeMapper<DBObject> implemen
}
/**
* {@link TypeAliasAccessor} to store aliases in a {@link DBObject}.
*
* @author Oliver Gierke
*/

View File

@@ -72,6 +72,7 @@ import com.mongodb.DBRef;
* @author Oliver Gierke
* @author Jon Brisbin
* @author Patrik Wasik
* @author Thomas Darimont
*/
public class MappingMongoConverter extends AbstractMongoConverter implements ApplicationContextAware {
@@ -124,6 +125,15 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
mappingContext) : typeMapper;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.MongoConverter#getTypeMapper()
*/
@Override
public MongoTypeMapper getTypeMapper() {
return this.typeMapper;
}
/**
* Configure the characters dots potentially contained in a {@link Map} shall be replaced with. By default we don't do
* any translation but rather reject a {@link Map} with keys containing dots causing the conversion for the entire
@@ -356,8 +366,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
try {
Object id = wrapper.getProperty(idProperty, Object.class, fieldAccessOnly);
dbo.put("_id", idMapper.convertId(id));
} catch (ConversionException ignored) {
}
} catch (ConversionException ignored) {}
}
// Write the properties

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2010-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -17,6 +17,7 @@ package org.springframework.data.mongodb.core.convert;
import org.springframework.data.convert.EntityConverter;
import org.springframework.data.convert.EntityReader;
import org.springframework.data.convert.TypeMapper;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
@@ -26,9 +27,17 @@ import com.mongodb.DBObject;
* Central Mongo specific converter interface which combines {@link MongoWriter} and {@link MongoReader}.
*
* @author Oliver Gierke
* @author Thomas Darimont
*/
public interface MongoConverter extends
EntityConverter<MongoPersistentEntity<?>, MongoPersistentProperty, Object, DBObject>, MongoWriter<Object>,
EntityReader<Object, DBObject> {
/**
* Returns thw {@link TypeMapper} being used to write type information into {@link DBObject}s created with that
* converter.
*
* @return will never be {@literal null}.
*/
MongoTypeMapper getTypeMapper();
}

View File

@@ -24,8 +24,11 @@ import org.bson.types.ObjectId;
import org.springframework.core.convert.ConversionFailedException;
import org.springframework.core.convert.TypeDescriptor;
import org.springframework.core.convert.converter.Converter;
import org.springframework.data.convert.ReadingConverter;
import org.springframework.util.StringUtils;
import com.mongodb.DBObject;
/**
* Wrapper class to contain useful converters for the usage with Mongo.
*
@@ -147,4 +150,15 @@ abstract class MongoConverters {
}
}
}
@ReadingConverter
public static enum DBObjectToStringConverter implements Converter<DBObject, String> {
INSTANCE;
@Override
public String convert(DBObject source) {
return source == null ? null : source.toString();
}
}
}

View File

@@ -15,6 +15,8 @@
*/
package org.springframework.data.mongodb.core.convert;
import java.util.Set;
import org.springframework.data.convert.TypeMapper;
import com.mongodb.DBObject;
@@ -32,4 +34,14 @@ public interface MongoTypeMapper extends TypeMapper<DBObject> {
* @return
*/
boolean isTypeKey(String key);
/**
* Writes type restrictions to the given {@link DBObject}. This usually results in an {@code $in}-clause to be
* generated that restricts the type-key (e.g. {@code _class}) to be in the set of type aliases for the given
* {@code restrictedTypes}.
*
* @param result must not be {@literal null}
* @param restrictedTypes must not be {@literal null}
*/
void writeTypeRestrictions(DBObject result, Set<Class<?>> restrictedTypes);
}

View File

@@ -18,6 +18,7 @@ package org.springframework.data.mongodb.core.convert;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import java.util.Set;
import org.bson.types.ObjectId;
import org.springframework.core.convert.ConversionException;
@@ -29,6 +30,7 @@ import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.context.PersistentPropertyPath;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.util.Assert;
import com.mongodb.BasicDBList;
@@ -74,6 +76,7 @@ public class QueryMapper {
* @param entity can be {@literal null}.
* @return
*/
@SuppressWarnings("deprecation")
public DBObject getMappedObject(DBObject query, MongoPersistentEntity<?> entity) {
if (Keyword.isKeyword(query)) {
@@ -84,8 +87,32 @@ public class QueryMapper {
for (String key : query.keySet()) {
// TODO: remove one once QueryMapper can work with Query instances directly
if (Query.isRestrictedTypeKey(key)) {
@SuppressWarnings("unchecked")
Set<Class<?>> restrictedTypes = (Set<Class<?>>) query.get(key);
this.converter.getTypeMapper().writeTypeRestrictions(result, restrictedTypes);
continue;
}
if (Keyword.isKeyword(key)) {
result.putAll(getMappedKeyword(new Keyword(query, key), entity));
continue;
}
Field field = entity == null ? new Field(key) : new MetadataBackedField(key, entity, mappingContext);
result.put(field.getMappedKey(), getMappedValue(query.get(key), field));
Object rawValue = query.get(key);
String newKey = field.getMappedKey();
if (Keyword.isKeyword(rawValue) && !field.isIdField()) {
Keyword keyword = new Keyword((DBObject) rawValue);
result.put(newKey, getMappedKeyword(field, keyword));
} else {
result.put(newKey, getMappedValue(field, rawValue));
}
}
return result;
@@ -101,13 +128,14 @@ public class QueryMapper {
private DBObject getMappedKeyword(Keyword query, MongoPersistentEntity<?> entity) {
// $or/$nor
if (query.key.matches(N_OR_PATTERN)) {
if (query.key.matches(N_OR_PATTERN) || query.value instanceof Iterable) {
Iterable<?> conditions = (Iterable<?>) query.value;
BasicDBList newConditions = new BasicDBList();
for (Object condition : conditions) {
newConditions.add(getMappedObject((DBObject) condition, entity));
newConditions.add(condition instanceof DBObject ? getMappedObject((DBObject) condition, entity)
: convertSimpleOrDBObject(condition, entity));
}
return new BasicDBObject(query.key, newConditions);
@@ -119,34 +147,34 @@ public class QueryMapper {
/**
* Returns the mapped keyword considered defining a criteria for the given property.
*
* @param keyword
* @param property
* @param keyword
* @return
*/
public DBObject getMappedKeyword(Keyword keyword, Field property) {
private DBObject getMappedKeyword(Field property, Keyword keyword) {
if (property.isAssociation()) {
convertAssociation(keyword.value, property.getProperty());
}
boolean needsAssociationConversion = property.isAssociation() && !keyword.isExists();
Object value = needsAssociationConversion ? convertAssociation(keyword.value, property.getProperty())
: getMappedValue(property.with(keyword.key), keyword.value);
return new BasicDBObject(keyword.key, getMappedValue(keyword.value, property.with(keyword.key)));
return new BasicDBObject(keyword.key, value);
}
/**
* Returns the mapped value for the given source object assuming it's a value for the given
* {@link MongoPersistentProperty}.
*
* @param source the source object to be mapped
* @param value the source object to be mapped
* @param property the property the value is a value for
* @param newKey the key the value will be bound to eventually
* @return
*/
private Object getMappedValue(Object source, Field key) {
private Object getMappedValue(Field documentField, Object value) {
if (key.isIdField()) {
if (documentField.isIdField()) {
if (source instanceof DBObject) {
DBObject valueDbo = (DBObject) source;
if (value instanceof DBObject) {
DBObject valueDbo = (DBObject) value;
if (valueDbo.containsField("$in") || valueDbo.containsField("$nin")) {
String inKey = valueDbo.containsField("$in") ? "$in" : "$nin";
List<Object> ids = new ArrayList<Object>();
@@ -157,22 +185,25 @@ public class QueryMapper {
} else if (valueDbo.containsField("$ne")) {
valueDbo.put("$ne", convertId(valueDbo.get("$ne")));
} else {
return getMappedObject((DBObject) source, null);
return getMappedObject((DBObject) value, null);
}
return valueDbo;
} else {
return convertId(source);
return convertId(value);
}
}
if (key.isAssociation()) {
return Keyword.isKeyword(source) ? getMappedKeyword(new Keyword(source), key) : convertAssociation(source,
key.getProperty());
if (Keyword.isKeyword(value)) {
return getMappedKeyword(new Keyword((DBObject) value), null);
}
return convertSimpleOrDBObject(source, key.getPropertyEntity());
if (documentField.isAssociation()) {
return convertAssociation(value, documentField.getProperty());
}
return convertSimpleOrDBObject(value, documentField.getPropertyEntity());
}
/**
@@ -185,18 +216,30 @@ public class QueryMapper {
private Object convertSimpleOrDBObject(Object source, MongoPersistentEntity<?> entity) {
if (source instanceof BasicDBList) {
return converter.convertToMongoType(source);
return delegateConvertToMongoType(source, entity);
}
if (source instanceof DBObject) {
return getMappedObject((DBObject) source, entity);
}
return delegateConvertToMongoType(source, entity);
}
/**
* Converts the given source Object to a mongo type with the type information of the original source type omitted.
* Subclasses may overwrite this method to retain the type information of the source type on the resulting mongo type.
*
* @param source
* @param entity
* @return the converted mongo type or null if source is null
*/
protected Object delegateConvertToMongoType(Object source, MongoPersistentEntity<?> entity) {
return converter.convertToMongoType(source);
}
/**
* Converts the given source assuming it's actually an association to anoter object.
* Converts the given source assuming it's actually an association to another object.
*
* @param source
* @param property
@@ -243,7 +286,7 @@ public class QueryMapper {
// Ignore
}
return converter.convertToMongoType(id);
return delegateConvertToMongoType(id, null);
}
/**
@@ -256,16 +299,27 @@ public class QueryMapper {
String key;
Object value;
Keyword(Object source) {
public Keyword(DBObject source, String key) {
this.key = key;
this.value = source.get(key);
}
Assert.isInstanceOf(DBObject.class, source);
public Keyword(DBObject dbObject) {
DBObject value = (DBObject) source;
Set<String> keys = dbObject.keySet();
Assert.isTrue(keys.size() == 1, "Can only use a single value DBObject!");
Assert.isTrue(value.keySet().size() == 1, "Keyword must have a single key only!");
this.key = keys.iterator().next();
this.value = dbObject.get(key);
}
this.key = value.keySet().iterator().next();
this.value = value.get(key);
/**
* Returns whether the current keyword is the {@code $exists} keyword.
*
* @return
*/
public boolean isExists() {
return "$exists".equalsIgnoreCase(key);
}
/**
@@ -275,7 +329,11 @@ public class QueryMapper {
* @param value
* @return
*/
static boolean isKeyword(Object value) {
public static boolean isKeyword(Object value) {
if (value instanceof String) {
return ((String) value).startsWith("$");
}
if (!(value instanceof DBObject)) {
return false;
@@ -298,7 +356,7 @@ public class QueryMapper {
protected final String name;
/**
* Creates a new {@link Field} without meta-information but the given name.
* Creates a new {@link DocumentField} without meta-information but the given name.
*
* @param name must not be {@literal null} or empty.
*/
@@ -309,7 +367,7 @@ public class QueryMapper {
}
/**
* Returns a new {@link Field} with the given name.
* Returns a new {@link DocumentField} with the given name.
*
* @param name must not be {@literal null} or empty.
* @return
@@ -365,7 +423,7 @@ public class QueryMapper {
}
/**
* Extension of {@link Field} to be backed with mapping metadata.
* Extension of {@link DocumentField} to be backed with mapping metadata.
*
* @author Oliver Gierke
*/

View File

@@ -0,0 +1,52 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
/**
* A subclass of {@link QueryMapper} that retains type information on the mongo types.
*
* @author Thomas Darimont
*/
public class UpdateMapper extends QueryMapper {
private final MongoWriter<?> converter;
/**
* Creates a new {@link UpdateMapper} using the given {@link MongoConverter}.
*
* @param converter must not be {@literal null}.
*/
public UpdateMapper(MongoConverter converter) {
super(converter);
this.converter = converter;
}
/**
* Converts the given source object to a mongo type retaining the original type information of the source type on the
* mongo type.
*
* @see org.springframework.data.mongodb.core.convert.QueryMapper#delegateConvertToMongoType(java.lang.Object,
* org.springframework.data.mongodb.core.mapping.MongoPersistentEntity)
*/
@Override
protected Object delegateConvertToMongoType(Object source, MongoPersistentEntity<?> entity) {
return entity == null ? super.delegateConvertToMongoType(source, null) : converter.convertToMongoType(source,
entity.getTypeInformation());
}
}

View File

@@ -0,0 +1,5 @@
/**
* Spring Data MongoDB specific converter infrastructure.
*/
package org.springframework.data.mongodb.core.convert;

View File

@@ -131,7 +131,7 @@ public class GeoResults<T> implements Iterable<GeoResult<T>> {
private static Distance calculateAverageDistance(List<? extends GeoResult<?>> results, Metric metric) {
if (results.isEmpty()) {
return new Distance(0, null);
return new Distance(0, metric);
}
double averageDistance = 0;

View File

@@ -0,0 +1,5 @@
/**
* Support for MongoDB geo-spatial queries.
*/
package org.springframework.data.mongodb.core.geo;

View File

@@ -27,6 +27,7 @@ import java.lang.annotation.Target;
* @author Jon Brisbin
* @author Oliver Gierke
* @author Philipp Schneider
* @author Johno Crawford
*/
@Target({ ElementType.TYPE })
@Documented
@@ -78,4 +79,12 @@ public @interface CompoundIndex {
* @return
*/
boolean background() default false;
/**
* Configures the number of seconds after which the collection should expire. Defaults to -1 for no expiry.
*
* @see http://docs.mongodb.org/manual/tutorial/expire-data/
* @return
*/
int expireAfterSeconds() default -1;
}

View File

@@ -26,6 +26,7 @@ import java.lang.annotation.Target;
* @author Jon Brisbin
* @author Oliver Gierke
* @author Philipp Schneider
* @author Johno Crawford
*/
@Target(ElementType.FIELD)
@Retention(RetentionPolicy.RUNTIME)
@@ -50,4 +51,12 @@ public @interface Indexed {
* @return
*/
boolean background() default false;
/**
* Configures the number of seconds after which the collection should expire. Defaults to -1 for no expiry.
*
* @see http://docs.mongodb.org/manual/tutorial/expire-data/
* @return
*/
int expireAfterSeconds() default -1;
}

View File

@@ -44,6 +44,7 @@ import com.mongodb.util.JSON;
* @author Jon Brisbin
* @author Oliver Gierke
* @author Philipp Schneider
* @author Johno Crawford
*/
public class MongoPersistentEntityIndexCreator implements
ApplicationListener<MappingContextEvent<MongoPersistentEntity<?>, MongoPersistentProperty>> {
@@ -108,7 +109,7 @@ public class MongoPersistentEntityIndexCreator implements
DBObject definition = (DBObject) JSON.parse(index.def());
ensureIndex(indexColl, index.name(), definition, index.unique(), index.dropDups(), index.sparse(),
index.background());
index.background(), index.expireAfterSeconds());
if (log.isDebugEnabled()) {
log.debug("Created compound index " + index);
@@ -143,7 +144,7 @@ public class MongoPersistentEntityIndexCreator implements
DBObject definition = new BasicDBObject(persistentProperty.getFieldName(), direction);
ensureIndex(collection, name, definition, index.unique(), index.dropDups(), index.sparse(),
index.background());
index.background(), index.expireAfterSeconds());
if (log.isDebugEnabled()) {
log.debug("Created property index " + index);
@@ -192,9 +193,11 @@ public class MongoPersistentEntityIndexCreator implements
* @param unique whether it shall be a unique index
* @param dropDups whether to drop duplicates
* @param sparse sparse or not
* @param background whether the index will be created in the background
* @param expireAfterSeconds the time to live for documents in the collection
*/
protected void ensureIndex(String collection, String name, DBObject indexDefinition, boolean unique,
boolean dropDups, boolean sparse, boolean background) {
boolean dropDups, boolean sparse, boolean background, int expireAfterSeconds) {
DBObject opts = new BasicDBObject();
opts.put("name", name);
@@ -203,6 +206,10 @@ public class MongoPersistentEntityIndexCreator implements
opts.put("unique", unique);
opts.put("background", background);
if (expireAfterSeconds != -1) {
opts.put("expireAfterSeconds", expireAfterSeconds);
}
mongoDbFactory.getDb().getCollection(collection).ensureIndex(indexDefinition, opts);
}
}

View File

@@ -0,0 +1,5 @@
/**
* Support for MongoDB document indexing.
*/
package org.springframework.data.mongodb.core.index;

View File

@@ -15,6 +15,7 @@
*/
package org.springframework.data.mongodb.core.mapping;
import java.lang.reflect.Field;
import java.util.Comparator;
import java.util.HashMap;
import java.util.Map;
@@ -24,6 +25,7 @@ import org.springframework.context.ApplicationContext;
import org.springframework.context.ApplicationContextAware;
import org.springframework.context.expression.BeanFactoryAccessor;
import org.springframework.context.expression.BeanFactoryResolver;
import org.springframework.data.annotation.Id;
import org.springframework.data.mapping.Association;
import org.springframework.data.mapping.AssociationHandler;
import org.springframework.data.mapping.PropertyHandler;
@@ -35,6 +37,7 @@ import org.springframework.expression.Expression;
import org.springframework.expression.ParserContext;
import org.springframework.expression.spel.standard.SpelExpressionParser;
import org.springframework.expression.spel.support.StandardEvaluationContext;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
/**
@@ -43,11 +46,12 @@ import org.springframework.util.StringUtils;
*
* @author Jon Brisbin
* @author Oliver Gierke
* @author Thomas Darimont
*/
public class BasicMongoPersistentEntity<T> extends BasicPersistentEntity<T, MongoPersistentProperty> implements
MongoPersistentEntity<T>, ApplicationContextAware {
private static final String AMBIGUOUS_FIELD_MAPPING = "Ambiguous field mapping detected! Both %s and %s map to the same field name %s! Disambiguate using @Field annotation!";
private static final String AMBIGUOUS_FIELD_MAPPING = "Ambiguous field mapping detected! Both %s and %s map to the same field name %s! Disambiguate using @DocumentField annotation!";
private final String collection;
private final SpelExpressionParser parser;
private final StandardEvaluationContext context;
@@ -136,6 +140,60 @@ public class BasicMongoPersistentEntity<T> extends BasicPersistentEntity<T, Mong
}
}
/**
* As a general note: An implicit id property has a name that matches "id" or "_id". An explicit id property is one
* that is annotated with @see {@link Id}. The property id is updated according to the following rules: 1) An id
* property which is defined explicitly takes precedence over an implicitly defined id property. 2) In case of any
* ambiguity a @see {@link MappingException} is thrown.
*
* @param property - the new id property candidate
* @return
*/
@Override
protected MongoPersistentProperty returnPropertyIfBetterIdPropertyCandidateOrNull(MongoPersistentProperty property) {
Assert.notNull(property);
if (!property.isIdProperty()) {
return null;
}
MongoPersistentProperty currentIdProperty = getIdProperty();
boolean currentIdPropertyIsSet = currentIdProperty != null;
@SuppressWarnings("null")
boolean currentIdPropertyIsExplicit = currentIdPropertyIsSet ? currentIdProperty.isExplicitIdProperty() : false;
boolean newIdPropertyIsExplicit = property.isExplicitIdProperty();
if (!currentIdPropertyIsSet) {
return property;
}
@SuppressWarnings("null")
Field currentIdPropertyField = currentIdProperty.getField();
if (newIdPropertyIsExplicit && currentIdPropertyIsExplicit) {
throw new MappingException(String.format(
"Attempt to add explicit id property %s but already have an property %s registered "
+ "as explicit id. Check your mapping configuration!", property.getField(), currentIdPropertyField));
} else if (newIdPropertyIsExplicit && !currentIdPropertyIsExplicit) {
// explicit id property takes precedence over implicit id property
return property;
} else if (!newIdPropertyIsExplicit && currentIdPropertyIsExplicit) {
// no id property override - current property is explicitly defined
} else {
throw new MappingException(String.format(
"Attempt to add id property %s but already have an property %s registered "
+ "as id. Check your mapping configuration!", property.getField(), currentIdPropertyField));
}
return null;
}
/**
* Handler to collect {@link MongoPersistentProperty} instances and check that each of them is mapped to a distinct
* field name.

View File

@@ -24,6 +24,7 @@ import java.util.Set;
import org.bson.types.ObjectId;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.data.annotation.Id;
import org.springframework.data.mapping.Association;
import org.springframework.data.mapping.model.AnnotationBasedPersistentProperty;
import org.springframework.data.mapping.model.MappingException;
@@ -38,6 +39,7 @@ import com.mongodb.DBObject;
*
* @author Oliver Gierke
* @author Patryk Wasik
* @author Thomas Darimont
*/
public class BasicMongoPersistentProperty extends AnnotationBasedPersistentProperty<MongoPersistentProperty> implements
MongoPersistentProperty {
@@ -109,6 +111,15 @@ public class BasicMongoPersistentProperty extends AnnotationBasedPersistentPrope
return SUPPORTED_ID_PROPERTY_NAMES.contains(field.getName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.MongoPersistentProperty#isExplicitIdProperty()
*/
@Override
public boolean isExplicitIdProperty() {
return isAnnotationPresent(Id.class);
}
/**
* Returns the key to be used to store the value of the property inside a Mongo {@link DBObject}.
*
@@ -117,7 +128,18 @@ public class BasicMongoPersistentProperty extends AnnotationBasedPersistentPrope
public String getFieldName() {
if (isIdProperty()) {
return ID_FIELD_NAME;
if (owner == null) {
return ID_FIELD_NAME;
}
if (owner.getIdProperty() == null) {
return ID_FIELD_NAME;
}
if (owner.isIdProperty(this)) {
return ID_FIELD_NAME;
}
}
org.springframework.data.mongodb.core.mapping.Field annotation = findAnnotation(org.springframework.data.mongodb.core.mapping.Field.class);

View File

@@ -18,7 +18,7 @@ package org.springframework.data.mongodb.core.mapping;
/**
* SPI interface to determine how to name document fields in cases the field name is not manually defined.
*
* @see Field
* @see DocumentField
* @see PropertyNameFieldNamingStrategy
* @see CamelCaseAbbreviatingFieldNamingStrategy
* @since 1.3

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2012 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -16,6 +16,8 @@
package org.springframework.data.mongodb.core.mapping;
import org.springframework.core.convert.converter.Converter;
import org.springframework.data.annotation.Id;
import org.springframework.data.mapping.PersistentEntity;
import org.springframework.data.mapping.PersistentProperty;
/**
@@ -23,6 +25,7 @@ import org.springframework.data.mapping.PersistentProperty;
*
* @author Oliver Gierke
* @author Patryk Wasik
* @author Thomas Darimont
*/
public interface MongoPersistentProperty extends PersistentProperty<MongoPersistentProperty> {
@@ -48,6 +51,14 @@ public interface MongoPersistentProperty extends PersistentProperty<MongoPersist
*/
boolean isDbReference();
/**
* Returns whether the property is explicitly marked as an identifier property of the owning {@link PersistentEntity}.
* A property is an explicit id property if it is annotated with @see {@link Id}.
*
* @return
*/
boolean isExplicitIdProperty();
/**
* Returns the {@link DBRef} if the property is a reference.
*

View File

@@ -0,0 +1,5 @@
/**
* Mapping event callback infrastructure for the MongoDB document-to-object mapping subsystem.
*/
package org.springframework.data.mongodb.core.mapping.event;

View File

@@ -0,0 +1,5 @@
/**
* Infrastructure for the MongoDB document-to-object mapping subsystem.
*/
package org.springframework.data.mongodb.core.mapping;

View File

@@ -0,0 +1,5 @@
/**
* Support for MongoDB map-reduce operations.
*/
package org.springframework.data.mongodb.core.mapreduce;

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2010-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -39,6 +39,10 @@ import com.mongodb.DBObject;
/**
* Central class for creating queries. It follows a fluent API style so that you can easily chain together multiple
* criteria. Static import of the 'Criteria.where' method will improve readability.
*
* @author Thomas Risberg
* @author Oliver Gierke
* @author Thomas Darimont
*/
public class Criteria implements CriteriaDefinition {
@@ -396,34 +400,54 @@ public class Criteria implements CriteriaDefinition {
/**
* Creates an 'or' criteria using the $or operator for all of the provided criteria
* <p>
* Note that mongodb doesn't support an $or operator to be wrapped in a $not operator.
* <p>
*
* @throws IllegalArgumentException if {@link #orOperator(Criteria...)} follows a not() call directly.
* @param criteria
*/
public Criteria orOperator(Criteria... criteria) {
BasicDBList bsonList = createCriteriaList(criteria);
criteriaChain.add(new Criteria("$or").is(bsonList));
return this;
return registerCriteriaChainElement(new Criteria("$or").is(bsonList));
}
/**
* Creates a 'nor' criteria using the $nor operator for all of the provided criteria
* Creates a 'nor' criteria using the $nor operator for all of the provided criteria.
* <p>
* Note that mongodb doesn't support an $nor operator to be wrapped in a $not operator.
* <p>
*
* @throws IllegalArgumentException if {@link #norOperator(Criteria...)} follows a not() call directly.
* @param criteria
*/
public Criteria norOperator(Criteria... criteria) {
BasicDBList bsonList = createCriteriaList(criteria);
criteriaChain.add(new Criteria("$nor").is(bsonList));
return this;
return registerCriteriaChainElement(new Criteria("$nor").is(bsonList));
}
/**
* Creates an 'and' criteria using the $and operator for all of the provided criteria
* Creates an 'and' criteria using the $and operator for all of the provided criteria.
* <p>
* Note that mongodb doesn't support an $and operator to be wrapped in a $not operator.
* <p>
*
* @throws IllegalArgumentException if {@link #andOperator(Criteria...)} follows a not() call directly.
* @param criteria
*/
public Criteria andOperator(Criteria... criteria) {
BasicDBList bsonList = createCriteriaList(criteria);
criteriaChain.add(new Criteria("$and").is(bsonList));
return registerCriteriaChainElement(new Criteria("$and").is(bsonList));
}
private Criteria registerCriteriaChainElement(Criteria criteria) {
if (lastOperatorWasNot()) {
throw new IllegalArgumentException("operator $not is not allowed around criteria chain element: "
+ criteria.getCriteriaObject());
} else {
criteriaChain.add(criteria);
}
return this;
}
@@ -468,6 +492,7 @@ public class Criteria implements CriteriaDefinition {
}
}
}
DBObject queryCriteria = new BasicDBObject();
if (isValue != NOT_SET) {
queryCriteria.put(this.key, this.isValue);

View File

@@ -73,7 +73,7 @@ public class Field {
*/
public Field position(String field, int value) {
Assert.hasText(field, "Field must not be null or empty!");
Assert.hasText(field, "DocumentField must not be null or empty!");
postionKey = field;
positionValue = value;

View File

@@ -15,6 +15,7 @@
*/
package org.springframework.data.mongodb.core.query;
import org.springframework.data.domain.Pageable;
import org.springframework.data.mongodb.core.geo.CustomMetric;
import org.springframework.data.mongodb.core.geo.Distance;
import org.springframework.data.mongodb.core.geo.Metric;
@@ -29,6 +30,7 @@ import com.mongodb.DBObject;
* Builder class to build near-queries.
*
* @author Oliver Gierke
* @author Thomas Darimont
*/
public class NearQuery {
@@ -38,6 +40,7 @@ public class NearQuery {
private Metric metric;
private boolean spherical;
private Integer num;
private Integer skip;
/**
* Creates a new {@link NearQuery}.
@@ -116,7 +119,7 @@ public class NearQuery {
}
/**
* Configures the number of results to return.
* Configures the maximum number of results to return.
*
* @param num
* @return
@@ -126,6 +129,29 @@ public class NearQuery {
return this;
}
/**
* Configures the number of results to skip.
*
* @param skip
* @return
*/
public NearQuery skip(int skip) {
this.skip = skip;
return this;
}
/**
* Configures the {@link Pageable} to use.
*
* @param pageable
* @return
*/
public NearQuery with(Pageable pageable) {
this.num = pageable.getOffset() + pageable.getPageSize();
this.skip = pageable.getOffset();
return this;
}
/**
* Sets the max distance results shall have from the configured origin. If a {@link Metric} was set before the given
* value will be interpreted as being a value in that metric. E.g.
@@ -290,9 +316,18 @@ public class NearQuery {
*/
public NearQuery query(Query query) {
this.query = query;
this.skip = query.getSkip();
this.num = query.getLimit();
return this;
}
/**
* @return the number of elements to skip.
*/
public Integer getSkip() {
return skip;
}
/**
* Returns the {@link DBObject} built by the {@link NearQuery}.
*

View File

@@ -19,8 +19,11 @@ import static org.springframework.data.mongodb.core.query.SerializationUtils.*;
import static org.springframework.util.ObjectUtils.*;
import java.util.ArrayList;
import java.util.Collections;
import java.util.HashSet;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Set;
import org.springframework.data.domain.Pageable;
import org.springframework.data.domain.Sort;
@@ -34,9 +37,13 @@ import com.mongodb.DBObject;
/**
* @author Thomas Risberg
* @author Oliver Gierke
* @author Thomas Darimont
*/
public class Query {
private final static String RESTRICTED_TYPES_KEY = "_$RESTRICTED_TYPES";
private final Set<Class<?>> restrictedTypes = new HashSet<Class<?>>();
private LinkedHashMap<String, Criteria> criteria = new LinkedHashMap<String, Criteria>();
private Field fieldSpec;
private Sort sort;
@@ -54,8 +61,7 @@ public class Query {
return new Query(criteria);
}
public Query() {
}
public Query() {}
/**
* Creates a new {@link Query} using the given {@link Criteria}.
@@ -161,13 +167,46 @@ public class Query {
return this;
}
/**
* @return the restrictedTypes
*/
public Set<Class<?>> getRestrictedTypes() {
return restrictedTypes == null ? Collections.<Class<?>> emptySet() : restrictedTypes;
}
/**
* Restricts the query to only return documents instances that are exactly of the given types.
*
* @param type may not be {@literal null}
* @param additionalTypes may not be {@literal null}
* @return
*/
public Query restrict(Class<?> type, Class<?>... additionalTypes) {
Assert.notNull(type, "Type must not be null!");
Assert.notNull(additionalTypes, "AdditionalTypes must not be null");
restrictedTypes.add(type);
for (Class<?> additionalType : additionalTypes) {
restrictedTypes.add(additionalType);
}
return this;
}
public DBObject getQueryObject() {
DBObject dbo = new BasicDBObject();
for (String k : criteria.keySet()) {
CriteriaDefinition c = criteria.get(k);
DBObject cl = c.getCriteriaObject();
dbo.putAll(cl);
}
if (!restrictedTypes.isEmpty()) {
dbo.put(RESTRICTED_TYPES_KEY, getRestrictedTypes());
}
return dbo;
}
@@ -266,4 +305,17 @@ public class Query {
return result;
}
/**
* Returns whether the given key is the one used to hold the type restriction information.
*
* @deprecated don't call this method as the restricted type handling will undergo some significant changes going
* forward.
* @param key
* @return
*/
@Deprecated
public static boolean isRestrictedTypeKey(String key) {
return RESTRICTED_TYPES_KEY.equals(key);
}
}

View File

@@ -217,22 +217,23 @@ public class Update {
return dbo;
}
@SuppressWarnings("unchecked")
protected void addMultiFieldOperation(String operator, String key, Object value) {
Object existingValue = this.modifierOps.get(operator);
LinkedHashMap<String, Object> keyValueMap;
DBObject keyValueMap;
if (existingValue == null) {
keyValueMap = new LinkedHashMap<String, Object>();
keyValueMap = new BasicDBObject();
this.modifierOps.put(operator, keyValueMap);
} else {
if (existingValue instanceof LinkedHashMap) {
keyValueMap = (LinkedHashMap<String, Object>) existingValue;
if (existingValue instanceof BasicDBObject) {
keyValueMap = (BasicDBObject) existingValue;
} else {
throw new InvalidDataAccessApiUsageException("Modifier Operations should be a LinkedHashMap but was "
+ existingValue.getClass());
}
}
keyValueMap.put(key, value);
}
}

View File

@@ -0,0 +1,5 @@
/**
* Support for MongoDB GridFS feature.
*/
package org.springframework.data.mongodb.gridfs;

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2002-2011 the original author or authors.
* Copyright 2002-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,19 +15,20 @@
*/
package org.springframework.data.mongodb.monitor;
import java.net.InetAddress;
import java.net.UnknownHostException;
import com.mongodb.Mongo;
import org.springframework.jmx.export.annotation.ManagedMetric;
import org.springframework.jmx.export.annotation.ManagedOperation;
import org.springframework.jmx.export.annotation.ManagedResource;
import org.springframework.jmx.support.MetricType;
import com.mongodb.Mongo;
/**
* Expose basic server information via JMX
*
* @author Mark Pollack
* @author Thomas Darimont
*/
@ManagedResource(description = "Server Information")
public class ServerInfo extends AbstractMonitor {
@@ -36,9 +37,20 @@ public class ServerInfo extends AbstractMonitor {
this.mongo = mongo;
}
/**
* Returns the hostname of the used server reported by mongo.
*
* @return the reported hostname can also be an IP address.
* @throws UnknownHostException
*/
@ManagedOperation(description = "Server host name")
public String getHostName() throws UnknownHostException {
return InetAddress.getLocalHost().getHostName();
/*
* UnknownHostException is not necessary anymore, but clients could have
* called this method in a try..catch(UnknownHostException) already
*/
return getServerStatus().getServerUsed().getHost();
}
@ManagedMetric(displayName = "Uptime Estimate")

View File

@@ -0,0 +1,5 @@
/**
* Spring Data's MongoDB abstraction.
*/
package org.springframework.data.mongodb;

View File

@@ -0,0 +1,5 @@
/**
* CDI support for MongoDB specific repository implementation.
*/
package org.springframework.data.mongodb.repository.cdi;

View File

@@ -0,0 +1,5 @@
/**
* Support infrastructure for the configuration of MongoDB specific repositories.
*/
package org.springframework.data.mongodb.repository.config;

View File

@@ -38,6 +38,7 @@ import org.springframework.util.Assert;
* Base class for {@link RepositoryQuery} implementations for Mongo.
*
* @author Oliver Gierke
* @author Thomas Darimont
*/
public abstract class AbstractMongoQuery implements RepositoryQuery {
@@ -287,6 +288,11 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
nearQuery.maxDistance(maxDistance).in(maxDistance.getMetric());
}
Pageable pageable = accessor.getPageable();
if (pageable != null) {
nearQuery.with(pageable);
}
MongoEntityMetadata<?> metadata = method.getEntityInformation();
return (GeoResults<Object>) operations.geoNear(nearQuery, metadata.getJavaType(), metadata.getCollectionName());
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -23,6 +23,7 @@ import org.springframework.core.MethodParameter;
import org.springframework.data.mongodb.core.geo.Distance;
import org.springframework.data.mongodb.core.geo.Point;
import org.springframework.data.mongodb.repository.Near;
import org.springframework.data.mongodb.repository.query.MongoParameters.MongoParameter;
import org.springframework.data.repository.query.Parameter;
import org.springframework.data.repository.query.Parameters;
@@ -31,7 +32,7 @@ import org.springframework.data.repository.query.Parameters;
*
* @author Oliver Gierke
*/
public class MongoParameters extends Parameters {
public class MongoParameters extends Parameters<MongoParameters, MongoParameter> {
private final Integer distanceIndex;
private Integer nearIndex;
@@ -55,6 +56,14 @@ public class MongoParameters extends Parameters {
}
}
private MongoParameters(List<MongoParameter> parameters, Integer distanceIndex, Integer nearIndex) {
super(parameters);
this.distanceIndex = distanceIndex;
this.nearIndex = nearIndex;
}
@SuppressWarnings("unchecked")
private final int getNearIndex(List<Class<?>> parameterTypes) {
@@ -81,7 +90,7 @@ public class MongoParameters extends Parameters {
* @see org.springframework.data.repository.query.Parameters#createParameter(org.springframework.core.MethodParameter)
*/
@Override
protected Parameter createParameter(MethodParameter parameter) {
protected MongoParameter createParameter(MethodParameter parameter) {
MongoParameter mongoParameter = new MongoParameter(parameter);
@@ -114,6 +123,15 @@ public class MongoParameters extends Parameters {
return nearIndex;
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.query.Parameters#createFrom(java.util.List)
*/
@Override
protected MongoParameters createFrom(List<MongoParameter> parameters) {
return new MongoParameters(parameters, this.distanceIndex, this.nearIndex);
}
/**
* Custom {@link Parameter} implementation adding parameters of type {@link Distance} to the special ones.
*

View File

@@ -269,10 +269,10 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
switch (type) {
case STARTING_WITH:
source = source + "*";
source = "^" + source;
break;
case ENDING_WITH:
source = "*" + source;
source = source + "$";
break;
case CONTAINING:
source = "*" + source + "*";

View File

@@ -28,7 +28,6 @@ import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.repository.Query;
import org.springframework.data.repository.core.RepositoryMetadata;
import org.springframework.data.repository.query.Parameters;
import org.springframework.data.repository.query.QueryMethod;
import org.springframework.data.util.ClassTypeInformation;
import org.springframework.data.util.TypeInformation;
@@ -42,9 +41,8 @@ import org.springframework.util.StringUtils;
*/
public class MongoQueryMethod extends QueryMethod {
@SuppressWarnings("unchecked")
private static final List<Class<?>> GEO_NEAR_RESULTS = Arrays
.asList(GeoResult.class, GeoResults.class, GeoPage.class);
@SuppressWarnings("unchecked") private static final List<Class<?>> GEO_NEAR_RESULTS = Arrays.asList(GeoResult.class,
GeoResults.class, GeoPage.class);
private final Method method;
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
@@ -72,7 +70,7 @@ public class MongoQueryMethod extends QueryMethod {
* @see org.springframework.data.repository.query.QueryMethod#getParameters(java.lang.reflect.Method)
*/
@Override
protected Parameters createParameters(Method method) {
protected MongoParameters createParameters(Method method) {
return new MongoParameters(method, isGeoNearQuery(method));
}

View File

@@ -0,0 +1,5 @@
/**
* Query derivation mechanism for MongoDB specific repositories.
*/
package org.springframework.data.mongodb.repository.query;

View File

@@ -0,0 +1,5 @@
/**
* Support infrastructure for query derivation of MongoDB specific repositories.
*/
package org.springframework.data.mongodb.repository.support;

View File

@@ -181,10 +181,10 @@ The base package in which to scan for entities annotated with @Document
</xsd:appinfo>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="mongo-ref" type="mongoRef" use="optional">
<xsd:attribute name="type-mapper-ref" type="typeMapperRef" use="optional">
<xsd:annotation>
<xsd:documentation>
The reference to a Mongo. Will default to 'mongo'.
The reference to a MongoTypeMapper to be used by this MappingMongoConverter.
</xsd:documentation>
</xsd:annotation>
</xsd:attribute>
@@ -195,13 +195,6 @@ The base package in which to scan for entities annotated with @Document
</xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="mongo-template-ref" type="mongoTemplateRef" use="optional">
<xsd:annotation>
<xsd:documentation source="org.springframework.data.mongodb.core.core.MongoTemplate">
The reference to a MongoTemplate. Will default to 'mongoTemplate'.
</xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="disable-validation" use="optional">
<xsd:annotation>
<xsd:documentation source="org.springframework.data.mongodb.core.mapping.event.ValidatingMongoEventListener">
@@ -257,6 +250,17 @@ The name of the Mongo object that determines what server to monitor. (by default
</xsd:complexType>
</xsd:element>
<xsd:simpleType name="typeMapperRef">
<xsd:annotation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to type="org.springframework.data.mongodb.core.convert.MongoTypeMapper"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
<xsd:union memberTypes="xsd:string"/>
</xsd:simpleType>
<xsd:simpleType name="mappingContextRef">
<xsd:annotation>
<xsd:appinfo>

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2012 the original author or authors.
* Copyright 2012-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -23,10 +23,16 @@ import org.junit.Test;
import org.junit.rules.ExpectedException;
import org.springframework.beans.factory.NoSuchBeanDefinitionException;
import org.springframework.context.annotation.AnnotationConfigApplicationContext;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.convert.MongoTypeMapper;
import org.springframework.data.mongodb.core.mapping.BasicMongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.expression.spel.support.StandardEvaluationContext;
import org.springframework.test.util.ReflectionTestUtils;
import com.mongodb.Mongo;
@@ -34,11 +40,11 @@ import com.mongodb.Mongo;
* Unit tests for {@link AbstractMongoConfiguration}.
*
* @author Oliver Gierke
* @author Thomas Darimont
*/
public class AbstractMongoConfigurationUnitTests {
@Rule
public ExpectedException exception = ExpectedException.none();
@Rule public ExpectedException exception = ExpectedException.none();
/**
* @see DATAMONGO-496
@@ -97,6 +103,34 @@ public class AbstractMongoConfigurationUnitTests {
assertThat(context.getPersistentEntities(), is(not(emptyIterable())));
}
/**
* @see DATAMONGO-717
*/
@Test
public void lifecycleCallbacksAreInvokedInAppropriateOrder() {
AnnotationConfigApplicationContext context = new AnnotationConfigApplicationContext(SampleMongoConfiguration.class);
MongoMappingContext mappingContext = context.getBean(MongoMappingContext.class);
BasicMongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(Entity.class);
StandardEvaluationContext spElContext = (StandardEvaluationContext) ReflectionTestUtils.getField(entity, "context");
assertThat(spElContext.getBeanResolver(), is(notNullValue()));
}
/**
* @see DATAMONGO-725
*/
@Test
public void shouldBeAbleToConfigureCustomTypeMapperViaJavaConfig() {
AnnotationConfigApplicationContext context = new AnnotationConfigApplicationContext(SampleMongoConfiguration.class);
MongoTypeMapper typeMapper = context.getBean(CustomMongoTypeMapper.class);
MappingMongoConverter mmc = context.getBean(MappingMongoConverter.class);
assertThat(mmc, is(notNullValue()));
assertThat(mmc.getTypeMapper(), is(typeMapper));
}
private static void assertScanningDisabled(final String value) throws ClassNotFoundException {
AbstractMongoConfiguration configuration = new SampleMongoConfiguration() {
@@ -122,6 +156,19 @@ public class AbstractMongoConfigurationUnitTests {
public Mongo mongo() throws Exception {
return new Mongo();
}
@Bean
@Override
public MappingMongoConverter mappingMongoConverter() throws Exception {
MappingMongoConverter mmc = super.mappingMongoConverter();
mmc.setTypeMapper(typeMapper());
return mmc;
}
@Bean
public MongoTypeMapper typeMapper() {
return new CustomMongoTypeMapper();
}
}
@Document

View File

@@ -0,0 +1,23 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import org.springframework.data.mongodb.core.convert.DefaultMongoTypeMapper;
/**
* @author Thomas Darimont
*/
class CustomMongoTypeMapper extends DefaultMongoTypeMapper {}

View File

@@ -31,6 +31,8 @@ import org.springframework.core.convert.converter.Converter;
import org.springframework.core.convert.converter.GenericConverter;
import org.springframework.core.io.ClassPathResource;
import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.convert.MongoTypeMapper;
import org.springframework.data.mongodb.core.mapping.Account;
import org.springframework.data.mongodb.core.mapping.CamelCaseAbbreviatingFieldNamingStrategy;
import org.springframework.data.mongodb.repository.Person;
@@ -42,6 +44,7 @@ import com.mongodb.DBObject;
* Integration tests for {@link MappingMongoConverterParser}.
*
* @author Oliver Gierke
* @author Thomas Darimont
*/
public class MappingMongoConverterParserIntegrationTests {
@@ -61,6 +64,15 @@ public class MappingMongoConverterParserIntegrationTests {
factory.getBean("converter");
}
@Test
public void hasCustomTypeMapper() {
MappingMongoConverter converter = factory.getBean("converter", MappingMongoConverter.class);
MongoTypeMapper customMongoTypeMapper = factory.getBean(CustomMongoTypeMapper.class);
assertThat(converter.getTypeMapper(), is(customMongoTypeMapper));
}
@Test
public void scansForConverterAndSetsUpCustomConversionsAccordingly() {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2012 the original author or authors.
* Copyright 2012-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -33,6 +33,7 @@ import org.springframework.core.io.ClassPathResource;
*
* @see DATAMONGO-36
* @author Maciej Walkowiak
* @author Thomas Darimont
*/
public class MappingMongoConverterParserValidationIntegrationTests {
@@ -65,4 +66,11 @@ public class MappingMongoConverterParserValidationIntegrationTests {
reader.loadBeanDefinitions(new ClassPathResource("namespace/converter-validation-disabled.xml"));
factory.getBean(BeanNames.VALIDATING_EVENT_LISTENER);
}
@Test
public void validatingEventListenerCreatedWithCustomTypeMapperConfig() {
reader.loadBeanDefinitions(new ClassPathResource("namespace/converter-custom-typeMapper.xml"));
assertThat(factory.getBean(BeanNames.VALIDATING_EVENT_LISTENER), is(not(nullValue())));
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2012 the original author or authors.
* Copyright 2012-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -31,6 +31,7 @@ import com.mongodb.ServerAddress;
* Unit tests for {@link ServerAddressPropertyEditor}.
*
* @author Oliver Gierke
* @author Thomas Darimont
*/
public class ServerAddressPropertyEditorUnitTests {
@@ -70,6 +71,16 @@ public class ServerAddressPropertyEditorUnitTests {
assertSingleAddressOfLocalhost(editor.getValue());
}
/**
* @see DATAMONGO-693
*/
@Test
public void interpretEmptyStringAsNull() {
editor.setAsText("");
assertNull(editor.getValue());
}
private static void assertSingleAddressOfLocalhost(Object result) throws UnknownHostException {
assertThat(result, is(instanceOf(ServerAddress[].class)));

View File

@@ -27,7 +27,7 @@ import com.mongodb.WriteConcern;
*
* @author Oliver Gierke
*/
public class StringToWriteConcernConverterUnitTest {
public class StringToWriteConcernConverterUnitTests {
StringToWriteConcernConverter converter = new StringToWriteConcernConverter();

View File

@@ -0,0 +1,8 @@
package org.springframework.data.mongodb.core;
import org.springframework.data.annotation.Id;
public class BaseDoc {
@Id String id;
String value;
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2012 the original author or authors.
* Copyright 2012-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -20,6 +20,8 @@ import static org.junit.Assert.*;
import static org.mockito.Matchers.*;
import static org.mockito.Mockito.*;
import java.util.List;
import org.junit.After;
import org.junit.Before;
import org.junit.Test;
@@ -28,7 +30,9 @@ import org.mockito.Mock;
import org.mockito.invocation.InvocationOnMock;
import org.mockito.runners.MockitoJUnitRunner;
import org.mockito.stubbing.Answer;
import org.springframework.transaction.support.TransactionSynchronization;
import org.springframework.transaction.support.TransactionSynchronizationManager;
import org.springframework.transaction.support.TransactionSynchronizationUtils;
import com.mongodb.DB;
import com.mongodb.Mongo;
@@ -37,12 +41,12 @@ import com.mongodb.Mongo;
* Unit tests for {@link MongoDbUtils}.
*
* @author Oliver Gierke
* @author Randy Watler
*/
@RunWith(MockitoJUnitRunner.class)
public class MongoDbUtilsUnitTests {
@Mock
Mongo mongo;
@Mock Mongo mongo;
@Before
public void setUp() throws Exception {
@@ -81,4 +85,94 @@ public class MongoDbUtilsUnitTests {
assertThat(first, is(notNullValue()));
assertThat(MongoDbUtils.getDB(mongo, "first"), is(sameInstance(first)));
}
/**
* @see DATAMONGO-737
*/
@Test
public void handlesTransactionSynchronizationLifecycle() {
// ensure transaction synchronization manager has no registered
// transaction synchronizations or bound resources at start of test
assertThat(TransactionSynchronizationManager.getSynchronizations().isEmpty(), is(true));
assertThat(TransactionSynchronizationManager.getResourceMap().isEmpty(), is(true));
// access database for one mongo instance, (registers transaction
// synchronization and binds transaction resource)
MongoDbUtils.getDB(mongo, "first");
// ensure transaction synchronization manager has registered
// transaction synchronizations and bound resources
assertThat(TransactionSynchronizationManager.getSynchronizations().isEmpty(), is(false));
assertThat(TransactionSynchronizationManager.getResourceMap().isEmpty(), is(false));
// simulate transaction completion, (unbinds transaction resource)
try {
simulateTransactionCompletion();
} catch (Exception e) {
fail("Unexpected exception thrown during transaction completion: " + e);
}
// ensure transaction synchronization manager has no bound resources
// at end of test
assertThat(TransactionSynchronizationManager.getResourceMap().isEmpty(), is(true));
}
/**
* @see DATAMONGO-737
*/
@Test
public void handlesTransactionSynchronizationsLifecycle() {
// ensure transaction synchronization manager has no registered
// transaction synchronizations or bound resources at start of test
assertThat(TransactionSynchronizationManager.getSynchronizations().isEmpty(), is(true));
assertThat(TransactionSynchronizationManager.getResourceMap().isEmpty(), is(true));
// access multiple databases for one mongo instance, (registers
// transaction synchronizations and binds transaction resources)
MongoDbUtils.getDB(mongo, "first");
MongoDbUtils.getDB(mongo, "second");
// ensure transaction synchronization manager has registered
// transaction synchronizations and bound resources
assertThat(TransactionSynchronizationManager.getSynchronizations().isEmpty(), is(false));
assertThat(TransactionSynchronizationManager.getResourceMap().isEmpty(), is(false));
// simulate transaction completion, (unbinds transaction resources)
try {
simulateTransactionCompletion();
} catch (Exception e) {
fail("Unexpected exception thrown during transaction completion: " + e);
}
// ensure transaction synchronization manager has no bound
// transaction resources at end of test
assertThat(TransactionSynchronizationManager.getResourceMap().isEmpty(), is(true));
}
/**
* Simulate transaction rollback/commit completion protocol on managed transaction synchronizations which will unbind
* managed transaction resources. Does not swallow exceptions for testing purposes.
*
* @see TransactionSynchronizationUtils#triggerBeforeCompletion()
* @see TransactionSynchronizationUtils#triggerAfterCompletion(int)
*/
private void simulateTransactionCompletion() {
// triggerBeforeCompletion() implementation without swallowed exceptions
List<TransactionSynchronization> synchronizations = TransactionSynchronizationManager.getSynchronizations();
for (TransactionSynchronization synchronization : synchronizations) {
synchronization.beforeCompletion();
}
// triggerAfterCompletion() implementation without swallowed exceptions
List<TransactionSynchronization> remainingSynchronizations = TransactionSynchronizationManager
.getSynchronizations();
if (remainingSynchronizations != null) {
for (TransactionSynchronization remainingSynchronization : remainingSynchronizations) {
remainingSynchronization.afterCompletion(TransactionSynchronization.STATUS_ROLLED_BACK);
}
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2012 the original author or authors.
* Copyright 2012-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -21,17 +21,21 @@ import static org.junit.Assert.*;
import org.junit.Test;
import org.springframework.beans.factory.support.DefaultListableBeanFactory;
import org.springframework.beans.factory.support.RootBeanDefinition;
import org.springframework.data.mongodb.config.ServerAddressPropertyEditor;
import org.springframework.data.mongodb.config.WriteConcernPropertyEditor;
import org.springframework.test.util.ReflectionTestUtils;
import com.mongodb.Mongo;
import com.mongodb.ServerAddress;
import com.mongodb.WriteConcern;
/**
* Integration tests for {@link MongoFactoryBean}.
*
* @author Oliver Gierke
* @author Thomas Darimont
*/
public class MongoFactoryBeanIntegrationTest {
public class MongoFactoryBeanIntegrationTests {
/**
* @see DATAMONGO-408
@@ -49,4 +53,22 @@ public class MongoFactoryBeanIntegrationTest {
MongoFactoryBean bean = factory.getBean("&factory", MongoFactoryBean.class);
assertThat(ReflectionTestUtils.getField(bean, "writeConcern"), is((Object) WriteConcern.SAFE));
}
/**
* @see DATAMONGO-693
*/
@Test
public void createMongoInstanceWithHostAndEmptyReplicaSets() {
RootBeanDefinition definition = new RootBeanDefinition(MongoFactoryBean.class);
definition.getPropertyValues().addPropertyValue("host", "localhost");
definition.getPropertyValues().addPropertyValue("replicaPair", "");
DefaultListableBeanFactory factory = new DefaultListableBeanFactory();
factory.registerCustomEditor(ServerAddress.class, ServerAddressPropertyEditor.class);
factory.registerBeanDefinition("factory", definition);
Mongo mongo = factory.getBean(Mongo.class);
assertNotNull(mongo);
}
}

View File

@@ -30,6 +30,7 @@ import org.springframework.dao.DataAccessException;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.convert.AbstractMongoConverter;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.convert.MongoTypeMapper;
import org.springframework.data.mongodb.core.geo.Point;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
@@ -45,14 +46,13 @@ import com.mongodb.DBRef;
* instances of their implementation and thus can see if it correctly implements the {@link MongoOperations} interface.
*
* @author Oliver Gierke
* @author Thomas Darimont
*/
@RunWith(MockitoJUnitRunner.class)
public abstract class MongoOperationsUnitTests {
@Mock
CollectionCallback<Object> collectionCallback;
@Mock
DbCallback<Object> dbCallback;
@Mock CollectionCallback<Object> collectionCallback;
@Mock DbCallback<Object> dbCallback;
MongoConverter converter;
Person person;
@@ -86,6 +86,11 @@ public abstract class MongoOperationsUnitTests {
public DBRef toDBRef(Object object, MongoPersistentProperty referingProperty) {
return null;
}
@Override
public MongoTypeMapper getTypeMapper() {
return null;
}
};
}

View File

@@ -22,6 +22,7 @@ import static org.springframework.data.mongodb.core.query.Criteria.*;
import static org.springframework.data.mongodb.core.query.Query.*;
import static org.springframework.data.mongodb.core.query.Update.*;
import java.math.BigInteger;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Date;
@@ -61,6 +62,7 @@ import org.springframework.data.mongodb.core.index.Index;
import org.springframework.data.mongodb.core.index.Index.Duplicates;
import org.springframework.data.mongodb.core.index.IndexField;
import org.springframework.data.mongodb.core.index.IndexInfo;
import org.springframework.data.mongodb.core.mapping.Field;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.query.BasicQuery;
import org.springframework.data.mongodb.core.query.Criteria;
@@ -87,20 +89,18 @@ import com.mongodb.WriteResult;
* @author Thomas Risberg
* @author Amol Nayak
* @author Patryk Wasik
* @author Thomas Darimont
*/
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration("classpath:infrastructure.xml")
public class MongoTemplateTests {
@Autowired
MongoTemplate template;
@Autowired
MongoDbFactory factory;
@Autowired MongoTemplate template;
@Autowired MongoDbFactory factory;
MongoTemplate mappingTemplate;
@Rule
public ExpectedException thrown = ExpectedException.none();
@Rule public ExpectedException thrown = ExpectedException.none();
@Autowired
@SuppressWarnings("unchecked")
@@ -113,8 +113,8 @@ public class MongoTemplateTests {
mappingContext.setInitialEntitySet(new HashSet<Class<?>>(Arrays.asList(PersonWith_idPropertyOfTypeObjectId.class,
PersonWith_idPropertyOfTypeString.class, PersonWithIdPropertyOfTypeObjectId.class,
PersonWithIdPropertyOfTypeString.class, PersonWithIdPropertyOfTypeInteger.class,
PersonWithIdPropertyOfPrimitiveInt.class, PersonWithIdPropertyOfTypeLong.class,
PersonWithIdPropertyOfPrimitiveLong.class)));
PersonWithIdPropertyOfTypeBigInteger.class, PersonWithIdPropertyOfPrimitiveInt.class,
PersonWithIdPropertyOfTypeLong.class, PersonWithIdPropertyOfPrimitiveLong.class)));
mappingContext.setSimpleTypeHolder(conversions.getSimpleTypeHolder());
mappingContext.initialize();
@@ -143,6 +143,7 @@ public class MongoTemplateTests {
template.dropCollection(PersonWithIdPropertyOfTypeObjectId.class);
template.dropCollection(PersonWithIdPropertyOfTypeString.class);
template.dropCollection(PersonWithIdPropertyOfTypeInteger.class);
template.dropCollection(PersonWithIdPropertyOfTypeBigInteger.class);
template.dropCollection(PersonWithIdPropertyOfPrimitiveInt.class);
template.dropCollection(PersonWithIdPropertyOfTypeLong.class);
template.dropCollection(PersonWithIdPropertyOfPrimitiveLong.class);
@@ -150,8 +151,14 @@ public class MongoTemplateTests {
template.dropCollection(TestClass.class);
template.dropCollection(Sample.class);
template.dropCollection(MyPerson.class);
template.dropCollection(TypeWithFieldAnnotation.class);
template.dropCollection(TypeWithDate.class);
template.dropCollection("collection");
template.dropCollection("personX");
template.dropCollection(Document.class);
template.dropCollection(ObjectWith3AliasedFields.class);
template.dropCollection(ObjectWith3AliasedFieldsAndNestedAddress.class);
template.dropCollection(BaseDoc.class);
}
@Test
@@ -473,6 +480,25 @@ public class MongoTemplateTests {
assertThat(p9q.getId(), is(p9.getId()));
checkCollectionContents(PersonWithIdPropertyOfTypeInteger.class, 1);
/*
* @see DATAMONGO-602
*/
// BigInteger id - provided
PersonWithIdPropertyOfTypeBigInteger p9bi = new PersonWithIdPropertyOfTypeBigInteger();
p9bi.setFirstName("Sven_9bi");
p9bi.setAge(22);
p9bi.setId(BigInteger.valueOf(12345));
// insert
mongoTemplate.insert(p9bi);
// also try save
mongoTemplate.save(p9bi);
assertThat(p9bi.getId(), notNullValue());
PersonWithIdPropertyOfTypeBigInteger p9qbi = mongoTemplate.findOne(new Query(where("id").in(p9bi.getId())),
PersonWithIdPropertyOfTypeBigInteger.class);
assertThat(p9qbi, notNullValue());
assertThat(p9qbi.getId(), is(p9bi.getId()));
checkCollectionContents(PersonWithIdPropertyOfTypeBigInteger.class, 1);
// int id - provided
PersonWithIdPropertyOfPrimitiveInt p10 = new PersonWithIdPropertyOfPrimitiveInt();
p10.setFirstName("Sven_10");
@@ -697,6 +723,47 @@ public class MongoTemplateTests {
assertThat(results3.size(), is(2));
}
/**
* @see DATAMONGO-602
*/
@Test
public void testUsingAnInQueryWithBigIntegerId() throws Exception {
template.remove(new Query(), PersonWithIdPropertyOfTypeBigInteger.class);
PersonWithIdPropertyOfTypeBigInteger p1 = new PersonWithIdPropertyOfTypeBigInteger();
p1.setFirstName("Sven");
p1.setAge(11);
p1.setId(new BigInteger("2666666666666666665069473312490162649510603601"));
template.insert(p1);
PersonWithIdPropertyOfTypeBigInteger p2 = new PersonWithIdPropertyOfTypeBigInteger();
p2.setFirstName("Mary");
p2.setAge(21);
p2.setId(new BigInteger("2666666666666666665069473312490162649510603602"));
template.insert(p2);
PersonWithIdPropertyOfTypeBigInteger p3 = new PersonWithIdPropertyOfTypeBigInteger();
p3.setFirstName("Ann");
p3.setAge(31);
p3.setId(new BigInteger("2666666666666666665069473312490162649510603603"));
template.insert(p3);
PersonWithIdPropertyOfTypeBigInteger p4 = new PersonWithIdPropertyOfTypeBigInteger();
p4.setFirstName("John");
p4.setAge(41);
p4.setId(new BigInteger("2666666666666666665069473312490162649510603604"));
template.insert(p4);
Query q1 = new Query(Criteria.where("age").in(11, 21, 41));
List<PersonWithIdPropertyOfTypeBigInteger> results1 = template.find(q1, PersonWithIdPropertyOfTypeBigInteger.class);
Query q2 = new Query(Criteria.where("firstName").in("Ann", "Mary"));
List<PersonWithIdPropertyOfTypeBigInteger> results2 = template.find(q2, PersonWithIdPropertyOfTypeBigInteger.class);
Query q3 = new Query(Criteria.where("id").in(new BigInteger("2666666666666666665069473312490162649510603601"),
new BigInteger("2666666666666666665069473312490162649510603604")));
List<PersonWithIdPropertyOfTypeBigInteger> results3 = template.find(q3, PersonWithIdPropertyOfTypeBigInteger.class);
assertThat(results1.size(), is(3));
assertThat(results2.size(), is(2));
assertThat(results3.size(), is(2));
}
@Test
public void testUsingAnInQueryWithPrimitiveIntId() throws Exception {
@@ -772,8 +839,7 @@ public class MongoTemplateTests {
Query q3 = new Query(Criteria.where("age").in(l1, l2));
template.find(q3, PersonWithIdPropertyOfTypeObjectId.class);
Assert.fail("Should have trown an InvalidDocumentStoreApiUsageException");
} catch (InvalidMongoDbApiUsageException e) {
}
} catch (InvalidMongoDbApiUsageException e) {}
}
@Test
@@ -1625,6 +1691,339 @@ public class MongoTemplateTests {
assertThat(template.exists(query, Sample.class, template.getCollectionName(Sample.class)), is(true));
}
/**
* @see DATAMONGO-675
*/
@Test
public void updateConsidersMappingAnnotations() {
TypeWithFieldAnnotation entity = new TypeWithFieldAnnotation();
entity.emailAddress = "old";
template.save(entity);
Query query = query(where("_id").is(entity.id));
Update update = Update.update("emailAddress", "new");
FindAndModifyOptions options = new FindAndModifyOptions().returnNew(true);
TypeWithFieldAnnotation result = template.findAndModify(query, update, options, TypeWithFieldAnnotation.class);
assertThat(result.emailAddress, is("new"));
}
/**
* @see DATAMONGO-671
*/
@Test
public void findsEntityByDateReference() {
TypeWithDate entity = new TypeWithDate();
entity.date = new Date(System.currentTimeMillis() - 10);
template.save(entity);
Query query = query(where("date").lt(new Date()));
List<TypeWithDate> result = template.find(query, TypeWithDate.class);
assertThat(result, hasSize(1));
assertThat(result.get(0).date, is(notNullValue()));
}
/**
* @see DATAMONGO-540
*/
@Test
public void findOneAfterUpsertForNonExistingObjectReturnsTheInsertedObject() {
String idValue = "4711";
Query query = new Query(Criteria.where("id").is(idValue));
String fieldValue = "bubu";
Update update = Update.update("field", fieldValue);
template.upsert(query, update, Sample.class);
Sample result = template.findOne(query, Sample.class);
assertThat(result, is(notNullValue()));
assertThat(result.field, is(fieldValue));
assertThat(result.id, is(idValue));
}
/**
* @see DATAMONGO-392
*/
@Test
public void updatesShouldRetainTypeInformation() {
Document doc = new Document();
doc.id = "4711";
doc.model = new ModelA().withValue("foo");
template.insert(doc);
Query query = new Query(Criteria.where("id").is(doc.id));
String newModelValue = "bar";
Update update = Update.update("model", new ModelA().withValue(newModelValue));
template.updateFirst(query, update, Document.class);
Document result = template.findOne(query, Document.class);
assertThat(result, is(notNullValue()));
assertThat(result.id, is(doc.id));
assertThat(result.model, is(notNullValue()));
assertThat(result.model.value(), is(newModelValue));
}
/**
* @see DATAMONGO-702
*/
@Test
public void queryShouldSupportRealAndAliasedPropertyNamesForFieldInclusions() {
ObjectWith3AliasedFields obj = new ObjectWith3AliasedFields();
obj.id = "4711";
obj.property1 = "P1";
obj.property2 = "P2";
obj.property3 = "P3";
template.insert(obj);
Query query = new Query(Criteria.where("id").is(obj.id));
query.fields() //
.include("property2") // real property name
.include("prop3"); // aliased property name
ObjectWith3AliasedFields result = template.findOne(query, ObjectWith3AliasedFields.class);
assertThat(result.id, is(obj.id));
assertThat(result.property1, is(nullValue()));
assertThat(result.property2, is(obj.property2));
assertThat(result.property3, is(obj.property3));
}
/**
* @see DATAMONGO-702
*/
@Test
public void queryShouldSupportRealAndAliasedPropertyNamesForFieldExclusions() {
ObjectWith3AliasedFields obj = new ObjectWith3AliasedFields();
obj.id = "4711";
obj.property1 = "P1";
obj.property2 = "P2";
obj.property3 = "P3";
template.insert(obj);
Query query = new Query(Criteria.where("id").is(obj.id));
query.fields() //
.exclude("property2") // real property name
.exclude("prop3"); // aliased property name
ObjectWith3AliasedFields result = template.findOne(query, ObjectWith3AliasedFields.class);
assertThat(result.id, is(obj.id));
assertThat(result.property1, is(obj.property1));
assertThat(result.property2, is(nullValue()));
assertThat(result.property3, is(nullValue()));
}
/**
* @see DATAMONGO-702
*/
@Test
public void findMultipleWithQueryShouldSupportRealAndAliasedPropertyNamesForFieldExclusions() {
ObjectWith3AliasedFields obj0 = new ObjectWith3AliasedFields();
obj0.id = "4711";
obj0.property1 = "P10";
obj0.property2 = "P20";
obj0.property3 = "P30";
ObjectWith3AliasedFields obj1 = new ObjectWith3AliasedFields();
obj1.id = "4712";
obj1.property1 = "P11";
obj1.property2 = "P21";
obj1.property3 = "P31";
template.insert(obj0);
template.insert(obj1);
Query query = new Query(Criteria.where("id").in(obj0.id, obj1.id));
query.fields() //
.exclude("property2") // real property name
.exclude("prop3"); // aliased property name
List<ObjectWith3AliasedFields> results = template.find(query, ObjectWith3AliasedFields.class);
assertThat(results, is(notNullValue()));
assertThat(results.size(), is(2));
ObjectWith3AliasedFields result0 = results.get(0);
assertThat(result0, is(notNullValue()));
assertThat(result0.id, is(obj0.id));
assertThat(result0.property1, is(obj0.property1));
assertThat(result0.property2, is(nullValue()));
assertThat(result0.property3, is(nullValue()));
ObjectWith3AliasedFields result1 = results.get(1);
assertThat(result1, is(notNullValue()));
assertThat(result1.id, is(obj1.id));
assertThat(result1.property1, is(obj1.property1));
assertThat(result1.property2, is(nullValue()));
assertThat(result1.property3, is(nullValue()));
}
/**
* @see DATAMONGO-702
*/
@Test
public void queryShouldSupportNestedPropertyNamesForFieldInclusions() {
ObjectWith3AliasedFieldsAndNestedAddress obj = new ObjectWith3AliasedFieldsAndNestedAddress();
obj.id = "4711";
obj.property1 = "P1";
obj.property2 = "P2";
obj.property3 = "P3";
Address address = new Address();
String stateValue = "WA";
address.state = stateValue;
address.city = "Washington";
obj.address = address;
template.insert(obj);
Query query = new Query(Criteria.where("id").is(obj.id));
query.fields() //
.include("property2") // real property name
.include("address.state"); // aliased property name
ObjectWith3AliasedFieldsAndNestedAddress result = template.findOne(query,
ObjectWith3AliasedFieldsAndNestedAddress.class);
assertThat(result.id, is(obj.id));
assertThat(result.property1, is(nullValue()));
assertThat(result.property2, is(obj.property2));
assertThat(result.property3, is(nullValue()));
assertThat(result.address, is(notNullValue()));
assertThat(result.address.city, is(nullValue()));
assertThat(result.address.state, is(stateValue));
}
/**
* @see DATAMONGO-709
*/
@Test
public void aQueryRestrictedWithOneRestrictedResultTypeShouldReturnOnlyInstancesOfTheRestrictedType() {
BaseDoc doc0 = new BaseDoc();
doc0.value = "foo";
SpecialDoc doc1 = new SpecialDoc();
doc1.value = "foo";
doc1.specialValue = "specialfoo";
VerySpecialDoc doc2 = new VerySpecialDoc();
doc2.value = "foo";
doc2.specialValue = "specialfoo";
doc2.verySpecialValue = 4711;
String collectionName = template.getCollectionName(BaseDoc.class);
template.insert(doc0, collectionName);
template.insert(doc1, collectionName);
template.insert(doc2, collectionName);
Query query = Query.query(where("value").is("foo")).restrict(SpecialDoc.class);
List<BaseDoc> result = template.find(query, BaseDoc.class);
assertThat(result, is(notNullValue()));
assertThat(result.size(), is(1));
assertThat(result.get(0), is(instanceOf(SpecialDoc.class)));
}
/**
* @see DATAMONGO-709
*/
@Test
public void aQueryRestrictedWithMultipleRestrictedResultTypesShouldReturnOnlyInstancesOfTheRestrictedTypes() {
BaseDoc doc0 = new BaseDoc();
doc0.value = "foo";
SpecialDoc doc1 = new SpecialDoc();
doc1.value = "foo";
doc1.specialValue = "specialfoo";
VerySpecialDoc doc2 = new VerySpecialDoc();
doc2.value = "foo";
doc2.specialValue = "specialfoo";
doc2.verySpecialValue = 4711;
String collectionName = template.getCollectionName(BaseDoc.class);
template.insert(doc0, collectionName);
template.insert(doc1, collectionName);
template.insert(doc2, collectionName);
Query query = Query.query(where("value").is("foo")).restrict(BaseDoc.class, VerySpecialDoc.class);
List<BaseDoc> result = template.find(query, BaseDoc.class);
assertThat(result, is(notNullValue()));
assertThat(result.size(), is(2));
assertThat(result.get(0).getClass(), is((Object) BaseDoc.class));
assertThat(result.get(1).getClass(), is((Object) VerySpecialDoc.class));
}
/**
* @see DATAMONGO-709
*/
@Test
public void aQueryWithNoRestrictedResultTypesShouldReturnAllInstancesWithinTheGivenCollection() {
BaseDoc doc0 = new BaseDoc();
doc0.value = "foo";
SpecialDoc doc1 = new SpecialDoc();
doc1.value = "foo";
doc1.specialValue = "specialfoo";
VerySpecialDoc doc2 = new VerySpecialDoc();
doc2.value = "foo";
doc2.specialValue = "specialfoo";
doc2.verySpecialValue = 4711;
String collectionName = template.getCollectionName(BaseDoc.class);
template.insert(doc0, collectionName);
template.insert(doc1, collectionName);
template.insert(doc2, collectionName);
Query query = Query.query(where("value").is("foo"));
List<BaseDoc> result = template.find(query, BaseDoc.class);
assertThat(result, is(notNullValue()));
assertThat(result.size(), is(3));
assertThat(result.get(0).getClass(), is((Object) BaseDoc.class));
assertThat(result.get(1).getClass(), is((Object) SpecialDoc.class));
assertThat(result.get(2).getClass(), is((Object) VerySpecialDoc.class));
}
static interface Model {
String value();
Model withValue(String value);
}
static class ModelA implements Model {
private String value;
@Override
public String value() {
return this.value;
}
@Override
public Model withValue(String value) {
this.value = value;
return this;
}
}
static class Document {
@Id public String id;
public Model model;
}
static class MyId {
String first;
@@ -1633,14 +2032,12 @@ public class MongoTemplateTests {
static class TypeWithMyId {
@Id
MyId id;
@Id MyId id;
}
public static class Sample {
static class Sample {
@Id
String id;
@Id String id;
String field;
}
@@ -1697,8 +2094,31 @@ public class MongoTemplateTests {
static class VersionedPerson {
@Version
Long version;
@Version Long version;
String id, firstname, lastname;
}
static class TypeWithFieldAnnotation {
@Id ObjectId id;
@Field("email") String emailAddress;
}
static class TypeWithDate {
@Id String id;
Date date;
}
static class ObjectWith3AliasedFields {
@Id String id;
@Field("prop1") String property1;
@Field("prop2") String property2;
@Field("prop3") String property3;
}
static class ObjectWith3AliasedFieldsAndNestedAddress extends ObjectWith3AliasedFields {
@Field("adr") Address address;
}
}

View File

@@ -0,0 +1,57 @@
/*
* Copyright 2010-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import java.math.BigInteger;
public class PersonWithIdPropertyOfTypeBigInteger {
private BigInteger id;
private String firstName;
private int age;
public BigInteger getId() {
return id;
}
public void setId(BigInteger id) {
this.id = id;
}
public String getFirstName() {
return firstName;
}
public void setFirstName(String firstName) {
this.firstName = firstName;
}
public int getAge() {
return age;
}
public void setAge(int age) {
this.age = age;
}
@Override
public String toString() {
return "PersonWithIdPropertyOfTypeInteger [id=" + id + ", firstName=" + firstName + ", age=" + age + "]";
}
}

View File

@@ -1,64 +0,0 @@
package org.springframework.data.mongodb.core;
public class SomeEnumTest {
public enum StringEnum {
ONE, TWO, FIVE;
}
public enum NumberEnum {
ONE(1), TWO(2), FIVE(5);
private int value;
public int value() {
return value;
}
NumberEnum(int value) {
this.value = value;
}
}
private StringEnum stringEnum;
private NumberEnum numberEnum;
private String id;
private String name;
public StringEnum getStringEnum() {
return stringEnum;
}
public void setStringEnum(StringEnum stringEnum) {
this.stringEnum = stringEnum;
}
public NumberEnum getNumberEnum() {
return numberEnum;
}
public void setNumberEnum(NumberEnum numberEnum) {
this.numberEnum = numberEnum;
}
public String getId() {
return id;
}
public void setId(String id) {
this.id = id;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
}

View File

@@ -0,0 +1,5 @@
package org.springframework.data.mongodb.core;
public class SpecialDoc extends BaseDoc {
String specialValue;
}

View File

@@ -0,0 +1,95 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.mockito.runners.MockitoJUnitRunner;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.MongoTemplate.UnwrapAndReadDbObjectCallback;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import com.mongodb.BasicDBObject;
/**
* Unit tests for {@link UnwrapAndReadDbObjectCallback}.
*
* @author Oliver Gierke
*/
@RunWith(MockitoJUnitRunner.class)
public class UnwrapAndReadDbObjectCallbackUnitTests {
@Mock MongoDbFactory factory;
UnwrapAndReadDbObjectCallback<Target> callback;
@Before
public void setUp() {
MongoTemplate template = new MongoTemplate(factory);
MappingMongoConverter converter = new MappingMongoConverter(factory, new MongoMappingContext());
this.callback = template.new UnwrapAndReadDbObjectCallback<Target>(converter, Target.class);
}
@Test
public void usesFirstLevelValues() {
Target target = callback.doWith(new BasicDBObject("foo", "bar"));
assertThat(target.id, is(nullValue()));
assertThat(target.foo, is("bar"));
}
@Test
public void unwrapsUnderscoreIdIfBasicDBObject() {
Target target = callback.doWith(new BasicDBObject("_id", new BasicDBObject("foo", "bar")));
assertThat(target.id, is(nullValue()));
assertThat(target.foo, is("bar"));
}
@Test
public void firstLevelPropertiesTrumpNestedOnes() {
Target target = callback.doWith(new BasicDBObject("_id", new BasicDBObject("foo", "bar")).append("foo", "foobar"));
assertThat(target.id, is(nullValue()));
assertThat(target.foo, is("foobar"));
}
@Test
public void keepsUnderscoreIdIfScalarValue() {
Target target = callback.doWith(new BasicDBObject("_id", "bar").append("foo", "foo"));
assertThat(target.id, is("bar"));
assertThat(target.foo, is("foo"));
}
static class Target {
String id;
String foo;
}
}

View File

@@ -0,0 +1,5 @@
package org.springframework.data.mongodb.core;
public class VerySpecialDoc extends SpecialDoc {
int verySpecialValue;
}

View File

@@ -0,0 +1,505 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import static org.springframework.data.domain.Sort.Direction.*;
import static org.springframework.data.mongodb.core.aggregation.Aggregation.*;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import java.io.BufferedInputStream;
import java.util.ArrayList;
import java.util.Date;
import java.util.List;
import java.util.Scanner;
import org.junit.After;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.core.io.ClassPathResource;
import org.springframework.dao.DataAccessException;
import org.springframework.data.mongodb.core.CollectionCallback;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import com.mongodb.BasicDBObject;
import com.mongodb.CommandResult;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mongodb.MongoException;
import com.mongodb.util.JSON;
/**
* Tests for {@link MongoTemplate#aggregate(String, AggregationPipeline, Class)}.
*
* @see DATAMONGO-586
* @author Tobias Trelle
* @author Thomas Darimont
* @author Oliver Gierke
*/
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration("classpath:infrastructure.xml")
public class AggregationTests {
private static final String INPUT_COLLECTION = "aggregation_test_collection";
private static final Logger LOGGER = LoggerFactory.getLogger(AggregationTests.class);
private static boolean initialized = false;
@Autowired MongoTemplate mongoTemplate;
@Before
public void setUp() {
cleanDb();
initSampleDataIfNecessary();
}
@After
public void cleanUp() {
cleanDb();
}
private void cleanDb() {
mongoTemplate.dropCollection(INPUT_COLLECTION);
mongoTemplate.dropCollection(Product.class);
mongoTemplate.dropCollection(UserWithLikes.class);
}
/**
* Imports the sample dataset (zips.json) if necessary (e.g. if it doen't exist yet). The dataset can originally be
* found on the mongodb aggregation framework example website:
*
* @see http://docs.mongodb.org/manual/tutorial/aggregation-examples/.
*/
private void initSampleDataIfNecessary() {
if (!initialized) {
CommandResult result = mongoTemplate.executeCommand("{ buildInfo: 1 }");
Object version = result.get("version");
LOGGER.debug("Server uses MongoDB Version: {}", version);
mongoTemplate.dropCollection(ZipInfo.class);
mongoTemplate.execute(ZipInfo.class, new CollectionCallback<Void>() {
@Override
public Void doInCollection(DBCollection collection) throws MongoException, DataAccessException {
Scanner scanner = null;
try {
scanner = new Scanner(new BufferedInputStream(new ClassPathResource("zips.json").getInputStream()));
while (scanner.hasNextLine()) {
String zipInfoRecord = scanner.nextLine();
collection.save((DBObject) JSON.parse(zipInfoRecord));
}
} catch (Exception e) {
if (scanner != null) {
scanner.close();
}
throw new RuntimeException("Could not load mongodb sample dataset!", e);
}
return null;
}
});
long count = mongoTemplate.count(new Query(), ZipInfo.class);
assertThat(count, is(29467L));
initialized = true;
}
}
@Test(expected = IllegalArgumentException.class)
public void shouldHandleMissingInputCollection() {
mongoTemplate.aggregate(newAggregation(), (String) null, TagCount.class);
}
@Test(expected = IllegalArgumentException.class)
public void shouldHandleMissingAggregationPipeline() {
mongoTemplate.aggregate(null, INPUT_COLLECTION, TagCount.class);
}
@Test(expected = IllegalArgumentException.class)
public void shouldHandleMissingEntityClass() {
mongoTemplate.aggregate(newAggregation(), INPUT_COLLECTION, null);
}
@Test
public void shouldAggregate() {
createTagDocuments();
Aggregation agg = newAggregation( //
project("tags"), //
unwind("tags"), //
group("tags") //
.count().as("n"), //
project("n") //
.and("tag").previousOperation(), //
sort(DESC, "n") //
);
AggregationResults<TagCount> results = mongoTemplate.aggregate(agg, INPUT_COLLECTION, TagCount.class);
assertThat(results, is(notNullValue()));
assertThat(results.getServerUsed(), is("/127.0.0.1:27017"));
List<TagCount> tagCount = results.getMappedResults();
assertThat(tagCount, is(notNullValue()));
assertThat(tagCount.size(), is(3));
assertTagCount("spring", 3, tagCount.get(0));
assertTagCount("mongodb", 2, tagCount.get(1));
assertTagCount("nosql", 1, tagCount.get(2));
}
@Test
public void shouldAggregateEmptyCollection() {
Aggregation aggregation = newAggregation(//
project("tags"), //
unwind("tags"), //
group("tags") //
.count().as("n"), //
project("n") //
.and("tag").previousOperation(), //
sort(DESC, "n") //
);
AggregationResults<TagCount> results = mongoTemplate.aggregate(aggregation, INPUT_COLLECTION, TagCount.class);
assertThat(results, is(notNullValue()));
assertThat(results.getServerUsed(), is("/127.0.0.1:27017"));
List<TagCount> tagCount = results.getMappedResults();
assertThat(tagCount, is(notNullValue()));
assertThat(tagCount.size(), is(0));
}
@Test
public void shouldDetectResultMismatch() {
createTagDocuments();
Aggregation aggregation = newAggregation( //
project("tags"), //
unwind("tags"), //
group("tags") //
.count().as("count"), // count field not present
limit(2) //
);
AggregationResults<TagCount> results = mongoTemplate.aggregate(aggregation, INPUT_COLLECTION, TagCount.class);
assertThat(results, is(notNullValue()));
assertThat(results.getServerUsed(), is("/127.0.0.1:27017"));
List<TagCount> tagCount = results.getMappedResults();
assertThat(tagCount, is(notNullValue()));
assertThat(tagCount.size(), is(2));
assertTagCount(null, 0, tagCount.get(0));
assertTagCount(null, 0, tagCount.get(1));
}
@Test
public void complexAggregationFrameworkUsageLargestAndSmallestCitiesByState() {
/*
//complex mongodb aggregation framework example from http://docs.mongodb.org/manual/tutorial/aggregation-examples/#largest-and-smallest-cities-by-state
db.zipInfo.aggregate(
{
$group: {
_id: {
state: '$state',
city: '$city'
},
pop: {
$sum: '$pop'
}
}
},
{
$sort: {
pop: 1,
'_id.state': 1,
'_id.city': 1
}
},
{
$group: {
_id: '$_id.state',
biggestCity: {
$last: '$_id.city'
},
biggestPop: {
$last: '$pop'
},
smallestCity: {
$first: '$_id.city'
},
smallestPop: {
$first: '$pop'
}
}
},
{
$project: {
_id: 0,
state: '$_id',
biggestCity: {
name: '$biggestCity',
pop: '$biggestPop'
},
smallestCity: {
name: '$smallestCity',
pop: '$smallestPop'
}
}
},
{
$sort: {
state: 1
}
}
)
*/
TypedAggregation<ZipInfo> aggregation = newAggregation(ZipInfo.class, //
group("state", "city").sum("population").as("pop"), //
sort(ASC, "pop", "state", "city"), //
group("state") //
.last("city").as("biggestCity") //
.last("pop").as("biggestPop") //
.first("city").as("smallestCity") //
.first("pop").as("smallestPop"), //
project() //
.and("state").previousOperation() //
.and("biggestCity").nested(bind("name", "biggestCity").and("population", "biggestPop")) //
.and("smallestCity").nested(bind("name", "smallestCity").and("population", "smallestPop")), //
sort(ASC, "state") //
);
assertThat(aggregation, is(notNullValue()));
assertThat(aggregation.toString(), is(notNullValue()));
AggregationResults<ZipInfoStats> result = mongoTemplate.aggregate(aggregation, ZipInfoStats.class);
assertThat(result, is(notNullValue()));
assertThat(result.getMappedResults(), is(notNullValue()));
assertThat(result.getMappedResults().size(), is(51));
ZipInfoStats firstZipInfoStats = result.getMappedResults().get(0);
assertThat(firstZipInfoStats, is(notNullValue()));
assertThat(firstZipInfoStats.id, is(nullValue()));
assertThat(firstZipInfoStats.state, is("AK"));
assertThat(firstZipInfoStats.smallestCity, is(notNullValue()));
assertThat(firstZipInfoStats.smallestCity.name, is("CHEVAK"));
assertThat(firstZipInfoStats.smallestCity.population, is(0));
assertThat(firstZipInfoStats.biggestCity, is(notNullValue()));
assertThat(firstZipInfoStats.biggestCity.name, is("ANCHORAGE"));
assertThat(firstZipInfoStats.biggestCity.population, is(183987));
ZipInfoStats lastZipInfoStats = result.getMappedResults().get(50);
assertThat(lastZipInfoStats, is(notNullValue()));
assertThat(lastZipInfoStats.id, is(nullValue()));
assertThat(lastZipInfoStats.state, is("WY"));
assertThat(lastZipInfoStats.smallestCity, is(notNullValue()));
assertThat(lastZipInfoStats.smallestCity.name, is("LOST SPRINGS"));
assertThat(lastZipInfoStats.smallestCity.population, is(6));
assertThat(lastZipInfoStats.biggestCity, is(notNullValue()));
assertThat(lastZipInfoStats.biggestCity.name, is("CHEYENNE"));
assertThat(lastZipInfoStats.biggestCity.population, is(70185));
}
@Test
public void findStatesWithPopulationOver10MillionAggregationExample() {
/*
//complex mongodb aggregation framework example from
http://docs.mongodb.org/manual/tutorial/aggregation-examples/#largest-and-smallest-cities-by-state
db.zipcodes.aggregate(
{
$group: {
_id:"$state",
totalPop:{ $sum:"$pop"}
}
},
{
$sort: { _id: 1, "totalPop": 1 }
},
{
$match: {
totalPop: { $gte:10*1000*1000 }
}
}
)
*/
TypedAggregation<ZipInfo> agg = newAggregation(ZipInfo.class, //
group("state") //
.sum("population").as("totalPop"), //
sort(ASC, previousOperation(), "totalPop"), //
match(where("totalPop").gte(10 * 1000 * 1000)) //
);
assertThat(agg, is(notNullValue()));
assertThat(agg.toString(), is(notNullValue()));
AggregationResults<StateStats> result = mongoTemplate.aggregate(agg, StateStats.class);
assertThat(result, is(notNullValue()));
assertThat(result.getMappedResults(), is(notNullValue()));
assertThat(result.getMappedResults().size(), is(7));
StateStats stateStats = result.getMappedResults().get(0);
assertThat(stateStats, is(notNullValue()));
assertThat(stateStats.id, is("CA"));
assertThat(stateStats.state, is(nullValue()));
assertThat(stateStats.totalPopulation, is(29760021));
}
/**
* @see http://docs.mongodb.org/manual/tutorial/aggregation-examples/#return-the-five-most-common-likes
*/
@Test
public void returnFiveMostCommonLikesAggregationFrameworkExample() {
createUserWithLikesDocuments();
/*
...
$group: {
_id:"$like",
number:{ $sum:1}
}
...
*/
TypedAggregation<UserWithLikes> agg = newAggregation(UserWithLikes.class, //
unwind("likes"), //
group("likes").count().as("number"), //
sort(DESC, "number"), //
limit(5), //
sort(ASC, previousOperation()) //
);
assertThat(agg, is(notNullValue()));
assertThat(agg.toString(), is(notNullValue()));
AggregationResults<LikeStats> result = mongoTemplate.aggregate(agg, LikeStats.class);
assertThat(result, is(notNullValue()));
assertThat(result.getMappedResults(), is(notNullValue()));
assertThat(result.getMappedResults().size(), is(5));
assertLikeStats(result.getMappedResults().get(0), "a", 4);
assertLikeStats(result.getMappedResults().get(1), "b", 2);
assertLikeStats(result.getMappedResults().get(2), "c", 4);
assertLikeStats(result.getMappedResults().get(3), "d", 2);
assertLikeStats(result.getMappedResults().get(4), "e", 3);
}
@Test
public void arithmenticOperatorsInProjectionExample() {
double taxRate = 0.19;
double netPrice = 1.99;
double discountRate = 0.05;
int spaceUnits = 3;
String productId = "P1";
String productName = "A";
mongoTemplate.insert(new Product(productId, productName, netPrice, spaceUnits, discountRate, taxRate));
TypedAggregation<Product> agg = newAggregation(Product.class, //
project("name", "netPrice") //
.and("netPrice").plus(1).as("netPricePlus1") //
.and("netPrice").minus(1).as("netPriceMinus1") //
.and("netPrice").multiply(2).as("netPriceMul2") //
.and("netPrice").divide(1.19).as("netPriceDiv119") //
.and("spaceUnits").mod(2).as("spaceUnitsMod2") //
);
AggregationResults<DBObject> result = mongoTemplate.aggregate(agg, DBObject.class);
List<DBObject> resultList = result.getMappedResults();
assertThat(resultList, is(notNullValue()));
assertThat((String) resultList.get(0).get("_id"), is(productId));
assertThat((String) resultList.get(0).get("name"), is(productName));
assertThat((Double) resultList.get(0).get("netPricePlus1"), is(netPrice + 1));
assertThat((Double) resultList.get(0).get("netPriceMinus1"), is(netPrice - 1));
assertThat((Double) resultList.get(0).get("netPriceMul2"), is(netPrice * 2));
assertThat((Double) resultList.get(0).get("netPriceDiv119"), is(netPrice / 1.19));
assertThat((Integer) resultList.get(0).get("spaceUnitsMod2"), is(spaceUnits % 2));
}
private void assertLikeStats(LikeStats like, String id, long count) {
assertThat(like, is(notNullValue()));
assertThat(like.id, is(id));
assertThat(like.count, is(count));
}
private void createUserWithLikesDocuments() {
mongoTemplate.insert(new UserWithLikes("u1", new Date(), "a", "b", "c"));
mongoTemplate.insert(new UserWithLikes("u2", new Date(), "a"));
mongoTemplate.insert(new UserWithLikes("u3", new Date(), "b", "c"));
mongoTemplate.insert(new UserWithLikes("u4", new Date(), "c", "d", "e"));
mongoTemplate.insert(new UserWithLikes("u5", new Date(), "a", "e", "c"));
mongoTemplate.insert(new UserWithLikes("u6", new Date()));
mongoTemplate.insert(new UserWithLikes("u7", new Date(), "a"));
mongoTemplate.insert(new UserWithLikes("u8", new Date(), "x", "e"));
mongoTemplate.insert(new UserWithLikes("u9", new Date(), "y", "d"));
}
private void createTagDocuments() {
DBCollection coll = mongoTemplate.getCollection(INPUT_COLLECTION);
coll.insert(createDocument("Doc1", "spring", "mongodb", "nosql"));
coll.insert(createDocument("Doc2", "spring", "mongodb"));
coll.insert(createDocument("Doc3", "spring"));
}
private static DBObject createDocument(String title, String... tags) {
DBObject doc = new BasicDBObject("title", title);
List<String> tagList = new ArrayList<String>();
for (String tag : tags) {
tagList.add(tag);
}
doc.put("tags", tagList);
return doc;
}
private static void assertTagCount(String tag, int n, TagCount tagCount) {
assertThat(tagCount.getTag(), is(tag));
assertThat(tagCount.getN(), is(n));
}
}

View File

@@ -0,0 +1,46 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import org.junit.Test;
/**
* Unit tests for {@link Aggregation}.
*
* @author Oliver Gierke
*/
public class AggregationUnitTests {
@Test(expected = IllegalArgumentException.class)
public void rejectsNullAggregationOperation() {
Aggregation.newAggregation((AggregationOperation[]) null);
}
@Test(expected = IllegalArgumentException.class)
public void rejectsNullTypedAggregationOperation() {
Aggregation.newAggregation(String.class, (AggregationOperation[]) null);
}
@Test(expected = IllegalArgumentException.class)
public void rejectsNoAggregationOperation() {
Aggregation.newAggregation(new AggregationOperation[0]);
}
@Test(expected = IllegalArgumentException.class)
public void rejectsNoTypedAggregationOperation() {
Aggregation.newAggregation(String.class, new AggregationOperation[0]);
}
}

View File

@@ -0,0 +1,11 @@
package org.springframework.data.mongodb.core.aggregation;
class City {
String name;
int population;
public String toString() {
return "City [name=" + name + ", population=" + population + "]";
}
}

View File

@@ -0,0 +1,55 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import org.junit.Test;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
/**
* Unit tests for {@link ExposedFields}.
*
* @author Oliver Gierke
*/
public class ExposedFieldsUnitTests {
@Test(expected = IllegalArgumentException.class)
public void rejectsNullFields() {
ExposedFields.from((ExposedField) null);
}
@Test(expected = IllegalArgumentException.class)
public void rejectsNullFieldsForSynthetics() {
ExposedFields.synthetic(null);
}
@Test(expected = IllegalArgumentException.class)
public void rejectsNullFieldsForNonSynthetics() {
ExposedFields.nonSynthetic(null);
}
@Test
public void exposesSingleField() {
ExposedFields fields = ExposedFields.synthetic(Fields.fields("foo"));
assertThat(fields.exposesSingleFieldOnly(), is(true));
fields = fields.and(new ExposedField("bar", true));
assertThat(fields.exposesSingleFieldOnly(), is(false));
}
}

View File

@@ -0,0 +1,115 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import static org.springframework.data.mongodb.core.aggregation.Fields.*;
import org.hamcrest.Matchers;
import org.junit.Rule;
import org.junit.Test;
import org.junit.rules.ExpectedException;
import org.springframework.data.mongodb.core.aggregation.Fields.AggregationField;
/**
* Unit tests for {@link Fields}.
*
* @author Oliver Gierke
* @author Thomas Darimont
*/
public class FieldsUnitTests {
@Rule public ExpectedException exception = ExpectedException.none();
@Test(expected = IllegalArgumentException.class)
public void rejectsNullFieldVarArgs() {
Fields.from((Field[]) null);
}
@Test(expected = IllegalArgumentException.class)
public void rejectsNullFieldNameVarArgs() {
Fields.fields((String[]) null);
}
@Test
public void createsFieldFromNameOnly() {
verify(Fields.field("foo"), "foo", null);
}
@Test
public void createsFieldFromNameAndTarget() {
verify(Fields.field("foo", "bar"), "foo", "bar");
}
@Test(expected = IllegalArgumentException.class)
public void rejectsNullFieldName() {
Fields.field(null);
}
@Test(expected = IllegalArgumentException.class)
public void rejectsNullFieldNameIfTargetGiven() {
Fields.field(null, "foo");
}
@Test(expected = IllegalArgumentException.class)
public void rejectsEmptyFieldName() {
Fields.field("");
}
@Test
public void createsFieldsFromFieldInstances() {
AggregationField reference = new AggregationField("foo");
Fields fields = Fields.from(reference);
assertThat(fields, is(Matchers.<Field> iterableWithSize(1)));
assertThat(fields, hasItem(reference));
}
@Test
public void aliasesPathExpressionsIntoLeafForImplicits() {
verify(Fields.field("foo.bar"), "bar", "foo.bar");
}
@Test
public void fieldsFactoryMethod() {
Fields fields = fields("a", "b").and("c").and("d", "e");
assertThat(fields, is(Matchers.<Field> iterableWithSize(4)));
verify(fields.getField("a"), "a", null);
verify(fields.getField("b"), "b", null);
verify(fields.getField("c"), "c", null);
verify(fields.getField("d"), "d", "e");
}
@Test
public void rejectsAmbiguousFieldNames() {
exception.expect(IllegalArgumentException.class);
fields("b", "a.b");
}
private static void verify(Field field, String name, String target) {
assertThat(field, is(notNullValue()));
assertThat(field.getName(), is(name));
assertThat(field.getTarget(), is(target != null ? target : name));
}
}

View File

@@ -0,0 +1,44 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import org.junit.Test;
import org.springframework.data.mongodb.core.DBObjectUtils;
import org.springframework.data.mongodb.core.query.NearQuery;
import com.mongodb.DBObject;
/**
* Unit tests for {@link GeoNearOperation}.
*
* @author Oliver Gierke
*/
public class GeoNearOperationUnitTests {
@Test
public void rendersNearQueryAsAggregationOperation() {
NearQuery query = NearQuery.near(10.0, 10.0);
GeoNearOperation operation = new GeoNearOperation(query);
DBObject dbObject = operation.toDBObject(Aggregation.DEFAULT_CONTEXT);
DBObject nearClause = DBObjectUtils.getAsDBObject(dbObject, "$geoNear");
assertThat(nearClause, is(query.toDBObject()));
}
}

View File

@@ -0,0 +1,159 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import static org.springframework.data.mongodb.core.aggregation.Fields.*;
import org.junit.Test;
import org.springframework.data.mongodb.core.DBObjectUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Unit tests for {@link GroupOperation}.
*
* @author Oliver Gierke
*/
public class GroupOperationUnitTests {
@Test(expected = IllegalArgumentException.class)
public void rejectsNullFields() {
new GroupOperation((Fields) null);
}
@Test
public void createsGroupOperationWithSingleField() {
GroupOperation operation = new GroupOperation(fields("a"));
DBObject groupClause = extractDbObjectFromGroupOperation(operation);
assertThat(groupClause.get(UNDERSCORE_ID), is((Object) "$a"));
}
@Test
public void createsGroupOperationWithMultipleFields() {
GroupOperation operation = new GroupOperation(fields("a").and("b", "c"));
DBObject groupClause = extractDbObjectFromGroupOperation(operation);
DBObject idClause = DBObjectUtils.getAsDBObject(groupClause, UNDERSCORE_ID);
assertThat(idClause.get("a"), is((Object) "$a"));
assertThat(idClause.get("b"), is((Object) "$c"));
}
@Test
public void groupFactoryMethodWithMultipleFieldsAndSumOperation() {
GroupOperation groupOperation = Aggregation.group(fields("a", "b").and("c")) //
.sum("e").as("e");
DBObject groupClause = extractDbObjectFromGroupOperation(groupOperation);
DBObject eOp = DBObjectUtils.getAsDBObject(groupClause, "e");
assertThat(eOp, is((DBObject) new BasicDBObject("$sum", "$e")));
}
@Test
public void groupFactoryMethodWithMultipleFieldsAndSumOperationWithAlias() {
GroupOperation groupOperation = Aggregation.group(fields("a", "b").and("c")) //
.sum("e").as("ee");
DBObject groupClause = extractDbObjectFromGroupOperation(groupOperation);
DBObject eOp = DBObjectUtils.getAsDBObject(groupClause, "ee");
assertThat(eOp, is((DBObject) new BasicDBObject("$sum", "$e")));
}
@Test
public void groupFactoryMethodWithMultipleFieldsAndCountOperationWithout() {
GroupOperation groupOperation = Aggregation.group(fields("a", "b").and("c")) //
.count().as("count");
DBObject groupClause = extractDbObjectFromGroupOperation(groupOperation);
DBObject eOp = DBObjectUtils.getAsDBObject(groupClause, "count");
assertThat(eOp, is((DBObject) new BasicDBObject("$sum", 1)));
}
@Test
public void groupFactoryMethodWithMultipleFieldsAndMultipleAggregateOperationsWithAlias() {
GroupOperation groupOperation = Aggregation.group(fields("a", "b").and("c")) //
.sum("e").as("sum") //
.min("e").as("min"); //
DBObject groupClause = extractDbObjectFromGroupOperation(groupOperation);
DBObject sum = DBObjectUtils.getAsDBObject(groupClause, "sum");
assertThat(sum, is((DBObject) new BasicDBObject("$sum", "$e")));
DBObject min = DBObjectUtils.getAsDBObject(groupClause, "min");
assertThat(min, is((DBObject) new BasicDBObject("$min", "$e")));
}
@Test
public void groupOperationPushWithValue() {
GroupOperation groupOperation = Aggregation.group("a", "b").push(1).as("x");
DBObject groupClause = extractDbObjectFromGroupOperation(groupOperation);
DBObject push = DBObjectUtils.getAsDBObject(groupClause, "x");
assertThat(push, is((DBObject) new BasicDBObject("$push", 1)));
}
@Test
public void groupOperationPushWithReference() {
GroupOperation groupOperation = Aggregation.group("a", "b").push("ref").as("x");
DBObject groupClause = extractDbObjectFromGroupOperation(groupOperation);
DBObject push = DBObjectUtils.getAsDBObject(groupClause, "x");
assertThat(push, is((DBObject) new BasicDBObject("$push", "$ref")));
}
@Test
public void groupOperationAddToSetWithReference() {
GroupOperation groupOperation = Aggregation.group("a", "b").addToSet("ref").as("x");
DBObject groupClause = extractDbObjectFromGroupOperation(groupOperation);
DBObject push = DBObjectUtils.getAsDBObject(groupClause, "x");
assertThat(push, is((DBObject) new BasicDBObject("$addToSet", "$ref")));
}
@Test
public void groupOperationAddToSetWithValue() {
GroupOperation groupOperation = Aggregation.group("a", "b").addToSet(42).as("x");
DBObject groupClause = extractDbObjectFromGroupOperation(groupOperation);
DBObject push = DBObjectUtils.getAsDBObject(groupClause, "x");
assertThat(push, is((DBObject) new BasicDBObject("$addToSet", 42)));
}
private DBObject extractDbObjectFromGroupOperation(GroupOperation groupOperation) {
DBObject dbObject = groupOperation.toDBObject(Aggregation.DEFAULT_CONTEXT);
DBObject groupClause = DBObjectUtils.getAsDBObject(dbObject, "$group");
return groupClause;
}
}

View File

@@ -0,0 +1,27 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import org.springframework.data.mongodb.core.mapping.Field;
/**
* @author Thomas Darimont
*/
public class LikeStats {
String id;
@Field("number") long count;
}

Some files were not shown because too many files have changed in this diff Show More