Compare commits

...

86 Commits

Author SHA1 Message Date
Oliver Gierke
4ca16f8d15 DATAMONGO-456 - Fixed <db-factory /> element documentation in XSD. 2012-09-17 12:20:52 +02:00
Oliver Gierke
1b3d49e466 DATAMONGO-457 - Fixed links in reference documentation. 2012-09-17 12:09:05 +02:00
Oliver Gierke
bf89cce43c DATAMONGO-539 - Fixed MongoTemplate.remove(object, collectionName).
If the entity being removed using MongoTemplate.remove(object, collectionName) contained an id that could be converted into an ObjectID it wasn't removed correctly currently. This was caused by the fact that the intermediate call didn't hand over the entity type and thus the id conversion failed. This in turn caused the query not to match the previous saved object.
2012-09-17 11:44:30 +02:00
Oliver Gierke
aef493fdb8 DATAMONGO-539 - Added test case to show removing entity from explicit collection works. 2012-09-13 17:24:27 +02:00
Oliver Gierke
eabd47ae8d DATAMONGO-532 - Synchronize DB authentication.
In multithreaded environments Mongo database authentication can be triggered twice if two or more threads refer to the same db instance. This is now prevented by synchronizing calls to db.authenticate(…).
2012-09-12 13:00:46 +02:00
Oliver Gierke
1d9ee9a28f DATAMONGO-537 - Work around compiler issues with generics. 2012-09-12 10:40:19 +02:00
Oliver Gierke
e0b0792643 DATAMONGO-537 - Guard index creation tests against changing method orders. 2012-09-12 10:11:26 +02:00
Oliver Gierke
d5f8285d51 DATAMONGO-529 - Update Querydsl setup to use 1.0.4.
Raised Maven compiler plugin version to 2.5.1.
2012-09-11 18:17:20 +02:00
Oliver Gierke
4ce8da7ebf DATAMONGO-536 - Fixed package cycle introduced by SerializationUtils. 2012-09-11 18:08:56 +02:00
Oliver Gierke
c1030abe96 DATAMONGO-512 - Fixed broken handling of AND in query methods.
Apparently the fix fof DATAMONGO-469 was messed up later on.
2012-09-04 07:32:27 +02:00
Oliver Gierke
2a8f13d5d5 DATAMONGO-527 - Fixed Criteria.equals(…). 2012-09-04 07:32:27 +02:00
Spring Buildmaster
e4adc0ce23 DATAMONGO-514 - Prepare next development iteration. 2012-08-24 01:54:30 -07:00
Spring Buildmaster
beeed68873 DATAMONGO-514 - Release 1.0.4.RELEASE. 2012-08-24 01:54:26 -07:00
Oliver Gierke
22872f97dc DATAMONGO-514 - Prepare changelog for 1.0.4.RELEASE. 2012-08-24 10:43:54 +02:00
Oliver Gierke
323de58efc DATAMONGO-499 - Fixed namespace reference to repository XSD. 2012-07-31 11:00:23 +02:00
Oliver Gierke
58f12b8d8f DATAMONGO-494 - QueryMapper now forwards entity metadata into nested $(n)or criterias.
Introduced helper class to ease assertions on DBObjects as well.
2012-07-27 16:16:57 +02:00
Oliver Gierke
48cb155f6c DATAMONGO-493 - Fixed broken $ne handling in QueryMapper.
$ne expressions are now only being tried to be converted into an ObjectId in case they follow an id property. Previously they tried in every case which might have led to Strings being converted into ObjectIds that accidentally were valid ObjectIds but didn't represent an id at all.
2012-07-27 16:16:29 +02:00
Oliver Gierke
594ddbd1c1 DATAMONGO-495 - Fixed debug output in MongoTemplate.doFind(…).
Using SerializationUtils to safely output the query to be executed.
2012-07-26 10:29:11 +02:00
Spring Buildmaster
2c2bbf415b DATAMONGO-492 - Prepare next development iteration. 2012-07-24 07:01:24 -07:00
Spring Buildmaster
9375e7b981 DATAMONGO-492 - Release 1.0.3.RELEASE. 2012-07-24 07:01:21 -07:00
Oliver Gierke
30a4682369 DATAMONGO-492 - Prepare changelog for 1.0.3.RELEASE. 2012-07-24 15:34:33 +02:00
Oliver Gierke
356e6acd43 DATAMONGO-474 - Populating id's after save now inspects field only.
So far the algorithm to inspect whether an id property has to be set after a save(…) operation has used the plain BeanWrapper.getProperty(PersistentProperty property) method. This caused problems in case the getter of the id field returned something completely different (to be precise: a complex type not convertible out of the box).

We now inspect the id field only to retrieve the value.
2012-07-24 13:30:04 +02:00
Oliver Gierke
09ed4aaf24 DATAMONGO-489 - Ensure read collections get converted to appropriate target type.
When reading BasicDBLists we now make sure the resulting collection is converted into the actual target type eventually. It might be an array and thus need an additional round of massaging before being returned as value.
2012-07-23 16:41:28 +02:00
Oliver Gierke
c7995eb462 DATAMONGO-485 - Added test case to show complex id's are working. 2012-07-17 12:46:34 +02:00
Amol Nayak
2e6a9a6ee7 DATAMONGO-480 - Consider WriteResult for insert(…) and save(…) methods. 2012-07-16 18:53:38 +02:00
Oliver Gierke
a82fbade95 DATAMONGO-483 - Indexes now use the field name even if index name is defined. 2012-07-16 18:36:56 +02:00
Oliver Gierke
7a9ba3fe3e DATAMONGO-482 - Fixed typo in reference documentation. 2012-07-16 17:43:01 +02:00
Oliver Gierke
134e7762a7 DATAMONGO-474 - Fixed criteria mapping for MongoTemplate.group(…).
The criteria object handed to the group object needs to be mapped correctly to map complex values. Improved error handling on the way.
2012-07-16 16:37:18 +02:00
Oliver Gierke
e41299ff38 DATAMONGO-475 - Fixed debug output in map-reduce operations.
Using SerializationUtils.serializeToJsonSafely(…) instead of plain toString() as this might cause SerializationExceptions for complex objects.
2012-07-16 14:55:32 +02:00
Oliver Gierke
5cf7a86023 DATAMONGO-469 - Fixed parsing of And keyword in derived queries. 2012-06-26 13:44:22 +02:00
Oliver Gierke
0aacb887de DATAMONGO-470 - Implemented equals(…) and hashCode() for Query and Criteria. 2012-06-26 13:36:00 +02:00
Oliver Gierke
ba81f21aba DATAMONGO-467 - Fix identifier handling for Querydsl.
As we try to massage the value of the id property into an ObjectId if possible we need to do so as well when mapping the Querydsl query. Adapted SpringDataMongoDbSerializer accordingly.
2012-06-25 13:11:37 +02:00
Oliver Gierke
43dee69fe0 DATAMONGO-464 - Fixed resource synchronization in MongoDbUtils.
MongoDbUtils now correctly returns DB instances for others than the first one bound. So far the lookup for an alternate database resulted in the first one bound to be returned.
2012-06-25 11:16:31 +02:00
Oliver Gierke
1be1297ef9 DATAMONGO-466 - QueryMapper now only tries id conversion for top level document.
So far the QueryMapper has tried to map id properties of nested documents to ObjectIds which it shouldn't do actually.
2012-06-22 15:14:37 +02:00
Spring Buildmaster
dad0789356 DATAMONGO-463 - Prepare next development iteration. 2012-06-20 03:53:31 -07:00
Spring Buildmaster
80ee7d9553 DATAMONGO-463 - Release 1.0.2.RELEASE. 2012-06-20 03:53:28 -07:00
Oliver Gierke
7e3dfa5504 DATAMONGO-463 - Polished pom.
Upgrade to Log4J 1.2.16 to remove the need for the exclusions. Replaced ${pom.version} with ${project.version}.
2012-06-20 12:48:14 +02:00
Oliver Gierke
b5b11772b6 DATAMONGO-463 - Prepare 1.0.2 release.
Polished pom and updated changelog.
2012-06-20 12:36:48 +02:00
Oliver Gierke
416dc563f2 DATAMONGO-463 - Update reference documentation to point to 1.0.2.RELEASE JARS. 2012-06-20 12:24:27 +02:00
Oliver Gierke
a41b877081 DATAMONGO-455 - Documentation mentions BasicQuery. 2012-06-20 12:23:32 +02:00
Oliver Gierke
c4c8e368ca DATAMONGO-454 - Improvements to ServerAddressPropertyEditor.
ServerAddressPropertyEditor now only eventually fails if none of the configured addresses can be parsed correctly. Strengthened the parsing implementation to not fail for host-only parsing or accidental double commas. Cleaned up test dependency setup.
2012-06-20 11:27:37 +02:00
Oliver Gierke
11f0c515b0 DATAMONGO-378 - Fixed potential ClassCastException for MapReduceResults and upcoming MongoDB release.
The type of the value returned for the total field of the timing map in map-reduce results has changed from Integer to Long as of MongoDB version 2.1.0 apparently. Changed MapReduceResults to accommodate either Integer or Long types.
2012-06-20 10:05:53 +02:00
Oliver Gierke
25a94bc45e DATAMONGO-462 - Added custom converters for URL.
So far URL instances were treated as entities and serialized as nested document. As there was no custom converter registered to re-instantiate the objects and URL does not contain a no-arg constructor, reading the instances back in resulted in an ugly exception in ReflectionEntityInstantiator. We now register a custom Converter to serialize URL instances as their plain toString() representation. This causes the reading working out of the box as the StringToObjectConverter registered by default uses the constructor taking a String on URL accidentally. To make sure this still works we added an explicit StringToURLConverter to implement symmetric conversions.
2012-06-19 18:55:19 +02:00
Oliver Gierke
783cec0325 DATAMONGO-461 - Fixed potential NullPointerException in MappedConstructor.
Reject PersistentEntity instances that don't have a PersistenceConstructor as this indicates a mapping problem (either a constructor needed to be annotated or a custom converter registered).
2012-06-19 17:46:54 +02:00
Oliver Gierke
b02e81c481 DATAMONGO-428 - Fixed parsing of output collection for complex MapReduce result.
The raw result for a map-reduce operation might contain a complex element containing the output collection in case the original request configured an output database as option. Adapted the parsing of the output collection to accommodate both scenarios (plain String value as well as DBObject wrapper).
2012-06-19 09:29:56 +02:00
Oliver Gierke
3c90b4987d Added missing dependency declarations.
Build has worked so far because we relied on the dependencies being pulled in transitively but it's best practice not to do so.
2012-06-15 19:25:31 +02:00
Oliver Gierke
a4a03b0164 DATAMONGO-450 - Log output uses mapped query for debug logging. 2012-06-14 12:36:42 +02:00
Oliver Gierke
651255ca58 DATAMONGO-447 - Fixed broken log output in debug level.
The debug output now uses the already mapped query object when concatenating the log string. Improved applying the id after save operations by inspecting whether the object already has the id set before trying to set it. This could have caused problems in case you use a complex id and don't provide a custom converter as it can be serialized out of the box. Fixed minor glitch in MappingMongoConverter which was not really a bug as another path through the code has covered the scenario later on. Introduced SerializationUtils class that provides a method to safely serialize objects to pseudo JSON. Pseudo in the sense that it simply renders a complex object as { $java : object.toString() }. This is useful for debug output before the DBObject was mapped into Mongo-native types.
2012-06-14 12:27:19 +02:00
Maciej Walkowiak
ccf006e41b DATAMONGO-446 - Fixed bug in paging query methods returning Lists
Using List as return type for paginating methods didn't work for query methods currently. Fixed by inspecting the Pageable parameter potentially handed into them and restricting the result set accordingly.
2012-05-15 13:24:51 +02:00
Oliver Gierke
cb6a1b7110 DATAMONGO-429 - Fixed handling of nested arrays in QueryMapper.
QueryMapper now correctly transforms arrays not concreting them into BasicDBObjects anymore.
2012-04-16 15:22:00 +02:00
Oliver Gierke
ba5a764f5d DATAMONGO-423 - Fixed handling of negated regular expressions.
When using the not() method combined with the regex(…) methods on Criteria we created an invalid query so far. Fixed the regex(…) method to always transform the regex expressions and options into a Pattern instance and render that according to the $not state.
2012-04-16 15:22:00 +02:00
Oliver Gierke
3be35cba2d DATAMONGO-425 - Fixed parameter binding for Dates and manually defined queries.
Replaced manual JSON serialization for special parameters inside StringBasedMongoQuery by calling JSON.serialize(…).
2012-04-16 15:22:00 +02:00
Oliver Gierke
9421c45c5a DATAMONGO-181 - Improved resource handling for Mongo instance.
SimpleMongoDbFactory now only closes the Mongo instance if it created it itself. Removed public getter for WriteConcern and hold a UserCredentials instead of its parts.
2012-04-16 15:21:52 +02:00
Oliver Gierke
885c1b0f2c DATAMONGO-422 - Fixed invalid UUID conversion.
Removed UUIDToBinaryConverter and BinaryToUUIDConverter as the MongoDB Java driver can handle it itself. Added UUID as Mongo-simple type. Added integration test for reading and writing a UUID property.
2012-04-16 15:21:52 +02:00
Oliver Gierke
c8bb46ffb3 DATAMONGO-413 - Fixed bug in MongoQueryCreator.
MongoQueryCreator used the outdated OrQuery class to concatenate parts with OR. Refactored the class to use the Criteria.orOperator(…) method and deprecated the OrQuery class to be removed in the 1.1.x branch.
2012-04-16 15:21:51 +02:00
Oliver Gierke
f82de367c8 DATAMONGO-366 - Polished reference documentation.
Fixed link to bug tracker. Polished docbook files a bit.
2012-04-16 15:21:51 +02:00
Oliver Gierke
5e62675bae DATAMONGO-412 - Fixed duplicate invocation of getUserCredentials(). 2012-04-16 15:21:51 +02:00
Oliver Gierke
c805d9ccae DATAMONGO-273, DATAMONGO-294 - Re-enabled accidentally disabled test case. 2012-04-16 15:21:51 +02:00
Oliver Gierke
2d97288917 DATAMONGO-360 - Fixed index information creation for geo indexes.
Fixed a ClassCastException that occurred because we didn't consider index information of geo indexes (they return "2d" as direction). Introduced new IndexField abstraction that supersedes the fieldSpec Map in IndexInfo.
2012-04-16 15:21:51 +02:00
Oliver Gierke
e38448a569 DATAMONGO-382 - Fixed potential ClassCastException in MappingMongoConverter.
MappingMongoConverter's convertToMongoType(…) now deals with Sets (and more generally all Collections) correctly.
2012-04-16 15:21:50 +02:00
Oliver Gierke
b1065b8f2d DATAMONGO-408 - Added StringToWriteConverter for XML setup convenience.
When using a PropertyPlaceHolderConfigurer to set WriteConcerns on a MongoFactoryBean just like this:

<bean class="….mongodb.core.MongoFactoryBean">
  <property name="writeConcern" value="${mongodb.writeConcern}"/>
</bean>

we might create invalid WriteConcerns as the BeanFactory will use the WriteConcern's constructor taking a String to create the instance by default. To make Spring use the valueOf(…) method in advance one needs to register either our already existing WriteConcernPropertyEditor or the newly introduced StringToWriteConcernConverter in Springs ConversionService.
2012-04-16 15:21:50 +02:00
Oliver Gierke
8cac1d9368 DATAMONGO-411 - Double check type of PersistentEntity for index creation.
The Spring container does not check nested generic types of the type parameter of  ApplicationEvent<T>. As T is parameterized in our case as well (PersistentEntity<…, …>) we can code an event listener against that fully parameterized type but might run into ClassCastExceptions as we might get other implementations handed into the method at runtime. We now do an instanceof check to safely invoke checkForIndexes(…) only in case we get the correct event type.
2012-04-16 15:21:44 +02:00
Spring Buildmaster
7184950f8a Prepare next development version. 2012-02-11 06:41:56 -08:00
Spring Buildmaster
edd71cac78 Release version 1.0.1.RELEASE. 2012-02-11 06:41:53 -08:00
Oliver Gierke
82bd7a69eb DATAMONGO-395 - Updated changelog for 1.0.1. 2012-02-11 14:33:30 +01:00
Oliver Gierke
b434a0810e DATAMONGO-401 - Fixed NullPointerException in StringBasedMongoQuery. 2012-02-11 14:32:22 +01:00
Oliver Gierke
40236d4099 DATAMONGO-380 - Improved map handling for keys containing dots.
MappingMongoConverter now rejects objects that would result in field keys containing a dot as we cannot reliably escape and unescape them without potentially wrecking correct keys on reading. However I added a property mapKeyReplacement that can be set to e.g. ~ to have all dots in map keys replaced with ~. This will of course cause ~ to be transformed into dots when reading. If further customization is necessary override potentiallyEscapeMapKey(…) and potentiallyUnescapeMapKey(…).
2012-02-10 18:20:43 +01:00
Oliver Gierke
8f6d940036 DATAMONGO-397 - Replaced references to MongoTemplate with MongoOperations in repository package.
The reference in MongoRepositoryFactoryBean to MongoTemplate was unnecessary on the one hand and could cause problems in case the MongoTemplate is proxied as it can't be wired into the factory anymore then.
2012-02-10 17:11:27 +01:00
Oliver Gierke
95a92ccf5d DATAMONGO-395 - Removed invalid XML element. 2012-02-08 21:24:34 +01:00
Oliver Gierke
a6db24554f DATAMONGO-395 - Upgrade to Spring Data Commons 1.2.1. 2012-02-08 21:24:34 +01:00
Oliver Gierke
2f6c61ef9c DATAMONGO-395 - Polished pom.xml for Maven Central release. 2012-02-08 21:24:29 +01:00
Oliver Gierke
d8bf7ebf3f DATAMONGO-395 - Refer to latest core repository documentation. 2012-02-08 20:20:09 +01:00
Oliver Gierke
ce42783e73 DATACMNS-390 - Added UUIDToBinaryConverter to be able to handle UUIDs by default. 2012-02-06 15:16:24 +01:00
Oliver Gierke
69474327c6 DATAMONGO-358 - Fixed collection reading when property type is no a collection.
If you have a property of type object and it contains a collection we didn't property read it back in as creating the collection instance failed due to an invalid call to CollectionFactory. We now default the parameter handed to that call to List in case the property type is not a Collection at all.
2012-02-01 16:26:37 +01:00
Oliver Gierke
1bbe2e8247 DATAMONGO-385 - Added test case to show repositories working with Long id. 2012-02-01 15:30:02 +01:00
Oliver Gierke
94af898ae3 DATAMONGO-387 - Repository query execution for GeoPage results are now working correctly.
Added special handling of GeoPage return types for repository query methods.
2012-02-01 14:42:49 +01:00
Oliver Gierke
f6298f7005 DATAMONGO-375 - Polished XSD by removing unnecessary version numbers. 2012-02-01 10:00:19 +01:00
Oliver Gierke
d5b3c651b2 Consolidate Maven repository usage to use repo.springsource.org/libs-snapshot. 2012-01-31 18:47:52 +01:00
Oliver Gierke
33dd00f0b8 DATAMONGO-379 - Improved entity instantiation.
Huge refactoring of the way MappingMongoConverter instantiates entities. The constructor arguments now have to mirror a property exactly in terms of name. Thus we can pick up mapping information from the property to lookup the correct value from the source document. The @Value annotation can be used to either inject completely arbitrary values into the instance (e.g. by referring to a Spring bean) or simply define an expression against DBObject's fields:

class Sample {
  String foo;
  String bar;

  Sample(String foo, @Value("#root._bar") String bar) {
    this.foo = foo;
    this.bar = bar;
  }
}

trying to create an instance of this class from

{ "foo" : "FOO" } -> new Sample("FOO", null)
{ "_bar" : "BAR" } -> new Sample(null, "BAR").
2012-01-16 18:37:00 +00:00
Oliver Gierke
3207a81555 DATAMONGO-368 - MappingMongoConverter does not remove null values from collections anymore. 2012-01-16 18:36:47 +00:00
Oliver Gierke
d231519012 DATAMONGO-376 - Fixed potential NPE in SpringDataMongodbSerializer. 2012-01-12 12:26:39 +01:00
Oliver Gierke
e052ecc9a4 Polished formatting and Javadoc. 2012-01-12 12:26:10 +01:00
Oliver Gierke
071f2934a1 DATAMONGO-369 - Fixed query mapping when a DBObject is included in the query object.
Replaced premature return with continue to break the for loop appropriately.
2012-01-12 11:20:16 +01:00
Oliver Gierke
d2a18e9b11 DATAMONGO-373 - Fixed potential ClassCastException in QueryMapper.
QueryMapper assumed finding a BasicBSONList for $(n)or operators. This is generally true if the DBObject was created through our Query abstraction. If you use the MongoDB driver QueryBuilder this will fail. We're now only insisting on an Iterable which fixes the issue.
2012-01-12 11:20:05 +01:00
Oliver Gierke
d684fa1f8e Fixed broken integration test. 2012-01-12 11:19:43 +01:00
Oliver Gierke
1a0077231d DATAMONGO-357 - Prepare 1.0.1 development branch. 2012-01-12 11:15:39 +01:00
161 changed files with 4585 additions and 2032 deletions

49
pom.xml
View File

@@ -1,11 +1,12 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-dist</artifactId>
<name>Spring Data MongoDB Distribution</name>
<version>1.0.0.RELEASE</version>
<description>Spring Data project for MongoDB</description>
<url>http://www.springsource.org/spring-data/mongodb</url>
<version>1.0.5.BUILD-SNAPSHOT</version>
<packaging>pom</packaging>
<modules>
<module>spring-data-mongodb</module>
@@ -15,6 +16,17 @@
</modules>
<developers>
<developer>
<id>ogierke</id>
<name>Oliver Gierke</name>
<email>ogierke at vmware.com</email>
<organization>SpringSource</organization>
<organizationUrl>http://www.springsource.com</organizationUrl>
<roles>
<role>Project lead</role>
</roles>
<timezone>+1</timezone>
</developer>
<developer>
<id>trisberg</id>
<name>Thomas Risberg</name>
@@ -39,17 +51,6 @@
</roles>
<timezone>-5</timezone>
</developer>
<developer>
<id>ogierke</id>
<name>Oliver Gierke</name>
<email>ogierke at vmware.com</email>
<organization>SpringSource</organization>
<organizationUrl>http://www.springsource.com</organizationUrl>
<roles>
<role>Developer</role>
</roles>
<timezone>+1</timezone>
</developer>
<developer>
<id>jbrisbin</id>
<name>Jon Brisbin</name>
@@ -145,12 +146,12 @@
<htmlCustomization>${project.basedir}/src/docbkx/resources/xsl/html.xsl</htmlCustomization>
<useExtensions>1</useExtensions>
<highlightSource>1</highlightSource>
<highlightDefaultLanguage></highlightDefaultLanguage>
<highlightDefaultLanguage />
<!-- callouts -->
<entities>
<entity>
<name>version</name>
<value>${pom.version}</value>
<value>${project.version}</value>
</entity>
</entities>
<postProcess>
@@ -173,9 +174,7 @@
<include name="*.png" />
</fileset>
</copy>
<move file="${project.basedir}/target/site/reference/pdf/index.pdf"
tofile="${project.basedir}/target/site/reference/pdf/spring-data-mongo-reference.pdf"
failonerror="false"/>
<move file="${project.basedir}/target/site/reference/pdf/index.pdf" tofile="${project.basedir}/target/site/reference/pdf/spring-data-mongo-reference.pdf" failonerror="false" />
</postProcess>
</configuration>
</plugin>
@@ -184,7 +183,7 @@
<artifactId>maven-javadoc-plugin</artifactId>
<version>2.5</version>
<configuration>
<javadoc:aggregate>true</javadoc:aggregate>
<aggregate>true</aggregate>
<breakiterator>true</breakiterator>
<header>Spring Data Document</header>
<source>1.5</source>
@@ -270,9 +269,9 @@
<url>http://repository.springsource.com/maven/bundles/release</url>
</pluginRepository>
<pluginRepository>
<id>repository.springframework.maven.release</id>
<id>spring-libs-release</id>
<name>Spring Framework Maven Release Repository</name>
<url>http://repo.springsource.org/release</url>
<url>http://repo.springsource.org/libs-release</url>
</pluginRepository>
</pluginRepositories>
@@ -282,7 +281,7 @@
<site>
<id>static.springframework.org</id>
<url>
scp://static.springframework.org/var/www/domains/springframework.org/static/htdocs/spring-data/data-mongodb/docs/${project.version}
scp://static.springframework.org/var/www/domains/springframework.org/static/htdocs/spring-data/data-mongodb/snapshot-site
</url>
</site>
<repository>
@@ -296,5 +295,7 @@
<url>s3://maven.springframework.org/snapshot</url>
</snapshotRepository>
</distributionManagement>
<scm>
<url>https://github.com/SpringSource/spring-data-mongodb</url>
</scm>
</project>

View File

@@ -1,10 +1,10 @@
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.0.0.RELEASE</version>
<version>1.0.5.BUILD-SNAPSHOT</version>
<relativePath>../spring-data-mongodb-parent/pom.xml</relativePath>
</parent>
<artifactId>spring-data-mongodb-cross-store</artifactId>
@@ -68,24 +68,6 @@
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<exclusions>
<exclusion>
<groupId>javax.mail</groupId>
<artifactId>mail</artifactId>
</exclusion>
<exclusion>
<groupId>javax.jms</groupId>
<artifactId>jms</artifactId>
</exclusion>
<exclusion>
<groupId>com.sun.jdmk</groupId>
<artifactId>jmxtools</artifactId>
</exclusion>
<exclusion>
<groupId>com.sun.jmx</groupId>
<artifactId>jmxri</artifactId>
</exclusion>
</exclusions>
<scope>runtime</scope>
</dependency>
@@ -95,17 +77,6 @@
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.mockito</groupId>
<artifactId>mockito-all</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
</dependency>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjrt</artifactId>
@@ -151,25 +122,7 @@
</dependency>
</dependencies>
<repositories>
<repository>
<id>jboss-repository</id>
<name>JBoss Public Repository</name>
<url>http://repository.jboss.org/nexus/content/groups/public-jboss</url>
</repository>
</repositories>
<pluginRepositories>
<pluginRepository>
<id>spring-maven-milestones</id>
<name>Springframework Maven Milestone Repository</name>
<url>http://maven.springframework.org/milestone</url>
</pluginRepository>
<pluginRepository>
<id>spring-maven-release</id>
<name>Springframework Maven Release Repository</name>
<url>http://maven.springframework.org/release</url>
</pluginRepository>
</pluginRepositories>
<build>
<plugins>
<plugin>

View File

@@ -1,11 +1,10 @@
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.0.0.RELEASE</version>
<version>1.0.5.BUILD-SNAPSHOT</version>
<relativePath>../spring-data-mongodb-parent/pom.xml</relativePath>
</parent>
<artifactId>spring-data-mongodb-log4j</artifactId>
@@ -28,47 +27,9 @@
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<exclusions>
<exclusion>
<groupId>javax.mail</groupId>
<artifactId>mail</artifactId>
</exclusion>
<exclusion>
<groupId>javax.jms</groupId>
<artifactId>jms</artifactId>
</exclusion>
<exclusion>
<groupId>com.sun.jdmk</groupId>
<artifactId>jmxtools</artifactId>
</exclusion>
<exclusion>
<groupId>com.sun.jmx</groupId>
<artifactId>jmxri</artifactId>
</exclusion>
</exclusions>
<scope>compile</scope>
</dependency>
<!-- Test dependencies -->
<dependency>
<groupId>org.mockito</groupId>
<artifactId>mockito-all</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.hamcrest</groupId>
<artifactId>hamcrest-all</artifactId>
<version>1.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
<build>

View File

@@ -1,63 +1,101 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<name>Spring Data MongoDB Parent</name>
<description>Spring Data project for MongoDB</description>
<url>http://www.springsource.org/spring-data/mongodb</url>
<version>1.0.0.RELEASE</version>
<version>1.0.5.BUILD-SNAPSHOT</version>
<packaging>pom</packaging>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<!-- versions for commonly-used dependencies -->
<junit.version>4.8.1</junit.version>
<log4j.version>1.2.15</log4j.version>
<junit.version>4.10</junit.version>
<log4j.version>1.2.16</log4j.version>
<org.mockito.version>1.8.4</org.mockito.version>
<hamcrest.version>1.2.1</hamcrest.version>
<org.slf4j.version>1.5.10</org.slf4j.version>
<org.codehaus.jackson.version>1.6.1</org.codehaus.jackson.version>
<org.springframework.version.30>3.0.7.RELEASE</org.springframework.version.30>
<org.springframework.version.40>4.0.0.RELEASE</org.springframework.version.40>
<org.springframework.version.range>[${org.springframework.version.30}, ${org.springframework.version.40})</org.springframework.version.range>
<data.commons.version>1.2.0.RELEASE</data.commons.version>
<data.commons.version>1.2.1.RELEASE</data.commons.version>
<aspectj.version>1.6.11.RELEASE</aspectj.version>
</properties>
<profiles>
<profile>
<id>strict</id>
<properties>
<maven.test.failure.ignore>false</maven.test.failure.ignore>
</properties>
</profile>
<profile>
<id>fast</id>
<properties>
<maven.test.skip>true</maven.test.skip>
<maven.javadoc.skip>true</maven.javadoc.skip>
</properties>
</profile>
<profile>
<id>staging</id>
<distributionManagement>
<site>
<id>spring-site-staging</id>
<url>file:///${java.io.tmpdir}/spring-data/mongodb/docs</url>
</site>
<repository>
<id>spring-milestone-staging</id>
<url>file:///${java.io.tmpdir}/spring-data/mongodb/milestone</url>
</repository>
<snapshotRepository>
<id>spring-snapshot-staging</id>
<url>file:///${java.io.tmpdir}/spring-data/mongodb/snapshot</url>
</snapshotRepository>
</distributionManagement>
</profile>
<profile>
<id>bootstrap</id>
<!-- TODO: move the repositories in here before release -->
</profile>
</profiles>
<developers>
<developer>
<id>ogierke</id>
<name>Oliver Gierke</name>
<email>ogierke at vmware.com</email>
<organization>SpringSource</organization>
<organizationUrl>http://www.springsource.com</organizationUrl>
<roles>
<role>Project lead</role>
</roles>
<timezone>+1</timezone>
</developer>
<developer>
<id>trisberg</id>
<name>Thomas Risberg</name>
<email>trisberg at vmware.com</email>
<organization>SpringSource</organization>
<organizationUrl>http://www.SpringSource.com</organizationUrl>
<roles>
<role>Project Admin</role>
<role>Developer</role>
</roles>
<timezone>-5</timezone>
</developer>
<developer>
<id>mpollack</id>
<name>Mark Pollack</name>
<email>mpollack at vmware.com</email>
<organization>SpringSource</organization>
<organizationUrl>http://www.SpringSource.com</organizationUrl>
<roles>
<role>Project Admin</role>
<role>Developer</role>
</roles>
<timezone>-5</timezone>
</developer>
<developer>
<id>jbrisbin</id>
<name>Jon Brisbin</name>
<email>jbrisbin at vmware.com</email>
<organization>SpringSource</organization>
<organizationUrl>http://www.springsource.com</organizationUrl>
<roles>
<role>Developer</role>
</roles>
<timezone>-6</timezone>
</developer>
</developers>
<licenses>
<license>
<name>Apache License, Version 2.0</name>
<url>http://www.apache.org/licenses/LICENSE-2.0</url>
<comments>
Copyright 2010 the original author or authors.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied.
See the License for the specific language governing permissions and
limitations under the License.
</comments>
</license>
</licenses>
<distributionManagement>
<!-- see 'staging' profile for dry-run deployment settings -->
<downloadUrl>http://www.springsource.com/download/community
@@ -65,7 +103,7 @@
<site>
<id>static.springframework.org</id>
<url>
scp://static.springframework.org/var/www/domains/springframework.org/static/htdocs/spring-data/data-mongodb/docs/${project.version}
scp://static.springframework.org/var/www/domains/springframework.org/static/htdocs/spring-data/data-mongodb/snapshot-site
</url>
</site>
<repository>
@@ -79,6 +117,9 @@
<url>s3://maven.springframework.org/snapshot</url>
</snapshotRepository>
</distributionManagement>
<scm>
<url>https://github.com/SpringSource/spring-data-mongodb</url>
</scm>
<dependencyManagement>
<!--
inheritable <dependency> declarations for child poms. children still
@@ -111,6 +152,11 @@
<artifactId>spring-tx</artifactId>
<version>${org.springframework.version.range}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-context</artifactId>
<version>${org.springframework.version.range}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-orm</artifactId>
@@ -152,7 +198,7 @@
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>${project.version}</version>
<version>1.0.5.BUILD-SNAPSHOT</version>
</dependency>
<!-- Logging -->
@@ -178,24 +224,6 @@
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>${log4j.version}</version>
<exclusions>
<exclusion>
<groupId>javax.mail</groupId>
<artifactId>mail</artifactId>
</exclusion>
<exclusion>
<groupId>javax.jms</groupId>
<artifactId>jms</artifactId>
</exclusion>
<exclusion>
<groupId>com.sun.jdmk</groupId>
<artifactId>jmxtools</artifactId>
</exclusion>
<exclusion>
<groupId>com.sun.jmx</groupId>
<artifactId>jmxri</artifactId>
</exclusion>
</exclusions>
<scope>runtime</scope>
</dependency>
@@ -208,14 +236,14 @@
<dependency>
<groupId>org.mockito</groupId>
<artifactId>mockito-all</artifactId>
<artifactId>mockito-core</artifactId>
<version>${org.mockito.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<artifactId>junit-dep</artifactId>
<version>${junit.version}</version>
<scope>test</scope>
</dependency>
@@ -223,19 +251,34 @@
</dependencies>
</dependencyManagement>
<dependencies>
<!--
dependency definitions to be inherited by child poms. any
<dependency> declarations here will automatically show up on child
project classpaths. only items that are truly common across all
projects (modules and samples) should go here. otherwise, consider
<dependencyManagement> above
-->
<!-- Test dependencies -->
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>${log4j.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.hamcrest</groupId>
<artifactId>hamcrest-library</artifactId>
<version>${hamcrest.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.mockito</groupId>
<artifactId>mockito-core</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit-dep</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
<build>
@@ -250,48 +293,11 @@
<version>3.1.0.RELEASE</version>
</extension>
</extensions>
<resources>
<resource>
<directory>${project.basedir}/src/main/java</directory>
<includes>
<include>**/*</include>
</includes>
<excludes>
<exclude>**/*.java</exclude>
</excludes>
</resource>
<resource>
<directory>${project.basedir}/src/main/resources</directory>
<includes>
<include>**/*</include>
</includes>
</resource>
</resources>
<testResources>
<testResource>
<directory>${project.basedir}/src/test/java</directory>
<includes>
<include>**/*</include>
</includes>
<excludes>
<exclude>**/*.java</exclude>
</excludes>
</testResource>
<testResource>
<directory>${project.basedir}/src/test/resources</directory>
<includes>
<include>**/*</include>
</includes>
<excludes>
<exclude>**/*.java</exclude>
</excludes>
</testResource>
</testResources>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>2.3.2</version>
<version>2.5.1</version>
<configuration>
<source>1.5</source>
<target>1.5</target>
@@ -311,13 +317,13 @@
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.8</version>
<version>2.10</version>
<configuration>
<useFile>false</useFile>
<includes>
<include>**/*Tests.java</include>
</includes>
<junitArtifactName>junit:junit</junitArtifactName>
<junitArtifactName>junit:junit-dep</junitArtifactName>
</configuration>
</plugin>
<plugin>
@@ -369,22 +375,18 @@
</build>
<pluginRepositories>
<pluginRepository>
<!-- necessary for bundlor and utils -->
<id>repository.plugin.springsource.release</id>
<name>SpringSource Maven Repository</name>
<url>http://repository.springsource.com/maven/bundles/release</url>
<id>spring-plugins-release</id>
<url>http://repo.springsource.org/plugins-release</url>
</pluginRepository>
<pluginRepository>
<id>repository.springframework.maven.release</id>
<name>Spring Framework Maven Release Repository</name>
<url>http://repo.springsource.org/release</url>
<id>querydsl</id>
<url>http://source.mysema.com/maven2/releases</url>
</pluginRepository>
</pluginRepositories>
<repositories>
<repository>
<id>repository.springframework.maven.release</id>
<name>Spring Framework Maven Release Repository</name>
<url>http://repo.springsource.org/release</url>
<id>spring-libs-release</id>
<url>http://repo.springsource.org/libs-release</url>
</repository>
</repositories>
<reporting>

View File

@@ -1,11 +1,10 @@
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.0.0.RELEASE</version>
<version>1.0.5.BUILD-SNAPSHOT</version>
<relativePath>../spring-data-mongodb-parent/pom.xml</relativePath>
</parent>
<artifactId>spring-data-mongodb</artifactId>
@@ -19,10 +18,22 @@
<dependencies>
<!-- Spring -->
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-tx</artifactId>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-context</artifactId>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-beans</artifactId>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-core</artifactId>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-expression</artifactId>
@@ -58,19 +69,6 @@
<artifactId>querydsl-mongodb</artifactId>
<version>${querydsl.version}</version>
<optional>true</optional>
<exclusions>
<exclusion>
<groupId>com.google.code.morphia</groupId>
<artifactId>morphia</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>com.mysema.querydsl</groupId>
<artifactId>querydsl-apt</artifactId>
<version>${querydsl.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
@@ -80,25 +78,6 @@
</dependency>
<!-- Test dependencies -->
<dependency>
<groupId>org.mockito</groupId>
<artifactId>mockito-all</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.hamcrest</groupId>
<artifactId>hamcrest-all</artifactId>
<version>1.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>joda-time</groupId>
<artifactId>joda-time</artifactId>
@@ -109,8 +88,8 @@
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>jcl-over-slf4j</artifactId>
@@ -128,6 +107,7 @@
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
@@ -138,7 +118,14 @@
<plugin>
<groupId>com.mysema.maven</groupId>
<artifactId>maven-apt-plugin</artifactId>
<version>1.0.2</version>
<version>1.0.4</version>
<dependencies>
<dependency>
<groupId>com.mysema.querydsl</groupId>
<artifactId>querydsl-apt</artifactId>
<version>${querydsl.version}</version>
</dependency>
</dependencies>
<executions>
<execution>
<phase>generate-test-sources</phase>
@@ -155,14 +142,4 @@
</plugins>
</build>
<repositories>
<repository>
<id>querydsl</id>
<name>Mysema QueryDsl</name>
<url>http://source.mysema.com/maven2/releases</url>
<snapshots>
<enabled>false</enabled>
</snapshots>
</repository>
</repositories>
</project>

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -18,7 +18,6 @@ package org.springframework.data.mongodb.config;
import java.util.HashSet;
import java.util.Set;
import com.mongodb.Mongo;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.ClassPathScanningCandidateComponentProvider;
@@ -35,6 +34,14 @@ import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.util.ClassUtils;
import org.springframework.util.StringUtils;
import com.mongodb.Mongo;
/**
* Abstract base class to ease JavaConfig setup for Spring Data MongoDB.
*
* @author Mark Pollack
* @author Oliver Gierke
*/
@Configuration
public abstract class AbstractMongoConfiguration {
@@ -50,10 +57,11 @@ public abstract class AbstractMongoConfiguration {
@Bean
public MongoDbFactory mongoDbFactory() throws Exception {
if (getUserCredentials() == null) {
UserCredentials credentials = getUserCredentials();
if (credentials == null) {
return new SimpleMongoDbFactory(mongo(), getDatabaseName());
} else {
return new SimpleMongoDbFactory(mongo(), getDatabaseName(), getUserCredentials());
return new SimpleMongoDbFactory(mongo(), getDatabaseName(), credentials);
}
}

View File

@@ -158,8 +158,8 @@ public class MappingMongoConverterParser extends AbstractBeanDefinitionParser {
String packageToScan = customerConvertersElement.getAttribute(BASE_PACKAGE);
if (StringUtils.hasText(packageToScan)) {
ClassPathScanningCandidateComponentProvider provider = new ClassPathScanningCandidateComponentProvider(true);
provider.addExcludeFilter(new NegatingFilter(new AssignableTypeFilter(Converter.class), new AssignableTypeFilter(
GenericConverter.class)));
provider.addExcludeFilter(new NegatingFilter(new AssignableTypeFilter(Converter.class),
new AssignableTypeFilter(GenericConverter.class)));
for (BeanDefinition candidate : provider.findCandidateComponents(packageToScan)) {
converterBeans.add(candidate);

View File

@@ -17,7 +17,11 @@ package org.springframework.data.mongodb.config;
import java.beans.PropertyEditorSupport;
import java.net.UnknownHostException;
import java.util.HashSet;
import java.util.Set;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.springframework.util.StringUtils;
import com.mongodb.ServerAddress;
@@ -30,6 +34,8 @@ import com.mongodb.ServerAddress;
*/
public class ServerAddressPropertyEditor extends PropertyEditorSupport {
private static final Log LOG = LogFactory.getLog(ServerAddressPropertyEditor.class);
/*
* (non-Javadoc)
* @see java.beans.PropertyEditorSupport#setAsText(java.lang.String)
@@ -38,21 +44,49 @@ public class ServerAddressPropertyEditor extends PropertyEditorSupport {
public void setAsText(String replicaSetString) {
String[] replicaSetStringArray = StringUtils.commaDelimitedListToStringArray(replicaSetString);
ServerAddress[] serverAddresses = new ServerAddress[replicaSetStringArray.length];
Set<ServerAddress> serverAddresses = new HashSet<ServerAddress>(replicaSetStringArray.length);
for (int i = 0; i < replicaSetStringArray.length; i++) {
for (String element : replicaSetStringArray) {
String[] hostAndPort = StringUtils.delimitedListToStringArray(replicaSetStringArray[i], ":");
ServerAddress address = parseServerAddress(element);
if (address != null) {
serverAddresses.add(address);
}
}
if (serverAddresses.isEmpty()) {
throw new IllegalArgumentException(
"Could not resolve at least one server of the replica set configuration! Validate your config!");
}
setValue(serverAddresses.toArray(new ServerAddress[serverAddresses.size()]));
}
/**
* Parses the given source into a {@link ServerAddress}.
*
* @param source
* @return the
*/
private ServerAddress parseServerAddress(String source) {
String[] hostAndPort = StringUtils.delimitedListToStringArray(source.trim(), ":");
if (!StringUtils.hasText(source) || hostAndPort.length > 2) {
LOG.warn(String.format("Could not parse address source '%s'. Check your replica set configuration!", source));
return null;
}
try {
serverAddresses[i] = new ServerAddress(hostAndPort[0], Integer.parseInt(hostAndPort[1]));
} catch (NumberFormatException e) {
throw new IllegalArgumentException("Could not parse port " + hostAndPort[1], e);
return hostAndPort.length == 1 ? new ServerAddress(hostAndPort[0]) : new ServerAddress(hostAndPort[0],
Integer.parseInt(hostAndPort[1]));
} catch (UnknownHostException e) {
throw new IllegalArgumentException("Could not parse host " + hostAndPort[0], e);
}
LOG.warn(String.format("Could not parse host '%s'. Check your replica set configuration!", hostAndPort[0]));
} catch (NumberFormatException e) {
LOG.warn(String.format("Could not parse port '%s'. Check your replica set configuration!", hostAndPort[1]));
}
setValue(serverAddresses);
return null;
}
}

View File

@@ -0,0 +1,38 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import org.springframework.core.convert.converter.Converter;
import com.mongodb.WriteConcern;
/**
* Converter to create {@link WriteConcern} instances from String representations.
*
* @author Oliver Gierke
*/
public class StringToWriteConcernConverter implements Converter<String, WriteConcern> {
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
public WriteConcern convert(String source) {
WriteConcern writeConcern = WriteConcern.valueOf(source);
return writeConcern != null ? writeConcern : new WriteConcern(source);
}
}

View File

@@ -16,14 +16,13 @@
package org.springframework.data.mongodb.core;
import java.util.ArrayList;
import java.util.Iterator;
import java.util.LinkedHashMap;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.Map.Entry;
import org.springframework.dao.DataAccessException;
import org.springframework.data.mongodb.core.index.IndexDefinition;
import org.springframework.data.mongodb.core.index.IndexField;
import org.springframework.data.mongodb.core.index.IndexInfo;
import org.springframework.data.mongodb.core.query.Order;
import org.springframework.util.Assert;
@@ -123,25 +122,30 @@ public class DefaultIndexOperations implements IndexOperations {
return getIndexData(dbObjectList);
}
@SuppressWarnings("unchecked")
private List<IndexInfo> getIndexData(List<DBObject> dbObjectList) {
List<IndexInfo> indexInfoList = new ArrayList<IndexInfo>();
for (DBObject ix : dbObjectList) {
Map<String, Order> keyOrderMap = new LinkedHashMap<String, Order>();
DBObject keyDbObject = (DBObject) ix.get("key");
Iterator<?> entries = keyDbObject.toMap().entrySet().iterator();
int numberOfElements = keyDbObject.keySet().size();
while (entries.hasNext()) {
Entry<Object, Integer> thisEntry = (Entry<Object, Integer>) entries.next();
String key = thisEntry.getKey().toString();
int value = thisEntry.getValue();
if (value == 1) {
Map<String, Order> keyOrderMap = new HashMap<String, Order>();
List<IndexField> indexFields = new ArrayList<IndexField>(numberOfElements);
for (String key : keyDbObject.keySet()) {
Object value = keyDbObject.get(key);
if (Integer.valueOf(1).equals(value)) {
keyOrderMap.put(key, Order.ASCENDING);
} else {
indexFields.add(IndexField.create(key, Order.ASCENDING));
} else if (Integer.valueOf(-1).equals(value)) {
keyOrderMap.put(key, Order.DESCENDING);
indexFields.add(IndexField.create(key, Order.DESCENDING));
} else if ("2d".equals(value)) {
indexFields.add(IndexField.geo(key));
}
}
@@ -151,12 +155,11 @@ public class DefaultIndexOperations implements IndexOperations {
boolean dropDuplicates = ix.containsField("dropDups") ? (Boolean) ix.get("dropDups") : false;
boolean sparse = ix.containsField("sparse") ? (Boolean) ix.get("sparse") : false;
indexInfoList.add(new IndexInfo(keyOrderMap, name, unique, dropDuplicates, sparse));
indexInfoList.add(new IndexInfo(keyOrderMap, indexFields, name, unique, dropDuplicates, sparse));
}
return indexInfoList;
}
});
}
}

View File

@@ -21,12 +21,13 @@ import com.mongodb.DBObject;
import com.mongodb.MongoException;
/**
* An interface used by {@link MongoTemplate} for processing documents returned from a MongoDB query on a per-document basis.
* Implementations of this interface perform the actual work of prcoessing each document but don't need to worry about
* exception handling. {@MongoException}s will be caught and translated by the calling MongoTemplate
* An interface used by {@link MongoTemplate} for processing documents returned from a MongoDB query on a per-document
* basis. Implementations of this interface perform the actual work of prcoessing each document but don't need to worry
* about exception handling. {@MongoException}s will be caught and translated by the calling
* MongoTemplate
*
* An DocumentCallbackHandler is typically stateful: It keeps the result state within the object, to be available later for later
* inspection.
* An DocumentCallbackHandler is typically stateful: It keeps the result state within the object, to be available later
* for later inspection.
*
* @author Mark Pollack
*

View File

@@ -59,6 +59,4 @@ public class FindAndModifyOptions {
return remove;
}
}

View File

@@ -23,12 +23,14 @@ import com.mongodb.WriteConcern;
* Represents an action taken against the collection. Used by {@link WriteConcernResolver} to determine a custom
* WriteConcern based on this information.
*
* Properties that will always be not-null are collectionName and defaultWriteConcern.
* The EntityClass is null only for the MongoActionOperaton.INSERT_LIST.
* Properties that will always be not-null are collectionName and defaultWriteConcern. The EntityClass is null only for
* the MongoActionOperaton.INSERT_LIST.
*
* INSERT, SAVE have null query,
* REMOVE has null document
* INSERT_LIST has null entityClass, document, and query.
* <ul>
* <li>INSERT, SAVE have null query</li>
* <li>REMOVE has null document</li>
* <li>INSERT_LIST has null entityClass, document, and query</li>
* </ul>
*
* @author Mark Pollack
*
@@ -49,6 +51,7 @@ public class MongoAction {
/**
* Create an instance of a MongoAction
*
* @param defaultWriteConcern the default write concern
* @param mongoActionOperation action being taken against the collection
* @param collectionName the collection name
@@ -91,6 +94,4 @@ public class MongoAction {
return document;
}
}

View File

@@ -16,8 +16,8 @@
package org.springframework.data.mongodb.core;
/**
* Enumeration for operations on a collection. Used with {@link MongoAction} to help determine the
* WriteConcern to use for a given mutating operation
* Enumeration for operations on a collection. Used with {@link MongoAction} to help determine the WriteConcern to use
* for a given mutating operation
*
* @author Mark Pollack
* @see MongoAction
@@ -25,9 +25,5 @@ package org.springframework.data.mongodb.core;
*/
public enum MongoActionOperation {
REMOVE,
UPDATE,
INSERT,
INSERT_LIST,
SAVE
REMOVE, UPDATE, INSERT, INSERT_LIST, SAVE
}

View File

@@ -15,14 +15,15 @@
*/
package org.springframework.data.mongodb.core;
import com.mongodb.DB;
import com.mongodb.Mongo;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.springframework.data.mongodb.CannotGetMongoDbConnectionException;
import org.springframework.transaction.support.TransactionSynchronizationManager;
import org.springframework.util.Assert;
import com.mongodb.DB;
import com.mongodb.Mongo;
/**
* Helper class featuring helper methods for internal MongoDb classes.
* <p/>
@@ -78,7 +79,7 @@ public abstract class MongoDbUtils {
DB db = null;
if (TransactionSynchronizationManager.isSynchronizationActive() && dbHolder.doesNotHoldNonDefaultDB()) {
// Spring transaction management is active ->
db = dbHolder.getDB();
db = dbHolder.getDB(databaseName);
if (db != null && !dbHolder.isSynchronizedWithTransaction()) {
LOGGER.debug("Registering Spring transaction synchronization for existing Mongo DB");
TransactionSynchronizationManager.registerSynchronization(new MongoSynchronization(dbHolder, mongo));
@@ -96,12 +97,14 @@ public abstract class MongoDbUtils {
boolean credentialsGiven = username != null && password != null;
if (credentialsGiven && !db.isAuthenticated()) {
// Note, can only authenticate once against the same com.mongodb.DB object.
synchronized (db) {
if (!db.authenticate(username, password)) {
throw new CannotGetMongoDbConnectionException("Failed to authenticate to database [" + databaseName
+ "], username = [" + username + "], password = [" + new String(password) + "]", databaseName, username,
password);
}
}
}
// Use same Session for further Mongo actions within the transaction.
// Thread object will get removed by synchronization at transaction completion.
@@ -110,9 +113,9 @@ public abstract class MongoDbUtils {
LOGGER.debug("Registering Spring transaction synchronization for new Hibernate Session");
DbHolder holderToUse = dbHolder;
if (holderToUse == null) {
holderToUse = new DbHolder(db);
holderToUse = new DbHolder(databaseName, db);
} else {
holderToUse.addDB(db);
holderToUse.addDB(databaseName, db);
}
TransactionSynchronizationManager.registerSynchronization(new MongoSynchronization(holderToUse, mongo));
holderToUse.setSynchronizedWithTransaction(true);
@@ -143,7 +146,7 @@ public abstract class MongoDbUtils {
return false;
}
DbHolder dbHolder = (DbHolder) TransactionSynchronizationManager.getResource(mongo);
return (dbHolder != null && dbHolder.containsDB(db));
return dbHolder != null && dbHolder.containsDB(db);
}
/**

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2010-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -13,15 +13,14 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import java.util.Arrays;
import java.util.List;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.springframework.beans.factory.DisposableBean;
import org.springframework.beans.factory.FactoryBean;
import org.springframework.beans.factory.InitializingBean;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.CannotGetMongoDbConnectionException;
@@ -39,12 +38,10 @@ import com.mongodb.WriteConcern;
* @author Oliver Gierke
* @since 1.0
*/
public class MongoFactoryBean implements FactoryBean<Mongo>, PersistenceExceptionTranslator {
public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, DisposableBean,
PersistenceExceptionTranslator {
/**
* Logger, available to subclasses.
*/
protected final Log logger = LogFactory.getLog(getClass());
private Mongo mongo;
private MongoOptions mongoOptions;
private String host;
@@ -89,34 +86,7 @@ public class MongoFactoryBean implements FactoryBean<Mongo>, PersistenceExceptio
}
public Mongo getObject() throws Exception {
Mongo mongo;
ServerAddress defaultOptions = new ServerAddress();
if (mongoOptions == null) {
mongoOptions = new MongoOptions();
}
if (replicaPair != null) {
if (replicaPair.size() < 2) {
throw new CannotGetMongoDbConnectionException("A replica pair must have two server entries");
}
mongo = new Mongo(replicaPair.get(0), replicaPair.get(1), mongoOptions);
} else if (replicaSetSeeds != null) {
mongo = new Mongo(replicaSetSeeds, mongoOptions);
} else {
String mongoHost = host != null ? host : defaultOptions.getHost();
mongo = port != null ? new Mongo(new ServerAddress(mongoHost, port), mongoOptions) : new Mongo(mongoHost,
mongoOptions);
}
if (writeConcern != null) {
mongo.setWriteConcern(writeConcern);
}
return mongo;
}
/*
@@ -142,4 +112,45 @@ public class MongoFactoryBean implements FactoryBean<Mongo>, PersistenceExceptio
public DataAccessException translateExceptionIfPossible(RuntimeException ex) {
return exceptionTranslator.translateExceptionIfPossible(ex);
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.InitializingBean#afterPropertiesSet()
*/
public void afterPropertiesSet() throws Exception {
Mongo mongo;
ServerAddress defaultOptions = new ServerAddress();
if (mongoOptions == null) {
mongoOptions = new MongoOptions();
}
if (replicaPair != null) {
if (replicaPair.size() < 2) {
throw new CannotGetMongoDbConnectionException("A replica pair must have two server entries");
}
mongo = new Mongo(replicaPair.get(0), replicaPair.get(1), mongoOptions);
} else if (replicaSetSeeds != null) {
mongo = new Mongo(replicaSetSeeds, mongoOptions);
} else {
String mongoHost = host != null ? host : defaultOptions.getHost();
mongo = port != null ? new Mongo(new ServerAddress(mongoHost, port), mongoOptions) : new Mongo(mongoHost,
mongoOptions);
}
if (writeConcern != null) {
mongo.setWriteConcern(writeConcern);
}
this.mongo = mongo;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.DisposableBean#destroy()
*/
public void destroy() throws Exception {
this.mongo.close();
}
}

View File

@@ -1,11 +1,7 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2010-2012 the original author or authors.
*
* Licensed under t
import org.springframework.util.ResourceUtils;
import org.springframework.data.convert.EntityReader;
he Apache License, Version 2.0 (the "License");
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
@@ -20,6 +16,7 @@ he Apache License, Version 2.0 (the "License");
package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import static org.springframework.data.mongodb.core.query.SerializationUtils.*;
import java.io.IOException;
import java.lang.reflect.InvocationTargetException;
@@ -50,6 +47,7 @@ import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.convert.EntityReader;
import org.springframework.data.mapping.PersistentEntity;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.model.BeanWrapper;
import org.springframework.data.mapping.model.MappingException;
@@ -108,6 +106,7 @@ import com.mongodb.util.JSON;
* @author Graeme Rocher
* @author Mark Pollack
* @author Oliver Gierke
* @author Amol Nayak
*/
public class MongoTemplate implements MongoOperations, ApplicationContextAware {
@@ -333,11 +332,12 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
Assert.notNull(query);
DBObject queryObject = query.getQueryObject();
DBObject sortObject = query.getSortObject();
DBObject fieldsObject = query.getFieldsObject();
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("find using query: " + queryObject + " fields: " + fieldsObject + " in collection: "
+ collectionName);
LOGGER.debug(String.format("Executing query: %s sort: %s fields: %s in collection: $s",
serializeToJsonSafely(queryObject), sortObject, fieldsObject, collectionName));
}
this.executeQueryInternal(new FindCallback(queryObject, fieldsObject), preparer, dch, collectionName);
@@ -456,7 +456,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
} else {
query.limit(1);
List<T> results = find(query, entityClass, collectionName);
return (results.isEmpty() ? null : results.get(0));
return results.isEmpty() ? null : results.get(0);
}
}
@@ -732,11 +732,13 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.INSERT, collectionName,
entityClass, dbDoc, null);
WriteConcern writeConcernToUse = prepareWriteConcern(mongoAction);
WriteResult wr;
if (writeConcernToUse == null) {
collection.insert(dbDoc);
wr = collection.insert(dbDoc);
} else {
collection.insert(dbDoc, writeConcernToUse);
wr = collection.insert(dbDoc, writeConcernToUse);
}
handleAnyWriteResultErrors(wr, dbDoc, "insert");
return dbDoc.get(ID);
}
});
@@ -755,11 +757,13 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.INSERT_LIST, collectionName, null,
null, null);
WriteConcern writeConcernToUse = prepareWriteConcern(mongoAction);
WriteResult wr;
if (writeConcernToUse == null) {
collection.insert(dbDocList);
wr = collection.insert(dbDocList);
} else {
collection.insert(dbDocList.toArray((DBObject[]) new BasicDBObject[dbDocList.size()]), writeConcernToUse);
wr = collection.insert(dbDocList.toArray((DBObject[]) new BasicDBObject[dbDocList.size()]), writeConcernToUse);
}
handleAnyWriteResultErrors(wr, null, "insert_list");
return null;
}
});
@@ -786,11 +790,13 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.SAVE, collectionName, entityClass,
dbDoc, null);
WriteConcern writeConcernToUse = prepareWriteConcern(mongoAction);
WriteResult wr;
if (writeConcernToUse == null) {
collection.save(dbDoc);
wr = collection.save(dbDoc);
} else {
collection.save(dbDoc, writeConcernToUse);
wr = collection.save(dbDoc, writeConcernToUse);
}
handleAnyWriteResultErrors(wr, dbDoc, "save");
return dbDoc.get(ID);
}
});
@@ -873,7 +879,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
return;
}
remove(getIdQueryFor(object), collection);
doRemove(collection, getIdQueryFor(object), object.getClass());
}
/**
@@ -938,7 +944,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
entityClass, null, queryObject);
WriteConcern writeConcernToUse = prepareWriteConcern(mongoAction);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("remove using query: " + queryObject + " in collection: " + collection.getName());
LOGGER.debug("remove using query: " + dboq + " in collection: " + collection.getName());
}
if (writeConcernToUse == null) {
wr = collection.remove(dboq);
@@ -996,26 +1002,15 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
LOGGER.debug("Executing MapReduce on collection [" + command.getInput() + "], mapFunction [" + mapFunc
+ "], reduceFunction [" + reduceFunc + "]");
}
CommandResult commandResult = null;
try {
if (command.getOutputType() == MapReduceCommand.OutputType.INLINE) {
commandResult = executeCommand(commandObject, getDb().getOptions());
} else {
commandResult = executeCommand(commandObject);
}
commandResult.throwOnError();
} catch (RuntimeException ex) {
this.potentiallyConvertRuntimeException(ex);
}
String error = commandResult.getErrorMessage();
if (error != null) {
throw new InvalidDataAccessApiUsageException("Command execution failed: Error [" + error + "], Command = "
+ commandObject);
}
CommandResult commandResult = command.getOutputType() == MapReduceCommand.OutputType.INLINE ? executeCommand(
commandObject, getDb().getOptions()) : executeCommand(commandObject);
handleCommandError(commandResult, commandObject);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("MapReduce command result = [" + commandResult + "]");
LOGGER.debug(String.format("MapReduce command result = [%s]", serializeToJsonSafely(commandObject)));
}
MapReduceOutput mapReduceOutput = new MapReduceOutput(inputCollection, commandObject, commandResult);
List<T> mappedResults = new ArrayList<T>();
DbObjectCallback<T> callback = new ReadDbObjectCallback<T>(mongoConverter, entityClass);
@@ -1040,7 +1035,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
if (criteria == null) {
dbo.put("cond", null);
} else {
dbo.put("cond", criteria.getCriteriaObject());
dbo.put("cond", mapper.getMappedObject(criteria.getCriteriaObject(), null));
}
// If initial document was a JavaScript string, potentially loaded by Spring's Resource abstraction, load it and
// convert to DBObject
@@ -1066,21 +1061,12 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
DBObject commandObject = new BasicDBObject("group", dbo);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Executing Group with DBObject [" + commandObject.toString() + "]");
}
CommandResult commandResult = null;
try {
commandResult = executeCommand(commandObject, getDb().getOptions());
commandResult.throwOnError();
} catch (RuntimeException ex) {
this.potentiallyConvertRuntimeException(ex);
}
String error = commandResult.getErrorMessage();
if (error != null) {
throw new InvalidDataAccessApiUsageException("Command execution failed: Error [" + error + "], Command = "
+ commandObject);
LOGGER.debug(String.format("Executing Group with DBObject [%s]", serializeToJsonSafely(commandObject)));
}
CommandResult commandResult = executeCommand(commandObject, getDb().getOptions());
handleCommandError(commandResult, commandObject);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Group command result = [" + commandResult + "]");
}
@@ -1251,11 +1237,14 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
protected <S, T> List<T> doFind(String collectionName, DBObject query, DBObject fields, Class<S> entityClass,
CursorPreparer preparer, DbObjectCallback<T> objectCallback) {
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("find using query: " + query + " fields: " + fields + " for class: " + entityClass
+ " in collection: " + collectionName);
LOGGER.debug(String.format("find using query: %s fields: %s for class: %s in collection: %s",
serializeToJsonSafely(query), fields, entityClass, collectionName));
}
return executeFindMultiInternal(new FindCallback(mapper.getMappedObject(query, entity), fields), preparer,
objectCallback, collectionName);
}
@@ -1337,13 +1326,15 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
updateObj.put(key, mongoConverter.convertToMongoType(updateObj.get(key)));
}
DBObject mappedQuery = mapper.getMappedObject(query, entity);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("findAndModify using query: " + query + " fields: " + fields + " sort: " + sort + " for class: "
+ entityClass + " and update: " + updateObj + " in collection: " + collectionName);
LOGGER.debug("findAndModify using query: " + mappedQuery + " fields: " + fields + " sort: " + sort
+ " for class: " + entityClass + " and update: " + updateObj + " in collection: " + collectionName);
}
return executeFindOneInternal(new FindAndModifyCallback(mapper.getMappedObject(query, entity), fields, sort,
updateObj, options), new ReadDbObjectCallback<T>(readerToUse, entityClass), collectionName);
return executeFindOneInternal(new FindAndModifyCallback(mappedQuery, fields, sort, updateObj, options),
new ReadDbObjectCallback<T>(readerToUse, entityClass), collectionName);
}
/**
@@ -1364,9 +1355,19 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
return;
}
ConversionService conversionService = mongoConverter.getConversionService();
BeanWrapper<PersistentEntity<Object, ?>, Object> wrapper = BeanWrapper.create(savedObject, conversionService);
try {
BeanWrapper.create(savedObject, mongoConverter.getConversionService()).setProperty(idProp, id);
Object idValue = wrapper.getProperty(idProp, idProp.getType(), true);
if (idValue != null) {
return;
}
wrapper.setProperty(idProp, id);
} catch (IllegalAccessException e) {
throw new MappingException(e.getMessage(), e);
} catch (InvocationTargetException e) {
@@ -1514,9 +1515,16 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
String error = wr.getError();
if (error != null) {
String message = String.format("Execution of %s%s failed: %s", operation, query == null ? "" : "' using '"
+ query.toString() + "' query", error);
String message;
if (operation.equals("insert") || operation.equals("save")) {
// assuming the insert operations will begin with insert string
message = String.format("Insert/Save for %s failed: %s", query, error);
} else if (operation.equals("insert_list")) {
message = String.format("Insert list failed: %s", error);
} else {
message = String.format("Execution of %s%s failed: %s", operation,
query == null ? "" : "' using '" + query.toString() + "' query", error);
}
if (WriteResultChecking.EXCEPTION == this.writeResultChecking) {
throw new DataIntegrityViolationException(message);
@@ -1539,6 +1547,27 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
return resolved == null ? ex : resolved;
}
/**
* Inspects the given {@link CommandResult} for erros and potentially throws an
* {@link InvalidDataAccessApiUsageException} for that error.
*
* @param result must not be {@literal null}.
* @param source must not be {@literal null}.
*/
private void handleCommandError(CommandResult result, DBObject source) {
try {
result.throwOnError();
} catch (MongoException ex) {
String error = result.getErrorMessage();
error = error == null ? "NO MESSAGE" : error;
throw new InvalidDataAccessApiUsageException("Command execution failed: Error [" + error + "], Command = "
+ source, ex);
}
}
private static final MongoConverter getDefaultMongoConverter(MongoDbFactory factory) {
MappingMongoConverter converter = new MappingMongoConverter(factory, new MongoMappingContext());
converter.afterPropertiesSet();

View File

@@ -30,6 +30,7 @@ import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.util.Assert;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
@@ -41,6 +42,9 @@ import com.mongodb.DBObject;
*/
public class QueryMapper {
private static final List<String> DEFAULT_ID_NAMES = Arrays.asList("id", "_id");
private static final String N_OR_PATTERN = "\\$.*or";
private final ConversionService conversionService;
private final MongoConverter converter;
@@ -59,8 +63,8 @@ public class QueryMapper {
* Replaces the property keys used in the given {@link DBObject} with the appropriate keys by using the
* {@link PersistentEntity} metadata.
*
* @param query
* @param entity
* @param query must not be {@literal null}.
* @param entity can be {@literal null}.
* @return
*/
public DBObject getMappedObject(DBObject query, MongoPersistentEntity<?> entity) {
@@ -68,8 +72,10 @@ public class QueryMapper {
DBObject newDbo = new BasicDBObject();
for (String key : query.keySet()) {
String newKey = key;
Object value = query.get(key);
if (isIdKey(key, entity)) {
if (value instanceof DBObject) {
DBObject valueDbo = (DBObject) value;
@@ -80,35 +86,52 @@ public class QueryMapper {
ids.add(convertId(id));
}
valueDbo.put(inKey, ids.toArray(new Object[ids.size()]));
} else if (valueDbo.containsField("$ne")) {
valueDbo.put("$ne", convertId(valueDbo.get("$ne")));
} else {
value = getMappedObject((DBObject) value, entity);
value = getMappedObject((DBObject) value, null);
}
} else {
value = convertId(value);
}
newKey = "_id";
} else if (key.startsWith("$") && key.endsWith("or")) {
} else if (key.matches(N_OR_PATTERN)) {
// $or/$nor
BasicBSONList conditions = (BasicBSONList) value;
Iterable<?> conditions = (Iterable<?>) value;
BasicBSONList newConditions = new BasicBSONList();
Iterator<Object> iter = conditions.iterator();
Iterator<?> iter = conditions.iterator();
while (iter.hasNext()) {
newConditions.add(getMappedObject((DBObject) iter.next(), entity));
}
value = newConditions;
} else if (key.equals("$ne")) {
value = convertId(value);
} else if (value instanceof DBObject) {
newDbo.put(newKey, getMappedObject((DBObject) value, entity));
return newDbo;
}
newDbo.put(newKey, converter.convertToMongoType(value));
newDbo.put(newKey, convertSimpleOrDBObject(value, null));
}
return newDbo;
}
/**
* Retriggers mapping if the given source is a {@link DBObject} or simply invokes the
*
* @param source
* @param entity
* @return
*/
private Object convertSimpleOrDBObject(Object source, MongoPersistentEntity<?> entity) {
if (source instanceof BasicDBList) {
return converter.convertToMongoType(source);
}
if (source instanceof DBObject) {
return getMappedObject((DBObject) source, entity);
}
return converter.convertToMongoType(source);
}
/**
* Returns whether the given key will be considered an id key.
*
@@ -118,12 +141,16 @@ public class QueryMapper {
*/
private boolean isIdKey(String key, MongoPersistentEntity<?> entity) {
if (null != entity && entity.getIdProperty() != null) {
if (entity == null) {
return false;
}
if (entity.getIdProperty() != null) {
MongoPersistentProperty idProperty = entity.getIdProperty();
return idProperty.getName().equals(key) || idProperty.getFieldName().equals(key);
}
return Arrays.asList("id", "_id").contains(key);
return DEFAULT_ID_NAMES.contains(key);
}
/**

View File

@@ -17,8 +17,6 @@ package org.springframework.data.mongodb.core;
import java.net.UnknownHostException;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.springframework.beans.factory.DisposableBean;
import org.springframework.dao.DataAccessException;
import org.springframework.data.authentication.UserCredentials;
@@ -39,12 +37,10 @@ import com.mongodb.WriteConcern;
*/
public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
protected final Log logger = LogFactory.getLog(getClass());
private final Mongo mongo;
private final String databaseName;
private String username;
private String password;
private final boolean mongoInstanceCreated;
private final UserCredentials credentials;
private WriteConcern writeConcern;
/**
@@ -54,11 +50,7 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
* @param databaseName database name, not be {@literal null}.
*/
public SimpleMongoDbFactory(Mongo mongo, String databaseName) {
Assert.notNull(mongo, "Mongo must not be null");
Assert.hasText(databaseName, "Database name must not be empty");
Assert.isTrue(databaseName.matches("[\\w-]+"), "Database name must only contain letters, numbers, underscores and dashes!");
this.mongo = mongo;
this.databaseName = databaseName;
this(mongo, databaseName, new UserCredentials(), false);
}
/**
@@ -66,12 +58,10 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
*
* @param mongo Mongo instance, must not be {@literal null}.
* @param databaseName Database name, must not be {@literal null}.
* @param userCredentials username and password must not be {@literal null}.
* @param credentials username and password.
*/
public SimpleMongoDbFactory(Mongo mongo, String databaseName, UserCredentials userCredentials) {
this(mongo, databaseName);
this.username = userCredentials.getUsername();
this.password = userCredentials.getPassword();
public SimpleMongoDbFactory(Mongo mongo, String databaseName, UserCredentials credentials) {
this(mongo, databaseName, credentials, false);
}
/**
@@ -83,7 +73,21 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
* @see MongoURI
*/
public SimpleMongoDbFactory(MongoURI uri) throws MongoException, UnknownHostException {
this(new Mongo(uri), uri.getDatabase(), new UserCredentials(uri.getUsername(), parseChars(uri.getPassword())));
this(new Mongo(uri), uri.getDatabase(), new UserCredentials(uri.getUsername(), parseChars(uri.getPassword())), true);
}
private SimpleMongoDbFactory(Mongo mongo, String databaseName, UserCredentials credentials,
boolean mongoInstanceCreated) {
Assert.notNull(mongo, "Mongo must not be null");
Assert.hasText(databaseName, "Database name must not be empty");
Assert.isTrue(databaseName.matches("[\\w-]+"),
"Database name must only contain letters, numbers, underscores and dashes!");
this.mongo = mongo;
this.databaseName = databaseName;
this.mongoInstanceCreated = mongoInstanceCreated;
this.credentials = credentials == null ? new UserCredentials() : credentials;
}
/**
@@ -95,10 +99,6 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
this.writeConcern = writeConcern;
}
public WriteConcern getWriteConcern() {
return writeConcern;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getDb()
@@ -115,6 +115,9 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
Assert.hasText(dbName, "Database name must not be empty.");
String username = credentials.getUsername();
String password = credentials.getPassword();
DB db = MongoDbUtils.getDB(mongo, dbName, username, password == null ? null : password.toCharArray());
if (writeConcern != null) {
@@ -125,17 +128,17 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
}
/**
* Clean up the Mongo instance.
* Clean up the Mongo instance if it was created by the factory itself.
*
* @see DisposableBean#destroy()
*/
public void destroy() throws Exception {
if (mongoInstanceCreated) {
mongo.close();
}
}
public static String parseChars(char[] chars) {
if (chars == null) {
return null;
} else {
return String.valueOf(chars);
}
private static String parseChars(char[] chars) {
return chars == null ? null : String.valueOf(chars);
}
}

View File

@@ -30,7 +30,9 @@ public interface WriteConcernResolver {
/**
* Resolve the WriteConcern given the MongoAction
* @param action describes the context of the Mongo action. Contains a default WriteConcern to use if one should not be resolved.
*
* @param action describes the context of the Mongo action. Contains a default WriteConcern to use if one should not
* be resolved.
* @return a WriteConcern based on the passed in MongoAction value, maybe null
*/
WriteConcern resolve(MongoAction action);

View File

@@ -38,6 +38,8 @@ import org.springframework.data.mongodb.core.convert.MongoConverters.BigDecimalT
import org.springframework.data.mongodb.core.convert.MongoConverters.BigIntegerToStringConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToBigDecimalConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToBigIntegerConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToURLConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.URLToStringConverter;
import org.springframework.data.mongodb.core.mapping.MongoSimpleTypes;
import org.springframework.util.Assert;
@@ -89,6 +91,8 @@ public class CustomConversions {
this.converters.add(StringToBigDecimalConverter.INSTANCE);
this.converters.add(BigIntegerToStringConverter.INSTANCE);
this.converters.add(StringToBigIntegerConverter.INSTANCE);
this.converters.add(URLToStringConverter.INSTANCE);
this.converters.add(StringToURLConverter.INSTANCE);
this.converters.addAll(converters);
for (Object c : this.converters) {

View File

@@ -0,0 +1,55 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import java.util.Map;
import org.springframework.context.expression.MapAccessor;
import org.springframework.expression.EvaluationContext;
import org.springframework.expression.PropertyAccessor;
import org.springframework.expression.TypedValue;
import com.mongodb.DBObject;
/**
* {@link PropertyAccessor} to allow entity based field access to {@link DBObject}s.
*
* @author Oliver Gierke
*/
class DBObjectPropertyAccessor extends MapAccessor {
static MapAccessor INSTANCE = new DBObjectPropertyAccessor();
@Override
public Class<?>[] getSpecificTargetClasses() {
return new Class[] { DBObject.class };
}
@Override
public boolean canRead(EvaluationContext context, Object target, String name) {
return true;
}
@Override
@SuppressWarnings("unchecked")
public TypedValue read(EvaluationContext context, Object target, String name) {
Map<String, Object> source = (Map<String, Object>) target;
Object value = source.get(name);
return value == null ? TypedValue.NULL : new TypedValue(value);
}
}

View File

@@ -66,7 +66,6 @@ public class DefaultMongoTypeMapper extends DefaultTypeMapper<DBObject> implemen
this.typeKey = typeKey;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.MongoTypeMapper#isTypeKey(java.lang.String)
@@ -75,7 +74,6 @@ public class DefaultMongoTypeMapper extends DefaultTypeMapper<DBObject> implemen
return typeKey == null ? false : typeKey.equals(key);
}
/* (non-Javadoc)
* @see org.springframework.data.convert.DefaultTypeMapper#getFallbackTypeFor(java.lang.Object)
*/

View File

@@ -0,0 +1,180 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import java.util.HashSet;
import java.util.Set;
import org.springframework.data.annotation.PersistenceConstructor;
import org.springframework.data.mapping.PersistentEntity;
import org.springframework.data.mapping.PersistentProperty;
import org.springframework.data.mapping.PreferredConstructor;
import org.springframework.data.mapping.PreferredConstructor.Parameter;
import org.springframework.data.mapping.PropertyPath;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.context.PersistentPropertyPath;
import org.springframework.data.mapping.model.MappingException;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.util.TypeInformation;
import org.springframework.util.Assert;
/**
* Abstraction for a {@link PreferredConstructor} alongside mapping information.
*
* @author Oliver Gierke
*/
class MappedConstructor {
private static final String REJECT_CONSTRUCTOR = String.format("Entity doesn't have a usable constructor, either "
+ "provide a custom converter or annotate a constructor with @%s!", PersistenceConstructor.class.getSimpleName());
private final Set<MappedConstructor.MappedParameter> parameters;
/**
* Creates a new {@link MappedConstructor} from the given {@link MongoPersistentEntity} and {@link MappingContext}.
*
* @param entity must not be {@literal null}.
* @param context must not be {@literal null}.
* @throws MappingException in case the {@link MongoPersistentEntity} handed in does not have a
* {@link PreferredConstructor}.
*/
public MappedConstructor(MongoPersistentEntity<?> entity,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> context) throws MappingException {
Assert.notNull(entity);
Assert.notNull(context);
if (entity.getPreferredConstructor() == null) {
throw new MappingException(REJECT_CONSTRUCTOR);
}
this.parameters = new HashSet<MappedConstructor.MappedParameter>();
for (Parameter<?> parameter : entity.getPreferredConstructor().getParameters()) {
parameters.add(new MappedParameter(parameter, entity, context));
}
}
/**
* Returns whether the given {@link PersistentProperty} is referenced in a constructor argument of the
* {@link PersistentEntity} backing this {@link MappedConstructor}.
*
* @param property must not be {@literal null}.
* @return
*/
public boolean isConstructorParameter(PersistentProperty<?> property) {
Assert.notNull(property);
for (MappedConstructor.MappedParameter parameter : parameters) {
if (parameter.maps(property)) {
return true;
}
}
return false;
}
/**
* Returns the {@link MappedParameter} for the given {@link Parameter}.
*
* @param parameter must not be {@literal null}.
* @return
* @throws MappingException in case no {@link MappedParameter} can be found for the given {@link Parameter}.
*/
public MappedParameter getFor(Parameter<?> parameter) {
for (MappedParameter mappedParameter : parameters) {
if (mappedParameter.parameter.equals(parameter)) {
return mappedParameter;
}
}
throw new MappingException(String.format("Didn't find a MappedParameter for %s!", parameter.toString()));
}
/**
* Abstraction of a {@link Parameter} alongside mapping information.
*
* @author Oliver Gierke
*/
static class MappedParameter {
private final MongoPersistentProperty property;
private final Parameter<?> parameter;
/**
* Creates a new {@link MappedParameter} for the given {@link Parameter}, {@link MongoPersistentProperty} and
* {@link MappingContext}.
*
* @param parameter must not be {@literal null}.
* @param entity must not be {@literal null}.
* @param context must not be {@literal null}.
*/
public MappedParameter(Parameter<?> parameter, MongoPersistentEntity<?> entity,
MappingContext<? extends MongoPersistentEntity<?>, ? extends MongoPersistentProperty> context) {
Assert.notNull(parameter);
Assert.notNull(entity);
Assert.notNull(context);
this.parameter = parameter;
PropertyPath propertyPath = PropertyPath.from(parameter.getName(), entity.getType());
PersistentPropertyPath<? extends MongoPersistentProperty> path = context.getPersistentPropertyPath(propertyPath);
this.property = path == null ? null : path.getLeafProperty();
}
/**
* Returns whether there is a SpEL expression configured for this parameter.
*
* @return
*/
public boolean hasSpELExpression() {
return parameter.getKey() != null;
}
/**
* Returns the field name to be used to lookup the value which in turn shall be converted into the constructor
* parameter.
*
* @return
*/
public String getFieldName() {
return property.getFieldName();
}
/**
* Returns the type of the property backing the {@link Parameter}.
*
* @return
*/
public TypeInformation<?> getPropertyTypeInformation() {
return property.getTypeInformation();
}
/**
* Returns whether the given {@link PersistentProperty} is mapped by the parameter.
*
* @param property
* @return
*/
public boolean maps(PersistentProperty<?> property) {
return this.property.equals(property);
}
}
}

View File

@@ -37,7 +37,7 @@ import org.springframework.core.convert.support.ConversionServiceFactory;
import org.springframework.data.convert.TypeMapper;
import org.springframework.data.mapping.Association;
import org.springframework.data.mapping.AssociationHandler;
import org.springframework.data.mapping.PreferredConstructor;
import org.springframework.data.mapping.PreferredConstructor.Parameter;
import org.springframework.data.mapping.PropertyHandler;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.model.BeanWrapper;
@@ -81,6 +81,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
protected ApplicationContext applicationContext;
protected boolean useFieldAccessOnly = true;
protected MongoTypeMapper typeMapper;
protected String mapKeyDotReplacement = null;
/**
* Creates a new {@link MappingMongoConverter} given the new {@link MongoDbFactory} and {@link MappingContext}.
@@ -116,6 +117,18 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
mappingContext) : typeMapper;
}
/**
* Configure the characters dots potentially contained in a {@link Map} shall be replaced with. By default we don't do
* any translation but rather reject a {@link Map} with keys containing dots causing the conversion for the entire
* object to fail. If further customization of the translation is needed, have a look at
* {@link #potentiallyEscapeMapKey(String)} as well as {@link #potentiallyUnescapeMapKey(String)}.
*
* @param mapKeyDotReplacement the mapKeyDotReplacement to set
*/
public void setMapKeyDotReplacement(String mapKeyDotReplacement) {
this.mapKeyDotReplacement = mapKeyDotReplacement;
}
/*
* (non-Javadoc)
* @see org.springframework.data.convert.EntityConverter#getMappingContext()
@@ -189,52 +202,17 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
private <S extends Object> S read(final MongoPersistentEntity<S> entity, final DBObject dbo) {
final StandardEvaluationContext spelCtx = new StandardEvaluationContext();
if (null != applicationContext) {
final StandardEvaluationContext spelCtx = new StandardEvaluationContext(dbo);
spelCtx.addPropertyAccessor(DBObjectPropertyAccessor.INSTANCE);
if (applicationContext != null) {
spelCtx.setBeanResolver(new BeanFactoryResolver(applicationContext));
}
if (!(dbo instanceof BasicDBList)) {
String[] keySet = dbo.keySet().toArray(new String[dbo.keySet().size()]);
for (String key : keySet) {
spelCtx.setVariable(key, dbo.get(key));
}
}
final List<String> ctorParamNames = new ArrayList<String>();
final MongoPersistentProperty idProperty = entity.getIdProperty();
final MappedConstructor constructor = new MappedConstructor(entity, mappingContext);
ParameterValueProvider provider = new SpELAwareParameterValueProvider(spelExpressionParser, spelCtx) {
@Override
@SuppressWarnings("unchecked")
public <T> T getParameterValue(PreferredConstructor.Parameter<T> parameter) {
if (parameter.getKey() != null) {
return super.getParameterValue(parameter);
}
String name = parameter.getName();
TypeInformation<T> type = parameter.getType();
Class<T> rawType = parameter.getRawType();
String key = idProperty == null ? name : idProperty.getName().equals(name) ? idProperty.getFieldName() : name;
Object obj = dbo.get(key);
ctorParamNames.add(name);
if (obj instanceof DBRef) {
return read(type, ((DBRef) obj).fetch());
} else if (obj instanceof BasicDBList) {
BasicDBList objAsDbList = (BasicDBList) obj;
return conversionService.convert(readCollectionOrArray(type, objAsDbList), rawType);
} else if (obj instanceof DBObject) {
return read(type, ((DBObject) obj));
} else if (null != obj && obj.getClass().isAssignableFrom(rawType)) {
return (T) obj;
} else if (null != obj) {
return conversionService.convert(obj, rawType);
}
return null;
}
};
SpELAwareParameterValueProvider delegate = new SpELAwareParameterValueProvider(spelExpressionParser, spelCtx);
ParameterValueProvider provider = new DelegatingParameterValueProvider(constructor, dbo, delegate);
final BeanWrapper<MongoPersistentEntity<S>, S> wrapper = BeanWrapper.create(entity, provider, conversionService);
@@ -242,7 +220,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
entity.doWithProperties(new PropertyHandler<MongoPersistentProperty>() {
public void doWithPersistentProperty(MongoPersistentProperty prop) {
boolean isConstructorProperty = ctorParamNames.contains(prop.getName());
boolean isConstructorProperty = constructor.isConstructorParameter(prop);
boolean hasValueForProperty = dbo.containsField(prop.getFieldName());
if (!hasValueForProperty || isConstructorProperty) {
@@ -461,7 +439,6 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
*
* @param collection must not be {@literal null}.
* @param property must not be {@literal null}.
*
* @return
*/
protected DBObject createCollection(Collection<?> collection, MongoPersistentProperty property) {
@@ -499,13 +476,9 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
for (Object element : source) {
if (element == null) {
continue;
}
Class<?> elementType = element == null ? null : element.getClass();
Class<?> elementType = element.getClass();
if (conversions.isSimpleType(elementType)) {
if (elementType == null || conversions.isSimpleType(elementType)) {
sink.add(getPotentiallyConvertedSimpleWrite(element));
} else if (element instanceof Collection || elementType.isArray()) {
sink.add(writeCollectionInternal(asCollection(element), componentType, new BasicDBList()));
@@ -535,7 +508,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
if (conversions.isSimpleType(key.getClass())) {
// Don't use conversion service here as removal of ObjectToString converter results in some primitive types not
// being convertable
String simpleKey = key.toString();
String simpleKey = potentiallyEscapeMapKey(key.toString());
if (val == null || conversions.isSimpleType(val.getClass())) {
writeSimpleInternal(val, dbo, simpleKey);
} else if (val instanceof Collection || val.getClass().isArray()) {
@@ -543,7 +516,8 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
writeCollectionInternal(asCollection(val), propertyType.getMapValueType(), new BasicDBList()));
} else {
DBObject newDbo = new BasicDBObject();
TypeInformation<?> valueTypeInfo = propertyType.isMap() ? propertyType.getMapValueType() : ClassTypeInformation.OBJECT;
TypeInformation<?> valueTypeInfo = propertyType.isMap() ? propertyType.getMapValueType()
: ClassTypeInformation.OBJECT;
writeInternal(val, newDbo, valueTypeInfo);
dbo.put(simpleKey, newDbo);
}
@@ -555,6 +529,39 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return dbo;
}
/**
* Potentially replaces dots in the given map key with the configured map key replacement if configured or aborts
* conversion if none is configured.
*
* @see #setMapKeyDotReplacement(String)
* @param source
* @return
*/
protected String potentiallyEscapeMapKey(String source) {
if (!source.contains(".")) {
return source;
}
if (mapKeyDotReplacement == null) {
throw new MappingException(String.format("Map key %s contains dots but no replacement was configured! Make "
+ "sure map keys don't contain dots in the first place or configure an appropriate replacement!", source));
}
return source.replaceAll("\\.", mapKeyDotReplacement);
}
/**
* Translates the map key replacements in the given key just read with a dot in case a map key replacement has been
* configured.
*
* @param source
* @return
*/
protected String potentiallyUnescapeMapKey(String source) {
return mapKeyDotReplacement == null ? source : source.replaceAll(mapKeyDotReplacement, "\\.");
}
/**
* Adds custom type information to the given {@link DBObject} if necessary. That is if the value is not the same as
* the one given. This is usually the case if you store a subtype of the actual declared type of the property.
@@ -696,7 +703,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
(BasicDBList) sourceValue);
}
TypeInformation<?> toType = typeMapper.readType((DBObject) sourceValue);
TypeInformation<?> toType = typeMapper.readType((DBObject) sourceValue, prop.getTypeInformation());
// It's a complex object, have to read it in
if (toType != null) {
@@ -721,12 +728,15 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* @return the converted {@link Collections}, will never be {@literal null}.
*/
@SuppressWarnings("unchecked")
private Collection<?> readCollectionOrArray(TypeInformation<?> targetType, BasicDBList sourceValue) {
private Object readCollectionOrArray(TypeInformation<?> targetType, BasicDBList sourceValue) {
Assert.notNull(targetType);
Class<?> collectionType = targetType.getType();
collectionType = Collection.class.isAssignableFrom(collectionType) ? collectionType : List.class;
Collection<Object> items = targetType.getType().isArray() ? new ArrayList<Object>() : CollectionFactory
.createCollection(targetType.getType(), sourceValue.size());
.createCollection(collectionType, sourceValue.size());
for (int i = 0; i < sourceValue.size(); i++) {
Object dbObjItem = sourceValue.get(i);
@@ -735,11 +745,12 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
} else if (dbObjItem instanceof DBObject) {
items.add(read(targetType.getComponentType(), (DBObject) dbObjItem));
} else {
items.add(getPotentiallyConvertedSimpleRead(dbObjItem, targetType.getComponentType().getType()));
TypeInformation<?> componentType = targetType.getComponentType();
items.add(getPotentiallyConvertedSimpleRead(dbObjItem, componentType == null ? null : componentType.getType()));
}
}
return items;
return getPotentiallyConvertedSimpleRead(items, targetType.getType());
}
/**
@@ -763,12 +774,12 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
continue;
}
Object key = entry.getKey();
Object key = potentiallyUnescapeMapKey(entry.getKey());
TypeInformation<?> keyTypeInformation = type.getComponentType();
if (keyTypeInformation != null) {
Class<?> keyType = keyTypeInformation.getType();
key = conversionService.convert(entry.getKey(), keyType);
key = conversionService.convert(key, keyType);
}
Object value = entry.getValue();
@@ -807,7 +818,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return null;
}
Class<?> target = conversions.getCustomWriteTarget(getClass());
Class<?> target = conversions.getCustomWriteTarget(obj.getClass());
if (target != null) {
return conversionService.convert(obj, target);
}
@@ -838,14 +849,14 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return result;
}
if (obj instanceof List) {
return maybeConvertList((List<?>) obj);
}
if (obj.getClass().isArray()) {
return maybeConvertList(Arrays.asList((Object[]) obj));
}
if (obj instanceof Collection) {
return maybeConvertList((Collection<?>) obj);
}
DBObject newDbo = new BasicDBObject();
this.write(obj, newDbo);
return removeTypeInfoRecursively(newDbo);
@@ -895,4 +906,55 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return dbObject;
}
private class DelegatingParameterValueProvider implements ParameterValueProvider {
private final DBObject source;
private final ParameterValueProvider delegate;
private final MappedConstructor constructor;
/**
* {@link ParameterValueProvider} to delegate source object lookup to a {@link SpELAwareParameterValueProvider} in
* case a MappCon
*
* @param constructor must not be {@literal null}.
* @param source must not be {@literal null}.
* @param delegate must not be {@literal null}.
*/
public DelegatingParameterValueProvider(MappedConstructor constructor, DBObject source,
SpELAwareParameterValueProvider delegate) {
Assert.notNull(constructor);
Assert.notNull(source);
Assert.notNull(delegate);
this.constructor = constructor;
this.source = source;
this.delegate = delegate;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mapping.model.ParameterValueProvider#getParameterValue(org.springframework.data.mapping.PreferredConstructor.Parameter)
*/
@SuppressWarnings("unchecked")
public <T> T getParameterValue(Parameter<T> parameter) {
MappedConstructor.MappedParameter mappedParameter = constructor.getFor(parameter);
Object value = mappedParameter.hasSpELExpression() ? delegate.getParameterValue(parameter) : source
.get(mappedParameter.getFieldName());
TypeInformation<?> type = mappedParameter.getPropertyTypeInformation();
if (value instanceof DBRef) {
return (T) read(type, ((DBRef) value).fetch());
} else if (value instanceof BasicDBList) {
return (T) getPotentiallyConvertedSimpleRead(readCollectionOrArray(type, (BasicDBList) value), type.getType());
} else if (value instanceof DBObject) {
return (T) read(type, (DBObject) value);
} else {
return (T) getPotentiallyConvertedSimpleRead(value, type.getType());
}
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -17,8 +17,12 @@ package org.springframework.data.mongodb.core.convert;
import java.math.BigDecimal;
import java.math.BigInteger;
import java.net.MalformedURLException;
import java.net.URL;
import org.bson.types.ObjectId;
import org.springframework.core.convert.ConversionFailedException;
import org.springframework.core.convert.TypeDescriptor;
import org.springframework.core.convert.converter.Converter;
import org.springframework.util.StringUtils;
@@ -119,4 +123,28 @@ abstract class MongoConverters {
return StringUtils.hasText(source) ? new BigInteger(source) : null;
}
}
public static enum URLToStringConverter implements Converter<URL, String> {
INSTANCE;
public String convert(URL source) {
return source == null ? null : source.toString();
}
}
public static enum StringToURLConverter implements Converter<String, URL> {
INSTANCE;
private static final TypeDescriptor SOURCE = TypeDescriptor.valueOf(String.class);
private static final TypeDescriptor TARGET = TypeDescriptor.valueOf(URL.class);
public URL convert(String source) {
try {
return source == null ? null : new URL(source);
} catch (MalformedURLException e) {
throw new ConversionFailedException(SOURCE, TARGET, source, e);
}
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -13,9 +13,9 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.index;
import java.lang.annotation.Documented;
import java.lang.annotation.ElementType;
import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;
@@ -24,14 +24,29 @@ import java.lang.annotation.Target;
/**
* Mark a class to use compound indexes.
*
* @author Jon Brisbin <jbrisbin@vmware.com>
* @author Jon Brisbin
* @author Oliver Gierke
*/
@Target({ ElementType.TYPE })
@Documented
@Retention(RetentionPolicy.RUNTIME)
public @interface CompoundIndex {
/**
* The actual index definition in JSON format. The keys of the JSON document are the fields to be indexed, the values
* define the index direction (1 for ascending, -1 for descending).
*
* @return
*/
String def();
/**
* It does not actually make sense to use that attribute as the direction has to be defined in the {@link #def()}
* attribute actually.
*
* @return
*/
@Deprecated
IndexDirection direction() default IndexDirection.ASCENDING;
boolean unique() default false;
@@ -40,8 +55,18 @@ public @interface CompoundIndex {
boolean dropDups() default false;
/**
* The name of the index to be created.
*
* @return
*/
String name() default "";
/**
* The collection the index will be created in. Will default to the collection the annotated domain class will be
* stored in.
*
* @return
*/
String collection() default "";
}

View File

@@ -0,0 +1,132 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.index;
import org.springframework.data.mongodb.core.query.Order;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
/**
* Value object for an index field.
*
* @author Oliver Gierke
*/
public final class IndexField {
private final String key;
private final Order order;
private final boolean isGeo;
private IndexField(String key, Order order, boolean isGeo) {
Assert.hasText(key);
Assert.isTrue(order != null ^ isGeo);
this.key = key;
this.order = order;
this.isGeo = isGeo;
}
/**
* Creates a default {@link IndexField} with the given key and {@link Order}.
*
* @param key must not be {@literal null} or emtpy.
* @param order must not be {@literal null}.
* @return
*/
public static IndexField create(String key, Order order) {
Assert.notNull(order);
return new IndexField(key, order, false);
}
/**
* Creates a geo {@link IndexField} for the given key.
*
* @param key must not be {@literal null} or empty.
* @return
*/
public static IndexField geo(String key) {
return new IndexField(key, null, true);
}
/**
* @return the key
*/
public String getKey() {
return key;
}
/**
* Returns the order of the {@link IndexField} or {@literal null} in case we have a geo index field.
*
* @return the order
*/
public Order getOrder() {
return order;
}
/**
* Returns whether the {@link IndexField} is a geo index field.
*
* @return the isGeo
*/
public boolean isGeo() {
return isGeo;
}
/*
* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (!(obj instanceof IndexField)) {
return false;
}
IndexField that = (IndexField) obj;
return this.key.equals(that.key) && ObjectUtils.nullSafeEquals(this.order, that.order) && this.isGeo == that.isGeo;
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
int result = 17;
result += 31 * ObjectUtils.nullSafeHashCode(key);
result += 31 * ObjectUtils.nullSafeHashCode(order);
result += 31 * ObjectUtils.nullSafeHashCode(isGeo);
return result;
}
/*
* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
return String.format("IndexField [ key: %s, order: %s, isGeo: %s]", key, order, isGeo);
}
}

View File

@@ -15,36 +15,52 @@
*/
package org.springframework.data.mongodb.core.index;
import java.util.Collections;
import java.util.List;
import java.util.Map;
import org.springframework.data.mongodb.core.query.Order;
import org.springframework.util.ObjectUtils;
public class IndexInfo {
private final Map<String, Order> fieldSpec;
private final List<IndexField> indexFields;
private String name;
private final String name;
private final boolean unique;
private final boolean dropDuplicates;
private final boolean sparse;
private boolean unique = false;
public IndexInfo(Map<String, Order> fieldSpec, List<IndexField> indexFields, String name, boolean unique,
boolean dropDuplicates, boolean sparse) {
private boolean dropDuplicates = false;
private boolean sparse = false;
public IndexInfo(Map<String, Order> fieldSpec, String name, boolean unique, boolean dropDuplicates, boolean sparse) {
super();
this.fieldSpec = fieldSpec;
this.indexFields = Collections.unmodifiableList(indexFields);
this.name = name;
this.unique = unique;
this.dropDuplicates = dropDuplicates;
this.sparse = sparse;
}
/**
* @deprecated use {@link #getIndexFields()} instead as this {@link Map} does not contain geo indexes.
* @return
*/
@Deprecated
public Map<String, Order> getFieldSpec() {
return fieldSpec;
}
/**
* Returns the individual index fields of the index.
*
* @return
*/
public List<IndexField> getIndexFields() {
return this.indexFields;
}
public String getName() {
return name;
}
@@ -63,7 +79,7 @@ public class IndexInfo {
@Override
public String toString() {
return "IndexInfo [fieldSpec=" + fieldSpec + ", name=" + name + ", unique=" + unique + ", dropDuplicates="
return "IndexInfo [indexFields=" + indexFields + ", name=" + name + ", unique=" + unique + ", dropDuplicates="
+ dropDuplicates + ", sparse=" + sparse + "]";
}
@@ -72,7 +88,7 @@ public class IndexInfo {
final int prime = 31;
int result = 1;
result = prime * result + (dropDuplicates ? 1231 : 1237);
result = prime * result + ((fieldSpec == null) ? 0 : fieldSpec.hashCode());
result = prime * result + ObjectUtils.nullSafeHashCode(indexFields);
result = prime * result + ((name == null) ? 0 : name.hashCode());
result = prime * result + (sparse ? 1231 : 1237);
result = prime * result + (unique ? 1231 : 1237);
@@ -81,35 +97,39 @@ public class IndexInfo {
@Override
public boolean equals(Object obj) {
if (this == obj)
if (this == obj) {
return true;
if (obj == null)
}
if (obj == null) {
return false;
if (getClass() != obj.getClass())
}
if (getClass() != obj.getClass()) {
return false;
}
IndexInfo other = (IndexInfo) obj;
if (dropDuplicates != other.dropDuplicates)
if (dropDuplicates != other.dropDuplicates) {
return false;
if (fieldSpec == null) {
if (other.fieldSpec != null)
}
if (indexFields == null) {
if (other.indexFields != null) {
return false;
} else if (!fieldSpec.equals(other.fieldSpec))
}
} else if (!indexFields.equals(other.indexFields)) {
return false;
}
if (name == null) {
if (other.name != null)
if (other.name != null) {
return false;
} else if (!name.equals(other.name))
}
} else if (!name.equals(other.name)) {
return false;
if (sparse != other.sparse)
}
if (sparse != other.sparse) {
return false;
if (unique != other.unique)
}
if (unique != other.unique) {
return false;
}
return true;
}
/**
* [{ "v" : 1 , "key" : { "_id" : 1} , "ns" : "database.person" , "name" : "_id_"},
{ "v" : 1 , "key" : { "age" : -1} , "ns" : "database.person" , "name" : "age_-1" , "unique" : true , "dropDups" : true}]
*/
}

View File

@@ -23,8 +23,8 @@ import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
/**
* An implementation of ApplicationEventPublisher that will only fire MappingContextEvents for use by the index creator when
* MongoTemplate is used 'stand-alone', that is not declared inside a Spring ApplicationContext.
* An implementation of ApplicationEventPublisher that will only fire MappingContextEvents for use by the index creator
* when MongoTemplate is used 'stand-alone', that is not declared inside a Spring ApplicationContext.
*
* Declare MongoTemplate inside an ApplicationContext to enable the publishing of all persistence events such as
* {@link AfterLoadEvent}, {@link AfterSaveEvent}, etc.

View File

@@ -1,5 +1,5 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -13,7 +13,6 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.index;
import java.lang.reflect.Field;
@@ -23,10 +22,10 @@ import java.util.concurrent.ConcurrentHashMap;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.springframework.context.ApplicationListener;
import org.springframework.data.mapping.PersistentEntity;
import org.springframework.data.mapping.PropertyHandler;
import org.springframework.data.mapping.event.MappingContextEvent;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.mapping.BasicMongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
@@ -38,10 +37,10 @@ import com.mongodb.DBObject;
import com.mongodb.util.JSON;
/**
* Component that inspects {@link BasicMongoPersistentEntity} instances contained in the given
* {@link MongoMappingContext} for indexing metadata and ensures the indexes to be available.
* Component that inspects {@link MongoPersistentEntity} instances contained in the given {@link MongoMappingContext}
* for indexing metadata and ensures the indexes to be available.
*
* @author Jon Brisbin <jbrisbin@vmware.com>
* @author Jon Brisbin
* @author Oliver Gierke
*/
public class MongoPersistentEntityIndexCreator implements
@@ -76,8 +75,14 @@ public class MongoPersistentEntityIndexCreator implements
*/
public void onApplicationEvent(
MappingContextEvent<MongoPersistentEntity<MongoPersistentProperty>, MongoPersistentProperty> event) {
PersistentEntity<?, ?> entity = event.getPersistentEntity();
// Double check type as Spring infrastructure does not consider nested generics
if (entity instanceof MongoPersistentEntity) {
checkForIndexes(event.getPersistentEntity());
}
}
protected void checkForIndexes(final MongoPersistentEntity<?> entity) {
final Class<?> type = entity.getType();
@@ -90,12 +95,12 @@ public class MongoPersistentEntityIndexCreator implements
if (type.isAnnotationPresent(CompoundIndexes.class)) {
CompoundIndexes indexes = type.getAnnotation(CompoundIndexes.class);
for (CompoundIndex index : indexes.value()) {
String indexColl = index.collection();
if ("".equals(indexColl)) {
indexColl = entity.getCollection();
}
ensureIndex(indexColl, index.name(), index.def(), index.direction(), index.unique(), index.dropDups(),
index.sparse());
String indexColl = StringUtils.hasText(index.collection()) ? index.collection() : entity.getCollection();
DBObject definition = (DBObject) JSON.parse(index.def());
ensureIndex(indexColl, index.name(), definition, index.unique(), index.dropDups(), index.sparse());
if (log.isDebugEnabled()) {
log.debug("Created compound index " + index);
}
@@ -104,10 +109,14 @@ public class MongoPersistentEntityIndexCreator implements
entity.doWithProperties(new PropertyHandler<MongoPersistentProperty>() {
public void doWithPersistentProperty(MongoPersistentProperty persistentProperty) {
Field field = persistentProperty.getField();
if (field.isAnnotationPresent(Indexed.class)) {
Indexed index = field.getAnnotation(Indexed.class);
String name = index.name();
if (!StringUtils.hasText(name)) {
name = persistentProperty.getFieldName();
} else {
@@ -119,11 +128,17 @@ public class MongoPersistentEntityIndexCreator implements
}
}
}
String collection = StringUtils.hasText(index.collection()) ? index.collection() : entity.getCollection();
ensureIndex(collection, name, null, index.direction(), index.unique(), index.dropDups(), index.sparse());
int direction = index.direction() == IndexDirection.ASCENDING ? 1 : -1;
DBObject definition = new BasicDBObject(persistentProperty.getFieldName(), direction);
ensureIndex(collection, name, definition, index.unique(), index.dropDups(), index.sparse());
if (log.isDebugEnabled()) {
log.debug("Created property index " + index);
}
} else if (field.isAnnotationPresent(GeoSpatialIndexed.class)) {
GeoSpatialIndexed index = field.getAnnotation(GeoSpatialIndexed.class);
@@ -148,21 +163,15 @@ public class MongoPersistentEntityIndexCreator implements
}
}
protected void ensureIndex(String collection, final String name, final String def, final IndexDirection direction,
final boolean unique, final boolean dropDups, final boolean sparse) {
DBObject defObj;
if (null != def) {
defObj = (DBObject) JSON.parse(def);
} else {
defObj = new BasicDBObject();
defObj.put(name, (direction == IndexDirection.ASCENDING ? 1 : -1));
}
protected void ensureIndex(String collection, String name, DBObject indexDefinition, boolean unique,
boolean dropDups, boolean sparse) {
DBObject opts = new BasicDBObject();
opts.put("name", name);
opts.put("dropDups", dropDups);
opts.put("sparse", sparse);
opts.put("unique", unique);
mongoDbFactory.getDb().getCollection(collection).ensureIndex(defObj, opts);
}
mongoDbFactory.getDb().getCollection(collection).ensureIndex(indexDefinition, opts);
}
}

View File

@@ -30,7 +30,8 @@ import org.springframework.data.util.TypeInformation;
* @author Jon Brisbin <jbrisbin@vmware.com>
* @author Oliver Gierke ogierke@vmware.com
*/
public class MongoMappingContext extends AbstractMappingContext<BasicMongoPersistentEntity<?>, MongoPersistentProperty> implements ApplicationContextAware {
public class MongoMappingContext extends AbstractMappingContext<BasicMongoPersistentEntity<?>, MongoPersistentProperty>
implements ApplicationContextAware {
private ApplicationContext context;

View File

@@ -19,8 +19,10 @@ import java.math.BigInteger;
import java.util.Collections;
import java.util.HashSet;
import java.util.Set;
import java.util.UUID;
import java.util.regex.Pattern;
import org.bson.types.Binary;
import org.bson.types.CodeWScope;
import org.bson.types.ObjectId;
import org.springframework.data.mapping.model.SimpleTypeHolder;
@@ -50,6 +52,8 @@ public abstract class MongoSimpleTypes {
simpleTypes.add(CodeWScope.class);
simpleTypes.add(DBObject.class);
simpleTypes.add(Pattern.class);
simpleTypes.add(Binary.class);
simpleTypes.add(UUID.class);
MONGO_SIMPLE_TYPES = Collections.unmodifiableSet(simpleTypes);
}

View File

@@ -19,24 +19,19 @@ import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Collects the parameters required to perform a group operation on a collection. The query condition and the input collection are specified on the group method as method arguments
* to be consistent with other operations, e.g. map-reduce.
* Collects the parameters required to perform a group operation on a collection. The query condition and the input
* collection are specified on the group method as method arguments to be consistent with other operations, e.g.
* map-reduce.
*
* @author Mark Pollack
*
*/
public class GroupBy {
private DBObject dboKeys;
private String keyFunction;
private String initial;
private DBObject initialDbObject;
private String reduce;
private String finalize;
public GroupBy(String... keys) {
@@ -87,8 +82,6 @@ public class GroupBy {
return this;
}
public DBObject getGroupByObject() {
// return new GroupCommand(dbCollection, dboKeys, condition, initial, reduce, finalize);
BasicDBObject dbo = new BasicDBObject();
@@ -111,8 +104,4 @@ public class GroupBy {
return dbo;
}
}

View File

@@ -26,19 +26,15 @@ import com.mongodb.DBObject;
* Collects the results of executing a group operation.
*
* @author Mark Pollack
*
* @param <T> The class in which the results are mapped onto, accessible via an interator.
*/
public class GroupByResults<T> implements Iterable<T> {
private final List<T> mappedResults;
private DBObject rawResults;
private final DBObject rawResults;
private double count;
private int keys;
private String serverUsed;
public GroupByResults(List<T> mappedResults, DBObject rawResults) {

View File

@@ -15,13 +15,14 @@
*/
package org.springframework.data.mongodb.core.mapreduce;
/**
* @author Mark Pollack
*/
public class MapReduceCounts {
private int inputCount;
private int emitCount;
private int outputCount;
private final int inputCount;
private final int emitCount;
private final int outputCount;
public MapReduceCounts(int inputCount, int emitCount, int outputCount) {
super();
@@ -42,12 +43,20 @@ public class MapReduceCounts {
return outputCount;
}
/*
* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
return "MapReduceCounts [inputCount=" + inputCount + ", emitCount=" + emitCount + ", outputCount=" + outputCount
+ "]";
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
final int prime = 31;
@@ -58,24 +67,31 @@ public class MapReduceCounts {
return result;
}
/*
* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (this == obj)
if (this == obj) {
return true;
if (obj == null)
}
if (obj == null) {
return false;
if (getClass() != obj.getClass())
}
if (getClass() != obj.getClass()) {
return false;
}
MapReduceCounts other = (MapReduceCounts) obj;
if (emitCount != other.emitCount)
if (emitCount != other.emitCount) {
return false;
if (inputCount != other.inputCount)
}
if (inputCount != other.inputCount) {
return false;
if (outputCount != other.outputCount)
}
if (outputCount != other.outputCount) {
return false;
}
return true;
}
}

View File

@@ -42,7 +42,6 @@ public class MapReduceOptions {
private Map<String, Object> extraOptions = new HashMap<String, Object>();
/**
* Static factory method to create a MapReduceOptions instance
*
@@ -191,10 +190,10 @@ public class MapReduceOptions {
}
/**
* Add additional extra options that may not have a method on this class. This method will help if you use a
* version of this client library with a server version that has added additional map-reduce options that do not
* yet have an method for use in setting them.
* options
* Add additional extra options that may not have a method on this class. This method will help if you use a version
* of this client library with a server version that has added additional map-reduce options that do not yet have an
* method for use in setting them. options
*
* @param key The key option
* @param value The value of the option
* @return MapReduceOptions so that methods can be chained in a fluent API style
@@ -236,7 +235,6 @@ public class MapReduceOptions {
return this.scopeVariables;
}
public DBObject getOptionsObject() {
BasicDBObject cmd = new BasicDBObject();

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -24,34 +24,41 @@ import com.mongodb.DBObject;
/**
* Collects the results of performing a MapReduce operations.
* @author Mark Pollack
*
* @param <T> The class in which the results are mapped onto, accessible via an interator.
* @author Mark Pollack
* @author Oliver Gierke
* @param <T> The class in which the results are mapped onto, accessible via an iterator.
*/
public class MapReduceResults<T> implements Iterable<T> {
private final List<T> mappedResults;
private final DBObject rawResults;
private final String outputCollection;
private final MapReduceTiming mapReduceTiming;
private final MapReduceCounts mapReduceCounts;
private DBObject rawResults;
private MapReduceTiming mapReduceTiming;
private MapReduceCounts mapReduceCounts;
private String outputCollection;
/**
* Creates a new {@link MapReduceResults} from the given mapped results and the raw one.
*
* @param mappedResults must not be {@literal null}.
* @param rawResults must not be {@literal null}.
*/
public MapReduceResults(List<T> mappedResults, DBObject rawResults) {
Assert.notNull(mappedResults);
Assert.notNull(rawResults);
this.mappedResults = mappedResults;
this.rawResults = rawResults;
parseTiming(rawResults);
parseCounts(rawResults);
if (rawResults.get("result") != null) {
this.outputCollection = (String) rawResults.get("result");
}
this.mapReduceTiming = parseTiming(rawResults);
this.mapReduceCounts = parseCounts(rawResults);
this.outputCollection = parseOutputCollection(rawResults);
}
/*
* (non-Javadoc)
* @see java.lang.Iterable#iterator()
*/
public Iterator<T> iterator() {
return mappedResults.iterator();
}
@@ -72,29 +79,71 @@ public class MapReduceResults<T> implements Iterable<T> {
return rawResults;
}
protected void parseTiming(DBObject rawResults) {
private MapReduceTiming parseTiming(DBObject rawResults) {
DBObject timing = (DBObject) rawResults.get("timing");
if (timing != null) {
if (timing == null) {
return new MapReduceTiming(-1, -1, -1);
}
if (timing.get("mapTime") != null && timing.get("emitLoop") != null && timing.get("total") != null) {
mapReduceTiming = new MapReduceTiming( (Long)timing.get("mapTime"),
(Integer)timing.get("emitLoop"),
(Integer)timing.get("total"));
}
} else {
mapReduceTiming = new MapReduceTiming(-1,-1,-1);
}
return new MapReduceTiming(getAsLong(timing, "mapTime"), getAsLong(timing, "emitLoop"),
getAsLong(timing, "total"));
}
return new MapReduceTiming(-1, -1, -1);
}
/**
* Returns the value of the source's field with the given key as {@link Long}.
*
* @param source
* @param key
* @return
*/
private Long getAsLong(DBObject source, String key) {
Object raw = source.get(key);
return raw instanceof Long ? (Long) raw : (Integer) raw;
}
/**
* Parses the raw {@link DBObject} result into a {@link MapReduceCounts} value object.
*
* @param rawResults
* @return
*/
private MapReduceCounts parseCounts(DBObject rawResults) {
protected void parseCounts(DBObject rawResults) {
DBObject counts = (DBObject) rawResults.get("counts");
if (counts != null) {
if (counts.get("input") != null && counts.get("emit") != null && counts.get("output") != null) {
mapReduceCounts = new MapReduceCounts( (Integer)counts.get("input"), (Integer)counts.get("emit"), (Integer)counts.get("output"));
}
} else {
mapReduceCounts = new MapReduceCounts(-1,-1,-1);
}
if (counts == null) {
return new MapReduceCounts(-1, -1, -1);
}
if (counts.get("input") != null && counts.get("emit") != null && counts.get("output") != null) {
return new MapReduceCounts((Integer) counts.get("input"), (Integer) counts.get("emit"),
(Integer) counts.get("output"));
}
return new MapReduceCounts(-1, -1, -1);
}
/**
* Parses the output collection from the raw {@link DBObject} result.
*
* @param rawResults
* @return
*/
private String parseOutputCollection(DBObject rawResults) {
Object resultField = rawResults.get("result");
if (resultField == null) {
return null;
}
return resultField instanceof DBObject ? ((DBObject) resultField).get("collection").toString() : resultField
.toString();
}
}

View File

@@ -74,7 +74,4 @@ public class MapReduceTiming {
return true;
}
}

View File

@@ -15,19 +15,23 @@
*/
package org.springframework.data.mongodb.core.query;
import static org.springframework.util.ObjectUtils.*;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.regex.Pattern;
import org.bson.BSON;
import org.bson.types.BasicBSONList;
import org.springframework.data.mongodb.InvalidMongoDbApiUsageException;
import org.springframework.data.mongodb.core.geo.Circle;
import org.springframework.data.mongodb.core.geo.Point;
import org.springframework.data.mongodb.core.geo.Shape;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
import org.springframework.util.ObjectUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
@@ -97,13 +101,17 @@ public class Criteria implements CriteriaDefinition {
throw new InvalidMongoDbApiUsageException(
"Multiple 'is' values declared. You need to use 'and' with multiple criteria");
}
if (this.criteria.size() > 0 && "$not".equals(this.criteria.keySet().toArray()[this.criteria.size() - 1])) {
if (lastOperatorWasNot()) {
throw new InvalidMongoDbApiUsageException("Invalid query: 'not' can't be used with 'is' - use 'ne' instead.");
}
this.isValue = o;
return this;
}
private boolean lastOperatorWasNot() {
return this.criteria.size() > 0 && "$not".equals(this.criteria.keySet().toArray()[this.criteria.size() - 1]);
}
/**
* Creates a criterion using the $ne operator
*
@@ -269,7 +277,11 @@ public class Criteria implements CriteriaDefinition {
* @return
*/
public Criteria not() {
criteria.put("$not", null);
return not(null);
}
private Criteria not(Object value) {
criteria.put("$not", value);
return this;
}
@@ -280,8 +292,7 @@ public class Criteria implements CriteriaDefinition {
* @return
*/
public Criteria regex(String re) {
criteria.put("$regex", re);
return this;
return regex(re, null);
}
/**
@@ -292,13 +303,32 @@ public class Criteria implements CriteriaDefinition {
* @return
*/
public Criteria regex(String re, String options) {
criteria.put("$regex", re);
if (StringUtils.hasText(options)) {
criteria.put("$options", options);
return regex(toPattern(re, options));
}
/**
* Syntactical sugar for {@link #is(Object)} making obvious that we create a regex predicate.
*
* @param pattern
* @return
*/
public Criteria regex(Pattern pattern) {
Assert.notNull(pattern);
if (lastOperatorWasNot()) {
return not(pattern);
}
this.isValue = pattern;
return this;
}
private Pattern toPattern(String regex, String options) {
Assert.notNull(regex);
return Pattern.compile(regex, options == null ? 0 : BSON.regexFlags(options));
}
/**
* Creates a geospatial criterion using a $within $center operation. This is only available for Mongo 1.7 and higher.
*
@@ -397,16 +427,13 @@ public class Criteria implements CriteriaDefinition {
return this;
}
public String getKey() {
return this.key;
}
/*
* (non-Javadoc)
*
* @see org.springframework.datastore.document.mongodb.query.Criteria#
* getCriteriaObject(java.lang.String)
* @see org.springframework.data.mongodb.core.query.CriteriaDefinition#getCriteriaObject()
*/
public DBObject getCriteriaObject() {
if (this.criteriaChain.size() == 1) {
@@ -427,16 +454,17 @@ public class Criteria implements CriteriaDefinition {
DBObject dbo = new BasicDBObject();
boolean not = false;
for (String k : this.criteria.keySet()) {
Object value = this.criteria.get(k);
if (not) {
DBObject notDbo = new BasicDBObject();
notDbo.put(k, this.criteria.get(k));
notDbo.put(k, value);
dbo.put("$not", notDbo);
not = false;
} else {
if ("$not".equals(k)) {
if ("$not".equals(k) && value == null) {
not = true;
} else {
dbo.put(k, this.criteria.get(k));
dbo.put(k, value);
}
}
}
@@ -462,12 +490,89 @@ public class Criteria implements CriteriaDefinition {
Object existing = dbo.get(key);
if (existing == null) {
dbo.put(key, value);
}
else {
throw new InvalidMongoDbApiUsageException("Due to limitations of the com.mongodb.BasicDBObject, " +
"you can't add a second '" + key + "' expression specified as '" + key + " : " + value + "'. " +
"Criteria already contains '" + key + " : " + existing + "'.");
} else {
throw new InvalidMongoDbApiUsageException("Due to limitations of the com.mongodb.BasicDBObject, "
+ "you can't add a second '" + key + "' expression specified as '" + key + " : " + value + "'. "
+ "Criteria already contains '" + key + " : " + existing + "'.");
}
}
/*
* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (obj == null || !getClass().equals(obj.getClass())) {
return false;
}
Criteria that = (Criteria) obj;
if (this.criteriaChain.size() != that.criteriaChain.size()) {
return false;
}
for (int i = 0; i < this.criteriaChain.size(); i++) {
Criteria left = this.criteriaChain.get(i);
Criteria right = that.criteriaChain.get(i);
if (!simpleCriteriaEquals(left, right)) {
return false;
}
}
return true;
}
private boolean simpleCriteriaEquals(Criteria left, Criteria right) {
boolean keyEqual = left.key == null ? right.key == null : left.key.equals(right.key);
boolean criteriaEqual = left.criteria.equals(right.criteria);
boolean valueEqual = isEqual(left.isValue, right.isValue);
return keyEqual && criteriaEqual && valueEqual;
}
/**
* Checks the given objects for equality. Handles {@link Pattern} and arrays correctly.
*
* @param left
* @param right
* @return
*/
private boolean isEqual(Object left, Object right) {
if (left == null) {
return right == null;
}
if (left instanceof Pattern) {
return right instanceof Pattern ? ((Pattern) left).pattern().equals(((Pattern) right).pattern()) : false;
}
return ObjectUtils.nullSafeEquals(left, right);
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
int result = 17;
result += nullSafeHashCode(key);
result += criteria.hashCode();
result += nullSafeHashCode(isValue);
return result;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -18,6 +18,12 @@ package org.springframework.data.mongodb.core.query;
import java.util.ArrayList;
import java.util.List;
/**
* @deprecated use {@link Criteria#orOperator(Criteria...)} instead.
* @author Thomas Risberg
* @author Oliver Gierke
*/
@Deprecated
public class OrQuery extends Query {
public OrQuery(Query... q) {
@@ -31,5 +37,4 @@ public class OrQuery extends Query {
}
return new Criteria(criteriaList, "$or");
}
}

View File

@@ -15,6 +15,9 @@
*/
package org.springframework.data.mongodb.core.query;
import static org.springframework.data.mongodb.core.query.SerializationUtils.*;
import static org.springframework.util.ObjectUtils.*;
import java.util.ArrayList;
import java.util.LinkedHashMap;
import java.util.List;
@@ -56,11 +59,10 @@ public class Query {
String key = criteria.getKey();
if (existing == null) {
this.criteria.put(key, criteria);
}
else {
throw new InvalidMongoDbApiUsageException("Due to limitations of the com.mongodb.BasicDBObject, " +
"you can't add a second '" + key + "' criteria. " +
"Query already contains '" + existing.getCriteriaObject() + "'.");
} else {
throw new InvalidMongoDbApiUsageException("Due to limitations of the com.mongodb.BasicDBObject, "
+ "you can't add a second '" + key + "' criteria. " + "Query already contains '"
+ existing.getCriteriaObject() + "'.");
}
return this;
}
@@ -141,4 +143,60 @@ public class Query {
protected List<Criteria> getCriteria() {
return new ArrayList<Criteria>(this.criteria.values());
}
/*
* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
return String.format("Query: %s, Fields: %s, Sort: %s", serializeToJsonSafely(getQueryObject()),
serializeToJsonSafely(getFieldsObject()), serializeToJsonSafely(getSortObject()));
}
/*
* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (obj == null || !getClass().equals(obj.getClass())) {
return false;
}
Query that = (Query) obj;
boolean criteriaEqual = this.criteria.equals(that.criteria);
boolean fieldsEqual = this.fieldSpec == null ? that.fieldSpec == null : this.fieldSpec.equals(that.fieldSpec);
boolean sortEqual = this.sort == null ? that.sort == null : this.sort.equals(that.sort);
boolean hintEqual = this.hint == null ? that.hint == null : this.hint.equals(that.hint);
boolean skipEqual = this.skip == that.skip;
boolean limitEqual = this.limit == that.limit;
return criteriaEqual && fieldsEqual && sortEqual && hintEqual && skipEqual && limitEqual;
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
int result = 17;
result += 31 * criteria.hashCode();
result += 31 * nullSafeHashCode(fieldSpec);
result += 31 * nullSafeHashCode(sort);
result += 31 * nullSafeHashCode(hint);
result += 31 * skip;
result += 31 * limit;
return result;
}
}

View File

@@ -0,0 +1,110 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.query;
import java.util.Collection;
import java.util.Iterator;
import java.util.Map;
import java.util.Map.Entry;
import org.springframework.core.convert.converter.Converter;
import com.mongodb.DBObject;
import com.mongodb.util.JSON;
/**
* Utility methods for JSON serialization.
*
* @author Oliver Gierke
*/
public abstract class SerializationUtils {
private SerializationUtils() {
}
/**
* Serializes the given object into pseudo-JSON meaning it's trying to create a JSON representation as far as possible
* but falling back to the given object's {@link Object#toString()} method if it's not serializable. Useful for
* printing raw {@link DBObject}s containing complex values before actually converting them into Mongo native types.
*
* @param value
* @return
*/
public static String serializeToJsonSafely(Object value) {
if (value == null) {
return null;
}
try {
return JSON.serialize(value);
} catch (Exception e) {
if (value instanceof Collection) {
return toString((Collection<?>) value);
} else if (value instanceof Map) {
return toString((Map<?, ?>) value);
} else if (value instanceof DBObject) {
return toString(((DBObject) value).toMap());
} else {
return String.format("{ $java : %s }", value.toString());
}
}
}
private static String toString(Map<?, ?> source) {
return iterableToDelimitedString(source.entrySet(), "{ ", " }", new Converter<Entry<?, ?>, Object>() {
public Object convert(Entry<?, ?> source) {
return String.format("\"%s\" : %s", source.getKey(), serializeToJsonSafely(source.getValue()));
}
});
}
private static String toString(Collection<?> source) {
return iterableToDelimitedString(source, "[ ", " ]", new Converter<Object, Object>() {
public Object convert(Object source) {
return serializeToJsonSafely(source);
}
});
}
/**
* Creates a string representation from the given {@link Iterable} prepending the postfix, applying the given
* {@link Converter} to each element before adding it to the result {@link String}, concatenating each element with
* {@literal ,} and applying the postfix.
*
* @param source
* @param prefix
* @param postfix
* @param transformer
* @return
*/
private static <T> String iterableToDelimitedString(Iterable<T> source, String prefix, String postfix,
Converter<? super T, Object> transformer) {
StringBuilder builder = new StringBuilder(prefix);
Iterator<T> iterator = source.iterator();
while (iterator.hasNext()) {
builder.append(transformer.convert(iterator.next()));
if (iterator.hasNext()) {
builder.append(", ");
}
}
return builder.append(postfix).toString();
}
}

View File

@@ -48,7 +48,7 @@ public class MongoRepositoryConfigParser extends
protected void postProcessBeanDefinition(MongoRepositoryConfiguration context, BeanDefinitionBuilder builder,
BeanDefinitionRegistry registry, Object beanSource) {
builder.addPropertyReference("template", context.getMongoTemplateRef());
builder.addPropertyReference("mongoOperations", context.getMongoTemplateRef());
builder.addPropertyValue("createIndexesForQueryMethods", context.getCreateQueryIndexes());
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2002-2011 the original author or authors.
* Copyright 2010-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,15 +15,15 @@
*/
package org.springframework.data.mongodb.repository.query;
import static org.springframework.data.mongodb.repository.query.QueryUtils.applyPagination;
import static org.springframework.data.mongodb.repository.query.QueryUtils.*;
import java.util.List;
import org.springframework.data.domain.PageImpl;
import org.springframework.data.domain.Pageable;
import org.springframework.data.mongodb.core.CollectionCallback;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.geo.Distance;
import org.springframework.data.mongodb.core.geo.GeoPage;
import org.springframework.data.mongodb.core.geo.GeoResult;
import org.springframework.data.mongodb.core.geo.GeoResults;
import org.springframework.data.mongodb.core.geo.Point;
@@ -34,10 +34,6 @@ import org.springframework.data.repository.query.RepositoryQuery;
import org.springframework.data.util.TypeInformation;
import org.springframework.util.Assert;
import com.mongodb.DBCollection;
import com.mongodb.DBCursor;
import com.mongodb.DBObject;
/**
* Base class for {@link RepositoryQuery} implementations for Mongo.
*
@@ -46,45 +42,50 @@ import com.mongodb.DBObject;
public abstract class AbstractMongoQuery implements RepositoryQuery {
private final MongoQueryMethod method;
private final MongoOperations mongoOperations;
private final MongoOperations operations;
/**
* Creates a new {@link AbstractMongoQuery} from the given {@link MongoQueryMethod} and {@link MongoOperations}.
*
* @param method
* @param template
* @param method must not be {@literal null}.
* @param operations must not be {@literal null}.
*/
public AbstractMongoQuery(MongoQueryMethod method, MongoOperations template) {
public AbstractMongoQuery(MongoQueryMethod method, MongoOperations operations) {
Assert.notNull(template);
Assert.notNull(operations);
Assert.notNull(method);
this.method = method;
this.mongoOperations = template;
this.operations = operations;
}
/* (non-Javadoc)
/*
* (non-Javadoc)
* @see org.springframework.data.repository.query.RepositoryQuery#getQueryMethod()
*/
public MongoQueryMethod getQueryMethod() {
return method;
}
/*
* (non-Javadoc)
*
* @see org.springframework.data.repository.query.RepositoryQuery#execute(java.lang.Object[])
*/
public Object execute(Object[] parameters) {
MongoParameterAccessor accessor = new MongoParametersParameterAccessor(method, parameters);
Query query = createQuery(new ConvertingParameterAccessor(mongoOperations.getConverter(), accessor));
Query query = createQuery(new ConvertingParameterAccessor(operations.getConverter(), accessor));
if (method.isGeoNearQuery()) {
if (method.isGeoNearQuery() && method.isPageQuery()) {
MongoParameterAccessor countAccessor = new MongoParametersParameterAccessor(method, parameters);
Query countQuery = createCountQuery(new ConvertingParameterAccessor(operations.getConverter(), countAccessor));
return new GeoNearExecution(accessor).execute(query, countQuery);
} else if (method.isGeoNearQuery()) {
return new GeoNearExecution(accessor).execute(query);
} else if (method.isCollectionQuery()) {
return new CollectionExecution().execute(query);
return new CollectionExecution(accessor.getPageable()).execute(query);
} else if (method.isPageQuery()) {
return new PagedExecution(accessor.getPageable()).execute(query);
} else {
@@ -93,14 +94,25 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
}
/**
* Create a {@link Query} instance using the given {@link ParameterAccessor}
* Creates a {@link Query} instance using the given {@link ParameterAccessor}
*
* @param accessor
* @param converter
* @param accessor must not be {@literal null}.
* @return
*/
protected abstract Query createQuery(ConvertingParameterAccessor accessor);
/**
* Creates a {@link Query} instance using the given {@link ConvertingParameterAccessor}. Will delegate to
* {@link #createQuery(ConvertingParameterAccessor)} by default but allows customization of the count query to be
* triggered.
*
* @param accessor must not be {@literal null}.
* @return
*/
protected Query createCountQuery(ConvertingParameterAccessor accessor) {
return createQuery(accessor);
}
private abstract class Execution {
abstract Object execute(Query query);
@@ -110,7 +122,7 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
MongoEntityInformation<?, ?> metadata = method.getEntityInformation();
String collectionName = metadata.getCollectionName();
return mongoOperations.find(query, metadata.getJavaType(), collectionName);
return operations.find(query, metadata.getJavaType(), collectionName);
}
}
@@ -121,14 +133,23 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
*/
class CollectionExecution extends Execution {
private final Pageable pageable;
CollectionExecution(Pageable pageable) {
this.pageable = pageable;
}
/*
* (non-Javadoc)
*
* @see org.springframework.data.mongodb.repository.MongoQuery.Execution #execute(com.mongodb.DBObject)
* @see org.springframework.data.mongodb.repository.query.AbstractMongoQuery.Execution#execute(org.springframework.data.mongodb.core.query.Query)
*/
@Override
public Object execute(Query query) {
if (pageable != null) {
query = applyPagination(query, pageable);
}
return readCollection(query);
}
}
@@ -162,24 +183,13 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
Object execute(Query query) {
MongoEntityInformation<?, ?> metadata = method.getEntityInformation();
int count = getCollectionCursor(metadata.getCollectionName(), query.getQueryObject()).count();
long count = operations.count(query, metadata.getCollectionName());
List<?> result = mongoOperations.find(applyPagination(query, pageable), metadata.getJavaType(),
List<?> result = operations.find(applyPagination(query, pageable), metadata.getJavaType(),
metadata.getCollectionName());
return new PageImpl(result, pageable, count);
}
private DBCursor getCollectionCursor(String collectionName, final DBObject query) {
return mongoOperations.execute(collectionName, new CollectionCallback<DBCursor>() {
public DBCursor doInCollection(DBCollection collection) {
return collection.find(query);
}
});
}
}
/**
@@ -197,7 +207,7 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
Object execute(Query query) {
MongoEntityInformation<?, ?> entityInformation = method.getEntityInformation();
return mongoOperations.findOne(query, entityInformation.getJavaType());
return operations.findOne(query, entityInformation.getJavaType());
}
}
@@ -221,6 +231,28 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
@Override
Object execute(Query query) {
GeoResults<?> results = doExecuteQuery(query);
return isListOfGeoResult() ? results.getContent() : results;
}
/**
* Executes the given {@link Query} to return a page.
*
* @param query must not be {@literal null}.
* @param countQuery must not be {@literal null}.
* @return
*/
Object execute(Query query, Query countQuery) {
MongoEntityInformation<?, ?> information = method.getEntityInformation();
long count = operations.count(countQuery, information.getCollectionName());
return new GeoPage<Object>(doExecuteQuery(query), accessor.getPageable(), count);
}
@SuppressWarnings("unchecked")
private GeoResults<Object> doExecuteQuery(Query query) {
Point nearLocation = accessor.getGeoNearLocation();
NearQuery nearQuery = NearQuery.near(nearLocation);
@@ -234,9 +266,8 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
}
MongoEntityInformation<?, ?> entityInformation = method.getEntityInformation();
GeoResults<?> results = mongoOperations.geoNear(nearQuery, entityInformation.getJavaType(), entityInformation.getCollectionName());
return isListOfGeoResult() ? results.getContent() : results;
return (GeoResults<Object>) operations.geoNear(nearQuery, entityInformation.getJavaType(),
entityInformation.getCollectionName());
}
private boolean isListOfGeoResult() {

View File

@@ -17,7 +17,6 @@ package org.springframework.data.mongodb.repository.query;
import java.io.Serializable;
/**
* Interface for components being able to provide {@link EntityInformationCreator} for a given {@link Class}.
*

View File

@@ -89,7 +89,8 @@ public class MongoParameters extends Parameters {
if (this.nearIndex == null && mongoParameter.isManuallyAnnotatedNearParameter()) {
this.nearIndex = mongoParameter.getIndex();
} else if (mongoParameter.isManuallyAnnotatedNearParameter()) {
throw new IllegalStateException(String.format("Found multiple @Near annotations ond method %s! Only one allowed!", parameter.getMethod().toString()));
throw new IllegalStateException(String.format(
"Found multiple @Near annotations ond method %s! Only one allowed!", parameter.getMethod().toString()));
}
return mongoParameter;
@@ -142,8 +143,7 @@ public class MongoParameters extends Parameters {
*/
@Override
public boolean isSpecialParameter() {
return super.isSpecialParameter() || getType().equals(Distance.class)
|| isNearParameter();
return super.isSpecialParameter() || getType().equals(Distance.class) || isNearParameter();
}
private boolean isNearParameter() {

View File

@@ -31,7 +31,6 @@ import org.springframework.data.mongodb.core.geo.Shape;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.CriteriaDefinition;
import org.springframework.data.mongodb.core.query.OrQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.repository.query.ConvertingParameterAccessor.PotentiallyConvertingIterator;
import org.springframework.data.repository.query.parser.AbstractQueryCreator;
@@ -45,7 +44,7 @@ import org.springframework.util.Assert;
*
* @author Oliver Gierke
*/
class MongoQueryCreator extends AbstractQueryCreator<Query, Query> {
class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
private static final Log LOG = LogFactory.getLog(MongoQueryCreator.class);
private final MongoParameterAccessor accessor;
@@ -92,7 +91,7 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Query> {
* @see org.springframework.data.repository.query.parser.AbstractQueryCreator#create(org.springframework.data.repository.query.parser.Part, java.util.Iterator)
*/
@Override
protected Query create(Part part, Iterator<Object> iterator) {
protected Criteria create(Part part, Iterator<Object> iterator) {
if (isGeoNearQuery && part.getType().equals(Type.NEAR)) {
return null;
@@ -103,7 +102,7 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Query> {
where(path.toDotPath(MongoPersistentProperty.PropertyToFieldNameConverter.INSTANCE)),
(PotentiallyConvertingIterator) iterator);
return new Query(criteria);
return criteria;
}
/*
@@ -111,18 +110,18 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Query> {
* @see org.springframework.data.repository.query.parser.AbstractQueryCreator#and(org.springframework.data.repository.query.parser.Part, java.lang.Object, java.util.Iterator)
*/
@Override
protected Query and(Part part, Query base, Iterator<Object> iterator) {
protected Criteria and(Part part, Criteria base, Iterator<Object> iterator) {
if (base == null) {
return create(part, iterator);
}
PersistentPropertyPath<MongoPersistentProperty> path2 = context.getPersistentPropertyPath(part.getProperty());
PersistentPropertyPath<MongoPersistentProperty> path = context.getPersistentPropertyPath(part.getProperty());
Criteria criteria = from(part.getType(),
where(path2.toDotPath(MongoPersistentProperty.PropertyToFieldNameConverter.INSTANCE)),
(PotentiallyConvertingIterator) iterator);
return base.addCriteria(criteria);
return new Criteria().andOperator(
base,
from(part.getType(), where(path.toDotPath(MongoPersistentProperty.PropertyToFieldNameConverter.INSTANCE)),
(PotentiallyConvertingIterator) iterator));
}
/*
@@ -133,8 +132,10 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Query> {
* #or(java.lang.Object, java.lang.Object)
*/
@Override
protected Query or(Query base, Query query) {
return new OrQuery(new Query[] { base, query });
protected Criteria or(Criteria base, Criteria criteria) {
Criteria result = new Criteria();
return result.orOperator(base, criteria);
}
/*
@@ -145,16 +146,17 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Query> {
* #complete(java.lang.Object, org.springframework.data.domain.Sort)
*/
@Override
protected Query complete(Query query, Sort sort) {
protected Query complete(Criteria criteria, Sort sort) {
if (query == null) {
if (criteria == null) {
return null;
}
Query query = new Query(criteria);
QueryUtils.applySorting(query, sort);
if (LOG.isDebugEnabled()) {
LOG.debug("Created query " + query.getQueryObject());
LOG.debug("Created query " + query);
}
return query;
@@ -203,7 +205,8 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Query> {
case NEAR:
Distance distance = accessor.getMaxDistance();
Point point = nextAs(parameters, Point.class);
Point point = accessor.getGeoNearLocation();
point = point == null ? nextAs(parameters, Point.class) : point;
if (distance == null) {
return criteria.near(point);

View File

@@ -38,8 +38,8 @@ public class PartTreeMongoQuery extends AbstractMongoQuery {
/**
* Creates a new {@link PartTreeMongoQuery} from the given {@link QueryMethod} and {@link MongoTemplate}.
*
* @param method
* @param template
* @param method must not be {@literal null}.
* @param template must not be {@literal null}.
*/
public PartTreeMongoQuery(MongoQueryMethod method, MongoOperations mongoOperations) {
@@ -50,6 +50,8 @@ public class PartTreeMongoQuery extends AbstractMongoQuery {
}
/**
* Return the {@link PartTree} backing the query.
*
* @return the tree
*/
public PartTree getTree() {
@@ -58,10 +60,7 @@ public class PartTreeMongoQuery extends AbstractMongoQuery {
/*
* (non-Javadoc)
*
* @see
* org.springframework.data.mongodb.repository.AbstractMongoQuery#createQuery(org.springframework.data.
* document.mongodb.repository.ConvertingParameterAccessor)
* @see org.springframework.data.mongodb.repository.query.AbstractMongoQuery#createQuery(org.springframework.data.mongodb.repository.query.ConvertingParameterAccessor, boolean)
*/
@Override
protected Query createQuery(ConvertingParameterAccessor accessor) {
@@ -69,4 +68,13 @@ public class PartTreeMongoQuery extends AbstractMongoQuery {
MongoQueryCreator creator = new MongoQueryCreator(tree, accessor, context, isGeoNearQuery);
return creator.createQuery();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.query.AbstractMongoQuery#createCountQuery(org.springframework.data.mongodb.repository.query.ConvertingParameterAccessor)
*/
@Override
protected Query createCountQuery(ConvertingParameterAccessor accessor) {
return new MongoQueryCreator(tree, accessor, context, false).createQuery();
}
}

View File

@@ -20,11 +20,12 @@ import java.util.regex.Pattern;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.bson.types.ObjectId;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.query.BasicQuery;
import org.springframework.data.mongodb.core.query.Query;
import com.mongodb.util.JSON;
/**
* Query to use a plain JSON String to create the {@link Query} to actually execute.
*
@@ -56,10 +57,7 @@ public class StringBasedMongoQuery extends AbstractMongoQuery {
/*
* (non-Javadoc)
*
* @see
* org.springframework.data.mongodb.repository.AbstractMongoQuery#createQuery(org.springframework.data.
* repository.query.SimpleParameterAccessor, org.springframework.data.mongodb.core.core.support.convert.MongoConverter)
* @see org.springframework.data.mongodb.repository.query.AbstractMongoQuery#createQuery(org.springframework.data.mongodb.repository.query.ConvertingParameterAccessor)
*/
@Override
protected Query createQuery(ConvertingParameterAccessor accessor) {
@@ -99,13 +97,6 @@ public class StringBasedMongoQuery extends AbstractMongoQuery {
}
private String getParameterWithIndex(ConvertingParameterAccessor accessor, int index) {
Object parameter = accessor.getBindableValue(index);
if (parameter instanceof String || parameter.getClass().isEnum()) {
return String.format("\"%s\"", parameter);
} else if (parameter instanceof ObjectId) {
return String.format("{ '$oid' : '%s' }", parameter);
}
return parameter.toString();
return JSON.serialize(accessor.getBindableValue(index));
}
}

View File

@@ -25,7 +25,8 @@ import org.springframework.data.mongodb.repository.query.MongoEntityInformation;
import org.springframework.util.Assert;
/**
* Simple {@link EntityInformationCreator} to to create {@link MongoEntityInformation} instances based on a {@link MappingContext}.
* Simple {@link EntityInformationCreator} to to create {@link MongoEntityInformation} instances based on a
* {@link MappingContext}.
*
* @author Oliver Gierke
*/

View File

@@ -17,7 +17,7 @@ package org.springframework.data.mongodb.repository.support;
import java.io.Serializable;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.repository.MongoRepository;
import org.springframework.data.repository.Repository;
import org.springframework.data.repository.core.support.RepositoryFactoryBeanSupport;
@@ -32,17 +32,17 @@ import org.springframework.util.Assert;
public class MongoRepositoryFactoryBean<T extends Repository<S, ID>, S, ID extends Serializable> extends
RepositoryFactoryBeanSupport<T, S, ID> {
private MongoTemplate template;
private MongoOperations operations;
private boolean createIndexesForQueryMethods = false;
/**
* Configures the {@link MongoTemplate} to be used.
* Configures the {@link MongoOperations} to be used.
*
* @param template the template to set
* @param operations the operations to set
*/
public void setTemplate(MongoTemplate template) {
public void setMongoOperations(MongoOperations operations) {
this.template = template;
this.operations = operations;
}
/**
@@ -64,10 +64,10 @@ public class MongoRepositoryFactoryBean<T extends Repository<S, ID>, S, ID exten
@Override
protected final RepositoryFactorySupport createRepositoryFactory() {
RepositoryFactorySupport factory = getFactoryInstance(template);
RepositoryFactorySupport factory = getFactoryInstance(operations);
if (createIndexesForQueryMethods) {
factory.addQueryCreationListener(new IndexEnsuringQueryCreationListener(template));
factory.addQueryCreationListener(new IndexEnsuringQueryCreationListener(operations));
}
return factory;
@@ -76,11 +76,11 @@ public class MongoRepositoryFactoryBean<T extends Repository<S, ID>, S, ID exten
/**
* Creates and initializes a {@link RepositoryFactorySupport} instance.
*
* @param template
* @param operations
* @return
*/
protected RepositoryFactorySupport getFactoryInstance(MongoTemplate template) {
return new MongoRepositoryFactory(template);
protected RepositoryFactorySupport getFactoryInstance(MongoOperations operations) {
return new MongoRepositoryFactory(operations);
}
/*
@@ -94,6 +94,6 @@ public class MongoRepositoryFactoryBean<T extends Repository<S, ID>, S, ID exten
public void afterPropertiesSet() {
super.afterPropertiesSet();
Assert.notNull(template, "MongoTemplate must not be null!");
Assert.notNull(operations, "MongoTemplate must not be null!");
}
}

View File

@@ -28,6 +28,7 @@ import org.springframework.data.domain.Sort.Order;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.QueryMapper;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
@@ -235,6 +236,7 @@ public class QueryDslMongoRepository<T, ID extends Serializable> extends SimpleM
private final MongoConverter converter;
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
private final QueryMapper mapper;
/**
* Creates a new {@link SpringDataMongodbSerializer} for the given {@link MappingContext}.
@@ -244,6 +246,7 @@ public class QueryDslMongoRepository<T, ID extends Serializable> extends SimpleM
public SpringDataMongodbSerializer(MongoConverter converter) {
this.mappingContext = converter.getMappingContext();
this.converter = converter;
this.mapper = new QueryMapper(converter);
}
@Override
@@ -252,12 +255,16 @@ public class QueryDslMongoRepository<T, ID extends Serializable> extends SimpleM
Path<?> parent = metadata.getParent();
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(parent.getType());
MongoPersistentProperty property = entity.getPersistentProperty(metadata.getExpression().toString());
return property.getFieldName();
return property == null ? super.getKeyForPath(expr, metadata) : property.getFieldName();
}
@Override
protected DBObject asDBObject(String key, Object value) {
if ("_id".equals(key)) {
return super.asDBObject(key, mapper.convertId(value));
}
return super.asDBObject(key, value instanceof Pattern ? value : converter.convertToMongoType(value));
}
}

View File

@@ -61,9 +61,7 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements Paging
/*
* (non-Javadoc)
*
* @see
* org.springframework.data.repository.Repository#save(java.lang.Object)
* @see org.springframework.data.repository.CrudRepository#save(java.lang.Object)
*/
public T save(T entity) {
@@ -75,9 +73,7 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements Paging
/*
* (non-Javadoc)
*
* @see
* org.springframework.data.repository.Repository#save(java.lang.Iterable)
* @see org.springframework.data.repository.CrudRepository#save(java.lang.Iterable)
*/
public List<T> save(Iterable<? extends T> entities) {
@@ -95,10 +91,7 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements Paging
/*
* (non-Javadoc)
*
* @see
* org.springframework.data.repository.Repository#findById(java.io.Serializable
* )
* @see org.springframework.data.repository.CrudRepository#findOne(java.io.Serializable)
*/
public T findOne(ID id) {
Assert.notNull(id, "The given id must not be null!");
@@ -115,10 +108,7 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements Paging
/*
* (non-Javadoc)
*
* @see
* org.springframework.data.repository.Repository#exists(java.io.Serializable
* )
* @see org.springframework.data.repository.CrudRepository#exists(java.io.Serializable)
*/
public boolean exists(ID id) {
@@ -129,8 +119,7 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements Paging
/*
* (non-Javadoc)
*
* @see org.springframework.data.repository.Repository#count()
* @see org.springframework.data.repository.CrudRepository#count()
*/
public long count() {
@@ -139,7 +128,7 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements Paging
/*
* (non-Javadoc)
* @see org.springframework.data.repository.Repository#delete(java.io.Serializable)
* @see org.springframework.data.repository.CrudRepository#delete(java.io.Serializable)
*/
public void delete(ID id) {
Assert.notNull(id, "The given id must not be null!");
@@ -148,9 +137,7 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements Paging
/*
* (non-Javadoc)
*
* @see
* org.springframework.data.repository.Repository#delete(java.lang.Object)
* @see org.springframework.data.repository.CrudRepository#delete(java.lang.Object)
*/
public void delete(T entity) {
Assert.notNull(entity, "The given entity must not be null!");
@@ -159,9 +146,7 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements Paging
/*
* (non-Javadoc)
*
* @see
* org.springframework.data.repository.Repository#delete(java.lang.Iterable)
* @see org.springframework.data.repository.CrudRepository#delete(java.lang.Iterable)
*/
public void delete(Iterable<? extends T> entities) {
@@ -174,8 +159,7 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements Paging
/*
* (non-Javadoc)
*
* @see org.springframework.data.repository.Repository#deleteAll()
* @see org.springframework.data.repository.CrudRepository#deleteAll()
*/
public void deleteAll() {
@@ -184,8 +168,7 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements Paging
/*
* (non-Javadoc)
*
* @see org.springframework.data.repository.Repository#findAll()
* @see org.springframework.data.repository.CrudRepository#findAll()
*/
public List<T> findAll() {
@@ -194,10 +177,7 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements Paging
/*
* (non-Javadoc)
*
* @see
* org.springframework.data.repository.PagingAndSortingRepository#findAll
* (org.springframework.data.domain.Pageable)
* @see org.springframework.data.repository.PagingAndSortingRepository#findAll(org.springframework.data.domain.Pageable)
*/
public Page<T> findAll(final Pageable pageable) {
@@ -209,43 +189,13 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements Paging
/*
* (non-Javadoc)
*
* @see
* org.springframework.data.repository.PagingAndSortingRepository#findAll
* (org.springframework.data.domain.Sort)
* @see org.springframework.data.repository.PagingAndSortingRepository#findAll(org.springframework.data.domain.Sort)
*/
public List<T> findAll(final Sort sort) {
return findAll(QueryUtils.applySorting(new Query(), sort));
}
/*
* (non-Javadoc)
*
* @see
* org.springframework.data.repository.Repository#findAll(java.lang.Iterable
* )
*/
public List<T> findAll(Iterable<ID> ids) {
Query query = null;
//TODO: verify intent
// for (ID id : ids) {
// if (query == null) {
// query = getIdQuery(id);
// } else {
// query = new Query().or(getIdQuery(id));
// }
// }
List<ID> idList = new ArrayList<ID>();
for (ID id : ids) {
idList.add(id);
}
query = new Query(Criteria.where(entityInformation.getIdAttribute()).in(idList));
return findAll(query);
}
private List<T> findAll(Query query) {
if (query == null) {

View File

@@ -7,16 +7,14 @@
xmlns:context="http://www.springframework.org/schema/context"
xmlns:repository="http://www.springframework.org/schema/data/repository"
targetNamespace="http://www.springframework.org/schema/data/mongo"
elementFormDefault="qualified" attributeFormDefault="unqualified"
xsi:schemaLocation="http://www.springframework.org/schema/beans
http://www.springframework.org/schema/beans/spring-beans-2.5.xsd">
elementFormDefault="qualified" attributeFormDefault="unqualified">
<xsd:import namespace="http://www.springframework.org/schema/beans" />
<xsd:import namespace="http://www.springframework.org/schema/tool" />
<xsd:import namespace="http://www.springframework.org/schema/context"
schemaLocation="http://www.springframework.org/schema/context/spring-context.xsd" />
<xsd:import namespace="http://www.springframework.org/schema/data/repository"
schemaLocation="http://www.springframework.org/schema/data/repository/spring-repository.xsd"/>
schemaLocation="http://www.springframework.org/schema/data/repository/spring-repository-1.0.xsd" />
<xsd:element name="mongo" type="mongoType">
<xsd:annotation>
@@ -46,8 +44,9 @@ The name of the mongo definition (by default "mongoDbFactory").]]></xsd:document
</xsd:attribute>
<xsd:attribute name="mongo-ref" type="mongoRef" use="optional">
<xsd:annotation>
<xsd:documentation>
The reference to a Mongo. Will default to 'mongo'.
<xsd:documentation><![CDATA[
The reference to a Mongo instance. If not configured a default com.mongodb.Mongo instance will be created.
]]>
</xsd:documentation>
</xsd:annotation>
</xsd:attribute>

View File

@@ -45,7 +45,6 @@ public class MappingMongoConverterParserIntegrationTests {
DefaultListableBeanFactory factory;
@Before
public void setUp() {
factory = new DefaultListableBeanFactory();

View File

@@ -32,6 +32,7 @@ import org.springframework.context.support.ClassPathXmlApplicationContext;
import org.springframework.core.io.ClassPathResource;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.SimpleMongoDbFactory;
import org.springframework.test.util.ReflectionTestUtils;
import com.mongodb.DB;
import com.mongodb.Mongo;
@@ -59,7 +60,7 @@ public class MongoDbFactoryParserIntegrationTests {
SimpleMongoDbFactory dbFactory = new SimpleMongoDbFactory(new Mongo("localhost"), "database");
dbFactory.setWriteConcern(WriteConcern.SAFE);
dbFactory.getDb();
assertThat(WriteConcern.SAFE, is(dbFactory.getWriteConcern()));
assertThat(ReflectionTestUtils.getField(dbFactory, "writeConcern"), is((Object) WriteConcern.SAFE));
}
@Test
@@ -70,7 +71,8 @@ public class MongoDbFactoryParserIntegrationTests {
@Test
public void parsesCustomWriteConcern() {
ClassPathXmlApplicationContext ctx = new ClassPathXmlApplicationContext("namespace/db-factory-bean-custom-write-concern.xml");
ClassPathXmlApplicationContext ctx = new ClassPathXmlApplicationContext(
"namespace/db-factory-bean-custom-write-concern.xml");
assertWriteConcern(ctx, new WriteConcern("rack1"));
}
@@ -90,16 +92,17 @@ public class MongoDbFactoryParserIntegrationTests {
private void assertWriteConcern(ClassPathXmlApplicationContext ctx, WriteConcern expectedWriteConcern) {
SimpleMongoDbFactory dbFactory = ctx.getBean("first", SimpleMongoDbFactory.class);
DB db = dbFactory.getDb();
assertThat("db", is(db.getName()));
assertThat(db.getName(), is("db"));
MyWriteConcern myDbFactoryWriteConcern = new MyWriteConcern(dbFactory.getWriteConcern());
WriteConcern configuredConcern = (WriteConcern) ReflectionTestUtils.getField(dbFactory, "writeConcern");
MyWriteConcern myDbFactoryWriteConcern = new MyWriteConcern(configuredConcern);
MyWriteConcern myDbWriteConcern = new MyWriteConcern(db.getWriteConcern());
MyWriteConcern myExpectedWriteConcern = new MyWriteConcern(expectedWriteConcern);
assertThat(myDbFactoryWriteConcern, equalTo(myExpectedWriteConcern));
assertThat(myDbWriteConcern, equalTo(myExpectedWriteConcern));
assertThat(myDbWriteConcern, equalTo(myDbFactoryWriteConcern));
assertThat(myDbFactoryWriteConcern, is(myExpectedWriteConcern));
assertThat(myDbWriteConcern, is(myExpectedWriteConcern));
assertThat(myDbWriteConcern, is(myDbFactoryWriteConcern));
}
// This test will fail since equals in WriteConcern uses == for _w and not .equals
@@ -108,11 +111,9 @@ public class MongoDbFactoryParserIntegrationTests {
String s2 = new String("rack1");
WriteConcern wc1 = new WriteConcern(s1);
WriteConcern wc2 = new WriteConcern(s2);
assertThat(wc1, equalTo(wc2));
assertThat(wc1, is(wc2));
}
@Test
public void createsDbFactoryBean() {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -16,6 +16,7 @@
package org.springframework.data.mongodb.config;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import java.util.List;
@@ -29,6 +30,7 @@ import org.springframework.data.mongodb.core.MongoFactoryBean;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import org.springframework.test.util.ReflectionTestUtils;
import com.mongodb.CommandResult;
import com.mongodb.Mongo;
@@ -36,47 +38,45 @@ import com.mongodb.ServerAddress;
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration
public class MongoNamespaceReplicaSetTests extends NamespaceTestSupport {
public class MongoNamespaceReplicaSetTests {
@Autowired
private ApplicationContext ctx;
@Test
@SuppressWarnings("unchecked")
public void testParsingMongoWithReplicaSets() throws Exception {
assertTrue(ctx.containsBean("replicaSetMongo"));
MongoFactoryBean mfb = (MongoFactoryBean) ctx.getBean("&replicaSetMongo");
List<ServerAddress> replicaSetSeeds = readField("replicaSetSeeds", mfb);
assertNotNull(replicaSetSeeds);
assertEquals("127.0.0.1", replicaSetSeeds.get(0).getHost());
assertEquals(10001, replicaSetSeeds.get(0).getPort());
assertEquals("localhost", replicaSetSeeds.get(1).getHost());
assertEquals(10002, replicaSetSeeds.get(1).getPort());
List<ServerAddress> replicaSetSeeds = (List<ServerAddress>) ReflectionTestUtils.getField(mfb, "replicaSetSeeds");
assertThat(replicaSetSeeds, is(notNullValue()));
assertThat(replicaSetSeeds, hasItems(new ServerAddress("127.0.0.1", 10001), new ServerAddress("localhost", 10002)));
}
@Test
@SuppressWarnings("unchecked")
public void testParsingWithPropertyPlaceHolder() throws Exception {
assertTrue(ctx.containsBean("manyReplicaSetMongo"));
MongoFactoryBean mfb = (MongoFactoryBean) ctx.getBean("&manyReplicaSetMongo");
List<ServerAddress> replicaSetSeeds = readField("replicaSetSeeds", mfb);
assertNotNull(replicaSetSeeds);
assertEquals("192.168.174.130", replicaSetSeeds.get(0).getHost());
assertEquals(27017, replicaSetSeeds.get(0).getPort());
assertEquals("192.168.174.130", replicaSetSeeds.get(1).getHost());
assertEquals(27018, replicaSetSeeds.get(1).getPort());
assertEquals("192.168.174.130", replicaSetSeeds.get(2).getHost());
assertEquals(27019, replicaSetSeeds.get(2).getPort());
List<ServerAddress> replicaSetSeeds = (List<ServerAddress>) ReflectionTestUtils.getField(mfb, "replicaSetSeeds");
assertThat(replicaSetSeeds, is(notNullValue()));
assertThat(replicaSetSeeds, hasSize(3));
assertThat(
replicaSetSeeds,
hasItems(new ServerAddress("192.168.174.130", 27017), new ServerAddress("192.168.174.130", 27018),
new ServerAddress("192.168.174.130", 27019)));
}
@Test
@Ignore("CI infrastructure does not yet support replica sets")
public void testMongoWithReplicaSets() {
Mongo mongo = ctx.getBean(Mongo.class);
assertEquals(2, mongo.getAllAddress().size());
List<ServerAddress> servers = mongo.getAllAddress();
@@ -88,6 +88,5 @@ public class MongoNamespaceReplicaSetTests extends NamespaceTestSupport {
MongoTemplate template = new MongoTemplate(mongo, "admin");
CommandResult result = template.executeCommand("{replSetGetStatus : 1}");
assertEquals("blort", result.getString("set"));
}
}

View File

@@ -16,17 +16,19 @@
package org.springframework.data.mongodb.config;
import static org.springframework.test.util.ReflectionTestUtils.*;
import static org.junit.Assert.*;
import static org.springframework.test.util.ReflectionTestUtils.*;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.ApplicationContext;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.MongoFactoryBean;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import com.mongodb.Mongo;
import com.mongodb.MongoOptions;
@@ -62,8 +64,7 @@ public class MongoNamespaceTests {
Mongo mongo = (Mongo) getField(dbf, "mongo");
assertEquals("localhost", mongo.getAddress().getHost());
assertEquals(27017, mongo.getAddress().getPort());
assertEquals("joe", getField(dbf, "username"));
assertEquals("secret", getField(dbf, "password"));
assertEquals(new UserCredentials("joe", "secret"), getField(dbf, "credentials"));
assertEquals("database", getField(dbf, "databaseName"));
}

View File

@@ -17,6 +17,7 @@ public class MyWriteConcern {
boolean _fsync = false;
boolean _j = false;
boolean _continueOnErrorForInsert = false;
@Override
public int hashCode() {
final int prime = 31;
@@ -28,6 +29,7 @@ public class MyWriteConcern {
result = prime * result + _wtimeout;
return result;
}
@Override
public boolean equals(Object obj) {
if (this == obj)

View File

@@ -1,43 +0,0 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import java.lang.reflect.Field;
public class NamespaceTestSupport {
@SuppressWarnings({ "unchecked" })
public static <T> T readField(String name, Object target) throws Exception {
Field field = null;
Class<?> clazz = target.getClass();
do {
try {
field = clazz.getDeclaredField(name);
} catch (Exception ex) {
}
clazz = clazz.getSuperclass();
} while (field == null && !clazz.equals(Object.class));
if (field == null)
throw new IllegalArgumentException("Cannot find field '" + name + "' in the class hierarchy of "
+ target.getClass());
field.setAccessible(true);
return (T) field.get(target);
}
}

View File

@@ -0,0 +1,80 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import java.net.UnknownHostException;
import java.util.Arrays;
import java.util.Collection;
import org.junit.Before;
import org.junit.Test;
import com.mongodb.ServerAddress;
/**
* Unit tests for {@link ServerAddressPropertyEditor}.
*
* @author Oliver Gierke
*/
public class ServerAddressPropertyEditorUnitTests {
ServerAddressPropertyEditor editor;
@Before
public void setUp() {
editor = new ServerAddressPropertyEditor();
}
/**
* @see DATAMONGO-454
*/
@Test(expected = IllegalArgumentException.class)
public void rejectsAddressConfigWithoutASingleParsableServerAddress() {
editor.setAsText("foo, bar");
}
/**
* @see DATAMONGO-454
*/
@Test
public void skipsUnparsableAddressIfAtLeastOneIsParsable() throws UnknownHostException {
editor.setAsText("foo, localhost");
assertSingleAddressOfLocalhost(editor.getValue());
}
/**
* @see DATAMONGO-454
*/
@Test
public void handlesEmptyAddressAsParseError() throws UnknownHostException {
editor.setAsText(", localhost");
assertSingleAddressOfLocalhost(editor.getValue());
}
private static void assertSingleAddressOfLocalhost(Object result) throws UnknownHostException {
assertThat(result, is(instanceOf(ServerAddress[].class)));
Collection<ServerAddress> addresses = Arrays.asList((ServerAddress[]) result);
assertThat(addresses, hasSize(1));
assertThat(addresses, hasItem(new ServerAddress("localhost")));
}
}

View File

@@ -0,0 +1,43 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import org.junit.Test;
import com.mongodb.WriteConcern;
/**
* Unit tests for {@link StringToWriteConcernConverter}.
*
* @author Oliver Gierke
*/
public class StringToWriteConcernConverterUnitTest {
StringToWriteConcernConverter converter = new StringToWriteConcernConverter();
@Test
public void createsWellKnownConstantsCorrectly() {
assertThat(converter.convert("SAFE"), is(WriteConcern.SAFE));
}
@Test
public void createsWriteConcernForUnknownValue() {
assertThat(converter.convert("-1"), is(new WriteConcern("-1")));
}
}

View File

@@ -0,0 +1,53 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import org.junit.Before;
import org.junit.Test;
import com.mongodb.WriteConcern;
/**
* Unit tests for {@link WriteConcernPropertyEditor}.
*
* @author Oliver Gierke
*/
public class WriteConcernPropertyEditorUnitTests {
WriteConcernPropertyEditor editor;
@Before
public void setUp() {
editor = new WriteConcernPropertyEditor();
}
@Test
public void createsWriteConcernForWellKnownConstants() {
editor.setAsText("SAFE");
assertThat(editor.getValue(), is((Object) WriteConcern.SAFE));
}
@Test
public void createsWriteConcernForUnknownConstants() {
editor.setAsText("-1");
assertThat(editor.getValue(), is((Object) new WriteConcern("-1")));
}
}

View File

@@ -0,0 +1,81 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import com.mongodb.BasicDBList;
import com.mongodb.DBObject;
/**
* Helper classes to ease assertions on {@link DBObject}s.
*
* @author Oliver Gierke
*/
public abstract class DBObjectUtils {
private DBObjectUtils() {
}
/**
* Expects the field with the given key to be not {@literal null} and a {@link DBObject} in turn and returns it.
*
* @param source the {@link DBObject} to lookup the nested one
* @param key the key of the field to lokup the nested {@link DBObject}
* @return
*/
public static DBObject getAsDBObject(DBObject source, String key) {
return getTypedValue(source, key, DBObject.class);
}
/**
* Expects the field with the given key to be not {@literal null} and a {@link BasicDBList}.
*
* @param source the {@link DBObject} to lookup the {@link BasicDBList} in
* @param key the key of the field to find the {@link BasicDBList} in
* @return
*/
public static BasicDBList getAsDBList(DBObject source, String key) {
return getTypedValue(source, key, BasicDBList.class);
}
/**
* Expects the list element with the given index to be a non-{@literal null} {@link DBObject} and returns it.
*
* @param source the {@link BasicDBList} to look up the {@link DBObject} element in
* @param index the index of the element expected to contain a {@link DBObject}
* @return
*/
public static DBObject getAsDBObject(BasicDBList source, int index) {
assertThat(source.size(), greaterThanOrEqualTo(index + 1));
Object value = source.get(index);
assertThat(value, is(instanceOf(DBObject.class)));
return (DBObject) value;
}
@SuppressWarnings("unchecked")
private static <T> T getTypedValue(DBObject source, String key, Class<T> type) {
Object value = source.get(key);
assertThat(value, is(notNullValue()));
assertThat(value, is(instanceOf(type)));
return (T) value;
}
}

View File

@@ -15,7 +15,6 @@
*/
package org.springframework.data.mongodb.core;
public class Friend {
private String id;

View File

@@ -0,0 +1,65 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import org.junit.After;
import org.junit.Before;
import org.junit.Test;
import org.springframework.transaction.support.TransactionSynchronizationManager;
import com.mongodb.DB;
import com.mongodb.Mongo;
/**
* Unit tests for {@link MongoDbUtils}.
*
* @author Oliver Gierke
*/
public class MongoDbUtilsUnitTests {
Mongo mongo;
@Before
public void setUp() throws Exception {
this.mongo = new Mongo();
TransactionSynchronizationManager.initSynchronization();
}
@After
public void tearDown() {
for (Object key : TransactionSynchronizationManager.getResourceMap().keySet()) {
TransactionSynchronizationManager.unbindResource(key);
}
TransactionSynchronizationManager.clearSynchronization();
}
@Test
public void returnsNewInstanceForDifferentDatabaseName() {
DB first = MongoDbUtils.getDB(mongo, "first");
assertThat(first, is(notNullValue()));
assertThat(MongoDbUtils.getDB(mongo, "first"), is(first));
DB second = MongoDbUtils.getDB(mongo, "second");
assertThat(second, is(not(first)));
assertThat(MongoDbUtils.getDB(mongo, "second"), is(second));
}
}

View File

@@ -0,0 +1,52 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import org.junit.Test;
import org.springframework.beans.factory.support.DefaultListableBeanFactory;
import org.springframework.beans.factory.support.RootBeanDefinition;
import org.springframework.data.mongodb.config.WriteConcernPropertyEditor;
import org.springframework.test.util.ReflectionTestUtils;
import com.mongodb.WriteConcern;
/**
* Integration tests for {@link MongoFactoryBean}.
*
* @author Oliver Gierke
*/
public class MongoFactoryBeanIntegrationTest {
/**
* @see DATAMONGO-408
*/
@Test
public void convertsWriteConcernCorrectly() {
RootBeanDefinition definition = new RootBeanDefinition(MongoFactoryBean.class);
definition.getPropertyValues().addPropertyValue("writeConcern", "SAFE");
DefaultListableBeanFactory factory = new DefaultListableBeanFactory();
factory.registerCustomEditor(WriteConcern.class, WriteConcernPropertyEditor.class);
factory.registerBeanDefinition("factory", definition);
MongoFactoryBean bean = factory.getBean("&factory", MongoFactoryBean.class);
assertThat(ReflectionTestUtils.getField(bean, "writeConcern"), is((Object) WriteConcern.SAFE));
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -39,6 +39,7 @@ import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.core.convert.converter.Converter;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.data.annotation.Id;
import org.springframework.data.annotation.PersistenceConstructor;
import org.springframework.data.mongodb.InvalidMongoDbApiUsageException;
@@ -71,6 +72,7 @@ import com.mongodb.WriteResult;
*
* @author Oliver Gierke
* @author Thomas Risberg
* @author Amol Nayak
*/
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration("classpath:infrastructure.xml")
@@ -130,6 +132,7 @@ public class MongoTemplateTests {
template.dropCollection(template.getCollectionName(PersonWithIdPropertyOfTypeLong.class));
template.dropCollection(template.getCollectionName(PersonWithIdPropertyOfPrimitiveLong.class));
template.dropCollection(template.getCollectionName(TestClass.class));
template.dropCollection(Sample.class);
}
@Test
@@ -159,6 +162,112 @@ public class MongoTemplateTests {
mongoTemplate.updateFirst(q, u, Person.class);
}
/**
* @see DATAMONGO-480
*/
@Test
public void throwsExceptionForDuplicateIds() {
MongoTemplate template = new MongoTemplate(factory);
template.setWriteResultChecking(WriteResultChecking.EXCEPTION);
Person person = new Person(new ObjectId(), "Amol");
person.setAge(28);
template.insert(person);
try {
template.insert(person);
fail("Expected DataIntegrityViolationException!");
} catch (DataIntegrityViolationException e) {
assertThat(e.getMessage(), containsString("E11000 duplicate key error index: database.person.$_id_ dup key:"));
}
}
/**
* @see DATAMONGO-480
*/
@Test
public void throwsExceptionForUpdateWithInvalidPushOperator() {
MongoTemplate template = new MongoTemplate(factory);
template.setWriteResultChecking(WriteResultChecking.EXCEPTION);
ObjectId id = new ObjectId();
Person person = new Person(id, "Amol");
person.setAge(28);
template.insert(person);
try {
Query query = new Query(Criteria.where("firstName").is("Amol"));
Update upd = new Update().push("age", 29);
template.updateFirst(query, upd, Person.class);
fail("Expected DataIntegrityViolationException!");
} catch (DataIntegrityViolationException e) {
assertThat(e.getMessage(),
is("Execution of update with '{ \"$push\" : { \"age\" : 29}}'' using '{ \"firstName\" : \"Amol\"}' "
+ "query failed: Cannot apply $push/$pushAll modifier to non-array"));
}
}
/**
* @see DATAMONGO-480
*/
@Test
public void throwsExceptionForIndexViolationIfConfigured() {
MongoTemplate template = new MongoTemplate(factory);
template.setWriteResultChecking(WriteResultChecking.EXCEPTION);
template.indexOps(Person.class).ensureIndex(new Index().on("firstName", Order.DESCENDING).unique());
Person person = new Person(new ObjectId(), "Amol");
person.setAge(28);
template.save(person);
person = new Person(new ObjectId(), "Amol");
person.setAge(28);
try {
template.save(person);
fail("Expected DataIntegrityViolationException!");
} catch (DataIntegrityViolationException e) {
assertThat(e.getMessage(),
containsString("E11000 duplicate key error index: database.person.$firstName_-1 dup key:"));
}
}
/**
* @see DATAMONGO-480
*/
@Test
public void rejectsDuplicateIdInInsertAll() {
MongoTemplate template = new MongoTemplate(factory);
template.setWriteResultChecking(WriteResultChecking.EXCEPTION);
ObjectId id = new ObjectId();
Person person = new Person(id, "Amol");
person.setAge(28);
List<Person> records = new ArrayList<Person>();
records.add(person);
records.add(person);
try {
template.insertAll(records);
fail("Expected DataIntegrityViolationException!");
} catch (DataIntegrityViolationException e) {
assertThat(
e.getMessage(),
startsWith("Insert list failed: E11000 duplicate key error index: database.person.$_id_ dup key: { : ObjectId"));
}
}
@Test
public void testEnsureIndex() throws Exception {
@@ -189,7 +298,7 @@ public class MongoTemplateTests {
assertThat(dropDupes, is(true));
List<IndexInfo> indexInfoList = template.indexOps(Person.class).getIndexInfo();
System.out.println(indexInfoList);
assertThat(indexInfoList.size(), is(2));
IndexInfo ii = indexInfoList.get(1);
assertThat(ii.isUnique(), is(true));
@@ -906,8 +1015,6 @@ public class MongoTemplateTests {
assertThat(lastMongoAction.getEntityClass().toString(), is(PersonWithIdPropertyOfTypeObjectId.class.toString()));
assertThat(lastMongoAction.getMongoActionOperation(), is(MongoActionOperation.UPDATE));
assertThat(lastMongoAction.getQuery(), equalTo(q.getQueryObject()));
assertThat(lastMongoAction.getDocument(), equalTo(u.getUpdateObject()));
}
private class FsyncSafeWriteConcernResolver implements WriteConcernResolver {
@@ -933,7 +1040,7 @@ public class MongoTemplateTests {
DBRef first = new DBRef(factory.getDb(), "foo", new ObjectId());
DBRef second = new DBRef(factory.getDb(), "bar", new ObjectId());
template.updateFirst(null, Update.update("dbRefs", Arrays.asList(first, second)), ClassWithDBRefs.class);
template.updateFirst(null, update("dbRefs", Arrays.asList(first, second)), ClassWithDBRefs.class);
}
class ClassWithDBRefs {
@@ -1041,7 +1148,7 @@ public class MongoTemplateTests {
List<TestClass> testClassList = mappingTemplate.find(new Query(Criteria.where("myDate").is(dateTime.toDate())),
TestClass.class);
assertThat(testClassList.size(), is(1));
assertThat(testClassList.get(0).getMyDate(), is(testClass.getMyDate()));
assertThat(testClassList.get(0).myDate, is(testClass.myDate));
}
/**
@@ -1080,24 +1187,96 @@ public class MongoTemplateTests {
assertThat(template.findOne(query(where("id").is(id)), Sample.class), is(nullValue()));
}
public class Sample {
/**
* @see DATAMONGO-423
*/
@Test
public void executesQueryWithNegatedRegexCorrectly() {
Sample first = new Sample();
first.field = "Matthews";
Sample second = new Sample();
second.field = "Beauford";
template.save(first);
template.save(second);
Query query = query(where("field").not().regex("Matthews"));
List<Sample> result = template.find(query, Sample.class);
assertThat(result.size(), is(1));
assertThat(result.get(0).field, is("Beauford"));
}
/**
* @see DATAMONGO-447
*/
@Test
public void storesAndRemovesTypeWithComplexId() {
MyId id = new MyId();
id.first = "foo";
id.second = "bar";
TypeWithMyId source = new TypeWithMyId();
source.id = id;
template.save(source);
template.remove(query(where("id").is(id)), TypeWithMyId.class);
}
/**
* @see DATAMONGO-539
*/
@Test
public void removesObjectFromExplicitCollection() {
String collectionName = "explicit";
template.remove(new Query(), collectionName);
PersonWithConvertedId person = new PersonWithConvertedId();
person.name = "Dave";
template.save(person, collectionName);
assertThat(template.findAll(PersonWithConvertedId.class, collectionName).isEmpty(), is(false));
template.remove(person, collectionName);
assertThat(template.findAll(PersonWithConvertedId.class, collectionName).isEmpty(), is(true));
}
static class MyId {
String first;
String second;
}
static class TypeWithMyId {
@Id
MyId id;
}
public static class Sample {
@Id
String id;
String field;
}
public class TestClass {
static class TestClass {
private DateTime myDate;
DateTime myDate;
@PersistenceConstructor
public TestClass(DateTime date) {
this.myDate = date;
TestClass(DateTime myDate) {
this.myDate = myDate;
}
}
public DateTime getMyDate() {
return myDate;
}
static class PersonWithConvertedId {
String id;
String name;
}
static enum DateTimeToDateConverter implements Converter<DateTime, Date> {

View File

@@ -20,6 +20,7 @@ import static org.junit.Assert.*;
import static org.mockito.Mockito.*;
import java.math.BigInteger;
import java.util.regex.Pattern;
import org.bson.types.ObjectId;
import org.junit.Before;
@@ -136,6 +137,31 @@ public class MongoTemplateUnitTests extends MongoOperationsUnitTests {
assertThat(entity.id, is(notNullValue()));
}
/**
* @see DATAMONGO-474
*/
@Test
public void setsUnpopulatedIdField() {
NotAutogenerateableId entity = new NotAutogenerateableId();
template.populateIdIfNecessary(entity, 5);
assertThat(entity.id, is(5));
}
/**
* @see DATAMONGO-474
*/
@Test
public void doesNotSetAlreadyPopulatedId() {
NotAutogenerateableId entity = new NotAutogenerateableId();
entity.id = 5;
template.populateIdIfNecessary(entity, 7);
assertThat(entity.id, is(5));
}
class AutogenerateableId {
@Id
@@ -146,6 +172,10 @@ public class MongoTemplateUnitTests extends MongoOperationsUnitTests {
@Id
Integer id;
public Pattern getId() {
return Pattern.compile(".");
}
}
/**
@@ -161,8 +191,9 @@ public class MongoTemplateUnitTests extends MongoOperationsUnitTests {
return template;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.MongoOperationsUnitTests#getOperations()
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.MongoOperationsUnitTests#getOperationsForExceptionHandling()
*/
@Override
protected MongoOperations getOperationsForExceptionHandling() {
@@ -171,8 +202,9 @@ public class MongoTemplateUnitTests extends MongoOperationsUnitTests {
return template;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.MongoOperationsUnitTests#getOperations()
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.MongoOperationsUnitTests#getOperations()
*/
@Override
protected MongoOperations getOperations() {

View File

@@ -0,0 +1,67 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import static org.springframework.data.mongodb.core.query.SerializationUtils.*;
import java.util.Arrays;
import org.hamcrest.Matcher;
import org.junit.Test;
import org.springframework.data.mongodb.core.query.SerializationUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Unit tests for {@link SerializationUtils}.
*
* @author Oliver Gierke
*/
public class SerializationUtilsUnitTests {
@Test
public void writesSimpleDBObject() {
DBObject dbObject = new BasicDBObject("foo", "bar");
assertThat(serializeToJsonSafely(dbObject), is("{ \"foo\" : \"bar\"}"));
}
@Test
public void writesComplexObjectAsPlainToString() {
DBObject dbObject = new BasicDBObject("foo", new Complex());
assertThat(serializeToJsonSafely(dbObject),
startsWith("{ \"foo\" : { $java : org.springframework.data.mongodb.core.SerializationUtilsUnitTests$Complex"));
}
@Test
public void writesCollection() {
DBObject dbObject = new BasicDBObject("foo", Arrays.asList("bar", new Complex()));
Matcher<String> expectedOutput = allOf(
startsWith("{ \"foo\" : [ \"bar\", { $java : org.springframework.data.mongodb.core.SerializationUtilsUnitTests$Complex"),
endsWith(" } ] }"));
assertThat(serializeToJsonSafely(dbObject), is(expectedOutput));
}
static class Complex {
}
}

View File

@@ -15,8 +15,8 @@
*/
package org.springframework.data.mongodb.core;
import static org.junit.Assert.*;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import java.net.UnknownHostException;
@@ -24,6 +24,7 @@ import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.mockito.runners.MockitoJUnitRunner;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.test.util.ReflectionTestUtils;
@@ -70,8 +71,8 @@ public class SimpleMongoDbFactoryUnitTests {
MongoURI mongoURI = new MongoURI("mongodb://myUsername:myPassword@localhost/myDatabase.myCollection");
MongoDbFactory mongoDbFactory = new SimpleMongoDbFactory(mongoURI);
assertThat(ReflectionTestUtils.getField(mongoDbFactory, "username").toString(), is("myUsername"));
assertThat(ReflectionTestUtils.getField(mongoDbFactory, "password").toString(), is("myPassword"));
assertThat(ReflectionTestUtils.getField(mongoDbFactory, "credentials"), is((Object) new UserCredentials(
"myUsername", "myPassword")));
assertThat(ReflectionTestUtils.getField(mongoDbFactory, "databaseName").toString(), is("myDatabase"));
assertThat(ReflectionTestUtils.getField(mongoDbFactory, "databaseName").toString(), is("myDatabase"));
}

View File

@@ -35,5 +35,4 @@ public class TestMongoConfiguration extends AbstractMongoConfiguration {
converter.setCustomConversions(new CustomConversions(converters));
}
}

View File

@@ -3,14 +3,18 @@ package org.springframework.data.mongodb.core.convert;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import java.net.URL;
import java.text.DateFormat;
import java.text.Format;
import java.util.Arrays;
import java.util.Locale;
import java.util.UUID;
import org.bson.types.Binary;
import org.bson.types.ObjectId;
import org.junit.Test;
import org.springframework.core.convert.converter.Converter;
import org.springframework.core.convert.support.ConversionServiceFactory;
import org.springframework.core.convert.support.DefaultConversionService;
import org.springframework.core.convert.support.GenericConversionService;
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToBigIntegerConverter;
@@ -27,13 +31,13 @@ public class CustomConversionsUnitTests {
@SuppressWarnings("unchecked")
public void findsBasicReadAndWriteConversions() {
CustomConversions conversions = new CustomConversions(Arrays.asList(UuidToStringConverter.INSTANCE,
StringToUUIDConverter.INSTANCE));
CustomConversions conversions = new CustomConversions(Arrays.asList(FormatToStringConverter.INSTANCE,
StringToFormatConverter.INSTANCE));
assertThat(conversions.getCustomWriteTarget(UUID.class, null), is(typeCompatibleWith(String.class)));
assertThat(conversions.getCustomWriteTarget(Format.class, null), is(typeCompatibleWith(String.class)));
assertThat(conversions.getCustomWriteTarget(String.class, null), is(nullValue()));
assertThat(conversions.hasCustomReadTarget(String.class, UUID.class), is(true));
assertThat(conversions.hasCustomReadTarget(String.class, Format.class), is(true));
assertThat(conversions.hasCustomReadTarget(String.class, Locale.class), is(false));
}
@@ -51,7 +55,7 @@ public class CustomConversionsUnitTests {
@Test
public void considersTypesWeRegisteredConvertersForAsSimple() {
CustomConversions conversions = new CustomConversions(Arrays.asList(UuidToStringConverter.INSTANCE));
CustomConversions conversions = new CustomConversions(Arrays.asList(FormatToStringConverter.INSTANCE));
assertThat(conversions.isSimpleType(UUID.class), is(true));
}
@@ -95,14 +99,13 @@ public class CustomConversionsUnitTests {
@Test
public void populatesConversionServiceCorrectly() {
@SuppressWarnings("deprecation")
GenericConversionService conversionService = ConversionServiceFactory.createDefaultConversionService();
GenericConversionService conversionService = new DefaultConversionService();
assertThat(conversionService.canConvert(String.class, UUID.class), is(false));
CustomConversions conversions = new CustomConversions(Arrays.asList(StringToUUIDConverter.INSTANCE));
CustomConversions conversions = new CustomConversions(Arrays.asList(StringToFormatConverter.INSTANCE));
conversions.registerConvertersIn(conversionService);
assertThat(conversionService.canConvert(String.class, UUID.class), is(true));
assertThat(conversionService.canConvert(String.class, Format.class), is(true));
}
/**
@@ -110,8 +113,8 @@ public class CustomConversionsUnitTests {
*/
@Test
public void doesNotConsiderTypeSimpleIfOnlyReadConverterIsRegistered() {
CustomConversions conversions = new CustomConversions(Arrays.asList(StringToUUIDConverter.INSTANCE));
assertThat(conversions.isSimpleType(UUID.class), is(false));
CustomConversions conversions = new CustomConversions(Arrays.asList(StringToFormatConverter.INSTANCE));
assertThat(conversions.isSimpleType(Format.class), is(false));
}
/**
@@ -140,18 +143,47 @@ public class CustomConversionsUnitTests {
assertThat(conversions.getCustomWriteTarget(String.class), is(nullValue()));
}
enum UuidToStringConverter implements Converter<UUID, String> {
/**
* @see DATAMONGO-390
*/
@Test
public void considersBinaryASimpleType() {
CustomConversions conversions = new CustomConversions();
assertThat(conversions.isSimpleType(Binary.class), is(true));
}
/**
* @see DATAMONGO-462
*/
@Test
public void hasWriteConverterForURL() {
CustomConversions conversions = new CustomConversions();
assertThat(conversions.hasCustomWriteTarget(URL.class), is(true));
}
/**
* @see DATAMONGO-462
*/
@Test
public void readTargetForURL() {
CustomConversions conversions = new CustomConversions();
assertThat(conversions.hasCustomReadTarget(String.class, URL.class), is(true));
}
enum FormatToStringConverter implements Converter<Format, String> {
INSTANCE;
public String convert(UUID source) {
public String convert(Format source) {
return source.toString();
}
}
enum StringToUUIDConverter implements Converter<String, UUID> {
enum StringToFormatConverter implements Converter<String, Format> {
INSTANCE;
public UUID convert(String source) {
return UUID.fromString(source);
public Format convert(String source) {
return DateFormat.getInstance();
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -33,11 +33,12 @@ import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Unit test to reproduce DATADOC-273.
* Unit test to reproduce DATAMONGO-273.
*
* @author Harlan Iverson
* @author Oliver Gierke
*/
public class DataDoc273Test {
public class DataMongo273Tests {
MappingMongoConverter converter;
@@ -54,7 +55,7 @@ public class DataDoc273Test {
}
/**
* @see DATADOC-273
* @see DATAMONGO-273
*/
@Test
public void convertMapOfThings() {
@@ -80,7 +81,7 @@ public class DataDoc273Test {
}
/**
* @see DATADOC-294
* @see DATAMONGO-294
*/
@Test
@SuppressWarnings({ "rawtypes", "unchecked" })
@@ -96,7 +97,6 @@ public class DataDoc273Test {
DBObject result = new BasicDBList();
converter.write(listOfThings, result);
System.out.println(result.toString());
List listOfThings2 = converter.read(List.class, result);
@@ -106,7 +106,7 @@ public class DataDoc273Test {
}
/**
* @see DATADOC-294
* @see DATAMONGO-294
*/
@Test
@SuppressWarnings({ "rawtypes", "unchecked" })
@@ -121,7 +121,7 @@ public class DataDoc273Test {
listOfThings.add(train);
listOfThings.add(automobile);
Map box = new HashMap();
Map<String, Object> box = new HashMap<String, Object>();
box.put("one", listOfThings);
Shipment shipment = new Shipment(box);
@@ -138,7 +138,7 @@ public class DataDoc273Test {
assertTrue(listOfThings2.get(2) instanceof Automobile);
}
class Plane {
static class Plane {
String maker;
int numberOfPropellers;
@@ -149,7 +149,7 @@ public class DataDoc273Test {
}
}
class Train {
static class Train {
String railLine;
int numberOfCars;
@@ -160,7 +160,7 @@ public class DataDoc273Test {
}
}
class Automobile {
static class Automobile {
String make;
String model;
@@ -174,11 +174,11 @@ public class DataDoc273Test {
}
@SuppressWarnings("rawtypes")
public class Shipment {
static class Shipment {
Map boxes = new HashMap();
Map<String, Object> boxes;
public Shipment(Map boxes) {
public Shipment(Map<String, Object> boxes) {
this.boxes = boxes;
}

View File

@@ -39,17 +39,17 @@ public class DefaultMongoTypeMapperUnitTests {
ConfigurableTypeInformationMapper configurableTypeInformationMapper;
SimpleTypeInformationMapper simpleTypeInformationMapper;
DefaultMongoTypeMapper typeMapper;
@Before
public void setUp() {
configurableTypeInformationMapper = new ConfigurableTypeInformationMapper(Collections.singletonMap(String.class, "1"));
configurableTypeInformationMapper = new ConfigurableTypeInformationMapper(Collections.singletonMap(String.class,
"1"));
simpleTypeInformationMapper = SimpleTypeInformationMapper.INSTANCE;
typeMapper = new DefaultMongoTypeMapper(DefaultMongoTypeMapper.DEFAULT_TYPE_KEY, Arrays.asList(
configurableTypeInformationMapper));
typeMapper = new DefaultMongoTypeMapper(DefaultMongoTypeMapper.DEFAULT_TYPE_KEY,
Arrays.asList(configurableTypeInformationMapper));
}
@Test

View File

@@ -0,0 +1,44 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.mockito.runners.MockitoJUnitRunner;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.model.MappingException;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
/**
* Unit tests for {@link MappedConstructor}.
*
* @author Oliver Gierke
*/
@RunWith(MockitoJUnitRunner.class)
public class MappedConstructorUnitTests {
@Mock
MongoPersistentEntity<?> entity;
@Mock
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
@Test(expected = MappingException.class)
public void rejectsEntityWithoutPersistenceConstructor() {
new MappedConstructor(entity, mappingContext);
}
}

View File

@@ -21,8 +21,10 @@ import static org.junit.Assert.*;
import java.math.BigDecimal;
import java.math.BigInteger;
import java.net.URL;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
import java.util.Collections;
import java.util.Date;
import java.util.HashMap;
@@ -40,12 +42,13 @@ import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.mockito.runners.MockitoJUnitRunner;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.core.convert.converter.Converter;
import org.springframework.data.annotation.Id;
import org.springframework.data.annotation.PersistenceConstructor;
import org.springframework.data.mapping.model.MappingException;
import org.springframework.data.mapping.model.MappingInstantiationException;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.mapping.Field;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.PersonPojoStringId;
@@ -112,12 +115,30 @@ public class MappingMongoConverterUnitTests {
DBObject dbObject = new BasicDBObject();
converter.write(person, dbObject);
assertThat(dbObject.get("birthDate"), is(Date.class));
assertThat(dbObject.get("birthDate"), is(instanceOf(Date.class)));
Person result = converter.read(Person.class, dbObject);
assertThat(result.birthDate, is(notNullValue()));
}
@Test
public void convertsCustomTypeOnConvertToMongoType() {
List<Converter<?, ?>> converters = new ArrayList<Converter<?, ?>>();
converters.add(new LocalDateToDateConverter());
converters.add(new DateToLocalDateConverter());
CustomConversions conversions = new CustomConversions(converters);
mappingContext.setSimpleTypeHolder(conversions.getSimpleTypeHolder());
converter = new MappingMongoConverter(factory, mappingContext);
converter.setCustomConversions(conversions);
converter.afterPropertiesSet();
LocalDate date = new LocalDate();
converter.convertToMongoType(date);
}
/**
* @see DATAMONGO-130
*/
@@ -155,7 +176,7 @@ public class MappingMongoConverterUnitTests {
dbObject.put("birthDate", new LocalDate());
dbObject.put(DefaultMongoTypeMapper.DEFAULT_TYPE_KEY, Person.class.getName());
assertThat(converter.read(Contact.class, dbObject), is(Person.class));
assertThat(converter.read(Contact.class, dbObject), is(instanceOf(Person.class)));
}
/**
@@ -168,7 +189,7 @@ public class MappingMongoConverterUnitTests {
dbObject.put("birthDate", new LocalDate());
dbObject.put(DefaultMongoTypeMapper.DEFAULT_TYPE_KEY, Person.class.getName());
assertThat(converter.read(BirthDateContainer.class, dbObject), is(BirthDateContainer.class));
assertThat(converter.read(BirthDateContainer.class, dbObject), is(instanceOf(BirthDateContainer.class)));
}
@Test
@@ -196,7 +217,7 @@ public class MappingMongoConverterUnitTests {
DBObject result = new BasicDBObject();
converter.write(value, result);
assertThat(result.get("sampleEnum"), is(String.class));
assertThat(result.get("sampleEnum"), is(instanceOf(String.class)));
assertThat(result.get("sampleEnum").toString(), is("FIRST"));
}
@@ -212,7 +233,7 @@ public class MappingMongoConverterUnitTests {
DBObject result = new BasicDBObject();
converter.write(value, result);
assertThat(result.get("enums"), is(BasicDBList.class));
assertThat(result.get("enums"), is(instanceOf(BasicDBList.class)));
BasicDBList enums = (BasicDBList) result.get("enums");
assertThat(enums.size(), is(1));
@@ -242,7 +263,7 @@ public class MappingMongoConverterUnitTests {
ClassWithEnumProperty result = converter.read(ClassWithEnumProperty.class, dbObject);
assertThat(result.enums, is(List.class));
assertThat(result.enums, is(instanceOf(List.class)));
assertThat(result.enums.size(), is(1));
assertThat(result.enums, hasItem(SampleEnum.FIRST));
}
@@ -275,6 +296,22 @@ public class MappingMongoConverterUnitTests {
assertThat(result.firstname, is("Oliver"));
}
@Test
public void resolvesNestedComplexTypeForConstructorCorrectly() {
DBObject address = new BasicDBObject("street", "110 Southwark Street");
address.put("city", "London");
BasicDBList addresses = new BasicDBList();
addresses.add(address);
DBObject person = new BasicDBObject("firstname", "Oliver");
person.put("addresses", addresses);
Person result = converter.read(Person.class, person);
assertThat(result.addresses, is(notNullValue()));
}
/**
* @see DATAMONGO-145
*/
@@ -291,7 +328,7 @@ public class MappingMongoConverterUnitTests {
converter.write(wrapper, dbObject);
Object result = dbObject.get("contacts");
assertThat(result, is(BasicDBList.class));
assertThat(result, is(instanceOf(BasicDBList.class)));
BasicDBList contacts = (BasicDBList) result;
DBObject personDbObject = (DBObject) contacts.get(0);
assertThat(personDbObject.get("foo").toString(), is("Oliver"));
@@ -314,7 +351,7 @@ public class MappingMongoConverterUnitTests {
assertThat(result.contacts, is(notNullValue()));
assertThat(result.contacts.size(), is(1));
Contact contact = result.contacts.get(0);
assertThat(contact, is(Person.class));
assertThat(contact, is(instanceOf(Person.class)));
assertThat(((Person) contact).firstname, is("Oliver"));
}
@@ -328,7 +365,7 @@ public class MappingMongoConverterUnitTests {
converter.write(wrapper, dbObject);
Object localeField = dbObject.get("locale");
assertThat(localeField, is(String.class));
assertThat(localeField, is(instanceOf(String.class)));
assertThat((String) localeField, is("en_US"));
LocaleWrapper read = converter.read(LocaleWrapper.class, dbObject);
@@ -436,13 +473,13 @@ public class MappingMongoConverterUnitTests {
DBObject dbo1 = new BasicDBObject();
converter.write(p1, dbo1);
assertThat(dbo1.get("_id"), is(String.class));
assertThat(dbo1.get("_id"), is(instanceOf(String.class)));
PersonPojoStringId p2 = new PersonPojoStringId(new ObjectId().toString(), "Text-1");
DBObject dbo2 = new BasicDBObject();
converter.write(p2, dbo2);
assertThat(dbo2.get("_id"), is(ObjectId.class));
assertThat(dbo2.get("_id"), is(instanceOf(ObjectId.class)));
}
/**
@@ -456,8 +493,8 @@ public class MappingMongoConverterUnitTests {
ClassWithSortedMap result = converter.read(ClassWithSortedMap.class, wrapper);
assertThat(result, is(ClassWithSortedMap.class));
assertThat(result.map, is(SortedMap.class));
assertThat(result, is(instanceOf(ClassWithSortedMap.class)));
assertThat(result.map, is(instanceOf(SortedMap.class)));
}
/**
@@ -624,7 +661,6 @@ public class MappingMongoConverterUnitTests {
assertThat((String) ((Map<?, ?>) firstObjectInFoo).get("Hello"), is(equalTo("World")));
}
/**
* @see DATAMONGO-245
*/
@@ -724,7 +760,7 @@ public class MappingMongoConverterUnitTests {
assertThat(result.containsField("Foo"), is(true));
assertThat(result.get("Foo"), is(notNullValue()));
assertThat(result.get("Foo"), is(BasicDBList.class));
assertThat(result.get("Foo"), is(instanceOf(BasicDBList.class)));
BasicDBList list = (BasicDBList) result.get("Foo");
@@ -775,11 +811,11 @@ public class MappingMongoConverterUnitTests {
converter.write(wrapper, result);
Object mapObject = result.get("mapOfObjects");
assertThat(mapObject, is(BasicDBObject.class));
assertThat(mapObject, is(instanceOf(BasicDBObject.class)));
DBObject map = (DBObject) mapObject;
Object valueObject = map.get("foo");
assertThat(valueObject, is(BasicDBList.class));
assertThat(valueObject, is(instanceOf(BasicDBList.class)));
List<Object> list = (List<Object>) valueObject;
assertThat(list.size(), is(1));
@@ -856,17 +892,241 @@ public class MappingMongoConverterUnitTests {
assertThat(result.get("_id"), is((Object) 5));
}
class GenericType<T> {
/**
* @see DATAMONGO-368
*/
@Test
@SuppressWarnings("unchecked")
public void writesNullValuesForCollection() {
CollectionWrapper wrapper = new CollectionWrapper();
wrapper.contacts = Arrays.<Contact> asList(new Person(), null);
DBObject result = new BasicDBObject();
converter.write(wrapper, result);
Object contacts = result.get("contacts");
assertThat(contacts, is(instanceOf(Collection.class)));
assertThat(((Collection<?>) contacts).size(), is(2));
assertThat(((Collection<Object>) contacts), hasItem(nullValue()));
}
/**
* @see DATAMONGO-379
*/
@Test
public void considersDefaultingExpressionsAtConstructorArguments() {
DBObject dbObject = new BasicDBObject("foo", "bar");
dbObject.put("foobar", 2.5);
DefaultedConstructorArgument result = converter.read(DefaultedConstructorArgument.class, dbObject);
assertThat(result.bar, is(-1));
}
/**
* @see DATAMONGO-379
*/
@Test
public void usesDocumentFieldIfReferencedInAtValue() {
DBObject dbObject = new BasicDBObject("foo", "bar");
dbObject.put("something", 37);
dbObject.put("foobar", 2.5);
DefaultedConstructorArgument result = converter.read(DefaultedConstructorArgument.class, dbObject);
assertThat(result.bar, is(37));
}
/**
* @see DATAMONGO-379
*/
@Test(expected = MappingInstantiationException.class)
public void rejectsNotFoundConstructorParameterForPrimitiveType() {
DBObject dbObject = new BasicDBObject("foo", "bar");
converter.read(DefaultedConstructorArgument.class, dbObject);
}
/**
* @see DATAMONGO-358
*/
@Test
public void writesListForObjectPropertyCorrectly() {
Attribute attribute = new Attribute();
attribute.key = "key";
attribute.value = Arrays.asList("1", "2");
Item item = new Item();
item.attributes = Arrays.asList(attribute);
DBObject result = new BasicDBObject();
converter.write(item, result);
Item read = converter.read(Item.class, result);
assertThat(read.attributes.size(), is(1));
assertThat(read.attributes.get(0).key, is(attribute.key));
assertThat(read.attributes.get(0).value, is(instanceOf(Collection.class)));
@SuppressWarnings("unchecked")
Collection<String> values = (Collection<String>) read.attributes.get(0).value;
assertThat(values.size(), is(2));
assertThat(values, hasItems("1", "2"));
}
/**
* @see DATAMONGO-380
*/
@Test(expected = MappingException.class)
public void rejectsMapWithKeyContainingDotsByDefault() {
converter.write(Collections.singletonMap("foo.bar", "foobar"), new BasicDBObject());
}
/**
* @see DATAMONGO-380
*/
@Test
public void escapesDotInMapKeysIfReplacementConfigured() {
converter.setMapKeyDotReplacement("~");
DBObject dbObject = new BasicDBObject();
converter.write(Collections.singletonMap("foo.bar", "foobar"), dbObject);
assertThat((String) dbObject.get("foo~bar"), is("foobar"));
assertThat(dbObject.containsField("foo.bar"), is(false));
}
/**
* @see DATAMONGO-380
*/
@Test
@SuppressWarnings("unchecked")
public void unescapesDotInMapKeysIfReplacementConfigured() {
converter.setMapKeyDotReplacement("~");
DBObject dbObject = new BasicDBObject("foo~bar", "foobar");
Map<String, String> result = converter.read(Map.class, dbObject);
assertThat(result.get("foo.bar"), is("foobar"));
assertThat(result.containsKey("foobar"), is(false));
}
/**
* @see DATAMONGO-382
*/
@Test
public void convertsSetToBasicDBList() {
Address address = new Address();
address.city = "London";
address.street = "Foo";
Object result = converter.convertToMongoType(Collections.singleton(address));
assertThat(result, is(instanceOf(BasicDBList.class)));
Set<?> readResult = converter.read(Set.class, (BasicDBList) result);
assertThat(readResult.size(), is(1));
assertThat(readResult.iterator().next(), is(instanceOf(Map.class)));
}
/**
* @see DATAMONGO-462
*/
@Test
public void readsURLsAsStringsByDefault() throws Exception {
DBObject dbObject = new BasicDBObject("url", new URL("http://springsource.org"));
URLWrapper result = converter.read(URLWrapper.class, dbObject);
assertThat(result.url, is(new URL("http://springsource.org")));
}
/**
* @see DATAMONGO-462
*/
@Test
public void writesURLsAsStringsByDefault() throws Exception {
URLWrapper wrapper = new URLWrapper();
wrapper.url = new URL("http://springsource.org");
DBObject sink = new BasicDBObject();
converter.write(wrapper, sink);
assertThat(sink.get("url"), is((Object) "http://springsource.org"));
}
/**
* @see DATAMONGO-485
*/
@Test
public void writesComplexIdCorrectly() {
ComplexId id = new ComplexId();
id.innerId = 4711L;
ClassWithComplexId entity = new ClassWithComplexId();
entity.complexId = id;
DBObject dbObject = new BasicDBObject();
converter.write(entity, dbObject);
Object idField = dbObject.get("_id");
assertThat(idField, is(notNullValue()));
assertThat(idField, is(instanceOf(DBObject.class)));
assertThat(((DBObject) idField).get("innerId"), is((Object) 4711L));
}
/**
* @see DATAMONGO-485
*/
@Test
public void readsComplexIdCorrectly() {
DBObject innerId = new BasicDBObject("innerId", 4711L);
DBObject entity = new BasicDBObject("_id", innerId);
ClassWithComplexId result = converter.read(ClassWithComplexId.class, entity);
assertThat(result.complexId, is(notNullValue()));
assertThat(result.complexId.innerId, is(4711L));
}
/**
* @see DATAMONGO-489
*/
@Test
public void readsArraysAsMapValuesCorrectly() {
BasicDBList list = new BasicDBList();
list.add("Foo");
list.add("Bar");
DBObject map = new BasicDBObject("key", list);
DBObject wrapper = new BasicDBObject("mapOfStrings", map);
ClassWithMapProperty result = converter.read(ClassWithMapProperty.class, wrapper);
assertThat(result.mapOfStrings, is(notNullValue()));
String[] values = result.mapOfStrings.get("key");
assertThat(values, is(notNullValue()));
assertThat(values, is(arrayWithSize(2)));
}
static class GenericType<T> {
T content;
}
class ClassWithEnumProperty {
static class ClassWithEnumProperty {
SampleEnum sampleEnum;
List<SampleEnum> enums;
}
enum SampleEnum {
static enum SampleEnum {
FIRST {
@Override
void method() {
@@ -882,17 +1142,16 @@ public class MappingMongoConverterUnitTests {
abstract void method();
}
class Address {
static class Address {
String street;
String city;
}
interface Contact {
}
public static class Person implements Contact {
static class Person implements Contact {
LocalDate birthDate;
@Field("foo")
@@ -910,46 +1169,47 @@ public class MappingMongoConverterUnitTests {
}
}
class ClassWithSortedMap {
static class ClassWithSortedMap {
SortedMap<String, String> map;
}
class ClassWithMapProperty {
static class ClassWithMapProperty {
Map<Locale, String> map;
Map<String, List<String>> mapOfLists;
Map<String, Object> mapOfObjects;
Map<String, String[]> mapOfStrings;
}
class ClassWithNestedMaps {
static class ClassWithNestedMaps {
Map<String, Map<String, Map<String, String>>> nestedMaps;
}
class BirthDateContainer {
static class BirthDateContainer {
LocalDate birthDate;
}
class BigDecimalContainer {
static class BigDecimalContainer {
BigDecimal value;
Map<String, BigDecimal> map;
List<BigDecimal> collection;
}
class CollectionWrapper {
static class CollectionWrapper {
List<Contact> contacts;
List<List<String>> strings;
List<Map<String, Locale>> listOfMaps;
}
class LocaleWrapper {
static class LocaleWrapper {
Locale locale;
}
class ClassWithBigIntegerId {
static class ClassWithBigIntegerId {
@Id
BigInteger id;
}
class A<T> {
static class A<T> {
String valueType;
T value;
@@ -960,12 +1220,48 @@ public class MappingMongoConverterUnitTests {
}
}
class ClassWithIntId {
static class ClassWithIntId {
@Id
int id;
}
static class DefaultedConstructorArgument {
String foo;
int bar;
double foobar;
DefaultedConstructorArgument(String foo, @Value("#root.something ?: -1") int bar, double foobar) {
this.foo = foo;
this.bar = bar;
this.foobar = foobar;
}
}
static class Item {
List<Attribute> attributes;
}
static class Attribute {
String key;
Object value;
}
static class URLWrapper {
URL url;
}
static class ClassWithComplexId {
@Id
ComplexId complexId;
}
static class ComplexId {
Long innerId;
}
private class LocalDateToDateConverter implements Converter<LocalDate, Date> {
public Date convert(LocalDate source) {

View File

@@ -0,0 +1,70 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import java.util.UUID;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
/**
* Integration tests for {@link MongoConverters}.
*
* @author Oliver Gierke
*/
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration("classpath:infrastructure.xml")
public class MongoConvertersIntegrationTests {
static final String COLLECTION = "_sample";
@Autowired
MongoOperations template;
@Before
public void setUp() {
template.dropCollection(COLLECTION);
}
@Test
public void writesUUIDBinaryCorrectly() {
Wrapper wrapper = new Wrapper();
wrapper.uuid = UUID.randomUUID();
template.save(wrapper);
assertThat(wrapper.id, is(notNullValue()));
Wrapper result = template.findOne(Query.query(Criteria.where("id").is(wrapper.id)), Wrapper.class);
assertThat(result.uuid, is(wrapper.uuid));
}
static class Wrapper {
String id;
UUID uuid;
}
}

View File

@@ -21,7 +21,6 @@ import static org.junit.Assert.*;
import java.math.BigDecimal;
import org.junit.Test;
import org.springframework.data.mongodb.core.convert.MongoConverters;
import org.springframework.data.mongodb.core.convert.MongoConverters.BigDecimalToStringConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToBigDecimalConverter;

View File

@@ -34,10 +34,14 @@ import org.springframework.context.ApplicationContext;
import org.springframework.context.annotation.AnnotationConfigApplicationContext;
import org.springframework.dao.DataAccessException;
import org.springframework.data.mongodb.core.CollectionCallback;
import org.springframework.data.mongodb.core.IndexOperations;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.Venue;
import org.springframework.data.mongodb.core.index.GeospatialIndex;
import org.springframework.data.mongodb.core.index.IndexField;
import org.springframework.data.mongodb.core.index.IndexInfo;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Order;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.monitor.ServerInfo;
import org.springframework.expression.ExpressionParser;
@@ -54,7 +58,6 @@ import com.mongodb.WriteConcern;
* Modified from https://github.com/deftlabs/mongo-java-geospatial-example
*
* @author Mark Pollack
*
*/
public class GeoSpatialTests {
@@ -193,6 +196,26 @@ public class GeoSpatialTests {
assertThat(indexInfo.get(1).get("ns").toString(), is("database.newyork"));
}
/**
* @see DATAMONGO-360
*/
@Test
public void indexInfoIsCorrect() {
IndexOperations operations = template.indexOps(Venue.class);
List<IndexInfo> indexInfo = operations.getIndexInfo();
assertThat(indexInfo.size(), is(2));
List<IndexField> fields = indexInfo.get(0).getIndexFields();
assertThat(fields.size(), is(1));
assertThat(fields, hasItem(IndexField.create("_id", Order.ASCENDING)));
fields = indexInfo.get(1).getIndexFields();
assertThat(fields.size(), is(1));
assertThat(fields, hasItem(IndexField.geo("location")));
}
// TODO move to MongoAdmin
public List<DBObject> getIndexInfo(Class<?> clazz) {
return template.execute(clazz, new CollectionCallback<List<DBObject>>() {

View File

@@ -0,0 +1,70 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.index;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import org.junit.Test;
import org.springframework.data.mongodb.core.query.Order;
/**
* Unit tests for {@link IndexField}.
*
* @author Oliver Gierke
*/
public class IndexFieldUnitTests {
@Test
public void createsPlainIndexFieldCorrectly() {
IndexField field = IndexField.create("foo", Order.ASCENDING);
assertThat(field.getKey(), is("foo"));
assertThat(field.getOrder(), is(Order.ASCENDING));
assertThat(field.isGeo(), is(false));
}
@Test
public void createsGeoIndexFieldCorrectly() {
IndexField field = IndexField.geo("foo");
assertThat(field.getKey(), is("foo"));
assertThat(field.getOrder(), is(nullValue()));
assertThat(field.isGeo(), is(true));
}
@Test
public void correctEqualsForPlainFields() {
IndexField first = IndexField.create("foo", Order.ASCENDING);
IndexField second = IndexField.create("foo", Order.ASCENDING);
assertThat(first, is(second));
assertThat(second, is(first));
}
@Test
public void correctEqualsForGeoFields() {
IndexField first = IndexField.geo("bar");
IndexField second = IndexField.geo("bar");
assertThat(first, is(second));
assertThat(second, is(first));
}
}

View File

@@ -51,7 +51,6 @@ public class IndexingIntegrationTests {
operations.dropCollection(IndexedPerson.class);
}
/**
* @see DATADOC-237
*/

View File

@@ -0,0 +1,82 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.index;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import java.util.Collections;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.mockito.runners.MockitoJUnitRunner;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.mapping.Field;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import com.mongodb.DBObject;
/**
* Unit tests for {@link MongoPersistentEntityIndexCreator}.
*
* @author Oliver Gierke
*/
@RunWith(MockitoJUnitRunner.class)
public class MongoPersistentEntityIndexCreatorUnitTests {
@Mock
MongoDbFactory factory;
@Test
public void buildsIndexDefinitionUsingFieldName() {
MongoMappingContext mappingContext = new MongoMappingContext();
mappingContext.setInitialEntitySet(Collections.singleton(Person.class));
mappingContext.afterPropertiesSet();
DummyMongoPersistentEntityIndexCreator creator = new DummyMongoPersistentEntityIndexCreator(mappingContext, factory);
assertThat(creator.indexDefinition, is(notNullValue()));
assertThat(creator.indexDefinition.keySet(), hasItem("fieldname"));
assertThat(creator.name, is("indexName"));
}
static class Person {
@Indexed(name = "indexName")
@Field("fieldname")
String field;
}
static class DummyMongoPersistentEntityIndexCreator extends MongoPersistentEntityIndexCreator {
DBObject indexDefinition;
String name;
public DummyMongoPersistentEntityIndexCreator(MongoMappingContext mappingContext, MongoDbFactory mongoDbFactory) {
super(mappingContext, mongoDbFactory);
}
@Override
protected void ensureIndex(String collection, String name, DBObject indexDefinition, boolean unique,
boolean dropDups, boolean sparse) {
this.name = name;
this.indexDefinition = indexDefinition;
}
}
}

View File

@@ -48,7 +48,8 @@ public class BasicMongoPersistentEntityUnitTests {
@Test
public void evaluatesSpELExpression() {
MongoPersistentEntity<Company> entity = new BasicMongoPersistentEntity<Company>(ClassTypeInformation.from(Company.class));
MongoPersistentEntity<Company> entity = new BasicMongoPersistentEntity<Company>(
ClassTypeInformation.from(Company.class));
assertThat(entity.getCollection(), is("35"));
}
@@ -61,7 +62,8 @@ public class BasicMongoPersistentEntityUnitTests {
when(context.getBean("myBean")).thenReturn(provider);
when(context.containsBean("myBean")).thenReturn(true);
BasicMongoPersistentEntity<DynamicallyMapped> entity = new BasicMongoPersistentEntity<DynamicallyMapped>(ClassTypeInformation.from(DynamicallyMapped.class));
BasicMongoPersistentEntity<DynamicallyMapped> entity = new BasicMongoPersistentEntity<DynamicallyMapped>(
ClassTypeInformation.from(DynamicallyMapped.class));
entity.setApplicationContext(context);
assertThat(entity.getCollection(), is("reference"));

View File

@@ -20,8 +20,7 @@ import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import java.util.Collections;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
@@ -30,7 +29,9 @@ import org.mockito.runners.MockitoJUnitRunner;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Unit tests for testing the mapping works with generic types.
@@ -85,15 +86,15 @@ public class GenericMappingTests {
assertThat(result.container.content, is("Foo!"));
}
public class StringWrapper extends Wrapper<String> {
static class StringWrapper extends Wrapper<String> {
}
public class Wrapper<S> {
static class Wrapper<S> {
Container<S> container;
}
public class Container<T> {
static class Container<T> {
T content;
}
}

View File

@@ -28,11 +28,6 @@ import java.util.HashMap;
import java.util.List;
import java.util.Map;
import com.mongodb.DB;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mongodb.Mongo;
import com.mongodb.MongoException;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.bson.types.ObjectId;
@@ -46,12 +41,17 @@ import org.springframework.data.mongodb.MongoCollectionUtils;
import org.springframework.data.mongodb.core.CollectionCallback;
import org.springframework.data.mongodb.core.MongoDbUtils;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.Order;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.test.util.ReflectionTestUtils;
import com.mongodb.DB;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mongodb.Mongo;
import com.mongodb.MongoException;
/**
* @author Jon Brisbin <jbrisbin@vmware.com>
*/
@@ -72,8 +72,8 @@ public class MappingTests {
MongoCollectionUtils.getPreferredCollectionName(PersonWithLongDBRef.class),
MongoCollectionUtils.getPreferredCollectionName(PersonNullProperties.class),
MongoCollectionUtils.getPreferredCollectionName(Account.class),
MongoCollectionUtils.getPreferredCollectionName(PrimitiveId.class),
"foobar", "geolocation", "person1", "person2", "account"};
MongoCollectionUtils.getPreferredCollectionName(PrimitiveId.class), "foobar", "geolocation", "person1",
"person2", "account" };
ApplicationContext applicationContext;
Mongo mongo;
@@ -110,7 +110,8 @@ public class MappingTests {
LOGGER.info("done inserting");
assertNotNull(p.getId());
List<PersonWithObjectId> result = template.find(new Query(Criteria.where("ssn").is(12345)), PersonWithObjectId.class);
List<PersonWithObjectId> result = template.find(new Query(Criteria.where("ssn").is(12345)),
PersonWithObjectId.class);
assertThat(result.size(), is(1));
assertThat(result.get(0).getSsn(), is(12345));
}
@@ -229,10 +230,10 @@ public class MappingTests {
persons.add(new PersonCustomCollection2(66666, "Person", "Two"));
template.insertAll(persons);
List<PersonCustomCollection1> p1Results = template.find(new Query(Criteria.where("ssn").is(55555)), PersonCustomCollection1.class,
"person1");
List<PersonCustomCollection2> p2Results = template.find(new Query(Criteria.where("ssn").is(66666)), PersonCustomCollection2.class,
"person2");
List<PersonCustomCollection1> p1Results = template.find(new Query(Criteria.where("ssn").is(55555)),
PersonCustomCollection1.class, "person1");
List<PersonCustomCollection2> p2Results = template.find(new Query(Criteria.where("ssn").is(66666)),
PersonCustomCollection2.class, "person2");
assertThat(p1Results.size(), is(1));
assertThat(p2Results.size(), is(1));
}
@@ -255,7 +256,8 @@ public class MappingTests {
public Boolean doInCollection(DBCollection collection) throws MongoException, DataAccessException {
List<DBObject> indexes = collection.getIndexInfo();
for (DBObject dbo : indexes) {
if (dbo.get("name") != null && dbo.get("name") instanceof String && ((String)dbo.get("name")).startsWith("name")) {
if (dbo.get("name") != null && dbo.get("name") instanceof String
&& ((String) dbo.get("name")).startsWith("name")) {
return true;
}
}
@@ -271,7 +273,8 @@ public class MappingTests {
public Boolean doInCollection(DBCollection collection) throws MongoException, DataAccessException {
List<DBObject> indexes = collection.getIndexInfo();
for (DBObject dbo : indexes) {
if (dbo.get("name") != null && dbo.get("name") instanceof String && ((String)dbo.get("name")).startsWith("name")) {
if (dbo.get("name") != null && dbo.get("name") instanceof String
&& ((String) dbo.get("name")).startsWith("name")) {
return true;
}
}
@@ -367,13 +370,15 @@ public class MappingTests {
Person p2 = template.findOne(query(where("ssn").is(1111)), Person.class);
assertNull(p2);
template.upsert(query(where("ssn").is(1111).and("firstName").is("Query").and("lastName").is("Update")), update("address", addr), Person.class);
template.upsert(query(where("ssn").is(1111).and("firstName").is("Query").and("lastName").is("Update")),
update("address", addr), Person.class);
p2 = template.findOne(query(where("ssn").is(1111)), Person.class);
assertThat(p2.getAddress().getCity(), is("Anytown"));
template.dropCollection(Person.class);
template.upsert(query(where("ssn").is(1111).and("firstName").is("Query").and("lastName").is("Update")), update("address", addr), "person");
template.upsert(query(where("ssn").is(1111).and("firstName").is("Query").and("lastName").is("Update")),
update("address", addr), "person");
p2 = template.findOne(query(where("ssn").is(1111)), Person.class);
assertThat(p2.getAddress().getCity(), is("Anytown"));
@@ -386,8 +391,8 @@ public class MappingTests {
PersonWithObjectId p2 = new PersonWithObjectId(2, "second", "");
template.save(p2);
List<PersonWithObjectId> results = template.find(new Query(
new Criteria().orOperator(where("ssn").is(1), where("ssn").is(2))), PersonWithObjectId.class);
List<PersonWithObjectId> results = template.find(
new Query(new Criteria().orOperator(where("ssn").is(1), where("ssn").is(2))), PersonWithObjectId.class);
assertNotNull(results);
assertThat(results.size(), is(2));
@@ -426,18 +431,15 @@ public class MappingTests {
public void testNoMappingAnnotationsUsingLongAsId() {
PersonPojoLongId p = new PersonPojoLongId(1, "Text");
template.insert(p);
template.updateFirst(query(where("id").is(1)), update("text", "New Text"),
PersonPojoLongId.class);
template.updateFirst(query(where("id").is(1)), update("text", "New Text"), PersonPojoLongId.class);
PersonPojoLongId p2 = template.findOne(query(where("id").is(1)),
PersonPojoLongId.class);
PersonPojoLongId p2 = template.findOne(query(where("id").is(1)), PersonPojoLongId.class);
assertEquals("New Text", p2.getText());
p.setText("Different Text");
template.save(p);
PersonPojoLongId p3 = template.findOne(query(where("id").is(1)),
PersonPojoLongId.class);
PersonPojoLongId p3 = template.findOne(query(where("id").is(1)), PersonPojoLongId.class);
assertEquals("Different Text", p3.getText());
}
@@ -447,21 +449,17 @@ public class MappingTests {
// Assign the String Id in code
PersonPojoStringId p = new PersonPojoStringId("1", "Text");
template.insert(p);
template.updateFirst(query(where("id").is("1")), update("text", "New Text"),
PersonPojoStringId.class);
template.updateFirst(query(where("id").is("1")), update("text", "New Text"), PersonPojoStringId.class);
PersonPojoStringId p2 = template.findOne(query(where("id").is("1")),
PersonPojoStringId.class);
PersonPojoStringId p2 = template.findOne(query(where("id").is("1")), PersonPojoStringId.class);
assertEquals("New Text", p2.getText());
p.setText("Different Text");
template.save(p);
PersonPojoStringId p3 = template.findOne(query(where("id").is("1")),
PersonPojoStringId.class);
PersonPojoStringId p3 = template.findOne(query(where("id").is("1")), PersonPojoStringId.class);
assertEquals("Different Text", p3.getText());
PersonPojoStringId p4 = new PersonPojoStringId("2", "Text-2");
template.insert(p4);
@@ -513,8 +511,7 @@ public class MappingTests {
assertThat(result.items.get(0).id, is(items.id));
}
class Container {
static class Container {
@Id
final String id;
@@ -529,7 +526,7 @@ public class MappingTests {
List<Item> items;
}
class Item {
static class Item {
@Id
final String id;

View File

@@ -71,7 +71,8 @@ public class ApplicationContextEventTests {
public void beforeSaveEvent() {
PersonBeforeSaveListener personBeforeSaveListener = applicationContext.getBean(PersonBeforeSaveListener.class);
AfterSaveListener afterSaveListener = applicationContext.getBean(AfterSaveListener.class);
SimpleMappingEventListener simpleMappingEventListener = applicationContext.getBean(SimpleMappingEventListener.class);
SimpleMappingEventListener simpleMappingEventListener = applicationContext
.getBean(SimpleMappingEventListener.class);
assertEquals(0, personBeforeSaveListener.seenEvents.size());
assertEquals(0, afterSaveListener.seenEvents.size());
@@ -79,7 +80,6 @@ public class ApplicationContextEventTests {
assertEquals(0, simpleMappingEventListener.onBeforeSaveEvents.size());
assertEquals(0, simpleMappingEventListener.onAfterSaveEvents.size());
PersonPojoStringId p = new PersonPojoStringId("1", "Text");
template.insert(p);
@@ -92,7 +92,8 @@ public class ApplicationContextEventTests {
Assert.assertTrue(personBeforeSaveListener.seenEvents.get(0) instanceof BeforeSaveEvent<?>);
Assert.assertTrue(afterSaveListener.seenEvents.get(0) instanceof AfterSaveEvent<?>);
BeforeSaveEvent<PersonPojoStringId> beforeSaveEvent = (BeforeSaveEvent<PersonPojoStringId>)personBeforeSaveListener.seenEvents.get(0);
BeforeSaveEvent<PersonPojoStringId> beforeSaveEvent = (BeforeSaveEvent<PersonPojoStringId>) personBeforeSaveListener.seenEvents
.get(0);
PersonPojoStringId p2 = beforeSaveEvent.getSource();
DBObject dbo = beforeSaveEvent.getDBObject();

View File

@@ -19,7 +19,6 @@ import java.util.ArrayList;
import com.mongodb.DBObject;
public class SimpleMappingEventListener extends AbstractMongoEventListener<Object> {
public final ArrayList<BeforeConvertEvent<Object>> onBeforeConvertEvents = new ArrayList<BeforeConvertEvent<Object>>();

View File

@@ -38,7 +38,6 @@ public class ContentAndVersion {
this.document_id = documentId;
}
public String getId() {
return id;
}
@@ -69,5 +68,4 @@ public class ContentAndVersion {
+ author + ", version=" + version + ", value=" + value + "]";
}
}

View File

@@ -53,14 +53,12 @@ public class GroupByTests {
// @Autowired
// MongoTemplate mongoTemplate;
MongoTemplate mongoTemplate;
@Autowired
@SuppressWarnings("unchecked")
public void setMongo(Mongo mongo) throws Exception {
MongoMappingContext mappingContext = new MongoMappingContext();
mappingContext.setInitialEntitySet(new HashSet<Class<?>>(Arrays.asList(XObject.class)));
mappingContext.afterPropertiesSet();
@@ -72,7 +70,6 @@ public class GroupByTests {
}
@Before
public void setUp() {
cleanDb();
@@ -91,7 +88,8 @@ public class GroupByTests {
@Test
public void singleKeyCreation() {
DBObject gc = new GroupBy("a").getGroupByObject();
//String expected = "{ \"group\" : { \"ns\" : \"test\" , \"key\" : { \"a\" : 1} , \"cond\" : null , \"$reduce\" : null , \"initial\" : null }}";
// String expected =
// "{ \"group\" : { \"ns\" : \"test\" , \"key\" : { \"a\" : 1} , \"cond\" : null , \"$reduce\" : null , \"initial\" : null }}";
String expected = "{ \"key\" : { \"a\" : 1} , \"$reduce\" : null , \"initial\" : null }";
Assert.assertEquals(expected, gc.toString());
}
@@ -99,7 +97,8 @@ public class GroupByTests {
@Test
public void multipleKeyCreation() {
DBObject gc = GroupBy.key("a", "b").getGroupByObject();
//String expected = "{ \"group\" : { \"ns\" : \"test\" , \"key\" : { \"a\" : 1 , \"b\" : 1} , \"cond\" : null , \"$reduce\" : null , \"initial\" : null }}";
// String expected =
// "{ \"group\" : { \"ns\" : \"test\" , \"key\" : { \"a\" : 1 , \"b\" : 1} , \"cond\" : null , \"$reduce\" : null , \"initial\" : null }}";
String expected = "{ \"key\" : { \"a\" : 1 , \"b\" : 1} , \"$reduce\" : null , \"initial\" : null }";
Assert.assertEquals(expected, gc.toString());
}
@@ -116,8 +115,10 @@ public class GroupByTests {
createGroupByData();
GroupByResults<XObject> results;
results = mongoTemplate.group("group_test_collection",
GroupBy.key("x").initialDocument(new BasicDBObject("count", 0)).reduceFunction("function(doc, prev) { prev.count += 1 }"), XObject.class);
results = mongoTemplate.group(
"group_test_collection",
GroupBy.key("x").initialDocument(new BasicDBObject("count", 0))
.reduceFunction("function(doc, prev) { prev.count += 1 }"), XObject.class);
assertMapReduceResults(results);
@@ -128,8 +129,10 @@ public class GroupByTests {
createGroupByData();
GroupByResults<XObject> results;
results = mongoTemplate.group("group_test_collection",
GroupBy.keyFunction("function(doc) { return { x : doc.x }; }").initialDocument("{ count: 0 }").reduceFunction("function(doc, prev) { prev.count += 1 }"), XObject.class);
results = mongoTemplate.group(
"group_test_collection",
GroupBy.keyFunction("function(doc) { return { x : doc.x }; }").initialDocument("{ count: 0 }")
.reduceFunction("function(doc, prev) { prev.count += 1 }"), XObject.class);
assertMapReduceResults(results);
}
@@ -139,26 +142,23 @@ public class GroupByTests {
createGroupByData();
GroupByResults<XObject> results;
results = mongoTemplate.group("group_test_collection",
GroupBy.keyFunction("classpath:keyFunction.js").initialDocument("{ count: 0 }").reduceFunction("classpath:groupReduce.js"), XObject.class);
results = mongoTemplate.group("group_test_collection", GroupBy.keyFunction("classpath:keyFunction.js")
.initialDocument("{ count: 0 }").reduceFunction("classpath:groupReduce.js"), XObject.class);
assertMapReduceResults(results);
}
@Test
public void SimpleGroupWithQueryAndFunctionsAsResources() {
createGroupByData();
GroupByResults<XObject> results;
results = mongoTemplate.group(where("x").gt(0),
"group_test_collection",
keyFunction("classpath:keyFunction.js").initialDocument("{ count: 0 }").reduceFunction("classpath:groupReduce.js"), XObject.class);
results = mongoTemplate.group(where("x").gt(0), "group_test_collection", keyFunction("classpath:keyFunction.js")
.initialDocument("{ count: 0 }").reduceFunction("classpath:groupReduce.js"), XObject.class);
assertMapReduceResults(results);
}
private void assertMapReduceResults(GroupByResults<XObject> results) {
DBObject dboRawResults = results.getRawResults();
String expected = "{ \"serverUsed\" : \"127.0.0.1:27017\" , \"retval\" : [ { \"x\" : 1.0 , \"count\" : 2.0} , { \"x\" : 2.0 , \"count\" : 1.0} , { \"x\" : 3.0 , \"count\" : 3.0}] , \"count\" : 6.0 , \"keys\" : 3 , \"ok\" : 1.0}";
@@ -182,7 +182,6 @@ public class GroupByTests {
Assert.assertEquals(3, results.getKeys());
}
private void createGroupByData() {
DBCollection c = mongoTemplate.getDb().getCollection("group_test_collection");
c.save(new BasicDBObject("x", 1));

View File

@@ -19,7 +19,6 @@ import org.junit.Test;
public class MapReduceOptionsTests {
@Test
public void testFinalize() {
new MapReduceOptions().finalizeFunction("code");

View File

@@ -0,0 +1,72 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.mapreduce;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import java.util.Collections;
import org.junit.Test;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Unit tests for {@link MapReduceResults}.
*
* @author Oliver Gierke
*/
public class MapReduceResultsUnitTests {
/**
* @see DATAMONGO-428
*/
@Test
public void resolvesOutputCollectionForPlainResult() {
DBObject rawResult = new BasicDBObject("result", "FOO");
MapReduceResults<Object> results = new MapReduceResults<Object>(Collections.emptyList(), rawResult);
assertThat(results.getOutputCollection(), is("FOO"));
}
/**
* @see DATAMONGO-428
*/
@Test
public void resolvesOutputCollectionForDBObjectResult() {
DBObject rawResult = new BasicDBObject("result", new BasicDBObject("collection", "FOO"));
MapReduceResults<Object> results = new MapReduceResults<Object>(Collections.emptyList(), rawResult);
assertThat(results.getOutputCollection(), is("FOO"));
}
/**
* @see DATAMONGO-378
*/
@Test
public void handlesLongTotalInResult() {
DBObject inner = new BasicDBObject("total", 1L);
inner.put("mapTime", 1L);
inner.put("emitLoop", 1);
DBObject source = new BasicDBObject("timing", inner);
new MapReduceResults<Object>(Collections.emptyList(), source);
}
}

View File

@@ -134,8 +134,8 @@ public class MapReduceTests {
createNumberAndVersionData();
String map = "function () { emit(this.number, this.version); }";
String reduce = "function (key, values) { return Math.max.apply(Math, values); }";
MapReduceResults<NumberAndVersion> results =
mongoTemplate.mapReduce("jmr2", map, reduce, new MapReduceOptions().outputCollection("jmr2_out"), NumberAndVersion.class);
MapReduceResults<NumberAndVersion> results = mongoTemplate.mapReduce("jmr2", map, reduce,
new MapReduceOptions().outputCollection("jmr2_out"), NumberAndVersion.class);
int size = 0;
for (NumberAndVersion nv : results) {
if (nv.getId().equals("1")) {

View File

@@ -10,31 +10,38 @@ public class NumberAndVersion {
public Long getValue() {
return value;
}
public void setValue(Long value) {
this.value = value;
}
public String getId() {
return id;
}
public void setId(String id) {
this.id = id;
}
public Long getNumber() {
return number;
}
public void setNumber(Long number) {
this.number = number;
}
public Long getVersion() {
return version;
}
public void setVersion(Long version) {
this.version = version;
}
@Override
public String toString() {
return "NumberAndVersion [id=" + id + ", number=" + number + ", version=" + version + ", value=" + value + "]";
}
}

View File

@@ -6,27 +6,22 @@ public class XObject {
private float count;
public float getX() {
return x;
}
public void setX(float x) {
this.x = x;
}
public float getCount() {
return count;
}
public void setCount(float count) {
this.count = count;
}
@Override
public String toString() {
return "XObject [x=" + x + " count = " + count + "]";

View File

@@ -15,10 +15,10 @@
*/
package org.springframework.data.mongodb.core.query;
import static org.junit.Assert.*;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import org.junit.Test;
import org.springframework.data.mongodb.core.query.Criteria;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
@@ -51,4 +51,14 @@ public class CriteriaTests {
Criteria c = new Criteria("name").is("Bubba").and("age").lt(21);
assertEquals("{ \"name\" : \"Bubba\" , \"age\" : { \"$lt\" : 21}}", c.getCriteriaObject().toString());
}
@Test
public void equalIfCriteriaMatches() {
Criteria left = new Criteria("name").is("Foo").and("lastname").is("Bar");
Criteria right = new Criteria("name").is("Bar").and("lastname").is("Bar");
assertThat(left, is(not(right)));
assertThat(right, is(not(left)));
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -19,8 +19,12 @@ import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import static org.springframework.data.mongodb.core.query.Query.*;
import static org.springframework.data.mongodb.core.DBObjectUtils.*;
import java.math.BigInteger;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import org.bson.types.ObjectId;
import org.junit.Before;
@@ -30,6 +34,7 @@ import org.mockito.Mock;
import org.mockito.runners.MockitoJUnitRunner;
import org.springframework.data.annotation.Id;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.Person;
import org.springframework.data.mongodb.core.QueryMapper;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
@@ -38,6 +43,7 @@ import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
import com.mongodb.QueryBuilder;
/**
* Unit tests for {@link QueryMapper}.
@@ -45,7 +51,6 @@ import com.mongodb.DBObject;
* @author Oliver Gierke
*/
@RunWith(MockitoJUnitRunner.class)
@SuppressWarnings("unused")
public class QueryMapperUnitTests {
QueryMapper mapper;
@@ -80,15 +85,15 @@ public class QueryMapperUnitTests {
public void convertsStringIntoObjectId() {
DBObject query = new BasicDBObject("_id", new ObjectId().toString());
DBObject result = mapper.getMappedObject(query, null);
assertThat(result.get("_id"), is(ObjectId.class));
DBObject result = mapper.getMappedObject(query, context.getPersistentEntity(IdWrapper.class));
assertThat(result.get("_id"), is(instanceOf(ObjectId.class)));
}
@Test
public void handlesBigIntegerIdsCorrectly() {
DBObject dbObject = new BasicDBObject("id", new BigInteger("1"));
DBObject result = mapper.getMappedObject(dbObject, null);
DBObject result = mapper.getMappedObject(dbObject, context.getPersistentEntity(IdWrapper.class));
assertThat(result.get("_id"), is((Object) "1"));
}
@@ -97,7 +102,7 @@ public class QueryMapperUnitTests {
ObjectId id = new ObjectId();
DBObject dbObject = new BasicDBObject("id", new BigInteger(id.toString(), 16));
DBObject result = mapper.getMappedObject(dbObject, null);
DBObject result = mapper.getMappedObject(dbObject, context.getPersistentEntity(IdWrapper.class));
assertThat(result.get("_id"), is((Object) id));
}
@@ -111,9 +116,9 @@ public class QueryMapperUnitTests {
DBObject result = mapper.getMappedObject(criteria.getCriteriaObject(), context.getPersistentEntity(Sample.class));
Object object = result.get("_id");
assertThat(object, is(DBObject.class));
assertThat(object, is(instanceOf(DBObject.class)));
DBObject dbObject = (DBObject) object;
assertThat(dbObject.get("$ne"), is(ObjectId.class));
assertThat(dbObject.get("$ne"), is(instanceOf(ObjectId.class)));
}
/**
@@ -125,20 +130,19 @@ public class QueryMapperUnitTests {
DBObject result = mapper.getMappedObject(query.getQueryObject(), null);
Object object = result.get("foo");
assertThat(object, is(String.class));
assertThat(object, is(instanceOf(String.class)));
}
@Test
public void handlesEnumsInNotEqualCorrectly() {
Query query = query(where("foo").ne(Enum.INSTANCE));
DBObject result = mapper.getMappedObject(query.getQueryObject(), null);
Object object = result.get("foo");
assertThat(object, is(DBObject.class));
assertThat(object, is(instanceOf(DBObject.class)));
Object ne = ((DBObject) object).get("$ne");
assertThat(ne, is(String.class));
assertThat(ne, is(instanceOf(String.class)));
assertThat(ne.toString(), is(Enum.INSTANCE.name()));
}
@@ -149,17 +153,115 @@ public class QueryMapperUnitTests {
DBObject result = mapper.getMappedObject(query.getQueryObject(), null);
Object object = result.get("foo");
assertThat(object, is(DBObject.class));
assertThat(object, is(instanceOf(DBObject.class)));
Object in = ((DBObject) object).get("$in");
assertThat(in, is(BasicDBList.class));
assertThat(in, is(instanceOf(BasicDBList.class)));
BasicDBList list = (BasicDBList) in;
assertThat(list.size(), is(1));
assertThat(list.get(0), is(String.class));
assertThat(list.get(0), is(instanceOf(String.class)));
assertThat(list.get(0).toString(), is(Enum.INSTANCE.name()));
}
/**
* @see DATAMONGO-373
*/
@Test
public void handlesNativelyBuiltQueryCorrectly() {
DBObject query = new QueryBuilder().or(new BasicDBObject("foo", "bar")).get();
mapper.getMappedObject(query, null);
}
/**
* @see DATAMONGO-369
*/
@Test
public void handlesAllPropertiesIfDBObject() {
DBObject query = new BasicDBObject();
query.put("foo", new BasicDBObject("$in", Arrays.asList(1, 2)));
query.put("bar", new Person());
DBObject result = mapper.getMappedObject(query, null);
assertThat(result.get("bar"), is(notNullValue()));
}
/**
* @see DATAMONGO-429
*/
@Test
public void transformsArraysCorrectly() {
Query query = new BasicQuery("{ 'tags' : { '$all' : [ 'green', 'orange']}}");
DBObject result = mapper.getMappedObject(query.getQueryObject(), null);
assertThat(result, is(query.getQueryObject()));
}
@Test
public void doesNotHandleNestedFieldsWithDefaultIdNames() {
BasicDBObject dbObject = new BasicDBObject("id", new ObjectId().toString());
dbObject.put("nested", new BasicDBObject("id", new ObjectId().toString()));
MongoPersistentEntity<?> entity = context.getPersistentEntity(ClassWithDefaultId.class);
DBObject result = mapper.getMappedObject(dbObject, entity);
assertThat(result.get("_id"), is(instanceOf(ObjectId.class)));
assertThat(((DBObject) result.get("nested")).get("id"), is(instanceOf(String.class)));
}
/**
* @see DATAMONGO-493
*/
@Test
public void doesNotTranslateNonIdPropertiesFor$NeCriteria() {
ObjectId accidentallyAnObjectId = new ObjectId();
Query query = Query.query(Criteria.where("id").is("id_value").and("publishers")
.ne(accidentallyAnObjectId.toString()));
DBObject dbObject = mapper.getMappedObject(query.getQueryObject(), context.getPersistentEntity(UserEntity.class));
assertThat(dbObject.get("publishers"), is(instanceOf(DBObject.class)));
DBObject publishers = (DBObject) dbObject.get("publishers");
assertThat(publishers.containsField("$ne"), is(true));
assertThat(publishers.get("$ne"), is(instanceOf(String.class)));
}
/**
* @see DATAMONGO-494
*/
@Test
public void usesEntityMetadataInOr() {
Query query = query(new Criteria().orOperator(where("foo").is("bar")));
DBObject result = mapper.getMappedObject(query.getQueryObject(), context.getPersistentEntity(Sample.class));
assertThat(result.keySet(), hasSize(1));
assertThat(result.keySet(), hasItem("$or"));
BasicDBList ors = getAsDBList(result, "$or");
assertThat(ors, hasSize(1));
DBObject criterias = getAsDBObject(ors, 0);
assertThat(criterias.keySet(), hasSize(1));
assertThat(criterias.get("_id"), is(notNullValue()));
assertThat(criterias.get("foo"), is(nullValue()));
}
class IdWrapper {
Object id;
}
class ClassWithDefaultId {
String id;
ClassWithDefaultId nested;
}
class Sample {
@Id
@@ -175,4 +277,9 @@ public class QueryMapperUnitTests {
enum Enum {
INSTANCE;
}
class UserEntity {
String id;
List<String> publishers = new ArrayList<String>();
}
}

View File

@@ -48,7 +48,8 @@ public class QueryTests {
@Test
public void testOrQuery() {
Query q = new Query(new Criteria().orOperator(where("name").is("Sven").and("age").lt(50), where("age").lt(50), where("name").is("Thomas")));
Query q = new Query(new Criteria().orOperator(where("name").is("Sven").and("age").lt(50), where("age").lt(50),
where("name").is("Thomas")));
String expected = "{ \"$or\" : [ { \"name\" : \"Sven\" , \"age\" : { \"$lt\" : 50}} , { \"age\" : { \"$lt\" : 50}} , { \"name\" : \"Thomas\"}]}";
Assert.assertEquals(expected, q.getQueryObject().toString());
}
@@ -62,7 +63,8 @@ public class QueryTests {
@Test
public void testNorQuery() {
Query q = new Query(new Criteria().norOperator(where("name").is("Sven"), where("age").lt(50), where("name").is("Thomas")));
Query q = new Query(new Criteria().norOperator(where("name").is("Sven"), where("age").lt(50),
where("name").is("Thomas")));
String expected = "{ \"$nor\" : [ { \"name\" : \"Sven\"} , { \"age\" : { \"$lt\" : 50}} , { \"name\" : \"Thomas\"}]}";
Assert.assertEquals(expected, q.getQueryObject().toString());
}
@@ -98,7 +100,7 @@ public class QueryTests {
public void testComplexQueryWithMultipleChainedCriteria() {
Query q = new Query(where("name").regex("^T.*").and("age").gt(20).lt(80).and("city")
.in("Stockholm", "London", "New York"));
String expected = "{ \"name\" : { \"$regex\" : \"^T.*\"} , \"age\" : { \"$gt\" : 20 , \"$lt\" : 80} , "
String expected = "{ \"name\" : { \"$regex\" : \"^T.*\" , \"$options\" : \"\"} , \"age\" : { \"$gt\" : 20 , \"$lt\" : 80} , "
+ "\"city\" : { \"$in\" : [ \"Stockholm\" , \"London\" , \"New York\"]}}";
Assert.assertEquals(expected, q.getQueryObject().toString());
}
@@ -132,7 +134,7 @@ public class QueryTests {
@Test
public void testQueryWithRegex() {
Query q = new Query(where("name").regex("b.*"));
String expected = "{ \"name\" : { \"$regex\" : \"b.*\"}}";
String expected = "{ \"name\" : { \"$regex\" : \"b.*\" , \"$options\" : \"\"}}";
Assert.assertEquals(expected, q.getQueryObject().toString());
}

View File

@@ -34,7 +34,8 @@ public class UpdateTests {
@Test
public void testSetSet() {
Update u = new Update().set("directory", "/Users/Test/Desktop").set("size", 0);
Assert.assertEquals("{ \"$set\" : { \"directory\" : \"/Users/Test/Desktop\" , \"size\" : 0}}", u.getUpdateObject().toString());
Assert.assertEquals("{ \"$set\" : { \"directory\" : \"/Users/Test/Desktop\" , \"size\" : 0}}", u.getUpdateObject()
.toString());
}
@Test

Some files were not shown because too many files have changed in this diff Show More