Compare commits

...

58 Commits
3.1.2 ... 1.2.x

Author SHA1 Message Date
Spring Buildmaster
1202d2cbc2 DATAMONGO-755 - Prepare next development iteration. 2013-09-30 05:01:30 -07:00
Spring Buildmaster
0edf6c06ed DATAMONGO-755 - Release version 1.2.4.RELEASE. 2013-09-30 05:01:28 -07:00
Oliver Gierke
0afd137e7d DATAMONGO-755 - Prepare 1.2.4.RELEASE.
Upgraded to Spring Data Core 1.5.3. Updated changelog, readme etc.
2013-09-30 13:40:36 +02:00
Thomas Darimont
bc3f44197b DATAMONGO-445 - Allow to skip unnecessary elements in NearQuery.
Added support for skipping elements for NearQuery in MongoTemplate. As mongodb currently (2.4.4) doesn't support he skipping of elements in geoNear-Queries we skip the unnecessary elements ourselves. We use the limit & skip information from the given query or an explicitly passed Pageable.

Original pull request: #64.
2013-08-22 18:22:00 +02:00
Thomas Darimont
28bd631579 DATAMONGO-742 - Document CDI integration in reference documentation.
Added chapter for CDI Integration under the new chapter Miscellaneous.

Original pull request: #63.
2013-08-13 12:25:59 +02:00
Randy Watler
5396df9af4 DATAMONGO-737 - Register TransactionSynchronization holder once per Mongo instance.
Original pull request: #62.
2013-08-12 17:39:42 +02:00
Thomas Darimont
2b864e9744 DATAMONGO-507 - Reject incorrect usage of Criteria#not().
Added a guard to Criteria#(and|or|nor)Operator to prevent wrapping $and, $or or $nor expressions in a $not expression as mongodb currently doesn't support this. Added test case to CriteriaTests to verify that not() works as specified.

Original pull request: #60.
2013-08-09 12:24:40 +02:00
Spring Buildmaster
5accbbdac5 DATAMONGO-729 - Prepare next development iteration. 2013-07-24 06:32:26 -07:00
Spring Buildmaster
11d9f04fd1 DATAMONGO-729 - Release version 1.2.3.RELEASE. 2013-07-24 06:32:22 -07:00
Oliver Gierke
67b91e446e DATAMONGO-729 - Prepare 1.2.3 release.
Upgraded to Spring Data Build 1.0.4.RELEASE, Spring Data Commons 1.5.2.RELEASE. Updated changelog, notice and readmes. Removed Snapshot repository in favor of the release one.
2013-07-24 12:40:47 +02:00
Thomas Darimont
1124841e17 DATAMONGO-728 - Added missing package-info files. 2013-07-23 16:33:39 +02:00
Spring Buildmaster
10ccbf131d DATAMONGO-727 - Prepare next development iteration. 2013-07-19 06:37:58 -07:00
Spring Buildmaster
b29930b512 DATAMONGO-727 - Release version 1.2.2.RELEASE. 2013-07-19 06:37:54 -07:00
Oliver Gierke
d671fb13ae DATAMONGO-727 - Prepare 1.2.2.RELEASE. 2013-07-19 15:21:57 +02:00
Oliver Gierke
b0a10d19c3 DATAMONGO-723 - Cleand up a few misnamed test cases.
Reactivated test cases that were name Test instead of Tests by accident.
2013-07-12 18:44:28 +02:00
Oliver Gierke
d8ef7e1472 DATAMONGO-717 - Partial back-port of change in AbstractMongoConfiguration.
Backported a part of DATAMONGO-569 to avoid lifecycle issues on MongoMappingContext. ….initialize() was called to early so that other lifecycle callbacks (e.g. setApplicationContext(…)) are applied too late.
2013-07-10 23:00:36 +02:00
Thomas Darimont
b9a25eabae DATAMONGO-685 - ServerInfo should return used hostname reported by MongoDB.
Added test case getHostNameShouldReturnServerNameReportedByMongo() to MongoMonitorIntegrationTests. Modified MongoMonitorIntegrationTests to use common mongo-infrastructure configuration. ServerInfo.getHostName() is now derived from serverStatus.serverUsed.

Original pull request: #51.
2013-07-10 14:30:40 +02:00
Oliver Gierke
e92e5c737f DATAMONGO-540 - Fixed test case.
Apparently the test case was not working on MongoDB instances in version 2.2.
2013-07-08 14:07:24 +02:00
Thomas Darimont
6e46fb12cb DATAMONGO-671 - Improve test case to not fail on fast machines. 2013-07-05 12:00:27 +02:00
Thomas Darimont
031d446a1c DATAMONGO-540 - Added test case to show findOne(…) works after upsert. 2013-07-05 12:00:18 +02:00
Oliver Gierke
e6bab1ce60 DATAMONGO-671 - Added integration tests to show lookups by date are working. 2013-07-05 12:00:04 +02:00
Oliver Gierke
2fffe0a5c4 DATAMONGO-714 - Updated formatter to latest changes. 2013-07-05 11:51:35 +02:00
Thomas Darimont
2493de5f91 DATAMONGO-693 - More robust handling of host and replica set config in MongoFactoryBean.
MongoFactoryBean now considers empty strings for the replicaPair property as not set at all. The ServerAdressPropertyEditor also returns null as value for empty text strings. Deprecated setter for replica pair on MongoFactoryBean.
2013-07-02 16:37:54 +02:00
Oliver Gierke
bd11bab076 DATAMONGO-675 - MongoTemplate now maps Update in findAndModify(…).
The Update object handed to ….findAndModify(…) is now handed through the QueryMapper before being executed.
2013-07-01 08:27:29 +02:00
Oliver Gierke
b667984563 DATAMONGO-706 - Fixed DBRef conversion for nested keywords. 2013-06-29 13:07:52 +02:00
Oliver Gierke
7ef167ed96 DATAMONGO-705 - Fixed handling of DBRefs in QueryMapper.
QueryMapper now converts values to become DBRefs correctly in getMappedKeyword(…). Added an exclusion path for the value handling in case we have an $exists keyword.
2013-06-25 19:29:11 +02:00
Andrew Duncan
303a057d86 DATAMONGO-701 - Improve performance of starts-with and ends-with queries.
This changes the starts-with regex to the prefixed form using ^ to better make use of any index on the queried field. Also changes ending-with queries to use the $ anchor.
2013-06-25 18:03:09 +02:00
Ivan Sopov
607072c0d3 DATAMONGO-704 - Removed references to SimpleMongoConverter from JavaDoc.
Removing SimpleMongoConverter references from javadocs In commit 2832b524d3 MappingMongoConverter was made default instead of SimpleMongoConverter. Also SimpleMongoConverter was completely removed between 1.0.0.M3 and 1.0.0.M4 releases. This is an update for JavaDocs, that still reference SimpleMongoConverter as the default MongoConverter.
2013-06-25 17:32:05 +02:00
Oliver Gierke
11e9c562b3 DATAMONGO-683 - QueryMapper now handles default ID names it mapping metadata missing.
Changed the implementation so that _id is considered an id field if no metadata is present. Heavily refactored QueryMapper internals so that the conversion code is more readable.
2013-05-24 09:18:57 +02:00
Oliver Gierke
220b211faa DATAMONGO-679 - Fixed saving plain JSON strings in MongoTemplate.
MongoTemplate.doSave(…) didn't handle plain JSON strings correctly. It now correctly passes the marshaled object to the underlying method.
2013-05-23 20:46:08 +02:00
Patryk Wąsik
c0c51fcc29 DATAMONGO-677 - QueryMapper now handles DBRefs in Maps correctly.
QueryMapper now handle Map with DBRef value, which is needed to process an update in MongoTemplate.doUpdate(…) to save versioned document correctly.
2013-05-23 20:10:29 +02:00
Oliver Gierke
ed9eddf10e DATAMONGO-682 - Performance improvement for mapping hotspots.
This commit includes the MongoDB specific parts for the mapping subsystem performance improvements. Reworked PerformanceTest to output more reasonable numbers.

Heavily inspired by Patryk Wasik's contribution at https://github.com/SpringSource/spring-data-mongodb/pull/37.

GitHub PR: #37
Conflicts:
	spring-data-mongodb/src/test/java/org/springframework/data/mongodb/core/mapping/MappingTests.java
2013-05-23 19:52:34 +02:00
Oliver Gierke
e23d73d55e DATACMNS-379 - Adapt to changes in Spring Data Commons. 2013-05-12 23:50:42 +02:00
Philipp Schneider
9627fbaebf DATAMONGO-663 - Added equals(…) and hashCode() to Field. 2013-04-26 15:37:38 +02:00
Spring Buildmaster
ef93d4db0b DATAMONGO-654 - Prepare next development iteration. 2013-04-17 02:50:13 -07:00
Spring Buildmaster
928b5a7742 DATAMONGO-654 - Release 1.2.1.RELEASE. 2013-04-17 02:50:08 -07:00
Oliver Gierke
118a52a8d6 DATAMONGO-654 - Prepare 1.2.1.RELEASE.
Upgraded to Spring Data Build 1.0.3.RELEASE. Upgraded to Spring Data Commons 1.5.1.RELEASE. Updated changelog, notice and readme. Polished readme.md.
2013-04-17 11:28:46 +02:00
Oliver Gierke
b47e8ca3da DATAMONGO-656 - Fixed potential NPE in MongoTemplate. 2013-04-17 10:24:37 +02:00
Oliver Gierke
8527d6eb43 DATAMONGO-651 - MongoTemplate now throws Mongo-specific exception with WriteResult.
If the WriteResultChecking is set to EXCEPTION on a MongoTemplate, we now throw a Mongo-specific exception that captures both the WriteResult and MongoActionOperation for further evaluation.
2013-04-15 17:05:56 +02:00
Oliver Gierke
d645c778c3 DATAMONGO-571 - Fixed setting null values during update of versioned entities.
In case of updating a versioned object,the Update object is now constructed from plain key value pairs, not using $set anymore. This will correctly set the null values in the updated document.
2013-04-11 12:01:53 +02:00
Oliver Gierke
5c47f1ae9e DATAMONGO-650 - Added snapshot repository to resolve Spring Data Commons. 2013-04-11 11:08:02 +02:00
Oliver Gierke
9f324bac19 DATAMONGO-648 - Replaced XSD ids with XSD strings.
This change allows usage of Spring Data MongoDB XML namespace elements with <bean /> element using a profile. This scenario creates the case of e.g. two <mongo:db-factory /> declarations in the same XML file.
2013-04-10 20:34:29 +02:00
Oliver Gierke
186caba1ac DATAMONGO-642 - MongoChangeSetPersister now considers mapped collection.
So far the change set persister has used the plain domain type name to persist data. We now consider the collection name defined by the object mapping (through @Document(collection = "…")).
2013-04-02 11:42:50 +02:00
Oliver Gierke
2ebb7e801d DATAMONGO-641 - Fixed potential NullPointerException in MongoLog4jAppender.
Log4j appender now only closes Mongo instance if available.
2013-04-02 10:59:13 +02:00
Oliver Gierke
a932f3474e DATAMONGO-641 - Reformatting to prepare fix. 2013-04-02 10:59:05 +02:00
Oliver Gierke
e992456532 DATAMONGO-638 - MappingContext does not create PersistentEntities for AbstractMaps. 2013-03-28 16:29:55 +01:00
Oliver Gierke
7b34c5cac4 DATAMONGO-629 - Fixed QueryMapper not to massage queries without type information.
So far the QueryMapper applied the id massaging (especially interpreting the default id keys) even if there was no persistence metadata available to do so. This caused e.g. queries handed into MongoTemplate.count(Query, String) to get keys of "id" massaged into "_id" which shouldn't be the case as we cannot assume anything about the documents and the keys contained in them.

So we now only apply the defaults if there is at least persistence metadata present. This means that for methods on MongoOperations that don't take type information of any kind the queries have to be defined in terms of the document, not the object model as we cannot refer to it.
2013-03-27 12:04:06 +01:00
Andrey Bloschetsov
16baf00f5e DATAMONGO-637 - Fixed typo in Query.query(…).
Correcting mispelled "critera". Polished JavaDoc a little.
2013-03-27 11:28:22 +01:00
Oliver Gierke
61a2c56a27 DATAMONGO-632 - Removed schemaLocation attribute from namespace references. 2013-03-25 18:46:34 +01:00
Oliver Gierke
e62437b64a DATAMONGO-633 - Back down to bug fix versions of SD Parent and Core.
Avoid version bump to Querydsl 3.0.
2013-03-25 18:44:18 +01:00
Oliver Gierke
e9a7e887be DATAMONGO-635 - Fixed some Sonar warnings. 2013-03-25 18:13:21 +01:00
Oliver Gierke
b91a66f6f9 DATAMONGO-622 - Saving unversioned object now uses doInsert(…).
We now rather use doInsert(…) if a versioned object is saved the first time to prevent accidental updates not bumping the version number.
2013-02-26 11:39:36 +01:00
Oliver Gierke
a047e54e5a DATAMONGO-621 - Initializing version property now uses ConversionService.
The BeanWrapper used in MongoTemplate.initializeVersionProperty(…) now uses the ConversionService held in the MongoConverter.
2013-02-26 11:39:30 +01:00
Oliver Gierke
8197ff57c8 DATAMONGO-620 - Updating versioned object uses explicit collection name.
We're now handing the collection name given to doSaveVersioned(…) to the update method.
2013-02-26 11:39:22 +01:00
Oliver Gierke
a2136719e1 DATAMONGO-617 - Fixed potential NullPointerException in MongoTemplate.insert(…).
If MongoTemplate.insert(…) was called with a Mongo-simple type (such as a raw DBBobject) it caused a NullPointerException during the lookup of a version property. This is now fixed by correcting the guard.
2013-02-20 12:07:32 +01:00
Oliver Gierke
7bcf142c8d DATAMONGO-613 - Readded JConsole image and upgrade to Spring Data Build 1.1. 2013-02-11 12:41:12 +01:00
Oliver Gierke
d50d03a80e DATAMONGO-612 - Fixed reconfiguration of dist.id. 2013-02-11 12:39:15 +01:00
Spring Buildmaster
ba894a4511 DATAMONGO-609 - Prepare next development iteration. 2013-02-08 13:22:05 +01:00
82 changed files with 2578 additions and 788 deletions

133
README.md
View File

@@ -29,19 +29,13 @@ For those in a hurry:
* Download the jar through Maven:
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.2.0.BUILD-SNAPSHOT</version>
</dependency>
<repository>
<id>spring-maven-snapshot</id>
<snapshots><enabled>true</enabled></snapshots>
<name>Springframework Maven SNAPSHOT Repository</name>
<url>http://maven.springframework.org/snapshot</url>
</repository>
```xml
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.2.3.RELEASE</version>
</dependency>
```
### MongoTemplate
MongoTemplate is the central support class for Mongo database operations. It provides
@@ -54,93 +48,98 @@ Future plans are to support optional logging and/or exception throwing based on
### Easy Data Repository generation
To simplify the creation of Data Repositories a generic Repository interface and default implementation is provided. Furthermore, Spring will automatically create a Repository implementation for you that adds implementations of finder methods you specify on an interface.
To simplify the creation of data repositories a generic `Repository` interface and default implementation is provided. Furthermore, Spring will automatically create a Repository implementation for you that adds implementations of finder methods you specify on an interface.
The Repository interface is
public interface Repository<T, ID extends Serializable> {
```java
public interface Repository<T, ID extends Serializable> {
T save(T entity);
T save(T entity);
List<T> save(Iterable<? extends T> entities);
List<T> save(Iterable<? extends T> entities);
T findById(ID id);
T findById(ID id);
boolean exists(ID id);
boolean exists(ID id);
List<T> findAll();
List<T> findAll();
Long count();
Long count();
void delete(T entity);
void delete(T entity);
void delete(Iterable<? extends T> entities);
void delete(Iterable<? extends T> entities);
void deleteAll();
}
void deleteAll();
}
```
The MongoRepository extends Repository and will in future add more Mongo specific methods.
The `MongoRepository` extends `Repository` and will in future add more Mongo specific methods.
public interface MongoRepository<T, ID extends Serializable> extends
Repository<T, ID> {
}
```java
public interface MongoRepository<T, ID extends Serializable> extends Repository<T, ID> {
}
```
SimpleMongoRepository is the out of the box implementation of the MongoRepository you can use for basid CRUD operations.
`SimpleMongoRepository` is the out of the box implementation of the `MongoRepository` you can use for basid CRUD operations.
To go beyond basic CRUD, extend the MongoRepository interface and supply your own finder methods that follow simple naming conventions such that they can be easily converted into queries.
To go beyond basic CRUD, extend the `MongoRepository` interface and supply your own finder methods that follow simple naming conventions such that they can be easily converted into queries.
For example, given a Person class with first and last name properties, a PersonRepository interface that can query for Person by last name and when the first name matches a regular expression is shown below
For example, given a `Person` class with first and last name properties, a `PersonRepository` interface that can query for `Person` by last name and when the first name matches a regular expression is shown below
public interface PersonRepository extends MongoRepository<Person, Long> {
```java
public interface PersonRepository extends MongoRepository<Person, Long> {
List<Person> findByLastname(String lastname);
List<Person> findByLastname(String lastname);
List<Person> findByFirstnameLike(String firstname);
}
List<Person> findByFirstnameLike(String firstname);
}
```
You can have Spring automatically generate the implemention as shown below
You can have Spring automatically create a proxy for the interface as shown below:
<bean id="template" class="org.springframework.data.document.mongodb.MongoTemplate">
<constructor-arg>
<bean class="com.mongodb.Mongo">
<constructor-arg value="localhost" />
<constructor-arg value="27017" />
</bean>
</constructor-arg>
<constructor-arg value="database" />
<property name="defaultCollectionName" value="springdata" />
</bean>
```xml
<bean id="template" class="org.springframework.data.document.mongodb.MongoTemplate">
<constructor-arg>
<bean class="com.mongodb.Mongo">
<constructor-arg value="localhost" />
<constructor-arg value="27017" />
</bean>
</constructor-arg>
<constructor-arg value="database" />
<property name="defaultCollectionName" value="springdata" />
</bean>
<bean class="org.springframework.data.document.mongodb.repository.MongoRepositoryFactoryBean">
<property name="template" ref="template" />
<property name="repositoryInterface" value="org.springframework.data.document.mongodb.repository.PersonRepository" />
</bean>
<mongo:repositories base-package="com.acme.repository" />
```
This will register an object in the container named PersonRepository. You can use it as shown below
This will find the repository interface and register a proxy object in the container. You can use it as shown below:
@Service
public class MyService {
``java
@Service
public class MyService {
@Autowired
PersonRepository repository;
@Autowired
private final PersonRepository repository;
public void doWork() {
public void doWork() {
repository.deleteAll();
repository.deleteAll();
Person person = new Person();
person.setFirstname("Oliver");
person.setLastname("Gierke");
person = repository.save(person);
Person person = new Person();
person.setFirstname("Oliver");
person.setLastname("Gierke");
person = repository.save(person);
List<Person> lastNameResults = repository.findByLastname("Gierke");
List<Person> lastNameResults = repository.findByLastname("Gierke");
List<Person> firstNameResults = repository.findByFirstnameLike("Oli*");
List<Person> firstNameResults = repository.findByFirstnameLike("Oli*");
}
}
}
}
```
Contributing to Spring Data

View File

@@ -73,7 +73,7 @@
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_enum_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_angle_bracket_in_type_parameters" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.clear_blank_lines_in_block_comment" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_type_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_type_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.lineSplit" value="120"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_if" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_between_brackets_in_array_type_reference" value="do not insert"/>
@@ -102,7 +102,7 @@
<setting id="org.eclipse.jdt.core.formatter.alignment_for_parameters_in_constructor_declaration" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_colon_in_labeled_statement" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_annotation_type_declaration" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_method_body" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_method_body" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_method_declaration" value="0"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_method_invocation" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_try" value="do not insert"/>
@@ -124,7 +124,7 @@
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_parenthesized_expression_in_throw" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_type_arguments" value="do not insert"/>
<setting id="org.eclipse.jdt.core.compiler.problem.enumIdentifier" value="error"/>
<setting id="org.eclipse.jdt.core.formatter.indent_switchstatements_compare_to_switch" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.indent_switchstatements_compare_to_switch" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_ellipsis" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_block" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_for_inits" value="do not insert"/>
@@ -135,9 +135,9 @@
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_for_increments" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.format_line_comment_starting_on_first_column" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_bracket_in_array_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_after_annotation_on_field" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_enum_constant" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_after_annotation_on_field" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.indent_root_tags" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_enum_constant" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_enum_declarations" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_union_type_in_multicatch" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_explicitconstructorcall_arguments" value="insert"/>
@@ -150,12 +150,12 @@
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_after_opening_brace_in_array_initializer" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_closing_brace_in_block" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_bracket_in_array_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_enum_constant" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_enum_constant" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_angle_bracket_in_type_arguments" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_constructor_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_constructor_declaration_throws" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_if" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.clear_blank_lines_in_javadoc_comment" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.comment.clear_blank_lines_in_javadoc_comment" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_throws_clause_in_constructor_declaration" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_assignment_operator" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_assignment_operator" value="insert"/>
@@ -182,11 +182,11 @@
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_angle_bracket_in_parameterized_type_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_anonymous_type_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.keep_empty_array_initializer_on_one_line" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_enum_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_enum_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.keep_imple_if_on_one_line" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_constructor_declaration_parameters" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_closing_angle_bracket_in_type_parameters" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_at_end_of_file_if_missing" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_at_end_of_file_if_missing" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_colon_in_for" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_colon_in_labeled_statement" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_parameterized_type_reference" value="do not insert"/>
@@ -237,7 +237,7 @@
<setting id="org.eclipse.jdt.core.formatter.alignment_for_throws_clause_in_method_declaration" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_angle_bracket_in_type_parameters" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_bracket_in_array_allocation_expression" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_anonymous_type_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_anonymous_type_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_colon_in_conditional" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_angle_bracket_in_parameterized_type_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_for" value="do not insert"/>
@@ -251,12 +251,12 @@
<setting id="org.eclipse.jdt.core.compiler.codegen.targetPlatform" value="1.7"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_resources_in_try" value="80"/>
<setting id="org.eclipse.jdt.core.formatter.use_tabs_only_for_leading_indentations" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_arguments_in_annotation" value="0"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_arguments_in_annotation" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.comment.format_header" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.comment.format_block_comments" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_enum_constant" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_enum_constants" value="0"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_block" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_block" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.indent_body_declarations_compare_to_annotation_declaration_header" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_parenthesized_expression" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_parenthesized_expression" value="do not insert"/>

16
pom.xml
View File

@@ -5,7 +5,7 @@
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.2.0.RELEASE</version>
<version>1.2.5.BUILD-SNAPSHOT</version>
<packaging>pom</packaging>
<name>Spring Data MongoDB</name>
@@ -15,7 +15,7 @@
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>1.0.0.RELEASE</version>
<version>1.0.5.RELEASE</version>
<relativePath>../spring-data-build/parent/pom.xml</relativePath>
</parent>
@@ -26,11 +26,10 @@
<module>spring-data-mongodb-distribution</module>
</modules>
<properties>
<project.type>multi</project.type>
<dist.name>spring-data-mongodb</dist.name>
<springdata.commons>1.5.0.RELEASE</springdata.commons>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>1.5.3.RELEASE</springdata.commons>
<mongo>2.10.1</mongo>
</properties>
@@ -90,4 +89,11 @@
</dependency>
</dependencies>
<repositories>
<repository>
<id>spring-libs-release</id>
<url>http://repo.springsource.org/libs-release</url>
</repository>
</repositories>
</project>

View File

@@ -6,7 +6,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.2.0.RELEASE</version>
<version>1.2.5.BUILD-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -52,7 +52,7 @@
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.2.0.RELEASE</version>
<version>1.2.5.BUILD-SNAPSHOT</version>
</dependency>
<dependency>

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -34,6 +34,10 @@ import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mongodb.MongoException;
/**
* @author Thomas Risberg
* @author Oliver Gierke
*/
public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
private static final String ENTITY_CLASS = "_entity_class";
@@ -58,6 +62,10 @@ public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
this.entityManagerFactory = entityManagerFactory;
}
/*
* (non-Javadoc)
* @see org.springframework.data.crossstore.ChangeSetPersister#getPersistentState(java.lang.Class, java.lang.Object, org.springframework.data.crossstore.ChangeSet)
*/
public void getPersistentState(Class<? extends ChangeSetBacked> entityClass, Object id, final ChangeSet changeSet)
throws DataAccessException, NotFoundException {
@@ -100,6 +108,10 @@ public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.crossstore.ChangeSetPersister#getPersistentId(org.springframework.data.crossstore.ChangeSetBacked, org.springframework.data.crossstore.ChangeSet)
*/
public Object getPersistentId(ChangeSetBacked entity, ChangeSet cs) throws DataAccessException {
log.debug("getPersistentId called on " + entity);
if (entityManagerFactory == null) {
@@ -109,6 +121,10 @@ public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
return o;
}
/*
* (non-Javadoc)
* @see org.springframework.data.crossstore.ChangeSetPersister#persistState(org.springframework.data.crossstore.ChangeSetBacked, org.springframework.data.crossstore.ChangeSet)
*/
public Object persistState(ChangeSetBacked entity, ChangeSet cs) throws DataAccessException {
if (cs == null) {
log.debug("Flush: changeset was null, nothing to flush.");
@@ -169,8 +185,13 @@ public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
return 0L;
}
/**
* Returns the collection the given entity type shall be persisted to.
*
* @param entityClass must not be {@literal null}.
* @return
*/
private String getCollectionNameForEntity(Class<? extends ChangeSetBacked> entityClass) {
return ClassUtils.getQualifiedName(entityClass);
return mongoTemplate.getCollectionName(entityClass);
}
}

View File

@@ -0,0 +1,5 @@
/**
* Infrastructure for Spring Data's MongoDB cross store support.
*/
package org.springframework.data.mongodb.crossstore;

View File

@@ -36,9 +36,14 @@ import org.springframework.transaction.annotation.Transactional;
import org.springframework.transaction.support.TransactionCallback;
import org.springframework.transaction.support.TransactionTemplate;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
/**
* Integration tests for MongoDB cross-store persistence (mainly {@link MongoChangeSetPersister}).
*
* @author Thomas Risberg
* @author Oliver Gierke
*/
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration("classpath:/META-INF/spring/applicationContext.xml")
public class CrossStoreMongoTests {
@@ -58,7 +63,7 @@ public class CrossStoreMongoTests {
txTemplate = new TransactionTemplate(transactionManager);
clearData(Person.class.getName());
clearData(Person.class);
Address address = new Address(12, "MAin St.", "Boston", "MA", "02101");
@@ -91,11 +96,10 @@ public class CrossStoreMongoTests {
});
}
private void clearData(String collectionName) {
DBCollection col = this.mongoTemplate.getCollection(collectionName);
if (col != null) {
this.mongoTemplate.dropCollection(collectionName);
}
private void clearData(Class<?> domainType) {
String collectionName = mongoTemplate.getCollectionName(domainType);
mongoTemplate.dropCollection(collectionName);
}
@Test
@@ -183,7 +187,7 @@ public class CrossStoreMongoTests {
boolean weFound3 = false;
for (DBObject dbo : this.mongoTemplate.getCollection(Person.class.getName()).find()) {
for (DBObject dbo : this.mongoTemplate.getCollection(mongoTemplate.getCollectionName(Person.class)).find()) {
Assert.assertTrue(!dbo.get("_entity_id").equals(2L));
if (dbo.get("_entity_id").equals(3L)) {
weFound3 = true;

View File

@@ -13,13 +13,12 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.2.0.RELEASE</version>
<version>1.2.5.BUILD-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>
<properties>
<project.root>${basedir}/..</project.root>
<dist.id>spring-data-mongodb</dist.id>
<dist.key>SDMONGO</dist.key>
</properties>

View File

@@ -32,7 +32,7 @@ An example log entry might look like:
{
"_id" : ObjectId("4d89341a8ef397e06940d5cd"),
"applicationId" : "my.application",
"name" : "org.springframework.data.mongodb.log4j.AppenderTest",
"name" : "org.springframework.data.mongodb.log4j.MongoLog4jAppenderIntegrationTests",
"level" : "DEBUG",
"timestamp" : ISODate("2011-03-23T16:53:46.778Z"),
"properties" : {

View File

@@ -5,7 +5,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.2.0.RELEASE</version>
<version>1.2.5.BUILD-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -1,11 +1,11 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -13,7 +13,6 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.log4j;
import java.net.UnknownHostException;
@@ -21,188 +20,207 @@ import java.util.Arrays;
import java.util.Calendar;
import java.util.Map;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
import com.mongodb.DB;
import com.mongodb.Mongo;
import com.mongodb.WriteConcern;
import org.apache.log4j.AppenderSkeleton;
import org.apache.log4j.Level;
import org.apache.log4j.MDC;
import org.apache.log4j.PatternLayout;
import org.apache.log4j.spi.LoggingEvent;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
import com.mongodb.DB;
import com.mongodb.Mongo;
import com.mongodb.WriteConcern;
/**
* @author Jon Brisbin <jbrisbin@vmware.com>
* Log4j appender writing log entries into a MongoDB instance.
*
* @author Jon Brisbin
* @author Oliver Gierke
*/
public class MongoLog4jAppender extends AppenderSkeleton {
public static final String LEVEL = "level";
public static final String NAME = "name";
public static final String APP_ID = "applicationId";
public static final String TIMESTAMP = "timestamp";
public static final String PROPERTIES = "properties";
public static final String TRACEBACK = "traceback";
public static final String MESSAGE = "message";
public static final String YEAR = "year";
public static final String MONTH = "month";
public static final String DAY = "day";
public static final String HOUR = "hour";
public static final String LEVEL = "level";
public static final String NAME = "name";
public static final String APP_ID = "applicationId";
public static final String TIMESTAMP = "timestamp";
public static final String PROPERTIES = "properties";
public static final String TRACEBACK = "traceback";
public static final String MESSAGE = "message";
public static final String YEAR = "year";
public static final String MONTH = "month";
public static final String DAY = "day";
public static final String HOUR = "hour";
protected String host = "localhost";
protected int port = 27017;
protected String database = "logs";
protected String collectionPattern = "%c";
protected PatternLayout collectionLayout = new PatternLayout(collectionPattern);
protected String applicationId = System.getProperty("APPLICATION_ID", null);
protected WriteConcern warnOrHigherWriteConcern = WriteConcern.SAFE;
protected WriteConcern infoOrLowerWriteConcern = WriteConcern.NORMAL;
protected Mongo mongo;
protected DB db;
protected String host = "localhost";
protected int port = 27017;
protected String database = "logs";
protected String collectionPattern = "%c";
protected PatternLayout collectionLayout = new PatternLayout(collectionPattern);
protected String applicationId = System.getProperty("APPLICATION_ID", null);
protected WriteConcern warnOrHigherWriteConcern = WriteConcern.SAFE;
protected WriteConcern infoOrLowerWriteConcern = WriteConcern.NORMAL;
protected Mongo mongo;
protected DB db;
public MongoLog4jAppender() {
}
public MongoLog4jAppender() {
}
public MongoLog4jAppender(boolean isActive) {
super(isActive);
}
public MongoLog4jAppender(boolean isActive) {
super(isActive);
}
public String getHost() {
return host;
}
public String getHost() {
return host;
}
public void setHost(String host) {
this.host = host;
}
public void setHost(String host) {
this.host = host;
}
public int getPort() {
return port;
}
public int getPort() {
return port;
}
public void setPort(int port) {
this.port = port;
}
public void setPort(int port) {
this.port = port;
}
public String getDatabase() {
return database;
}
public String getDatabase() {
return database;
}
public void setDatabase(String database) {
this.database = database;
}
public void setDatabase(String database) {
this.database = database;
}
public String getCollectionPattern() {
return collectionPattern;
}
public String getCollectionPattern() {
return collectionPattern;
}
public void setCollectionPattern(String collectionPattern) {
this.collectionPattern = collectionPattern;
this.collectionLayout = new PatternLayout(collectionPattern);
}
public void setCollectionPattern(String collectionPattern) {
this.collectionPattern = collectionPattern;
this.collectionLayout = new PatternLayout(collectionPattern);
}
public String getApplicationId() {
return applicationId;
}
public String getApplicationId() {
return applicationId;
}
public void setApplicationId(String applicationId) {
this.applicationId = applicationId;
}
public void setApplicationId(String applicationId) {
this.applicationId = applicationId;
}
public void setWarnOrHigherWriteConcern(String wc) {
this.warnOrHigherWriteConcern = WriteConcern.valueOf(wc);
}
public void setWarnOrHigherWriteConcern(String wc) {
this.warnOrHigherWriteConcern = WriteConcern.valueOf(wc);
}
public String getWarnOrHigherWriteConcern() {
return warnOrHigherWriteConcern.toString();
}
public String getWarnOrHigherWriteConcern() {
return warnOrHigherWriteConcern.toString();
}
public String getInfoOrLowerWriteConcern() {
return infoOrLowerWriteConcern.toString();
}
public String getInfoOrLowerWriteConcern() {
return infoOrLowerWriteConcern.toString();
}
public void setInfoOrLowerWriteConcern(String wc) {
this.infoOrLowerWriteConcern = WriteConcern.valueOf(wc);
}
public void setInfoOrLowerWriteConcern(String wc) {
this.infoOrLowerWriteConcern = WriteConcern.valueOf(wc);
}
protected void connectToMongo() throws UnknownHostException {
this.mongo = new Mongo(host, port);
this.db = mongo.getDB(database);
}
protected void connectToMongo() throws UnknownHostException {
this.mongo = new Mongo(host, port);
this.db = mongo.getDB(database);
}
@SuppressWarnings({"unchecked"})
@Override
protected void append(final LoggingEvent event) {
if (null == db) {
try {
connectToMongo();
} catch (UnknownHostException e) {
throw new RuntimeException(e.getMessage(), e);
}
}
/*
* (non-Javadoc)
* @see org.apache.log4j.AppenderSkeleton#append(org.apache.log4j.spi.LoggingEvent)
*/
@Override
@SuppressWarnings({ "unchecked" })
protected void append(final LoggingEvent event) {
if (null == db) {
try {
connectToMongo();
} catch (UnknownHostException e) {
throw new RuntimeException(e.getMessage(), e);
}
}
BasicDBObject dbo = new BasicDBObject();
if (null != applicationId) {
dbo.put(APP_ID, applicationId);
MDC.put(APP_ID, applicationId);
}
dbo.put(NAME, event.getLogger().getName());
dbo.put(LEVEL, event.getLevel().toString());
Calendar tstamp = Calendar.getInstance();
tstamp.setTimeInMillis(event.getTimeStamp());
dbo.put(TIMESTAMP, tstamp.getTime());
BasicDBObject dbo = new BasicDBObject();
if (null != applicationId) {
dbo.put(APP_ID, applicationId);
MDC.put(APP_ID, applicationId);
}
dbo.put(NAME, event.getLogger().getName());
dbo.put(LEVEL, event.getLevel().toString());
Calendar tstamp = Calendar.getInstance();
tstamp.setTimeInMillis(event.getTimeStamp());
dbo.put(TIMESTAMP, tstamp.getTime());
// Copy properties into document
Map<Object, Object> props = event.getProperties();
if (null != props && props.size() > 0) {
BasicDBObject propsDbo = new BasicDBObject();
for (Map.Entry<Object, Object> entry : props.entrySet()) {
propsDbo.put(entry.getKey().toString(), entry.getValue().toString());
}
dbo.put(PROPERTIES, propsDbo);
}
// Copy properties into document
Map<Object, Object> props = event.getProperties();
if (null != props && props.size() > 0) {
BasicDBObject propsDbo = new BasicDBObject();
for (Map.Entry<Object, Object> entry : props.entrySet()) {
propsDbo.put(entry.getKey().toString(), entry.getValue().toString());
}
dbo.put(PROPERTIES, propsDbo);
}
// Copy traceback info (if there is any) into the document
String[] traceback = event.getThrowableStrRep();
if (null != traceback && traceback.length > 0) {
BasicDBList tbDbo = new BasicDBList();
tbDbo.addAll(Arrays.asList(traceback));
dbo.put(TRACEBACK, tbDbo);
}
// Copy traceback info (if there is any) into the document
String[] traceback = event.getThrowableStrRep();
if (null != traceback && traceback.length > 0) {
BasicDBList tbDbo = new BasicDBList();
tbDbo.addAll(Arrays.asList(traceback));
dbo.put(TRACEBACK, tbDbo);
}
// Put the rendered message into the document
dbo.put(MESSAGE, event.getRenderedMessage());
// Put the rendered message into the document
dbo.put(MESSAGE, event.getRenderedMessage());
// Insert the document
Calendar now = Calendar.getInstance();
MDC.put(YEAR, now.get(Calendar.YEAR));
MDC.put(MONTH, String.format("%1$02d", now.get(Calendar.MONTH) + 1));
MDC.put(DAY, String.format("%1$02d", now.get(Calendar.DAY_OF_MONTH)));
MDC.put(HOUR, String.format("%1$02d", now.get(Calendar.HOUR_OF_DAY)));
// Insert the document
Calendar now = Calendar.getInstance();
MDC.put(YEAR, now.get(Calendar.YEAR));
MDC.put(MONTH, String.format("%1$02d", now.get(Calendar.MONTH) + 1));
MDC.put(DAY, String.format("%1$02d", now.get(Calendar.DAY_OF_MONTH)));
MDC.put(HOUR, String.format("%1$02d", now.get(Calendar.HOUR_OF_DAY)));
String coll = collectionLayout.format(event);
String coll = collectionLayout.format(event);
MDC.remove(YEAR);
MDC.remove(MONTH);
MDC.remove(DAY);
MDC.remove(HOUR);
if (null != applicationId) {
MDC.remove(APP_ID);
}
MDC.remove(YEAR);
MDC.remove(MONTH);
MDC.remove(DAY);
MDC.remove(HOUR);
if (null != applicationId) {
MDC.remove(APP_ID);
}
WriteConcern wc;
if (event.getLevel().isGreaterOrEqual(Level.WARN)) {
wc = warnOrHigherWriteConcern;
} else {
wc = infoOrLowerWriteConcern;
}
db.getCollection(coll).insert(dbo, wc);
}
WriteConcern wc;
if (event.getLevel().isGreaterOrEqual(Level.WARN)) {
wc = warnOrHigherWriteConcern;
} else {
wc = infoOrLowerWriteConcern;
}
db.getCollection(coll).insert(dbo, wc);
}
public void close() {
mongo.close();
}
/*
* (non-Javadoc)
* @see org.apache.log4j.AppenderSkeleton#close()
*/
public void close() {
public boolean requiresLayout() {
return true;
}
if (mongo != null) {
mongo.close();
}
}
/*
* (non-Javadoc)
* @see org.apache.log4j.AppenderSkeleton#requiresLayout()
*/
public boolean requiresLayout() {
return true;
}
}

View File

@@ -0,0 +1,5 @@
/**
* Infrastructure for to use MongoDB as a logging sink.
*/
package org.springframework.data.mongodb.log4j;

View File

@@ -1,75 +0,0 @@
/*
* Copyright (c) 2011 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.log4j;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import java.net.UnknownHostException;
import java.util.Calendar;
import com.mongodb.DB;
import com.mongodb.DBCursor;
import com.mongodb.Mongo;
import org.apache.log4j.Logger;
import org.apache.log4j.MDC;
import org.junit.Before;
import org.junit.Test;
/**
* @author Jon Brisbin <jbrisbin@vmware.com>
*/
public class AppenderTest {
private static final String NAME = AppenderTest.class.getName();
private Logger log = Logger.getLogger(NAME);
private Mongo mongo;
private DB db;
private String collection;
@Before
public void setup() {
try {
mongo = new Mongo("localhost", 27017);
db = mongo.getDB("logs");
Calendar now = Calendar.getInstance();
collection = String.valueOf(now.get(Calendar.YEAR)) + String.format("%1$02d", now.get(Calendar.MONTH) + 1);
db.getCollection(collection).drop();
} catch (UnknownHostException e) {
throw new RuntimeException(e.getMessage(), e);
}
}
@Test
public void testLogging() {
log.debug("DEBUG message");
log.info("INFO message");
log.warn("WARN message");
log.error("ERROR message");
DBCursor msgs = db.getCollection(collection).find();
assertThat(msgs.count(), is(4));
}
@Test
public void testProperties() {
MDC.put("property", "one");
log.debug("DEBUG message");
}
}

View File

@@ -0,0 +1,77 @@
/*
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.log4j;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import java.util.Calendar;
import org.apache.log4j.Logger;
import org.apache.log4j.MDC;
import org.junit.Before;
import org.junit.Test;
import com.mongodb.DB;
import com.mongodb.DBCursor;
import com.mongodb.Mongo;
/**
* Integration tests for {@link MongoLog4jAppender}.
*
* @author Jon Brisbin
* @author Oliver Gierke
*/
public class MongoLog4jAppenderIntegrationTests {
static final String NAME = MongoLog4jAppenderIntegrationTests.class.getName();
Logger log = Logger.getLogger(NAME);
Mongo mongo;
DB db;
String collection;
@Before
public void setUp() throws Exception {
mongo = new Mongo("localhost", 27017);
db = mongo.getDB("logs");
Calendar now = Calendar.getInstance();
collection = String.valueOf(now.get(Calendar.YEAR)) + String.format("%1$02d", now.get(Calendar.MONTH) + 1);
db.getCollection(collection).drop();
}
@Test
public void testLogging() {
log.debug("DEBUG message");
log.info("INFO message");
log.warn("WARN message");
log.error("ERROR message");
DBCursor msgs = db.getCollection(collection).find();
assertThat(msgs.count(), is(4));
}
@Test
public void testProperties() {
MDC.put("property", "one");
log.debug("DEBUG message");
}
}

View File

@@ -0,0 +1,34 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.log4j;
import org.junit.Test;
/**
* Unit tests for {@link MongoLog4jAppender}.
*
* @author Oliver Gierke
*/
public class MongoLog4jAppenderUnitTests {
/**
* @see DATAMONGO-641
*/
@Test
public void closesWithoutMongoInstancePresent() {
new MongoLog4jAppender().close();
}
}

View File

@@ -10,11 +10,4 @@ log4j.appender.stdout.collectionPattern = %X{year}%X{month}
log4j.appender.stdout.applicationId = my.application
log4j.appender.stdout.warnOrHigherWriteConcern = FSYNC_SAFE
log4j.category.org.apache.activemq=ERROR
log4j.category.org.springframework.batch=DEBUG
log4j.category.org.springframework.data.document.mongodb=DEBUG
log4j.category.org.springframework.transaction=INFO
log4j.category.org.hibernate.SQL=DEBUG
# for debugging datasource initialization
# log4j.category.test.jdbc=DEBUG
log4j.category.org.springframework.data.mongodb=DEBUG

View File

@@ -11,7 +11,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.2.0.RELEASE</version>
<version>1.2.5.BUILD-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -0,0 +1,72 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb;
import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.data.mongodb.core.MongoActionOperation;
import org.springframework.util.Assert;
import com.mongodb.WriteResult;
/**
* Mongo-specific {@link DataIntegrityViolationException}.
*
* @author Oliver Gierke
*/
public class MongoDataIntegrityViolationException extends DataIntegrityViolationException {
private static final long serialVersionUID = -186980521176764046L;
private final WriteResult writeResult;
private final MongoActionOperation actionOperation;
/**
* Creates a new {@link MongoDataIntegrityViolationException} using the given message and {@link WriteResult}.
*
* @param message the exception message
* @param writeResult the {@link WriteResult} that causes the exception, must not be {@literal null}.
* @param actionOperation the {@link MongoActionOperation} that caused the exception, must not be {@literal null}.
*/
public MongoDataIntegrityViolationException(String message, WriteResult writeResult,
MongoActionOperation actionOperation) {
super(message);
Assert.notNull(writeResult, "WriteResult must not be null!");
Assert.notNull(actionOperation, "MongoActionOperation must not be null!");
this.writeResult = writeResult;
this.actionOperation = actionOperation;
}
/**
* Returns the {@link WriteResult} that caused the exception.
*
* @return the writeResult
*/
public WriteResult getWriteResult() {
return writeResult;
}
/**
* Returns the {@link MongoActionOperation} in which the current exception occured.
*
* @return the actionOperation
*/
public MongoActionOperation getActionOperation() {
return actionOperation;
}
}

View File

@@ -134,7 +134,6 @@ public abstract class AbstractMongoConfiguration {
MongoMappingContext mappingContext = new MongoMappingContext();
mappingContext.setInitialEntitySet(getInitialEntitySet());
mappingContext.setSimpleTypeHolder(customConversions().getSimpleTypeHolder());
mappingContext.initialize();
return mappingContext;
}

View File

@@ -118,6 +118,7 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
.genericBeanDefinition(MongoPersistentEntityIndexCreator.class);
indexHelperBuilder.addConstructorArgReference(ctxRef);
indexHelperBuilder.addConstructorArgReference(dbFactoryRef);
indexHelperBuilder.addDependsOn(ctxRef);
parserContext.registerBeanComponent(new BeanComponentDefinition(indexHelperBuilder.getBeanDefinition(),
INDEX_HELPER));

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -31,6 +31,7 @@ import com.mongodb.ServerAddress;
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
*/
public class ServerAddressPropertyEditor extends PropertyEditorSupport {
@@ -43,6 +44,11 @@ public class ServerAddressPropertyEditor extends PropertyEditorSupport {
@Override
public void setAsText(String replicaSetString) {
if (!StringUtils.hasText(replicaSetString)) {
setValue(null);
return;
}
String[] replicaSetStringArray = StringUtils.commaDelimitedListToStringArray(replicaSetString);
Set<ServerAddress> serverAddresses = new HashSet<ServerAddress>(replicaSetStringArray.length);

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2010-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -26,14 +26,13 @@ import com.mongodb.DB;
import com.mongodb.Mongo;
/**
* Helper class featuring helper methods for internal MongoDb classes.
* <p/>
* <p>
* Mainly intended for internal use within the framework.
* Helper class featuring helper methods for internal MongoDb classes. Mainly intended for internal use within the
* framework.
*
* @author Thomas Risberg
* @author Graeme Rocher
* @author Oliver Gierke
* @author Randy Watler
* @since 1.0
*/
public abstract class MongoDbUtils {
@@ -131,8 +130,11 @@ public abstract class MongoDbUtils {
holderToUse.addDB(databaseName, db);
}
TransactionSynchronizationManager.registerSynchronization(new MongoSynchronization(holderToUse, mongo));
holderToUse.setSynchronizedWithTransaction(true);
// synchronize holder only if not yet synchronized
if (!holderToUse.isSynchronizedWithTransaction()) {
TransactionSynchronizationManager.registerSynchronization(new MongoSynchronization(holderToUse, mongo));
holderToUse.setSynchronizedWithTransaction(true);
}
if (holderToUse != dbHolder) {
TransactionSynchronizationManager.bindResource(mongo, holderToUse);

View File

@@ -15,7 +15,9 @@
*/
package org.springframework.data.mongodb.core;
import java.util.Arrays;
import java.util.ArrayList;
import java.util.Collection;
import java.util.Collections;
import java.util.List;
import org.springframework.beans.factory.DisposableBean;
@@ -24,6 +26,7 @@ import org.springframework.beans.factory.InitializingBean;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.CannotGetMongoDbConnectionException;
import org.springframework.util.StringUtils;
import com.mongodb.Mongo;
import com.mongodb.MongoOptions;
@@ -36,6 +39,7 @@ import com.mongodb.WriteConcern;
* @author Thomas Risberg
* @author Graeme Rocher
* @author Oliver Gierke
* @author Thomas Darimont
* @since 1.0
*/
public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, DisposableBean,
@@ -57,11 +61,38 @@ public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, D
}
public void setReplicaSetSeeds(ServerAddress[] replicaSetSeeds) {
this.replicaSetSeeds = Arrays.asList(replicaSetSeeds);
this.replicaSetSeeds = filterNonNullElementsAsList(replicaSetSeeds);
}
/**
* @deprecated use {@link #setReplicaSetSeeds(ServerAddress[])} instead
*
* @param replicaPair
*/
@Deprecated
public void setReplicaPair(ServerAddress[] replicaPair) {
this.replicaPair = Arrays.asList(replicaPair);
this.replicaPair = filterNonNullElementsAsList(replicaPair);
}
/**
* @param elements the elements to filter <T>
* @return a new unmodifiable {@link List#} from the given elements without nulls
*/
private <T> List<T> filterNonNullElementsAsList(T[] elements) {
if (elements == null) {
return Collections.emptyList();
}
List<T> candidateElements = new ArrayList<T>();
for (T element : elements) {
if (element != null) {
candidateElements.add(element);
}
}
return Collections.unmodifiableList(candidateElements);
}
public void setHost(String host) {
@@ -126,15 +157,15 @@ public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, D
mongoOptions = new MongoOptions();
}
if (replicaPair != null) {
if (!isNullOrEmpty(replicaPair)) {
if (replicaPair.size() < 2) {
throw new CannotGetMongoDbConnectionException("A replica pair must have two server entries");
}
mongo = new Mongo(replicaPair.get(0), replicaPair.get(1), mongoOptions);
} else if (replicaSetSeeds != null) {
} else if (!isNullOrEmpty(replicaSetSeeds)) {
mongo = new Mongo(replicaSetSeeds, mongoOptions);
} else {
String mongoHost = host != null ? host : defaultOptions.getHost();
String mongoHost = StringUtils.hasText(host) ? host : defaultOptions.getHost();
mongo = port != null ? new Mongo(new ServerAddress(mongoHost, port), mongoOptions) : new Mongo(mongoHost,
mongoOptions);
}
@@ -146,6 +177,10 @@ public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, D
this.mongo = mongo;
}
private boolean isNullOrEmpty(Collection<?> elements) {
return elements == null || elements.isEmpty();
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.DisposableBean#destroy()

View File

@@ -247,7 +247,7 @@ public interface MongoOperations {
* Query for a list of objects of type T from the collection used by the entity class.
* <p/>
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of SimpleMongoConverter will be used.
* configured otherwise, an instance of MappingMongoConverter will be used.
* <p/>
* If your collection does not contain a homogeneous collection of types, this operation will not be an efficient way
* to map objects since the test for class type is done in the client and not on the server.
@@ -261,7 +261,7 @@ public interface MongoOperations {
* Query for a list of objects of type T from the specified collection.
* <p/>
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of SimpleMongoConverter will be used.
* configured otherwise, an instance of MappingMongoConverter will be used.
* <p/>
* If your collection does not contain a homogeneous collection of types, this operation will not be an efficient way
* to map objects since the test for class type is done in the client and not on the server.
@@ -382,7 +382,7 @@ public interface MongoOperations {
* specified type.
* <p/>
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of SimpleMongoConverter will be used.
* configured otherwise, an instance of MappingMongoConverter will be used.
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
@@ -399,7 +399,7 @@ public interface MongoOperations {
* type.
* <p/>
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of SimpleMongoConverter will be used.
* configured otherwise, an instance of MappingMongoConverter will be used.
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
@@ -417,7 +417,7 @@ public interface MongoOperations {
* Map the results of an ad-hoc query on the collection for the entity class to a List of the specified type.
* <p/>
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of SimpleMongoConverter will be used.
* configured otherwise, an instance of MappingMongoConverter will be used.
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
@@ -433,7 +433,7 @@ public interface MongoOperations {
* Map the results of an ad-hoc query on the specified collection to a List of the specified type.
* <p/>
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of SimpleMongoConverter will be used.
* configured otherwise, an instance of MappingMongoConverter will be used.
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
@@ -501,7 +501,7 @@ public interface MongoOperations {
* type. The first document that matches the query is returned and also removed from the collection in the database.
* <p/>
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of SimpleMongoConverter will be used.
* configured otherwise, an instance of MappingMongoConverter will be used.
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
@@ -555,7 +555,7 @@ public interface MongoOperations {
* Insert the object into the specified collection.
* <p/>
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of SimpleMongoConverter will be used.
* configured otherwise, an instance of MappingMongoConverter will be used.
* <p/>
* Insert is used to initially store the object into the database. To update an existing object use the save method.
*
@@ -593,7 +593,7 @@ public interface MongoOperations {
* object is not already present, that is an 'upsert'.
* <p/>
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of SimpleMongoConverter will be used.
* configured otherwise, an instance of MappingMongoConverter will be used.
* <p/>
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
@@ -610,7 +610,7 @@ public interface MongoOperations {
* is an 'upsert'.
* <p/>
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of SimpleMongoConverter will be used.
* configured otherwise, an instance of MappingMongoConverter will be used.
* <p/>
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
@@ -734,4 +734,4 @@ public interface MongoOperations {
* @return
*/
MongoConverter getConverter();
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2012 the original author or authors.
* Copyright 2010-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -44,7 +44,6 @@ import org.springframework.core.convert.ConversionService;
import org.springframework.core.io.Resource;
import org.springframework.core.io.ResourceLoader;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.dao.OptimisticLockingFailureException;
import org.springframework.data.authentication.UserCredentials;
@@ -53,6 +52,7 @@ import org.springframework.data.mapping.PersistentEntity;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.model.BeanWrapper;
import org.springframework.data.mapping.model.MappingException;
import org.springframework.data.mongodb.MongoDataIntegrityViolationException;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.convert.MongoConverter;
@@ -534,8 +534,26 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
mongoConverter, entityClass), near.getMetric());
List<GeoResult<T>> result = new ArrayList<GeoResult<T>>(results.size());
int index = 0;
int elementsToSkip = near.getSkip() != null ? near.getSkip() : 0;
for (Object element : results) {
result.add(callback.doWith((DBObject) element));
/*
* As MongoDB currently (2.4.4) doesn't support the skipping of elements in near queries
* we skip the elements ourselves to avoid at least the document 2 object mapping overhead.
*
* @see https://jira.mongodb.org/browse/SERVER-3925
*/
if (index >= elementsToSkip) {
result.add(callback.doWith((DBObject) element));
}
index++;
}
if (elementsToSkip > 0) {
// as we skipped some elements we have to calculate the averageDistance ourselves:
return new GeoResults<T>(result, near.getMetric());
}
DBObject stats = (DBObject) commandResult.get("stats");
@@ -666,8 +684,9 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
MongoPersistentEntity<?> mongoPersistentEntity = getPersistentEntity(entity.getClass());
if (mongoPersistentEntity == null || mongoPersistentEntity.hasVersionProperty()) {
BeanWrapper<PersistentEntity<Object, ?>, Object> wrapper = BeanWrapper.create(entity, null);
if (mongoPersistentEntity != null && mongoPersistentEntity.hasVersionProperty()) {
BeanWrapper<PersistentEntity<Object, ?>, Object> wrapper = BeanWrapper.create(entity,
this.mongoConverter.getConversionService());
wrapper.setProperty(mongoPersistentEntity.getVersionProperty(), 0);
}
}
@@ -770,8 +789,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
// Fresh instance -> initialize version property
if (version == null) {
beanWrapper.setProperty(versionProperty, 0);
doSave(collectionName, objectToSave, this.mongoConverter);
doInsert(collectionName, objectToSave, this.mongoConverter);
} else {
assertUpdateableIdIfNotSet(objectToSave);
@@ -792,12 +810,11 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
maybeEmitEvent(new BeforeSaveEvent<T>(objectToSave, dbObject));
Update update = Update.fromDBObject(dbObject, ID_FIELD);
updateFirst(query, update, objectToSave.getClass());
doUpdate(collectionName, query, update, objectToSave.getClass(), false, false);
maybeEmitEvent(new AfterSaveEvent<T>(objectToSave, dbObject));
}
}
@SuppressWarnings("unchecked")
protected <T> void doSave(String collectionName, T objectToSave, MongoWriter<T> writer) {
assertUpdateableIdIfNotSet(objectToSave);
@@ -810,7 +827,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
writer.write(objectToSave, dbDoc);
} else {
try {
objectToSave = (T) JSON.parse((String) objectToSave);
dbDoc = (DBObject) JSON.parse((String) objectToSave);
} catch (JSONParseException e) {
throw new MappingException("Could not parse given String to save into a JSON document!", e);
}
@@ -941,7 +958,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
if (entity != null && entity.hasVersionProperty() && !multi) {
if (writeResult.getN() == 0) {
throw new OptimisticLockingFailureException("Optimistic lock exception on saving entity: "
+ updateObj.toMap().toString());
+ updateObj.toMap().toString() + " to collection " + collectionName);
}
}
@@ -981,11 +998,12 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
Assert.notNull(object);
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(object.getClass());
Class<?> objectType = object.getClass();
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(objectType);
MongoPersistentProperty idProp = entity == null ? null : entity.getIdProperty();
if (idProp == null) {
throw new MappingException("No id property found for object of type " + entity.getType().getName());
throw new MappingException("No id property found for object of type " + objectType);
}
ConversionService service = mongoConverter.getConversionService();
@@ -1306,8 +1324,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
/**
* Map the results of an ad-hoc query on the default MongoDB collection to a List of the specified type. The object is
* converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless configured
* otherwise, an instance of SimpleMongoConverter will be used. The query document is specified as a standard DBObject
* and so is the fields specification. Can be overridden by subclasses.
* otherwise, an instance of MappingMongoConverter will be used. The query document is specified as a standard
* DBObject and so is the fields specification. Can be overridden by subclasses.
*
* @param collectionName name of the collection to retrieve the objects from
* @param query the query document that specifies the criteria used to find a record
@@ -1408,19 +1426,15 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass);
DBObject updateObj = update.getUpdateObject();
for (String key : updateObj.keySet()) {
updateObj.put(key, mongoConverter.convertToMongoType(updateObj.get(key)));
}
DBObject mappedUpdate = mapper.getMappedObject(update.getUpdateObject(), entity);
DBObject mappedQuery = mapper.getMappedObject(query, entity);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("findAndModify using query: " + mappedQuery + " fields: " + fields + " sort: " + sort
+ " for class: " + entityClass + " and update: " + updateObj + " in collection: " + collectionName);
+ " for class: " + entityClass + " and update: " + mappedUpdate + " in collection: " + collectionName);
}
return executeFindOneInternal(new FindAndModifyCallback(mappedQuery, fields, sort, updateObj, options),
return executeFindOneInternal(new FindAndModifyCallback(mappedQuery, fields, sort, mappedUpdate, options),
new ReadDbObjectCallback<T>(readerToUse, entityClass), collectionName);
}
@@ -1647,7 +1661,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
if (writeResultChecking == WriteResultChecking.EXCEPTION) {
throw new DataIntegrityViolationException(message);
throw new MongoDataIntegrityViolationException(message, writeResult, operation);
} else {
LOGGER.error(message);
return;

View File

@@ -17,9 +17,11 @@ package org.springframework.data.mongodb.core.convert;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.HashMap;
import java.util.HashSet;
import java.util.List;
import java.util.Locale;
import java.util.Map;
import java.util.Set;
import org.slf4j.Logger;
@@ -63,6 +65,7 @@ public class CustomConversions {
private final Set<ConvertiblePair> writingPairs;
private final Set<Class<?>> customSimpleTypes;
private final SimpleTypeHolder simpleTypeHolder;
private final Map<Class<?>, HashMap<Class<?>, CacheValue>> cache;
private final List<Object> converters;
@@ -85,6 +88,7 @@ public class CustomConversions {
this.readingPairs = new HashSet<ConvertiblePair>();
this.writingPairs = new HashSet<ConvertiblePair>();
this.customSimpleTypes = new HashSet<Class<?>>();
this.cache = new HashMap<Class<?>, HashMap<Class<?>, CacheValue>>();
this.converters = new ArrayList<Object>();
this.converters.add(CustomToStringConverter.INSTANCE);
@@ -268,9 +272,11 @@ public class CustomConversions {
* @return
*/
public boolean hasCustomReadTarget(Class<?> source, Class<?> expectedTargetType) {
Assert.notNull(source);
Assert.notNull(expectedTargetType);
return getCustomTarget(source, expectedTargetType, readingPairs) != null;
return getCustomReadTarget(source, expectedTargetType) != null;
}
/**
@@ -299,8 +305,32 @@ public class CustomConversions {
return null;
}
private Class<?> getCustomReadTarget(Class<?> source, Class<?> expectedTargetType) {
Class<?> type = expectedTargetType == null ? PlaceholderType.class : expectedTargetType;
Map<Class<?>, CacheValue> map;
CacheValue toReturn;
if ((map = cache.get(source)) == null || (toReturn = map.get(type)) == null) {
Class<?> target = getCustomTarget(source, type, readingPairs);
if (cache.get(source) == null) {
cache.put(source, new HashMap<Class<?>, CacheValue>());
}
Map<Class<?>, CacheValue> value = cache.get(source);
toReturn = target == null ? CacheValue.NULL : new CacheValue(target);
value.put(type, toReturn);
}
return toReturn.clazz;
}
@WritingConverter
private enum CustomToStringConverter implements GenericConverter {
INSTANCE;
public Set<ConvertiblePair> getConvertibleTypes() {
@@ -313,4 +343,30 @@ public class CustomConversions {
return source.toString();
}
}
/**
* Placeholder type to allow registering not-found values in the converter cache.
*
* @author Patryk Wasik
* @author Oliver Gierke
*/
private static class PlaceholderType {
}
/**
* Wrapper to safely store {@literal null} values in the type cache.
*
* @author Patryk Wasik
* @author Oliver Gierke
*/
private static class CacheValue {
public static final CacheValue NULL = new CacheValue(null);
private final Class<?> clazz;
public CacheValue(Class<?> clazz) {
this.clazz = clazz;
}
}
}

View File

@@ -238,10 +238,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
entity.doWithProperties(new PropertyHandler<MongoPersistentProperty>() {
public void doWithPersistentProperty(MongoPersistentProperty prop) {
boolean isConstructorProperty = entity.isConstructorArgument(prop);
boolean hasValueForProperty = dbo.containsField(prop.getFieldName());
if (!hasValueForProperty || isConstructorProperty) {
if (!dbo.containsField(prop.getFieldName()) || entity.isConstructorArgument(prop)) {
return;
}
@@ -832,7 +829,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return conversionService.convert(obj, target);
}
if (null != obj && conversions.isSimpleType(obj.getClass())) {
if (conversions.isSimpleType(obj.getClass())) {
// Doesn't need conversion
return getPotentiallyConvertedSimpleWrite(obj);
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2012 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -18,6 +18,7 @@ package org.springframework.data.mongodb.core.convert;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import java.util.Set;
import org.bson.types.ObjectId;
import org.springframework.core.convert.ConversionException;
@@ -41,6 +42,7 @@ import com.mongodb.DBRef;
*
* @author Jon Brisbin
* @author Oliver Gierke
* @author Patryk Wasik
*/
public class QueryMapper {
@@ -83,11 +85,22 @@ public class QueryMapper {
for (String key : query.keySet()) {
MongoPersistentProperty targetProperty = getTargetProperty(key, entity);
String newKey = determineKey(key, entity);
Object value = query.get(key);
if (Keyword.isKeyword(key)) {
result.putAll(getMappedKeyword(new Keyword(query, key), entity));
continue;
}
result.put(newKey, getMappedValue(value, targetProperty, newKey));
Field field = entity == null ? new Field(key) : new MetadataBackedField(key, entity, mappingContext);
Object rawValue = query.get(key);
String newKey = field.getMappedKey();
if (Keyword.isKeyword(rawValue) && !field.isIdField()) {
Keyword keyword = new Keyword((DBObject) rawValue);
result.put(newKey, getMappedKeyword(field, keyword));
} else {
result.put(newKey, getMappedValue(field, query.get(key)));
}
}
return result;
@@ -103,13 +116,14 @@ public class QueryMapper {
private DBObject getMappedKeyword(Keyword query, MongoPersistentEntity<?> entity) {
// $or/$nor
if (query.key.matches(N_OR_PATTERN)) {
if (query.key.matches(N_OR_PATTERN) || query.value instanceof Iterable) {
Iterable<?> conditions = (Iterable<?>) query.value;
BasicDBList newConditions = new BasicDBList();
for (Object condition : conditions) {
newConditions.add(getMappedObject((DBObject) condition, entity));
newConditions.add(condition instanceof DBObject ? getMappedObject((DBObject) condition, entity)
: convertSimpleOrDBObject(condition, entity));
}
return new BasicDBObject(query.key, newConditions);
@@ -121,38 +135,34 @@ public class QueryMapper {
/**
* Returns the mapped keyword considered defining a criteria for the given property.
*
* @param keyword
* @param property
* @param keyword
* @return
*/
public DBObject getMappedKeyword(Keyword keyword, MongoPersistentProperty property) {
private DBObject getMappedKeyword(Field property, Keyword keyword) {
if (property.isAssociation()) {
convertAssociation(keyword.value, property);
}
boolean needsAssociationConversion = property.isAssociation() && !keyword.isExists();
Object value = needsAssociationConversion ? convertAssociation(keyword.value, property.getProperty())
: getMappedValue(property.with(keyword.key), keyword.value);
return new BasicDBObject(keyword.key, getMappedValue(keyword.value, property, keyword.key));
return new BasicDBObject(keyword.key, value);
}
/**
* Returns the mapped value for the given source object assuming it's a value for the given
* {@link MongoPersistentProperty}.
*
* @param source the source object to be mapped
* @param value the source object to be mapped
* @param property the property the value is a value for
* @param newKey the key the value will be bound to eventually
* @return
*/
private Object getMappedValue(Object source, MongoPersistentProperty property, String newKey) {
private Object getMappedValue(Field documentField, Object value) {
if (property == null) {
return convertSimpleOrDBObject(source, null);
}
if (documentField.isIdField()) {
if (property.isIdProperty() || "_id".equals(newKey)) {
if (source instanceof DBObject) {
DBObject valueDbo = (DBObject) source;
if (value instanceof DBObject) {
DBObject valueDbo = (DBObject) value;
if (valueDbo.containsField("$in") || valueDbo.containsField("$nin")) {
String inKey = valueDbo.containsField("$in") ? "$in" : "$nin";
List<Object> ids = new ArrayList<Object>();
@@ -163,63 +173,25 @@ public class QueryMapper {
} else if (valueDbo.containsField("$ne")) {
valueDbo.put("$ne", convertId(valueDbo.get("$ne")));
} else {
return getMappedObject((DBObject) source, null);
return getMappedObject((DBObject) value, null);
}
return valueDbo;
} else {
return convertId(source);
return convertId(value);
}
}
if (property.isAssociation()) {
return Keyword.isKeyword(source) ? getMappedKeyword(new Keyword(source), property) : convertAssociation(source,
property);
if (Keyword.isKeyword(value)) {
return getMappedKeyword(new Keyword((DBObject) value), null);
}
return convertSimpleOrDBObject(source, mappingContext.getPersistentEntity(property));
}
private MongoPersistentProperty getTargetProperty(String key, MongoPersistentEntity<?> entity) {
if (isIdKey(key, entity)) {
return entity.getIdProperty();
if (documentField.isAssociation()) {
return convertAssociation(value, documentField.getProperty());
}
PersistentPropertyPath<MongoPersistentProperty> path = getPath(key, entity);
return path == null ? null : path.getLeafProperty();
}
private PersistentPropertyPath<MongoPersistentProperty> getPath(String key, MongoPersistentEntity<?> entity) {
if (entity == null) {
return null;
}
try {
PropertyPath path = PropertyPath.from(key, entity.getTypeInformation());
return mappingContext.getPersistentPropertyPath(path);
} catch (PropertyReferenceException e) {
return null;
}
}
/**
* Returns the translated key assuming the given one is a propert (path) reference.
*
* @param key the source key
* @param entity the base entity
* @return the translated key
*/
private String determineKey(String key, MongoPersistentEntity<?> entity) {
if (entity == null && DEFAULT_ID_NAMES.contains(key)) {
return "_id";
}
PersistentPropertyPath<MongoPersistentProperty> path = getPath(key, entity);
return path == null ? key : path.toDotPath(MongoPersistentProperty.PropertyToFieldNameConverter.INSTANCE);
return convertSimpleOrDBObject(value, documentField.getPropertyEntity());
}
/**
@@ -243,7 +215,7 @@ public class QueryMapper {
}
/**
* Converts the given source assuming it's actually an association to anoter object.
* Converts the given source assuming it's actually an association to another object.
*
* @param source
* @param property
@@ -263,31 +235,19 @@ public class QueryMapper {
return result;
}
if (property.isMap()) {
BasicDBObject result = new BasicDBObject();
DBObject dbObject = (DBObject) source;
for (String key : dbObject.keySet()) {
Object o = dbObject.get(key);
result.put(key, o instanceof DBRef ? o : converter.toDBRef(o, property));
}
return result;
}
return source == null || source instanceof DBRef ? source : converter.toDBRef(source, property);
}
/**
* Returns whether the given key will be considered an id key.
*
* @param key
* @param entity
* @return
*/
private boolean isIdKey(String key, MongoPersistentEntity<?> entity) {
if (entity == null) {
return false;
}
MongoPersistentProperty idProperty = entity.getIdProperty();
if (idProperty != null) {
return idProperty.getName().equals(key) || idProperty.getFieldName().equals(key);
}
return DEFAULT_ID_NAMES.contains(key);
}
/**
* Converts the given raw id value into either {@link ObjectId} or {@link String}.
*
@@ -315,16 +275,27 @@ public class QueryMapper {
String key;
Object value;
Keyword(Object source) {
public Keyword(DBObject source, String key) {
this.key = key;
this.value = source.get(key);
}
Assert.isInstanceOf(DBObject.class, source);
public Keyword(DBObject dbObject) {
DBObject value = (DBObject) source;
Set<String> keys = dbObject.keySet();
Assert.isTrue(keys.size() == 1, "Can only use a single value DBObject!");
Assert.isTrue(value.keySet().size() == 1, "Keyword must have a single key only!");
this.key = keys.iterator().next();
this.value = dbObject.get(key);
}
this.key = value.keySet().iterator().next();
this.value = value.get(key);
/**
* Returns whether the current keyword is the {@code $exists} keyword.
*
* @return
*/
public boolean isExists() {
return "$exists".equalsIgnoreCase(key);
}
/**
@@ -334,7 +305,11 @@ public class QueryMapper {
* @param value
* @return
*/
static boolean isKeyword(Object value) {
public static boolean isKeyword(Object value) {
if (value instanceof String) {
return ((String) value).startsWith("$");
}
if (!(value instanceof DBObject)) {
return false;
@@ -344,4 +319,192 @@ public class QueryMapper {
return dbObject.keySet().size() == 1 && dbObject.keySet().iterator().next().startsWith("$");
}
}
/**
* Value object to represent a field and its meta-information.
*
* @author Oliver Gierke
*/
private static class Field {
private static final String ID_KEY = "_id";
protected final String name;
/**
* Creates a new {@link Field} without meta-information but the given name.
*
* @param name must not be {@literal null} or empty.
*/
public Field(String name) {
Assert.hasText(name, "Name must not be null!");
this.name = name;
}
/**
* Returns a new {@link Field} with the given name.
*
* @param name must not be {@literal null} or empty.
* @return
*/
public Field with(String name) {
return new Field(name);
}
/**
* Returns whether the current field is the id field.
*
* @return
*/
public boolean isIdField() {
return ID_KEY.equals(name);
}
/**
* Returns the underlying {@link MongoPersistentProperty} backing the field.
*
* @return
*/
public MongoPersistentProperty getProperty() {
return null;
}
/**
* Returns the {@link MongoPersistentEntity} that field is conatined in.
*
* @return
*/
public MongoPersistentEntity<?> getPropertyEntity() {
return null;
}
/**
* Returns whether the field represents an association.
*
* @return
*/
public boolean isAssociation() {
return false;
}
/**
* Returns the key to be used in the mapped document eventually.
*
* @return
*/
public String getMappedKey() {
return isIdField() ? ID_KEY : name;
}
}
/**
* Extension of {@link Field} to be backed with mapping metadata.
*
* @author Oliver Gierke
*/
private static class MetadataBackedField extends Field {
private final MongoPersistentEntity<?> entity;
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
private final MongoPersistentProperty property;
/**
* Creates a new {@link MetadataBackedField} with the given name, {@link MongoPersistentEntity} and
* {@link MappingContext}.
*
* @param name must not be {@literal null} or empty.
* @param entity must not be {@literal null}.
* @param context must not be {@literal null}.
*/
public MetadataBackedField(String name, MongoPersistentEntity<?> entity,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> context) {
super(name);
Assert.notNull(entity, "MongoPersistentEntity must not be null!");
this.entity = entity;
this.mappingContext = context;
PersistentPropertyPath<MongoPersistentProperty> path = getPath(name);
this.property = path == null ? null : path.getLeafProperty();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.QueryMapper.Field#with(java.lang.String)
*/
@Override
public MetadataBackedField with(String name) {
return new MetadataBackedField(name, entity, mappingContext);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.QueryMapper.Field#isIdKey()
*/
@Override
public boolean isIdField() {
MongoPersistentProperty idProperty = entity.getIdProperty();
if (idProperty != null) {
return idProperty.getName().equals(name) || idProperty.getFieldName().equals(name);
}
return DEFAULT_ID_NAMES.contains(name);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.QueryMapper.Field#getProperty()
*/
@Override
public MongoPersistentProperty getProperty() {
return property;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.QueryMapper.Field#getEntity()
*/
@Override
public MongoPersistentEntity<?> getPropertyEntity() {
MongoPersistentProperty property = getProperty();
return property == null ? null : mappingContext.getPersistentEntity(property);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.QueryMapper.Field#isAssociation()
*/
@Override
public boolean isAssociation() {
MongoPersistentProperty property = getProperty();
return property == null ? false : property.isAssociation();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.QueryMapper.Field#getTargetKey()
*/
@Override
public String getMappedKey() {
PersistentPropertyPath<MongoPersistentProperty> path = getPath(name);
return path == null ? name : path.toDotPath(MongoPersistentProperty.PropertyToFieldNameConverter.INSTANCE);
}
private PersistentPropertyPath<MongoPersistentProperty> getPath(String name) {
try {
PropertyPath path = PropertyPath.from(name, entity.getTypeInformation());
return mappingContext.getPersistentPropertyPath(path);
} catch (PropertyReferenceException e) {
return null;
}
}
}
}

View File

@@ -0,0 +1,5 @@
/**
* Spring Data MongoDB specific converter infrastructure.
*/
package org.springframework.data.mongodb.core.convert;

View File

@@ -131,7 +131,7 @@ public class GeoResults<T> implements Iterable<GeoResult<T>> {
private static Distance calculateAverageDistance(List<? extends GeoResult<?>> results, Metric metric) {
if (results.isEmpty()) {
return new Distance(0, null);
return new Distance(0, metric);
}
double averageDistance = 0;

View File

@@ -0,0 +1,5 @@
/**
* Support for MongoDB geo-spatial queries.
*/
package org.springframework.data.mongodb.core.geo;

View File

@@ -0,0 +1,5 @@
/**
* Support for MongoDB document indexing.
*/
package org.springframework.data.mongodb.core.index;

View File

@@ -17,6 +17,7 @@ package org.springframework.data.mongodb.core.mapping;
import java.beans.PropertyDescriptor;
import java.lang.reflect.Field;
import java.util.AbstractMap;
import org.springframework.beans.BeansException;
import org.springframework.context.ApplicationContext;
@@ -51,7 +52,7 @@ public class MongoMappingContext extends AbstractMappingContext<BasicMongoPersis
*/
@Override
protected boolean shouldCreatePersistentEntityFor(TypeInformation<?> type) {
return !MongoSimpleTypes.HOLDER.isSimpleType(type.getType());
return !MongoSimpleTypes.HOLDER.isSimpleType(type.getType()) && !AbstractMap.class.isAssignableFrom(type.getType());
}
/*

View File

@@ -0,0 +1,5 @@
/**
* Mapping event callback infrastructure for the MongoDB document-to-object mapping subsystem.
*/
package org.springframework.data.mongodb.core.mapping.event;

View File

@@ -0,0 +1,5 @@
/**
* Infrastructure for the MongoDB document-to-object mapping subsystem.
*/
package org.springframework.data.mongodb.core.mapping;

View File

@@ -0,0 +1,5 @@
/**
* Support for MongoDB map-reduce operations.
*/
package org.springframework.data.mongodb.core.mapreduce;

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2010-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -39,6 +39,10 @@ import com.mongodb.DBObject;
/**
* Central class for creating queries. It follows a fluent API style so that you can easily chain together multiple
* criteria. Static import of the 'Criteria.where' method will improve readability.
*
* @author Thomas Risberg
* @author Oliver Gierke
* @author Thomas Darimont
*/
public class Criteria implements CriteriaDefinition {
@@ -396,34 +400,54 @@ public class Criteria implements CriteriaDefinition {
/**
* Creates an 'or' criteria using the $or operator for all of the provided criteria
* <p>
* Note that mongodb doesn't support an $or operator to be wrapped in a $not operator.
* <p>
*
* @throws IllegalArgumentException if {@link #orOperator(Criteria...)} follows a not() call directly.
* @param criteria
*/
public Criteria orOperator(Criteria... criteria) {
BasicDBList bsonList = createCriteriaList(criteria);
criteriaChain.add(new Criteria("$or").is(bsonList));
return this;
return registerCriteriaChainElement(new Criteria("$or").is(bsonList));
}
/**
* Creates a 'nor' criteria using the $nor operator for all of the provided criteria
* Creates a 'nor' criteria using the $nor operator for all of the provided criteria.
* <p>
* Note that mongodb doesn't support an $nor operator to be wrapped in a $not operator.
* <p>
*
* @throws IllegalArgumentException if {@link #norOperator(Criteria...)} follows a not() call directly.
* @param criteria
*/
public Criteria norOperator(Criteria... criteria) {
BasicDBList bsonList = createCriteriaList(criteria);
criteriaChain.add(new Criteria("$nor").is(bsonList));
return this;
return registerCriteriaChainElement(new Criteria("$nor").is(bsonList));
}
/**
* Creates an 'and' criteria using the $and operator for all of the provided criteria
* Creates an 'and' criteria using the $and operator for all of the provided criteria.
* <p>
* Note that mongodb doesn't support an $and operator to be wrapped in a $not operator.
* <p>
*
* @throws IllegalArgumentException if {@link #andOperator(Criteria...)} follows a not() call directly.
* @param criteria
*/
public Criteria andOperator(Criteria... criteria) {
BasicDBList bsonList = createCriteriaList(criteria);
criteriaChain.add(new Criteria("$and").is(bsonList));
return registerCriteriaChainElement(new Criteria("$and").is(bsonList));
}
private Criteria registerCriteriaChainElement(Criteria criteria) {
if (lastOperatorWasNot()) {
throw new IllegalArgumentException("operator $not is not allowed around criteria chain element: "
+ criteria.getCriteriaObject());
} else {
criteriaChain.add(criteria);
}
return this;
}
@@ -468,6 +492,7 @@ public class Criteria implements CriteriaDefinition {
}
}
}
DBObject queryCriteria = new BasicDBObject();
if (isValue != NOT_SET) {
queryCriteria.put(this.key, this.isValue);

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2010-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -18,14 +18,15 @@ package org.springframework.data.mongodb.core.query;
import java.util.HashMap;
import java.util.Map;
import org.springframework.util.ObjectUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
public class Field {
private Map<String, Integer> criteria = new HashMap<String, Integer>();
private Map<String, Object> slices = new HashMap<String, Object>();
private final Map<String, Integer> criteria = new HashMap<String, Integer>();
private final Map<String, Object> slices = new HashMap<String, Object>();
public Field include(String key) {
criteria.put(key, Integer.valueOf(1));
@@ -50,11 +51,54 @@ public class Field {
public DBObject getFieldsObject() {
DBObject dbo = new BasicDBObject();
for (String k : criteria.keySet()) {
dbo.put(k, (criteria.get(k)));
dbo.put(k, criteria.get(k));
}
for (String k : slices.keySet()) {
dbo.put(k, new BasicDBObject("$slice", (slices.get(k))));
dbo.put(k, new BasicDBObject("$slice", slices.get(k)));
}
return dbo;
}
/*
* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object object) {
if (this == object) {
return true;
}
if (!(object instanceof Field)) {
return false;
}
Field that = (Field) object;
if (!this.criteria.equals(that.criteria)) {
return false;
}
if (!this.slices.equals(that.slices)) {
return false;
}
return true;
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
int result = 17;
result += 31 * ObjectUtils.nullSafeHashCode(this.criteria);
result += 31 * ObjectUtils.nullSafeHashCode(this.slices);
return result;
}
}

View File

@@ -15,6 +15,7 @@
*/
package org.springframework.data.mongodb.core.query;
import org.springframework.data.domain.Pageable;
import org.springframework.data.mongodb.core.geo.CustomMetric;
import org.springframework.data.mongodb.core.geo.Distance;
import org.springframework.data.mongodb.core.geo.Metric;
@@ -29,6 +30,7 @@ import com.mongodb.DBObject;
* Builder class to build near-queries.
*
* @author Oliver Gierke
* @author Thomas Darimont
*/
public class NearQuery {
@@ -38,6 +40,7 @@ public class NearQuery {
private Metric metric;
private boolean spherical;
private Integer num;
private Integer skip;
/**
* Creates a new {@link NearQuery}.
@@ -116,7 +119,7 @@ public class NearQuery {
}
/**
* Configures the number of results to return.
* Configures the maximum number of results to return.
*
* @param num
* @return
@@ -126,6 +129,29 @@ public class NearQuery {
return this;
}
/**
* Configures the number of results to skip.
*
* @param skip
* @return
*/
public NearQuery skip(int skip) {
this.skip = skip;
return this;
}
/**
* Configures the {@link Pageable} to use.
*
* @param pageable
* @return
*/
public NearQuery with(Pageable pageable) {
this.num = pageable.getOffset() + pageable.getPageSize();
this.skip = pageable.getOffset();
return this;
}
/**
* Sets the max distance results shall have from the configured origin. If a {@link Metric} was set before the given
* value will be interpreted as being a value in that metric. E.g.
@@ -290,9 +316,18 @@ public class NearQuery {
*/
public NearQuery query(Query query) {
this.query = query;
this.skip = query.getSkip();
this.num = query.getLimit();
return this;
}
/**
* @return the number of elements to skip.
*/
public Integer getSkip() {
return skip;
}
/**
* Returns the {@link DBObject} built by the {@link NearQuery}.
*

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2012 the original author or authors.
* Copyright 2010-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -46,22 +46,33 @@ public class Query {
private String hint;
/**
* Static factory method to create a Query using the provided criteria
* Static factory method to create a {@link Query} using the provided {@link Criteria}.
*
* @param critera
* @param criteria must not be {@literal null}.
* @return
*/
public static Query query(Criteria critera) {
return new Query(critera);
public static Query query(Criteria criteria) {
return new Query(criteria);
}
public Query() {
}
/**
* Creates a new {@link Query} using the given {@link Criteria}.
*
* @param criteria must not be {@literal null}.
*/
public Query(Criteria criteria) {
addCriteria(criteria);
}
/**
* Adds the given {@link Criteria} to the current {@link Query}.
*
* @param criteria must not be {@literal null}.
* @return
*/
public Query addCriteria(Criteria criteria) {
CriteriaDefinition existing = this.criteria.get(criteria.getKey());
String key = criteria.getKey();

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2012 the original author or authors.
* Copyright 2010-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -52,7 +52,10 @@ public class Update {
/**
* Creates an {@link Update} instance from the given {@link DBObject}. Allows to explicitly exlude fields from making
* it into the created {@link Update} object.
* it into the created {@link Update} object. Note, that this will set attributes directly and <em>not</em> use
* {@literal $set}. This means fields not given in the {@link DBObject} will be nulled when executing the update. To
* create an only-updating {@link Update} instance of a {@link DBObject}, call {@link #set(String, Object)} for each
* value in it.
*
* @param object the source {@link DBObject} to create the update from.
* @param exclude the fields to exclude.
@@ -69,7 +72,7 @@ public class Update {
continue;
}
update.set(key, object.get(key));
update.modifierOps.put(key, object.get(key));
}
return update;
@@ -160,7 +163,7 @@ public class Update {
* @return
*/
public Update pop(String key, Position pos) {
addMultiFieldOperation("$pop", key, (pos == Position.FIRST ? -1 : 1));
addMultiFieldOperation("$pop", key, pos == Position.FIRST ? -1 : 1);
return this;
}
@@ -214,22 +217,23 @@ public class Update {
return dbo;
}
@SuppressWarnings("unchecked")
protected void addMultiFieldOperation(String operator, String key, Object value) {
Object existingValue = this.modifierOps.get(operator);
LinkedHashMap<String, Object> keyValueMap;
DBObject keyValueMap;
if (existingValue == null) {
keyValueMap = new LinkedHashMap<String, Object>();
keyValueMap = new BasicDBObject();
this.modifierOps.put(operator, keyValueMap);
} else {
if (existingValue instanceof LinkedHashMap) {
keyValueMap = (LinkedHashMap<String, Object>) existingValue;
if (existingValue instanceof BasicDBObject) {
keyValueMap = (BasicDBObject) existingValue;
} else {
throw new InvalidDataAccessApiUsageException("Modifier Operations should be a LinkedHashMap but was "
+ existingValue.getClass());
}
}
keyValueMap.put(key, value);
}
}

View File

@@ -0,0 +1,5 @@
/**
* Support for MongoDB GridFS feature.
*/
package org.springframework.data.mongodb.gridfs;

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2002-2011 the original author or authors.
* Copyright 2002-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,19 +15,20 @@
*/
package org.springframework.data.mongodb.monitor;
import java.net.InetAddress;
import java.net.UnknownHostException;
import com.mongodb.Mongo;
import org.springframework.jmx.export.annotation.ManagedMetric;
import org.springframework.jmx.export.annotation.ManagedOperation;
import org.springframework.jmx.export.annotation.ManagedResource;
import org.springframework.jmx.support.MetricType;
import com.mongodb.Mongo;
/**
* Expose basic server information via JMX
*
* @author Mark Pollack
* @author Thomas Darimont
*/
@ManagedResource(description = "Server Information")
public class ServerInfo extends AbstractMonitor {
@@ -36,9 +37,20 @@ public class ServerInfo extends AbstractMonitor {
this.mongo = mongo;
}
/**
* Returns the hostname of the used server reported by mongo.
*
* @return the reported hostname can also be an IP address.
* @throws UnknownHostException
*/
@ManagedOperation(description = "Server host name")
public String getHostName() throws UnknownHostException {
return InetAddress.getLocalHost().getHostName();
/*
* UnknownHostException is not necessary anymore, but clients could have
* called this method in a try..catch(UnknownHostException) already
*/
return getServerStatus().getServerUsed().getHost();
}
@ManagedMetric(displayName = "Uptime Estimate")

View File

@@ -0,0 +1,5 @@
/**
* Spring Data's MongoDB abstraction.
*/
package org.springframework.data.mongodb;

View File

@@ -0,0 +1,5 @@
/**
* CDI support for MongoDB specific repository implementation.
*/
package org.springframework.data.mongodb.repository.cdi;

View File

@@ -0,0 +1,5 @@
/**
* Support infrastructure for the configuration of MongoDB specific repositories.
*/
package org.springframework.data.mongodb.repository.config;

View File

@@ -36,6 +36,7 @@ import org.springframework.util.Assert;
* Base class for {@link RepositoryQuery} implementations for Mongo.
*
* @author Oliver Gierke
* @author Thomas Darimont
*/
public abstract class AbstractMongoQuery implements RepositoryQuery {
@@ -257,6 +258,11 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
nearQuery.maxDistance(maxDistance).in(maxDistance.getMetric());
}
Pageable pageable = accessor.getPageable();
if (pageable != null) {
nearQuery.with(pageable);
}
MongoEntityMetadata<?> metadata = method.getEntityInformation();
return (GeoResults<Object>) operations.geoNear(nearQuery, metadata.getJavaType(), metadata.getCollectionName());
}
@@ -264,7 +270,13 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
private boolean isListOfGeoResult() {
TypeInformation<?> returnType = method.getReturnType();
return returnType.getType().equals(List.class) && GeoResult.class.equals(returnType.getComponentType());
if (!returnType.getType().equals(List.class)) {
return false;
}
TypeInformation<?> componentType = returnType.getComponentType();
return componentType == null ? false : GeoResult.class.equals(componentType.getType());
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2012 the original author or authors.
* Copyright 2010-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -268,15 +268,16 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
private String toLikeRegex(String source, Type type) {
switch (type) {
case STARTING_WITH:
source = source + "*";
break;
case ENDING_WITH:
source = "*" + source;
break;
case CONTAINING:
source = "*" + source + "*";
break;
case STARTING_WITH:
source = "^" + source;
break;
case ENDING_WITH:
source = source + "$";
break;
case CONTAINING:
source = "*" + source + "*";
break;
default:
}
return source.replaceAll("\\*", ".*");

View File

@@ -0,0 +1,5 @@
/**
* Query derivation mechanism for MongoDB specific repositories.
*/
package org.springframework.data.mongodb.repository.query;

View File

@@ -0,0 +1,5 @@
/**
* Support infrastructure for query derivation of MongoDB specific repositories.
*/
package org.springframework.data.mongodb.repository.support;

View File

@@ -11,8 +11,7 @@
<xsd:import namespace="http://www.springframework.org/schema/beans" />
<xsd:import namespace="http://www.springframework.org/schema/tool" />
<xsd:import namespace="http://www.springframework.org/schema/context"
schemaLocation="http://www.springframework.org/schema/context/spring-context.xsd" />
<xsd:import namespace="http://www.springframework.org/schema/context" />
<xsd:import namespace="http://www.springframework.org/schema/data/repository"
schemaLocation="http://www.springframework.org/schema/data/repository/spring-repository-1.0.xsd" />

View File

@@ -11,8 +11,7 @@
<xsd:import namespace="http://www.springframework.org/schema/beans" />
<xsd:import namespace="http://www.springframework.org/schema/tool" />
<xsd:import namespace="http://www.springframework.org/schema/context"
schemaLocation="http://www.springframework.org/schema/context/spring-context.xsd" />
<xsd:import namespace="http://www.springframework.org/schema/context" />
<xsd:import namespace="http://www.springframework.org/schema/data/repository"
schemaLocation="http://www.springframework.org/schema/data/repository/spring-repository.xsd" />

View File

@@ -11,8 +11,7 @@
<xsd:import namespace="http://www.springframework.org/schema/beans" />
<xsd:import namespace="http://www.springframework.org/schema/tool" />
<xsd:import namespace="http://www.springframework.org/schema/context"
schemaLocation="http://www.springframework.org/schema/context/spring-context.xsd" />
<xsd:import namespace="http://www.springframework.org/schema/context" />
<xsd:import namespace="http://www.springframework.org/schema/data/repository"
schemaLocation="http://www.springframework.org/schema/data/repository/spring-repository.xsd" />
@@ -36,7 +35,7 @@ Defines a MongoDbFactory for connecting to a specific database
]]></xsd:documentation>
</xsd:annotation>
<xsd:complexType>
<xsd:attribute name="id" type="xsd:ID" use="optional">
<xsd:attribute name="id" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The name of the mongo definition (by default "mongoDbFactory").]]></xsd:documentation>
@@ -157,7 +156,7 @@ The Mongo URI string.]]></xsd:documentation>
</xsd:complexType>
</xsd:element>
</xsd:sequence>
<xsd:attribute name="id" type="xsd:ID" use="optional">
<xsd:attribute name="id" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The name of the MappingMongoConverter instance (by default "mappingConverter").]]></xsd:documentation>
@@ -336,7 +335,7 @@ The Mongo driver options
<!-- MLP
<xsd:attributeGroup ref="writeConcern" />
-->
<xsd:attribute name="id" type="xsd:ID" use="optional">
<xsd:attribute name="id" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The name of the mongo definition (by default "mongo").]]></xsd:documentation>

View File

@@ -19,8 +19,13 @@ import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import org.junit.Test;
import org.springframework.context.annotation.AnnotationConfigApplicationContext;
import org.springframework.context.annotation.Bean;
import org.springframework.data.mongodb.core.mapping.BasicMongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.expression.spel.support.StandardEvaluationContext;
import org.springframework.test.util.ReflectionTestUtils;
import com.mongodb.Mongo;
@@ -63,6 +68,17 @@ public class AbstractMongoConfigurationUnitTests {
assertScanningDisabled(" ");
}
@Test
public void lifecycleCallbacksAreInvokedInAppropriateOrder() {
AnnotationConfigApplicationContext context = new AnnotationConfigApplicationContext(SampleMongoConfiguration.class);
MongoMappingContext mappingContext = context.getBean(MongoMappingContext.class);
BasicMongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(Entity.class);
StandardEvaluationContext spElContext = (StandardEvaluationContext) ReflectionTestUtils.getField(entity, "context");
assertThat(spElContext.getBeanResolver(), is(notNullValue()));
}
private static void assertScanningDisabled(final String value) throws ClassNotFoundException {
AbstractMongoConfiguration configuration = new SampleMongoConfiguration() {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2012 the original author or authors.
* Copyright 2012-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -31,6 +31,7 @@ import com.mongodb.ServerAddress;
* Unit tests for {@link ServerAddressPropertyEditor}.
*
* @author Oliver Gierke
* @author Thomas Darimont
*/
public class ServerAddressPropertyEditorUnitTests {
@@ -70,6 +71,16 @@ public class ServerAddressPropertyEditorUnitTests {
assertSingleAddressOfLocalhost(editor.getValue());
}
/**
* @see DATAMONGO-693
*/
@Test
public void interpretEmptyStringAsNull() {
editor.setAsText("");
assertNull(editor.getValue());
}
private static void assertSingleAddressOfLocalhost(Object result) throws UnknownHostException {
assertThat(result, is(instanceOf(ServerAddress[].class)));

View File

@@ -27,7 +27,7 @@ import com.mongodb.WriteConcern;
*
* @author Oliver Gierke
*/
public class StringToWriteConcernConverterUnitTest {
public class StringToWriteConcernConverterUnitTests {
StringToWriteConcernConverter converter = new StringToWriteConcernConverter();

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2012 the original author or authors.
* Copyright 2012-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -20,6 +20,8 @@ import static org.junit.Assert.*;
import static org.mockito.Matchers.*;
import static org.mockito.Mockito.*;
import java.util.List;
import org.junit.After;
import org.junit.Before;
import org.junit.Test;
@@ -28,7 +30,9 @@ import org.mockito.Mock;
import org.mockito.invocation.InvocationOnMock;
import org.mockito.runners.MockitoJUnitRunner;
import org.mockito.stubbing.Answer;
import org.springframework.transaction.support.TransactionSynchronization;
import org.springframework.transaction.support.TransactionSynchronizationManager;
import org.springframework.transaction.support.TransactionSynchronizationUtils;
import com.mongodb.DB;
import com.mongodb.Mongo;
@@ -37,12 +41,12 @@ import com.mongodb.Mongo;
* Unit tests for {@link MongoDbUtils}.
*
* @author Oliver Gierke
* @author Randy Watler
*/
@RunWith(MockitoJUnitRunner.class)
public class MongoDbUtilsUnitTests {
@Mock
Mongo mongo;
@Mock Mongo mongo;
@Before
public void setUp() throws Exception {
@@ -81,4 +85,94 @@ public class MongoDbUtilsUnitTests {
assertThat(first, is(notNullValue()));
assertThat(MongoDbUtils.getDB(mongo, "first"), is(sameInstance(first)));
}
/**
* @see DATAMONGO-737
*/
@Test
public void handlesTransactionSynchronizationLifecycle() {
// ensure transaction synchronization manager has no registered
// transaction synchronizations or bound resources at start of test
assertThat(TransactionSynchronizationManager.getSynchronizations().isEmpty(), is(true));
assertThat(TransactionSynchronizationManager.getResourceMap().isEmpty(), is(true));
// access database for one mongo instance, (registers transaction
// synchronization and binds transaction resource)
MongoDbUtils.getDB(mongo, "first");
// ensure transaction synchronization manager has registered
// transaction synchronizations and bound resources
assertThat(TransactionSynchronizationManager.getSynchronizations().isEmpty(), is(false));
assertThat(TransactionSynchronizationManager.getResourceMap().isEmpty(), is(false));
// simulate transaction completion, (unbinds transaction resource)
try {
simulateTransactionCompletion();
} catch (Exception e) {
fail("Unexpected exception thrown during transaction completion: " + e);
}
// ensure transaction synchronization manager has no bound resources
// at end of test
assertThat(TransactionSynchronizationManager.getResourceMap().isEmpty(), is(true));
}
/**
* @see DATAMONGO-737
*/
@Test
public void handlesTransactionSynchronizationsLifecycle() {
// ensure transaction synchronization manager has no registered
// transaction synchronizations or bound resources at start of test
assertThat(TransactionSynchronizationManager.getSynchronizations().isEmpty(), is(true));
assertThat(TransactionSynchronizationManager.getResourceMap().isEmpty(), is(true));
// access multiple databases for one mongo instance, (registers
// transaction synchronizations and binds transaction resources)
MongoDbUtils.getDB(mongo, "first");
MongoDbUtils.getDB(mongo, "second");
// ensure transaction synchronization manager has registered
// transaction synchronizations and bound resources
assertThat(TransactionSynchronizationManager.getSynchronizations().isEmpty(), is(false));
assertThat(TransactionSynchronizationManager.getResourceMap().isEmpty(), is(false));
// simulate transaction completion, (unbinds transaction resources)
try {
simulateTransactionCompletion();
} catch (Exception e) {
fail("Unexpected exception thrown during transaction completion: " + e);
}
// ensure transaction synchronization manager has no bound
// transaction resources at end of test
assertThat(TransactionSynchronizationManager.getResourceMap().isEmpty(), is(true));
}
/**
* Simulate transaction rollback/commit completion protocol on managed transaction synchronizations which will unbind
* managed transaction resources. Does not swallow exceptions for testing purposes.
*
* @see TransactionSynchronizationUtils#triggerBeforeCompletion()
* @see TransactionSynchronizationUtils#triggerAfterCompletion(int)
*/
private void simulateTransactionCompletion() {
// triggerBeforeCompletion() implementation without swallowed exceptions
List<TransactionSynchronization> synchronizations = TransactionSynchronizationManager.getSynchronizations();
for (TransactionSynchronization synchronization : synchronizations) {
synchronization.beforeCompletion();
}
// triggerAfterCompletion() implementation without swallowed exceptions
List<TransactionSynchronization> remainingSynchronizations = TransactionSynchronizationManager
.getSynchronizations();
if (remainingSynchronizations != null) {
for (TransactionSynchronization remainingSynchronization : remainingSynchronizations) {
remainingSynchronization.afterCompletion(TransactionSynchronization.STATUS_ROLLED_BACK);
}
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2012 the original author or authors.
* Copyright 2012-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -21,17 +21,21 @@ import static org.junit.Assert.*;
import org.junit.Test;
import org.springframework.beans.factory.support.DefaultListableBeanFactory;
import org.springframework.beans.factory.support.RootBeanDefinition;
import org.springframework.data.mongodb.config.ServerAddressPropertyEditor;
import org.springframework.data.mongodb.config.WriteConcernPropertyEditor;
import org.springframework.test.util.ReflectionTestUtils;
import com.mongodb.Mongo;
import com.mongodb.ServerAddress;
import com.mongodb.WriteConcern;
/**
* Integration tests for {@link MongoFactoryBean}.
*
* @author Oliver Gierke
* @author Thomas Darimont
*/
public class MongoFactoryBeanIntegrationTest {
public class MongoFactoryBeanIntegrationTests {
/**
* @see DATAMONGO-408
@@ -49,4 +53,22 @@ public class MongoFactoryBeanIntegrationTest {
MongoFactoryBean bean = factory.getBean("&factory", MongoFactoryBean.class);
assertThat(ReflectionTestUtils.getField(bean, "writeConcern"), is((Object) WriteConcern.SAFE));
}
/**
* @see DATAMONGO-693
*/
@Test
public void createMongoInstanceWithHostAndEmptyReplicaSets() {
RootBeanDefinition definition = new RootBeanDefinition(MongoFactoryBean.class);
definition.getPropertyValues().addPropertyValue("host", "localhost");
definition.getPropertyValues().addPropertyValue("replicaPair", "");
DefaultListableBeanFactory factory = new DefaultListableBeanFactory();
factory.registerCustomEditor(ServerAddress.class, ServerAddressPropertyEditor.class);
factory.registerBeanDefinition("factory", definition);
Mongo mongo = factory.getBean(Mongo.class);
assertNotNull(mongo);
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2012 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -17,6 +17,7 @@ package org.springframework.data.mongodb.core;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import static org.mockito.Mockito.*;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import static org.springframework.data.mongodb.core.query.Query.*;
import static org.springframework.data.mongodb.core.query.Update.*;
@@ -30,6 +31,7 @@ import java.util.List;
import java.util.Map;
import org.bson.types.ObjectId;
import org.hamcrest.CoreMatchers;
import org.joda.time.DateTime;
import org.junit.After;
import org.junit.Assert;
@@ -42,12 +44,15 @@ import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.core.convert.converter.Converter;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.dao.DuplicateKeyException;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.dao.OptimisticLockingFailureException;
import org.springframework.data.annotation.Id;
import org.springframework.data.annotation.PersistenceConstructor;
import org.springframework.data.annotation.Version;
import org.springframework.data.mapping.model.MappingException;
import org.springframework.data.mongodb.InvalidMongoDbApiUsageException;
import org.springframework.data.mongodb.MongoDataIntegrityViolationException;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
@@ -55,6 +60,7 @@ import org.springframework.data.mongodb.core.index.Index;
import org.springframework.data.mongodb.core.index.Index.Duplicates;
import org.springframework.data.mongodb.core.index.IndexField;
import org.springframework.data.mongodb.core.index.IndexInfo;
import org.springframework.data.mongodb.core.mapping.Field;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.query.BasicQuery;
import org.springframework.data.mongodb.core.query.Criteria;
@@ -82,20 +88,18 @@ import com.mongodb.WriteResult;
* @author Thomas Risberg
* @author Amol Nayak
* @author Patryk Wasik
* @author Thomas Darimont
*/
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration("classpath:infrastructure.xml")
public class MongoTemplateTests {
@Autowired
MongoTemplate template;
@Autowired
MongoDbFactory factory;
@Autowired MongoTemplate template;
@Autowired MongoDbFactory factory;
MongoTemplate mappingTemplate;
@Rule
public ExpectedException thrown = ExpectedException.none();
@Rule public ExpectedException thrown = ExpectedException.none();
@Autowired
@SuppressWarnings("unchecked")
@@ -145,7 +149,10 @@ public class MongoTemplateTests {
template.dropCollection(TestClass.class);
template.dropCollection(Sample.class);
template.dropCollection(MyPerson.class);
template.dropCollection(TypeWithFieldAnnotation.class);
template.dropCollection(TypeWithDate.class);
template.dropCollection("collection");
template.dropCollection("personX");
}
@Test
@@ -272,7 +279,8 @@ public class MongoTemplateTests {
} catch (DataIntegrityViolationException e) {
assertThat(
e.getMessage(),
startsWith("Insert list failed: E11000 duplicate key error index: database.person.$_id_ dup key: { : ObjectId"));
CoreMatchers
.startsWith("Insert list failed: E11000 duplicate key error index: database.person.$_id_ dup key: { : ObjectId"));
}
}
@@ -765,8 +773,7 @@ public class MongoTemplateTests {
Query q3 = new Query(Criteria.where("age").in(l1, l2));
template.find(q3, PersonWithIdPropertyOfTypeObjectId.class);
Assert.fail("Should have trown an InvalidDocumentStoreApiUsageException");
} catch (InvalidMongoDbApiUsageException e) {
}
} catch (InvalidMongoDbApiUsageException e) {}
}
@Test
@@ -949,6 +956,7 @@ public class MongoTemplateTests {
}
@Test
@SuppressWarnings("deprecation")
public void testUsingReadPreference() throws Exception {
this.template.execute("readPref", new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
@@ -1318,6 +1326,18 @@ public class MongoTemplateTests {
template.save(person);
}
/**
* @see DATAMONGO-617
*/
@Test
public void doesNotFailOnVersionInitForUnversionedEntity() {
DBObject dbObject = new BasicDBObject();
dbObject.put("firstName", "Oliver");
template.insert(dbObject, template.determineCollectionName(PersonWithVersionPropertyOfTypeInteger.class));
}
/**
* @see DATAMONGO-539
*/
@@ -1455,6 +1475,199 @@ public class MongoTemplateTests {
assertThat(template.find(null, PersonWithIdPropertyOfTypeObjectId.class), is(result));
}
/**
* @see DATAMONGO-620
*/
@Test
public void versionsObjectIntoDedicatedCollection() {
PersonWithVersionPropertyOfTypeInteger person = new PersonWithVersionPropertyOfTypeInteger();
person.firstName = "Dave";
template.save(person, "personX");
assertThat(person.version, is(0));
template.save(person, "personX");
assertThat(person.version, is(1));
}
/**
* @see DATAMONGO-621
*/
@Test
public void correctlySetsLongVersionProperty() {
PersonWithVersionPropertyOfTypeLong person = new PersonWithVersionPropertyOfTypeLong();
person.firstName = "Dave";
template.save(person);
assertThat(person.version, is(0L));
}
/**
* @see DATAMONGO-622
*/
@Test(expected = DuplicateKeyException.class)
public void preventsDuplicateInsert() {
template.setWriteConcern(WriteConcern.SAFE);
PersonWithVersionPropertyOfTypeInteger person = new PersonWithVersionPropertyOfTypeInteger();
person.firstName = "Dave";
template.save(person);
assertThat(person.version, is(0));
person.version = null;
template.save(person);
}
/**
* @see DATAMONGO-629
*/
@Test
public void countAndFindWithoutTypeInformation() {
Person person = new Person();
template.save(person);
Query query = query(where("_id").is(person.getId()));
String collectionName = template.getCollectionName(Person.class);
assertThat(template.find(query, HashMap.class, collectionName), hasSize(1));
assertThat(template.count(query, collectionName), is(1L));
}
/**
* @see DATAMONGO-571
*/
@Test
public void nullsPropertiesForVersionObjectUpdates() {
VersionedPerson person = new VersionedPerson();
person.firstname = "Dave";
person.lastname = "Matthews";
template.save(person);
assertThat(person.id, is(notNullValue()));
person.lastname = null;
template.save(person);
person = template.findOne(query(where("id").is(person.id)), VersionedPerson.class);
assertThat(person.lastname, is(nullValue()));
}
/**
* @see DATAMONGO-571
*/
@Test
public void nullsValuesForUpdatesOfUnversionedEntity() {
Person person = new Person("Dave");
template.save(person);
person.setFirstName(null);
template.save(person);
person = template.findOne(query(where("id").is(person.getId())), Person.class);
assertThat(person.getFirstName(), is(nullValue()));
}
/**
* @see DATAMONGO-651
*/
@Test
public void throwsMongoSpecificExceptionForDataIntegrityViolations() {
WriteResult result = mock(WriteResult.class);
when(result.getError()).thenReturn("ERROR");
MongoActionOperation operation = MongoActionOperation.INSERT;
MongoTemplate mongoTemplate = new MongoTemplate(factory);
mongoTemplate.setWriteResultChecking(WriteResultChecking.EXCEPTION);
try {
mongoTemplate.handleAnyWriteResultErrors(result, null, operation);
fail("Expected MonogoDataIntegrityViolationException!");
} catch (MongoDataIntegrityViolationException o_O) {
assertThat(o_O.getActionOperation(), is(operation));
assertThat(o_O.getWriteResult(), is(result));
}
}
/**
* @see DATAMONGO-679
*/
@Test
public void savesJsonStringCorrectly() {
DBObject dbObject = new BasicDBObject().append("first", "first").append("second", "second");
template.save(dbObject.toString(), "collection");
List<DBObject> result = template.findAll(DBObject.class, "collection");
assertThat(result.size(), is(1));
assertThat(result.get(0).containsField("first"), is(true));
}
/**
* @see DATAMONGO-675
*/
@Test
public void updateConsidersMappingAnnotations() {
TypeWithFieldAnnotation entity = new TypeWithFieldAnnotation();
entity.emailAddress = "old";
template.save(entity);
Query query = query(where("_id").is(entity.id));
Update update = Update.update("emailAddress", "new");
FindAndModifyOptions options = new FindAndModifyOptions().returnNew(true);
TypeWithFieldAnnotation result = template.findAndModify(query, update, options, TypeWithFieldAnnotation.class);
assertThat(result.emailAddress, is("new"));
}
/**
* @see DATAMONGO-671
*/
@Test
public void findsEntityByDateReference() {
TypeWithDate entity = new TypeWithDate();
entity.date = new Date(System.currentTimeMillis() - 10);
template.save(entity);
Query query = query(where("date").lt(new Date()));
List<TypeWithDate> result = template.find(query, TypeWithDate.class);
assertThat(result, hasSize(1));
assertThat(result.get(0).date, is(notNullValue()));
}
/**
* @see DATAMONGO-540
*/
@Test
public void findOneAfterUpsertForNonExistingObjectReturnsTheInsertedObject() {
String idValue = "4711";
Query query = new Query(Criteria.where("id").is(idValue));
String fieldValue = "bubu";
Update update = Update.update("field", fieldValue);
template.upsert(query, update, Sample.class);
Sample result = template.findOne(query, Sample.class);
assertThat(result, is(notNullValue()));
assertThat(result.field, is(fieldValue));
assertThat(result.id, is(idValue));
}
static class MyId {
String first;
@@ -1463,14 +1676,12 @@ public class MongoTemplateTests {
static class TypeWithMyId {
@Id
MyId id;
@Id MyId id;
}
public static class Sample {
@Id
String id;
@Id String id;
String field;
}
@@ -1524,4 +1735,22 @@ public class MongoTemplateTests {
String state;
String city;
}
static class VersionedPerson {
@Version Long version;
String id, firstname, lastname;
}
static class TypeWithFieldAnnotation {
@Id ObjectId id;
@Field("email") String emailAddress;
}
static class TypeWithDate {
@Id String id;
Date date;
}
}

View File

@@ -15,7 +15,7 @@
*/
package org.springframework.data.mongodb.core;
import org.springframework.data.mongodb.core.mapping.Version;
import org.springframework.data.annotation.Version;
public class PersonWithVersionPropertyOfTypeInteger {

View File

@@ -0,0 +1,34 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.data.annotation.Version;
public class PersonWithVersionPropertyOfTypeLong {
String id;
String firstName;
int age;
@Version
Long version;
@Override
public String toString() {
return "PersonWithVersionPropertyOfTypeInteger [id=" + id + ", firstName=" + firstName + ", age=" + age
+ ", version=" + version + "]";
}
}

View File

@@ -1,64 +0,0 @@
package org.springframework.data.mongodb.core;
public class SomeEnumTest {
public enum StringEnum {
ONE, TWO, FIVE;
}
public enum NumberEnum {
ONE(1), TWO(2), FIVE(5);
private int value;
public int value() {
return value;
}
NumberEnum(int value) {
this.value = value;
}
}
private StringEnum stringEnum;
private NumberEnum numberEnum;
private String id;
private String name;
public StringEnum getStringEnum() {
return stringEnum;
}
public void setStringEnum(StringEnum stringEnum) {
this.stringEnum = stringEnum;
}
public NumberEnum getNumberEnum() {
return numberEnum;
}
public void setNumberEnum(NumberEnum numberEnum) {
this.numberEnum = numberEnum;
}
public String getId() {
return id;
}
public void setId(String id) {
this.id = id;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
}

View File

@@ -45,7 +45,6 @@ import org.mockito.Mock;
import org.mockito.runners.MockitoJUnitRunner;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.ApplicationContext;
import org.springframework.context.event.ContextRefreshedEvent;
import org.springframework.core.convert.converter.Converter;
import org.springframework.data.annotation.Id;
import org.springframework.data.annotation.PersistenceConstructor;
@@ -88,7 +87,7 @@ public class MappingMongoConverterUnitTests {
mappingContext = new MongoMappingContext();
mappingContext.setApplicationContext(context);
mappingContext.onApplicationEvent(new ContextRefreshedEvent(context));
mappingContext.afterPropertiesSet();
converter = new MappingMongoConverter(factory, mappingContext);
converter.afterPropertiesSet();

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2012 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -25,6 +25,7 @@ import java.math.BigInteger;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import java.util.Map;
import org.bson.types.ObjectId;
import org.junit.Before;
@@ -36,6 +37,7 @@ import org.springframework.data.annotation.Id;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.DBObjectUtils;
import org.springframework.data.mongodb.core.Person;
import org.springframework.data.mongodb.core.mapping.BasicMongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.DBRef;
import org.springframework.data.mongodb.core.mapping.Field;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
@@ -53,6 +55,7 @@ import com.mongodb.QueryBuilder;
* Unit tests for {@link QueryMapper}.
*
* @author Oliver Gierke
* @author Patryk Wasik
*/
@RunWith(MockitoJUnitRunner.class)
public class QueryMapperUnitTests {
@@ -61,8 +64,7 @@ public class QueryMapperUnitTests {
MongoMappingContext context;
MappingMongoConverter converter;
@Mock
MongoDbFactory factory;
@Mock MongoDbFactory factory;
@Before
public void setUp() {
@@ -332,7 +334,11 @@ public class QueryMapperUnitTests {
DBObject result = mapper.getMappedObject(query.getQueryObject(), context.getPersistentEntity(WithDBRef.class));
DBObject reference = DBObjectUtils.getAsDBObject(result, "reference");
assertThat(reference.containsField("$in"), is(true));
BasicDBList inClause = getAsDBList(reference, "$in");
assertThat(inClause, hasSize(2));
assertThat(inClause.get(0), is(instanceOf(com.mongodb.DBRef.class)));
assertThat(inClause.get(1), is(instanceOf(com.mongodb.DBRef.class)));
}
/**
@@ -347,6 +353,91 @@ public class QueryMapperUnitTests {
assertThat(object.get("reference"), is(nullValue()));
}
/**
* @see DATAMONGO-629
*/
@Test
public void doesNotMapIdIfNoEntityMetadataAvailable() {
String id = new ObjectId().toString();
Query query = query(where("id").is(id));
DBObject object = mapper.getMappedObject(query.getQueryObject(), null);
assertThat(object.containsField("id"), is(true));
assertThat(object.get("id"), is((Object) id));
assertThat(object.containsField("_id"), is(false));
}
/**
* @see DATAMONGO-677
*/
@Test
public void handleMapWithDBRefCorrectly() {
DBObject mapDbObject = new BasicDBObject();
mapDbObject.put("test", new com.mongodb.DBRef(null, "test", "test"));
DBObject dbObject = new BasicDBObject();
dbObject.put("mapWithDBRef", mapDbObject);
DBObject mapped = mapper.getMappedObject(dbObject, context.getPersistentEntity(WithMapDBRef.class));
assertThat(mapped.containsField("mapWithDBRef"), is(true));
assertThat(mapped.get("mapWithDBRef"), instanceOf(BasicDBObject.class));
assertThat(((BasicDBObject) mapped.get("mapWithDBRef")).containsField("test"), is(true));
assertThat(((BasicDBObject) mapped.get("mapWithDBRef")).get("test"), instanceOf(com.mongodb.DBRef.class));
}
@Test
public void convertsUnderscoreIdValueWithoutMetadata() {
DBObject dbObject = new BasicDBObject().append("_id", new ObjectId().toString());
DBObject mapped = mapper.getMappedObject(dbObject, null);
assertThat(mapped.containsField("_id"), is(true));
assertThat(mapped.get("_id"), is(instanceOf(ObjectId.class)));
}
/**
* @see DATAMONGO-705
*/
@Test
public void convertsDBRefWithExistsQuery() {
Query query = query(where("reference").exists(false));
BasicMongoPersistentEntity<?> entity = context.getPersistentEntity(WithDBRef.class);
DBObject mappedObject = mapper.getMappedObject(query.getQueryObject(), entity);
DBObject reference = getAsDBObject(mappedObject, "reference");
assertThat(reference.containsField("$exists"), is(true));
assertThat(reference.get("$exists"), is((Object) false));
}
/**
* @see DATAMONGO-706
*/
@Test
public void convertsNestedDBRefsCorrectly() {
Reference reference = new Reference();
reference.id = 5L;
Query query = query(where("someString").is("foo").andOperator(where("reference").in(reference)));
BasicMongoPersistentEntity<?> entity = context.getPersistentEntity(WithDBRef.class);
DBObject mappedObject = mapper.getMappedObject(query.getQueryObject(), entity);
assertThat(mappedObject.get("someString"), is((Object) "foo"));
BasicDBList andClause = getAsDBList(mappedObject, "$and");
assertThat(andClause, hasSize(1));
BasicDBList inClause = getAsDBList(getAsDBObject(getAsDBObject(andClause, 0), "reference"), "$in");
assertThat(inClause, hasSize(1));
assertThat(inClause.get(0), is(instanceOf(com.mongodb.DBRef.class)));
}
class IdWrapper {
Object id;
}
@@ -359,14 +450,12 @@ public class QueryMapperUnitTests {
class Sample {
@Id
private String foo;
@Id private String foo;
}
class BigIntegerId {
@Id
private BigInteger id;
@Id private BigInteger id;
}
enum Enum {
@@ -380,14 +469,13 @@ public class QueryMapperUnitTests {
class CustomizedField {
@Field("foo")
CustomizedField field;
@Field("foo") CustomizedField field;
}
class WithDBRef {
@DBRef
Reference reference;
String someString;
@DBRef Reference reference;
}
class Reference {
@@ -399,4 +487,9 @@ public class QueryMapperUnitTests {
WithDBRef withDbRef;
}
class WithMapDBRef {
@DBRef Map<String, Sample> mapWithDBRef;
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -13,7 +13,6 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.mapping;
import static org.hamcrest.Matchers.*;
@@ -37,14 +36,15 @@ import org.springframework.context.ApplicationContext;
import org.springframework.context.support.ClassPathXmlApplicationContext;
import org.springframework.dao.DataAccessException;
import org.springframework.data.annotation.Id;
import org.springframework.data.domain.Sort;
import org.springframework.data.domain.Sort.Direction;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.MongoCollectionUtils;
import org.springframework.data.mongodb.core.CollectionCallback;
import org.springframework.data.mongodb.core.MongoDbUtils;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.Order;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.test.util.ReflectionTestUtils;
import com.mongodb.DB;
import com.mongodb.DBCollection;
@@ -53,7 +53,8 @@ import com.mongodb.Mongo;
import com.mongodb.MongoException;
/**
* @author Jon Brisbin <jbrisbin@vmware.com>
* @author Jon Brisbin
* @author Oliver Gierke
*/
public class MappingTests {
@@ -78,7 +79,7 @@ public class MappingTests {
ApplicationContext applicationContext;
Mongo mongo;
MongoTemplate template;
MongoMappingContext mappingContext;
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
@Before
public void setUp() throws Exception {
@@ -89,7 +90,7 @@ public class MappingTests {
}
applicationContext = new ClassPathXmlApplicationContext("/mapping.xml");
template = applicationContext.getBean(MongoTemplate.class);
mappingContext = (MongoMappingContext) ReflectionTestUtils.getField(template, "mappingContext");
mappingContext = template.getConverter().getMappingContext();
}
@Test
@@ -464,7 +465,7 @@ public class MappingTests {
template.insert(p4);
Query q = query(where("id").in("1", "2"));
q.sort().on("id", Order.ASCENDING);
q.with(new Sort(Direction.ASC, "id"));
List<PersonPojoStringId> people = template.find(q, PersonPojoStringId.class);
assertEquals(2, people.size());

View File

@@ -1,5 +1,5 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright 2011-2013 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -13,13 +13,12 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.mapping;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import java.lang.reflect.Field;
import java.util.AbstractMap;
import java.util.Collections;
import java.util.Map;
@@ -29,9 +28,7 @@ import org.mockito.Mock;
import org.mockito.runners.MockitoJUnitRunner;
import org.springframework.context.ApplicationContext;
import org.springframework.data.annotation.Id;
import org.springframework.data.mapping.context.AbstractMappingContext;
import org.springframework.data.mapping.model.MappingException;
import org.springframework.util.ReflectionUtils;
import com.mongodb.DBRef;
@@ -69,15 +66,14 @@ public class MongoMappingContextUnitTests {
assertThat(context.getPersistentEntity(DBRef.class), is(nullValue()));
}
/**
* @see DATAMONGO-638
*/
@Test
public void populatesAbstractMappingContextsApplicationCorrectly() {
public void doesNotCreatePersistentEntityForAbstractMap() {
MongoMappingContext context = new MongoMappingContext();
context.setApplicationContext(applicationContext);
Field field = ReflectionUtils.findField(AbstractMappingContext.class, "applicationContext");
ReflectionUtils.makeAccessible(field);
assertThat(ReflectionUtils.getField(field, context), is(notNullValue()));
assertThat(context.getPersistentEntity(AbstractMap.class), is(nullValue()));
}
class ClassWithMultipleIdProperties {

View File

@@ -36,7 +36,7 @@ import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
*/
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration
public class ValidatingMongoEventListenerTest {
public class ValidatingMongoEventListenerTests {
@Autowired
MongoTemplate mongoTemplate;

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2010-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -24,6 +24,10 @@ import org.springframework.data.mongodb.InvalidMongoDbApiUsageException;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* @author Oliver Gierke
* @author Thomas Darimont
*/
public class CriteriaTests {
@Test
@@ -68,4 +72,50 @@ public class CriteriaTests {
assertThat(left, is(not(right)));
assertThat(right, is(not(left)));
}
/**
* @see DATAMONGO-507
*/
@Test(expected = IllegalArgumentException.class)
public void shouldThrowExceptionWhenTryingToNegateAndOperation() {
new Criteria() //
.not() //
.andOperator(Criteria.where("delete").is(true).and("_id").is(42)); //
}
/**
* @see DATAMONGO-507
*/
@Test(expected = IllegalArgumentException.class)
public void shouldThrowExceptionWhenTryingToNegateOrOperation() {
new Criteria() //
.not() //
.orOperator(Criteria.where("delete").is(true).and("_id").is(42)); //
}
/**
* @see DATAMONGO-507
*/
@Test(expected = IllegalArgumentException.class)
public void shouldThrowExceptionWhenTryingToNegateNorOperation() {
new Criteria() //
.not() //
.norOperator(Criteria.where("delete").is(true).and("_id").is(42)); //
}
/**
* @see DATAMONGO-507
*/
@Test
public void shouldNegateFollowingSimpleExpression() {
Criteria c = Criteria.where("age").not().gt(18).and("status").is("student");
DBObject co = c.getCriteriaObject();
assertThat(co, is(notNullValue()));
assertThat(co.toString(), is("{ \"age\" : { \"$not\" : { \"$gt\" : 18}} , \"status\" : \"student\"}"));
}
}

View File

@@ -0,0 +1,49 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.query;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import org.junit.Test;
/**
* Unit tests for {@link Field}.
*
* @author Oliver Gierke
*/
public class FieldUnitTests {
@Test
public void sameObjectSetupCreatesEqualField() {
Field left = new Field().include("foo");
Field right = new Field().include("foo");
assertThat(left, is(right));
assertThat(right, is(left));
}
@Test
public void differentObjectSetupCreatesEqualField() {
Field left = new Field().include("foo");
Field right = new Field().include("bar");
assertThat(left, is(not(right)));
assertThat(right, is(not(left)));
}
}

View File

@@ -19,14 +19,18 @@ import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import org.junit.Test;
import org.springframework.data.domain.PageRequest;
import org.springframework.data.domain.Pageable;
import org.springframework.data.mongodb.core.geo.Distance;
import org.springframework.data.mongodb.core.geo.Metric;
import org.springframework.data.mongodb.core.geo.Metrics;
import org.springframework.data.mongodb.core.geo.Point;
/**
* Unit tests for {@link NearQuery}.
*
* @author Oliver Gierke
* @author Thomas Darimont
*/
public class NearQueryUnitTests {
@@ -75,4 +79,48 @@ public class NearQueryUnitTests {
query = query.maxDistance(new Distance(200, Metrics.KILOMETERS));
assertThat(query.getMetric(), is((Metric) Metrics.MILES));
}
/**
* @see DATAMONGO-445
*/
@Test
public void shouldTakeSkipAndLimitSettingsFromGivenPageable() {
Pageable pageable = new PageRequest(3, 5);
NearQuery query = NearQuery.near(new Point(1, 1)).with(pageable);
assertThat(query.getSkip(), is(pageable.getPageNumber() * pageable.getPageSize()));
assertThat((Integer) query.toDBObject().get("num"), is((pageable.getPageNumber() + 1) * pageable.getPageSize()));
}
/**
* @see DATAMONGO-445
*/
@Test
public void shouldTakeSkipAndLimitSettingsFromGivenQuery() {
int limit = 10;
int skip = 5;
NearQuery query = NearQuery.near(new Point(1, 1)).query(
Query.query(Criteria.where("foo").is("bar")).limit(limit).skip(skip));
assertThat(query.getSkip(), is(skip));
assertThat((Integer) query.toDBObject().get("num"), is(limit));
}
/**
* @see DATAMONGO-445
*/
@Test
public void shouldTakeSkipAndLimitSettingsFromPageableEvenIfItWasSpecifiedOnQuery() {
int limit = 10;
int skip = 5;
Pageable pageable = new PageRequest(3, 5);
NearQuery query = NearQuery.near(new Point(1, 1))
.query(Query.query(Criteria.where("foo").is("bar")).limit(limit).skip(skip)).with(pageable);
assertThat(query.getSkip(), is(pageable.getPageNumber() * pageable.getPageSize()));
assertThat((Integer) query.toDBObject().get("num"), is((pageable.getPageNumber() + 1) * pageable.getPageSize()));
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2002-2011 the original author or authors.
* Copyright 2002-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,28 +15,32 @@
*/
package org.springframework.data.mongodb.monitor;
import com.mongodb.Mongo;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import java.net.UnknownHostException;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.data.mongodb.monitor.OperationCounters;
import org.springframework.data.mongodb.monitor.ServerInfo;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
import com.mongodb.Mongo;
/**
* This test class assumes that you are already running the MongoDB server.
*
* @author Mark Pollack
* @author Thomas Darimont
*/
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration
@ContextConfiguration("classpath:infrastructure.xml")
public class MongoMonitorIntegrationTests {
@Autowired
Mongo mongo;
@Autowired Mongo mongo;
@Test
public void serverInfo() {
@@ -45,9 +49,29 @@ public class MongoMonitorIntegrationTests {
Assert.isTrue(StringUtils.hasText("1."));
}
/**
* @throws UnknownHostException
* @see DATAMONGO-685
*/
@Test
public void getHostNameShouldReturnServerNameReportedByMongo() throws UnknownHostException {
ServerInfo serverInfo = new ServerInfo(mongo);
String hostName = null;
try {
hostName = serverInfo.getHostName();
} catch (UnknownHostException e) {
throw e;
}
assertThat(hostName, is(notNullValue()));
assertThat(hostName, is("127.0.0.1"));
}
@Test
public void operationCounters() {
OperationCounters operationCounters = new OperationCounters(mongo);
operationCounters.getInsertCount();
}
}
}

View File

@@ -18,12 +18,16 @@ package org.springframework.data.mongodb.performance;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import static org.springframework.data.mongodb.core.query.Query.*;
import java.text.DecimalFormat;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
import java.util.Collections;
import java.util.Date;
import java.util.HashMap;
import java.util.HashSet;
import java.util.List;
import java.util.Map;
import java.util.Random;
import java.util.Set;
import java.util.regex.Pattern;
@@ -35,12 +39,15 @@ import org.springframework.core.Constants;
import org.springframework.data.annotation.PersistenceConstructor;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.SimpleMongoDbFactory;
import org.springframework.data.mongodb.core.index.Indexed;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.repository.MongoRepository;
import org.springframework.data.mongodb.repository.support.MongoRepositoryFactoryBean;
import org.springframework.util.Assert;
import org.springframework.util.StopWatch;
import org.springframework.util.StringUtils;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
@@ -60,29 +67,40 @@ import com.mongodb.WriteConcern;
public class PerformanceTests {
private static final String DATABASE_NAME = "performance";
private static final int NUMBER_OF_PERSONS = 30000;
private static final int NUMBER_OF_PERSONS = 300;
private static final int ITERATIONS = 50;
private static final StopWatch watch = new StopWatch();
private static final Collection<String> IGNORED_WRITE_CONCERNS = Arrays.asList("MAJORITY", "REPLICAS_SAFE",
"FSYNC_SAFE", "JOURNAL_SAFE");
"FSYNC_SAFE", "FSYNCED", "JOURNAL_SAFE", "JOURNALED", "REPLICA_ACKNOWLEDGED");
private static final int COLLECTION_SIZE = 1024 * 1024 * 256; // 256 MB
private static final Collection<String> COLLECTION_NAMES = Arrays.asList("template", "driver", "person");
Mongo mongo;
MongoTemplate operations;
PersonRepository repository;
MongoConverter converter;
@Before
public void setUp() throws Exception {
this.mongo = new Mongo();
this.operations = new MongoTemplate(new SimpleMongoDbFactory(this.mongo, DATABASE_NAME));
SimpleMongoDbFactory mongoDbFactory = new SimpleMongoDbFactory(this.mongo, DATABASE_NAME);
MongoMappingContext context = new MongoMappingContext();
context.setInitialEntitySet(Collections.singleton(Person.class));
context.afterPropertiesSet();
this.converter = new MappingMongoConverter(mongoDbFactory, context);
this.operations = new MongoTemplate(new SimpleMongoDbFactory(this.mongo, DATABASE_NAME), converter);
MongoRepositoryFactoryBean<PersonRepository, Person, ObjectId> factory = new MongoRepositoryFactoryBean<PersonRepository, Person, ObjectId>();
factory.setMongoOperations(operations);
factory.setRepositoryInterface(PersonRepository.class);
factory.afterPropertiesSet();
repository = factory.getObject();
this.repository = factory.getObject();
}
@Test
@@ -90,69 +108,137 @@ public class PerformanceTests {
executeWithWriteConcerns(new WriteConcernCallback() {
public void doWithWriteConcern(String constantName, WriteConcern concern) {
writeHeadline("WriteConcern: " + constantName);
writingObjectsUsingPlainDriver("Writing %s objects using plain driver");
writingObjectsUsingMongoTemplate("Writing %s objects using template");
writingObjectsUsingRepositories("Writing %s objects using repository");
System.out.println(String.format("Writing %s objects using plain driver took %sms", NUMBER_OF_PERSONS,
writingObjectsUsingPlainDriver(NUMBER_OF_PERSONS)));
System.out.println(String.format("Writing %s objects using template took %sms", NUMBER_OF_PERSONS,
writingObjectsUsingMongoTemplate(NUMBER_OF_PERSONS)));
System.out.println(String.format("Writing %s objects using repository took %sms", NUMBER_OF_PERSONS,
writingObjectsUsingRepositories(NUMBER_OF_PERSONS)));
writeFooter();
}
});
}
@Test
public void writeAndRead() {
public void plainConversion() throws InterruptedException {
Statistics statistics = new Statistics("Plain conversion of " + NUMBER_OF_PERSONS * 100
+ " persons - After %s iterations");
List<DBObject> dbObjects = getPersonDBObjects(NUMBER_OF_PERSONS * 100);
for (int i = 0; i < ITERATIONS; i++) {
statistics.registerTime(Api.DIRECT, Mode.READ, convertDirectly(dbObjects));
statistics.registerTime(Api.CONVERTER, Mode.READ, convertUsingConverter(dbObjects));
}
statistics.printResults(ITERATIONS);
}
private long convertDirectly(final List<DBObject> dbObjects) {
executeWatched(new WatchCallback<List<Person>>() {
@Override
public List<Person> doInWatch() {
List<Person> persons = new ArrayList<PerformanceTests.Person>();
for (DBObject dbObject : dbObjects) {
persons.add(Person.from(dbObject));
}
return persons;
}
});
return watch.getLastTaskTimeMillis();
}
private long convertUsingConverter(final List<DBObject> dbObjects) {
executeWatched(new WatchCallback<List<Person>>() {
@Override
public List<Person> doInWatch() {
List<Person> persons = new ArrayList<PerformanceTests.Person>();
for (DBObject dbObject : dbObjects) {
persons.add(converter.read(Person.class, dbObject));
}
return persons;
}
});
return watch.getLastTaskTimeMillis();
}
@Test
public void writeAndRead() throws Exception {
mongo.setWriteConcern(WriteConcern.SAFE);
for (int i = 3; i > 0; i--) {
readsAndWrites(NUMBER_OF_PERSONS, ITERATIONS);
}
private void readsAndWrites(int numberOfPersons, int iterations) {
Statistics statistics = new Statistics("Reading " + numberOfPersons + " - After %s iterations");
for (int i = 0; i < iterations; i++) {
setupCollections();
writeHeadline("Plain driver");
writingObjectsUsingPlainDriver("Writing %s objects using plain driver");
readingUsingPlainDriver("Reading all objects using plain driver");
queryUsingPlainDriver("Executing query using plain driver");
writeFooter();
statistics.registerTime(Api.DRIVER, Mode.WRITE, writingObjectsUsingPlainDriver(numberOfPersons));
statistics.registerTime(Api.TEMPLATE, Mode.WRITE, writingObjectsUsingMongoTemplate(numberOfPersons));
statistics.registerTime(Api.REPOSITORY, Mode.WRITE, writingObjectsUsingRepositories(numberOfPersons));
writeHeadline("Template");
writingObjectsUsingMongoTemplate("Writing %s objects using template");
readingUsingTemplate("Reading all objects using template");
queryUsingTemplate("Executing query using template");
writeFooter();
statistics.registerTime(Api.DRIVER, Mode.READ, readingUsingPlainDriver());
statistics.registerTime(Api.TEMPLATE, Mode.READ, readingUsingTemplate());
statistics.registerTime(Api.REPOSITORY, Mode.READ, readingUsingRepository());
writeHeadline("Repositories");
writingObjectsUsingRepositories("Writing %s objects using repository");
readingUsingRepository("Reading all objects using repository");
queryUsingRepository("Executing query using repository");
writeFooter();
statistics.registerTime(Api.DRIVER, Mode.QUERY, queryUsingPlainDriver());
statistics.registerTime(Api.TEMPLATE, Mode.QUERY, queryUsingTemplate());
statistics.registerTime(Api.REPOSITORY, Mode.QUERY, queryUsingRepository());
writeFooter();
if (i > 0 && i % (iterations / 10) == 0) {
statistics.printResults(i);
}
}
statistics.printResults(iterations);
}
private void writeHeadline(String headline) {
System.out.println(headline);
System.out.println("---------------------------------".substring(0, headline.length()));
System.out.println(createUnderline(headline));
}
private void writeFooter() {
System.out.println();
}
private void queryUsingTemplate(String template) {
executeWatchedWithTimeAndResultSize(template, new WatchCallback<List<Person>>() {
private long queryUsingTemplate() {
executeWatched(new WatchCallback<List<Person>>() {
public List<Person> doInWatch() {
Query query = query(where("addresses.zipCode").regex(".*1.*"));
return operations.find(query, Person.class, "template");
}
});
return watch.getLastTaskTimeMillis();
}
private void queryUsingRepository(String template) {
executeWatchedWithTimeAndResultSize(template, new WatchCallback<List<Person>>() {
private long queryUsingRepository() {
executeWatched(new WatchCallback<List<Person>>() {
public List<Person> doInWatch() {
return repository.findByAddressesZipCodeContaining("1");
}
});
return watch.getLastTaskTimeMillis();
}
private void executeWithWriteConcerns(WriteConcernCallback callback) {
@@ -181,7 +267,7 @@ public class PerformanceTests {
for (String collectionName : COLLECTION_NAMES) {
DBCollection collection = db.getCollection(collectionName);
collection.drop();
db.command(getCreateCollectionCommand(collectionName));
collection.getDB().command(getCreateCollectionCommand(collectionName));
collection.ensureIndex(new BasicDBObject("firstname", -1));
collection.ensureIndex(new BasicDBObject("lastname", -1));
}
@@ -195,38 +281,42 @@ public class PerformanceTests {
return dbObject;
}
private void writingObjectsUsingPlainDriver(String template) {
private long writingObjectsUsingPlainDriver(int numberOfPersons) {
final DBCollection collection = mongo.getDB(DATABASE_NAME).getCollection("driver");
final List<DBObject> persons = getPersonDBObjects();
final List<Person> persons = getPersonObjects(numberOfPersons);
executeWatchedWithTime(template, new WatchCallback<Void>() {
executeWatched(new WatchCallback<Void>() {
public Void doInWatch() {
for (DBObject person : persons) {
collection.save(person);
for (Person person : persons) {
collection.save(person.toDBObject());
}
return null;
}
});
return watch.getLastTaskTimeMillis();
}
private void writingObjectsUsingRepositories(String template) {
private long writingObjectsUsingRepositories(int numberOfPersons) {
final List<Person> persons = getPersonObjects();
final List<Person> persons = getPersonObjects(numberOfPersons);
executeWatchedWithTime(template, new WatchCallback<Void>() {
executeWatched(new WatchCallback<Void>() {
public Void doInWatch() {
repository.save(persons);
return null;
}
});
return watch.getLastTaskTimeMillis();
}
private void writingObjectsUsingMongoTemplate(String template) {
private long writingObjectsUsingMongoTemplate(int numberOfPersons) {
final List<Person> persons = getPersonObjects();
final List<Person> persons = getPersonObjects(numberOfPersons);
executeWatchedWithTime(template, new WatchCallback<Void>() {
executeWatched(new WatchCallback<Void>() {
public Void doInWatch() {
for (Person person : persons) {
operations.save(person, "template");
@@ -234,82 +324,95 @@ public class PerformanceTests {
return null;
}
});
return watch.getLastTaskTimeMillis();
}
private void readingUsingPlainDriver(String template) {
private long readingUsingPlainDriver() {
final DBCollection collection = mongo.getDB(DATABASE_NAME).getCollection("driver");
executeWatchedWithTimeAndResultSize(String.format(template, NUMBER_OF_PERSONS), new WatchCallback<List<Person>>() {
executeWatched(new WatchCallback<List<Person>>() {
public List<Person> doInWatch() {
return toPersons(collection.find());
return toPersons(mongo.getDB(DATABASE_NAME).getCollection("driver").find());
}
});
return watch.getLastTaskTimeMillis();
}
private void readingUsingTemplate(String template) {
executeWatchedWithTimeAndResultSize(String.format(template, NUMBER_OF_PERSONS), new WatchCallback<List<Person>>() {
private long readingUsingTemplate() {
executeWatched(new WatchCallback<List<Person>>() {
public List<Person> doInWatch() {
return operations.findAll(Person.class, "template");
}
});
return watch.getLastTaskTimeMillis();
}
private void readingUsingRepository(String template) {
executeWatchedWithTimeAndResultSize(String.format(template, NUMBER_OF_PERSONS), new WatchCallback<List<Person>>() {
private long readingUsingRepository() {
executeWatched(new WatchCallback<List<Person>>() {
public List<Person> doInWatch() {
return repository.findAll();
}
});
return watch.getLastTaskTimeMillis();
}
private void queryUsingPlainDriver(String template) {
private long queryUsingPlainDriver() {
final DBCollection collection = mongo.getDB(DATABASE_NAME).getCollection("driver");
executeWatchedWithTimeAndResultSize(template, new WatchCallback<List<Person>>() {
executeWatched(new WatchCallback<List<Person>>() {
public List<Person> doInWatch() {
DBCollection collection = mongo.getDB(DATABASE_NAME).getCollection("driver");
DBObject regex = new BasicDBObject("$regex", Pattern.compile(".*1.*"));
DBObject query = new BasicDBObject("addresses.zipCode", regex);
return toPersons(collection.find(query));
}
});
return watch.getLastTaskTimeMillis();
}
private List<DBObject> getPersonDBObjects() {
private List<Person> getPersonObjects(int numberOfPersons) {
List<DBObject> result = new ArrayList<DBObject>(NUMBER_OF_PERSONS);
List<Person> result = new ArrayList<Person>();
for (Person person : getPersonObjects()) {
result.add(person.toDBObject());
}
for (int i = 0; i < numberOfPersons; i++) {
return result;
}
List<Address> addresses = new ArrayList<Address>();
private List<Person> getPersonObjects() {
for (int a = 0; a < 5; a++) {
addresses.add(new Address("zip" + a, "city" + a));
}
List<Person> result = new ArrayList<Person>(NUMBER_OF_PERSONS);
Person person = new Person("Firstname" + i, "Lastname" + i, addresses);
watch.start("Created " + NUMBER_OF_PERSONS + " Persons");
for (int o = 0; o < 10; o++) {
person.orders.add(new Order(LineItem.generate()));
}
for (int i = 0; i < NUMBER_OF_PERSONS; i++) {
Address address = new Address("zip" + i, "city" + i);
Person person = new Person("Firstname" + i, "Lastname" + i, Arrays.asList(address));
person.orders.add(new Order(LineItem.generate()));
person.orders.add(new Order(LineItem.generate()));
result.add(person);
}
watch.stop();
return result;
}
private <T> T executeWatched(String template, WatchCallback<T> callback) {
private List<DBObject> getPersonDBObjects(int numberOfPersons) {
watch.start(String.format(template, NUMBER_OF_PERSONS));
List<DBObject> dbObjects = new ArrayList<DBObject>(numberOfPersons);
for (Person person : getPersonObjects(numberOfPersons)) {
dbObjects.add(person.toDBObject());
}
return dbObjects;
}
private <T> T executeWatched(WatchCallback<T> callback) {
watch.start();
try {
return callback.doInWatch();
@@ -318,28 +421,6 @@ public class PerformanceTests {
}
}
private <T> void executeWatchedWithTime(String template, WatchCallback<?> callback) {
executeWatched(template, callback);
printStatistics(null);
}
private <T> void executeWatchedWithTimeAndResultSize(String template, WatchCallback<List<T>> callback) {
printStatistics(executeWatched(template, callback));
}
private void printStatistics(Collection<?> result) {
long time = watch.getLastTaskTimeMillis();
StringBuilder builder = new StringBuilder(watch.getLastTaskName());
if (result != null) {
builder.append(" returned ").append(result.size()).append(" results and");
}
builder.append(" took ").append(time).append(" milliseconds");
System.out.println(builder);
}
private static List<Person> toPersons(DBCursor cursor) {
List<Person> persons = new ArrayList<Person>();
@@ -354,10 +435,9 @@ public class PerformanceTests {
static class Person {
ObjectId id;
@Indexed
final String firstname, lastname;
final List<Address> addresses;
final Set<Order> orders;
String firstname, lastname;
List<Address> addresses;
Set<Order> orders;
public Person(String firstname, String lastname, List<Address> addresses) {
this.firstname = firstname;
@@ -579,11 +659,253 @@ public class PerformanceTests {
DBObject toDBObject();
}
private static List<DBObject> writeAll(Collection<? extends Convertible> convertibles) {
List<DBObject> result = new ArrayList<DBObject>();
private static BasicDBList writeAll(Collection<? extends Convertible> convertibles) {
BasicDBList result = new BasicDBList();
for (Convertible convertible : convertibles) {
result.add(convertible.toDBObject());
}
return result;
}
static enum Api {
DRIVER, TEMPLATE, REPOSITORY, DIRECT, CONVERTER;
}
static enum Mode {
WRITE, READ, QUERY;
}
private static class Statistics {
private final String headline;
private final Map<Mode, ModeTimes> times;
public Statistics(String headline) {
this.headline = headline;
this.times = new HashMap<Mode, ModeTimes>();
for (Mode mode : Mode.values()) {
times.put(mode, new ModeTimes(mode));
}
}
public void registerTime(Api api, Mode mode, double time) {
times.get(mode).add(api, time);
}
public void printResults(int iterations) {
String title = String.format(headline, iterations);
System.out.println(title);
System.out.println(createUnderline(title));
StringBuilder builder = new StringBuilder();
for (Mode mode : Mode.values()) {
String print = times.get(mode).print();
if (!print.isEmpty()) {
builder.append(print).append('\n');
}
}
System.out.println(builder.toString());
}
@Override
public String toString() {
StringBuilder builder = new StringBuilder(times.size());
for (ModeTimes times : this.times.values()) {
builder.append(times.toString());
}
return builder.toString();
}
}
private static String createUnderline(String input) {
StringBuilder builder = new StringBuilder(input.length());
for (int i = 0; i < input.length(); i++) {
builder.append("-");
}
return builder.toString();
}
static class ApiTimes {
private static final String TIME_TEMPLATE = "%s %s time -\tAverage: %sms%s,%sMedian: %sms%s";
private static final DecimalFormat TIME_FORMAT;
private static final DecimalFormat DEVIATION_FORMAT;
static {
TIME_FORMAT = new DecimalFormat("0.00");
DEVIATION_FORMAT = new DecimalFormat("0.00");
DEVIATION_FORMAT.setPositivePrefix("+");
}
private final Api api;
private final Mode mode;
private final List<Double> times;
public ApiTimes(Api api, Mode mode) {
this.api = api;
this.mode = mode;
this.times = new ArrayList<Double>();
}
public void add(double time) {
this.times.add(time);
}
public boolean hasTimes() {
return !times.isEmpty();
}
public double getAverage() {
double result = 0;
for (Double time : times) {
result += time;
}
return result == 0.0 ? 0.0 : result / times.size();
}
public double getMedian() {
if (times.isEmpty()) {
return 0.0;
}
ArrayList<Double> list = new ArrayList<Double>(times);
Collections.sort(list);
int size = list.size();
if (size % 2 == 0) {
return (list.get(size / 2 - 1) + list.get(size / 2)) / 2;
} else {
return list.get(size / 2);
}
}
private double getDeviationFrom(double otherAverage) {
double average = getAverage();
return average * 100 / otherAverage - 100;
}
private double getMediaDeviationFrom(double otherMedian) {
double median = getMedian();
return median * 100 / otherMedian - 100;
}
public String print() {
if (times.isEmpty()) {
return "";
}
return basicPrint("", "\t\t", "") + '\n';
}
private String basicPrint(String extension, String middle, String foo) {
return String.format(TIME_TEMPLATE, api, mode, TIME_FORMAT.format(getAverage()), extension, middle,
TIME_FORMAT.format(getMedian()), foo);
}
public String print(double referenceAverage, double referenceMedian) {
if (times.isEmpty()) {
return "";
}
return basicPrint(String.format(" %s%%", DEVIATION_FORMAT.format(getDeviationFrom(referenceAverage))), "\t",
String.format(" %s%%", DEVIATION_FORMAT.format(getMediaDeviationFrom(referenceMedian)))) + '\n';
}
/*
* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
return times.isEmpty() ? "" : String.format("%s, %s: %s", api, mode,
StringUtils.collectionToCommaDelimitedString(times)) + '\n';
}
}
static class ModeTimes {
private final Map<Api, ApiTimes> times;
public ModeTimes(Mode mode) {
this.times = new HashMap<Api, ApiTimes>();
for (Api api : Api.values()) {
this.times.put(api, new ApiTimes(api, mode));
}
}
public void add(Api api, double time) {
times.get(api).add(time);
}
@SuppressWarnings("null")
public String print() {
if (times.isEmpty()) {
return "";
}
Double previousTime = null;
Double previousMedian = null;
StringBuilder builder = new StringBuilder();
for (Api api : Api.values()) {
ApiTimes apiTimes = times.get(api);
if (!apiTimes.hasTimes()) {
continue;
}
if (previousTime == null) {
builder.append(apiTimes.print());
previousTime = apiTimes.getAverage();
previousMedian = apiTimes.getMedian();
} else {
builder.append(apiTimes.print(previousTime, previousMedian));
}
}
return builder.toString();
}
/*
* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
StringBuilder builder = new StringBuilder(times.size());
for (ApiTimes times : this.times.values()) {
builder.append(times.toString());
}
return builder.toString();
}
}
}

View File

@@ -53,11 +53,9 @@ import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
@RunWith(SpringJUnit4ClassRunner.class)
public abstract class AbstractPersonRepositoryIntegrationTests {
@Autowired
protected PersonRepository repository;
@Autowired protected PersonRepository repository;
@Autowired
MongoOperations operations;
@Autowired MongoOperations operations;
Person dave, oliver, carter, boyd, stefan, leroi, alicia;
QPerson person;
@@ -546,4 +544,116 @@ public abstract class AbstractPersonRepositoryIntegrationTests {
assertThat(result, hasSize(1));
assertThat(result, hasItem(dave));
}
/**
* @see DATAMONGO-701
*/
@Test
public void executesDerivedStartsWithQueryCorrectly() {
List<Person> result = repository.findByLastnameStartsWith("Matt");
assertThat(result, hasSize(2));
assertThat(result, hasItems(dave, oliver));
}
/**
* @see DATAMONGO-701
*/
@Test
public void executesDerivedEndsWithQueryCorrectly() {
List<Person> result = repository.findByLastnameEndsWith("thews");
assertThat(result, hasSize(2));
assertThat(result, hasItems(dave, oliver));
}
/**
* @see DATAMONGO-445
*/
@Test
public void executesGeoPageQueryForWithPageRequestForPageInBetween() {
Point farAway = new Point(-73.9, 40.7);
Point here = new Point(-73.99, 40.73);
dave.setLocation(farAway);
oliver.setLocation(here);
carter.setLocation(here);
boyd.setLocation(here);
leroi.setLocation(here);
repository.save(Arrays.asList(dave, oliver, carter, boyd, leroi));
GeoPage<Person> results = repository.findByLocationNear(new Point(-73.99, 40.73), new Distance(2000,
Metrics.KILOMETERS), new PageRequest(1, 2));
assertThat(results.getContent().isEmpty(), is(false));
assertThat(results.getNumberOfElements(), is(2));
assertThat(results.isFirstPage(), is(false));
assertThat(results.isLastPage(), is(false));
assertThat(results.getAverageDistance().getMetric(), is((Metric) Metrics.KILOMETERS));
assertThat(results.getAverageDistance().getNormalizedValue(), is(0.0));
}
/**
* @see DATAMONGO-445
*/
@Test
public void executesGeoPageQueryForWithPageRequestForPageAtTheEnd() {
Point point = new Point(-73.99171, 40.738868);
dave.setLocation(point);
oliver.setLocation(point);
carter.setLocation(point);
repository.save(Arrays.asList(dave, oliver, carter));
GeoPage<Person> results = repository.findByLocationNear(new Point(-73.99, 40.73), new Distance(2000,
Metrics.KILOMETERS), new PageRequest(1, 2));
assertThat(results.getContent().isEmpty(), is(false));
assertThat(results.getNumberOfElements(), is(1));
assertThat(results.isFirstPage(), is(false));
assertThat(results.isLastPage(), is(true));
assertThat(results.getAverageDistance().getMetric(), is((Metric) Metrics.KILOMETERS));
}
/**
* @see DATAMONGO-445
*/
@Test
public void executesGeoPageQueryForWithPageRequestForJustOneElement() {
Point point = new Point(-73.99171, 40.738868);
dave.setLocation(point);
repository.save(dave);
GeoPage<Person> results = repository.findByLocationNear(new Point(-73.99, 40.73), new Distance(2000,
Metrics.KILOMETERS), new PageRequest(0, 2));
assertThat(results.getContent().isEmpty(), is(false));
assertThat(results.getNumberOfElements(), is(1));
assertThat(results.isFirstPage(), is(true));
assertThat(results.isLastPage(), is(true));
assertThat(results.getAverageDistance().getMetric(), is((Metric) Metrics.KILOMETERS));
}
/**
* @see DATAMONGO-445
*/
@Test
public void executesGeoPageQueryForWithPageRequestForJustOneElementEmptyPage() {
dave.setLocation(new Point(-73.99171, 40.738868));
repository.save(dave);
GeoPage<Person> results = repository.findByLocationNear(new Point(-73.99, 40.73), new Distance(2000,
Metrics.KILOMETERS), new PageRequest(1, 2));
assertThat(results.getContent().isEmpty(), is(true));
assertThat(results.getNumberOfElements(), is(0));
assertThat(results.isFirstPage(), is(false));
assertThat(results.isLastPage(), is(true));
assertThat(results.getAverageDistance().getMetric(), is((Metric) Metrics.KILOMETERS));
}
}

View File

@@ -47,6 +47,10 @@ public interface PersonRepository extends MongoRepository<Person, String>, Query
*/
List<Person> findByLastname(String lastname);
List<Person> findByLastnameStartsWith(String prefix);
List<Person> findByLastnameEndsWith(String postfix);
/**
* Returns all {@link Person}s with the given lastname ordered by their firstname.
*

View File

@@ -263,7 +263,7 @@ public class MongoQueryCreatorUnitTests {
MongoQueryCreator creator = new MongoQueryCreator(tree, getAccessor(converter, "Matt"), context);
Query query = creator.createQuery();
assertThat(query, is(query(where("foo").regex("Matt.*"))));
assertThat(query, is(query(where("foo").regex("^Matt"))));
}
/**
@@ -276,7 +276,7 @@ public class MongoQueryCreatorUnitTests {
MongoQueryCreator creator = new MongoQueryCreator(tree, getAccessor(converter, "ews"), context);
Query query = creator.createQuery();
assertThat(query, is(query(where("foo").regex(".*ews"))));
assertThat(query, is(query(where("foo").regex("ews$"))));
}
/**

View File

@@ -7,7 +7,7 @@
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd
http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context.xsd">
<mongo:db-factory />
<mongo:db-factory dbname="validation" />
<mongo:mapping-converter base-package="org.springframework.data.mongodb.core" />

View File

@@ -52,7 +52,7 @@
<xi:include href="introduction/why-sd-doc.xml"/>
<xi:include href="introduction/requirements.xml"/>
<xi:include href="introduction/getting-started.xml"/>
<xi:include href="https://github.com/SpringSource/spring-data-commons/raw/1.4.0.RELEASE/src/docbkx/repositories.xml">
<xi:include href="https://raw.github.com/SpringSource/spring-data-commons/1.5.3.RELEASE/src/docbkx/repositories.xml">
<xi:fallback href="../../../spring-data-commons/src/docbkx/repositories.xml" />
</xi:include>
</part>
@@ -72,10 +72,10 @@
<part id="appendix">
<title>Appendix</title>
<xi:include href="https://raw.github.com/SpringSource/spring-data-commons/1.4.0.RELEASE/src/docbkx/repository-namespace-reference.xml">
<xi:include href="https://raw.github.com/SpringSource/spring-data-commons/1.5.3.RELEASE/src/docbkx/repository-namespace-reference.xml">
<xi:fallback href="../../../spring-data-commons/src/docbkx/repository-namespace-reference.xml" />
</xi:include>
<xi:include href="https://raw.github.com/SpringSource/spring-data-commons/1.4.0.RELEASE/src/docbkx/repository-query-keywords-reference.xml">
<xi:include href="https://raw.github.com/SpringSource/spring-data-commons/1.5.3.RELEASE/src/docbkx/repository-query-keywords-reference.xml">
<xi:fallback href="../../../spring-data-commons/src/docbkx/repository-query-keywords-reference.xml" />
</xi:include>
</part>

Binary file not shown.

After

Width:  |  Height:  |  Size: 48 KiB

View File

@@ -542,4 +542,52 @@ Page&lt;Person&gt; page = repository.findAll(person.lastname.contains("a"),
MongoDB queries.</para>
</section>
</section>
<section>
<title>Miscellaneous</title>
<para/>
<section>
<title>CDI Integration</title>
<para>Instances of the repository interfaces are usually created by a
container, which Spring is the most natural choice when working with
Spring Data. As of version 1.3.0 Spring Data MongoDB ships with a custom
CDI extension that allows using the repository abstraction in CDI
environments. The extension is part of the JAR so all you need to do to
activate it is dropping the Spring Data MongoDB JAR into your classpath.
You can now set up the infrastructure by implementing a CDI Producer for
the <classname>MongoTemplate</classname>:</para>
<programlisting language="java">class MongoTemplateProducer {
@Produces
@ApplicationScoped
public MongoOperations createMongoTemplate() throws UnknownHostException, MongoException {
MongoDbFactory factory = new SimpleMongoDbFactory(new Mongo(), "database");
return new MongoTemplate(factory);
}
}</programlisting>
<para>The Spring Data MongoDB CDI extension will pick up the
<classname>MongoTemplate</classname> available as CDI bean and create a
proxy for a Spring Data repository whenever an bean of a repository type
is requested by the container. Thus obtaining an instance of a Spring
Data repository is a matter of declaring an <code>@Inject</code>-ed
property:</para>
<programlisting language="java">class RepositoryClient {
@Inject
PersonRepository repository;
public void businessMethod() {
List&lt;Person&gt; people = repository.findAll();
}
}</programlisting>
</section>
</section>
</chapter>

View File

@@ -1,6 +1,77 @@
Spring Data MongoDB Changelog
=============================
Changes in version 1.2.4.GA (2013-09-30)
----------------------------------------
** Bug
* [DATAMONGO-445] - GeoNear Query Doesn't Work with Pageable
* [DATAMONGO-602] - Querying with $in operator on the id field of type BigInteger returns zero results
* [DATAMONGO-737] - Extra MongoSynchronizations cause TransactionSynchronizationManager to throw IllegalStateException on transaction complete
** Task
* [DATAMONGO-742] - Document CDI integration in reference documentation
* [DATAMONGO-755] - Release 1.2.4
Changes in version 1.2.3.GA (2013-07-24)
----------------------------------------
** Task
* [DATAMONGO-728] - Add missing package-info.java files
* [DATAMONGO-729] - Release 1.2.3.
Changes in version 1.2.2.GA (2013-07-19)
----------------------------------------
** Bug
* [DATAMONGO-663] - org.springframework.data.mongodb.core.query.Field needs an equals method
* [DATAMONGO-677] - QueryMapper does not handled correctly Map with DBRef value
* [DATAMONGO-679] - MongoTemplate.doSave(…) passed a JSON String doesn't save it.
* [DATAMONGO-683] - QueryMapper does not handle default _id when no MappingMetadata is present
* [DATAMONGO-685] - JMX ServerInfo bean may return wrong info
* [DATAMONGO-693] - MongoFactoryBean should create a mongo instance with host/port if replicaset is null or empty
* [DATAMONGO-704] - Remove references to SimpleMongoConverter from JavaDoc.
* [DATAMONGO-705] - QueryMapper doesn't handles exists query with DBRef field
* [DATAMONGO-706] - QueryMapper does not transform DBRefs in nested keywords correctly
* [DATAMONGO-717] - Application context is not properly distributed to persistent entities
** Improvement
* [DATAMONGO-682] - Remove performance hotspots
* [DATAMONGO-701] - Improve performance of indexed starts-with queries
** Task
* [DATAMONGO-658] - Minor formatting changes to README.md
* [DATAMONGO-678] - Performance improvements in CustomConversions
* [DATAMONGO-714] - Add latest formatter to project sources
* [DATAMONGO-723] - Clean up test cases
* [DATAMONGO-727] - Release 1.2.2
Changes in version 1.2.1.GA (2013-04-17)
----------------------------------------
** Bug
* [DATAMONGO-571] - Spring Data for MongoDb doesn't save null values when @Version is added to domain class
* [DATAMONGO-612] - Fix PDF reference documentation name
* [DATAMONGO-613] - Images missing from reference documentation
* [DATAMONGO-617] - NullPointerException in MongoTemplate.initializeVersionProperty(…)
* [DATAMONGO-620] - MongoTemplate.doSaveVersioned(…) does not consider collection handed into the method
* [DATAMONGO-621] - MongoTemplate.initializeVersionProperty(…) does not use ConversionService
* [DATAMONGO-622] - An unversioned object should be created using insert(…) instead of save.
* [DATAMONGO-629] - Different results when using count and find with the same criteria with 'id' field
* [DATAMONGO-638] - MappingContext should not create PersistentEntity instances for native maps
* [DATAMONGO-640] - MongoLog4jAppender suffers from potential NullPointerException when closing Mongo instance
* [DATAMONGO-641] - MongoLog4jAppender suffers from potential NullPointerException when closing Mongo instance
* [DATAMONGO-642] - MongoChangeSetPersister does not use mapped collection name
* [DATAMONGO-646] - Can't insert DBObjects through MongoTemplate
* [DATAMONGO-648] - ID attributes in namespace shouldn't be XSD IDs
** Improvement
* [DATAMONGO-594] - cross-store=> Define document name using annotation
* [DATAMONGO-632] - Polish namespace XSD to avoid errors in STS
* [DATAMONGO-635] - Fix some Sonar warnings
* [DATAMONGO-637] - Typo in Query.query(…)
* [DATAMONGO-651] - WriteResult not available from thrown Exception
* [DATAMONGO-656] - Potential NullPointerException when debugging in MongoTemplate
** Task
* [DATAMONGO-597] - Website is severely out-of-date
* [DATAMONGO-654] - Release 1.2.1
Changes in version 1.2.0.GA (2013-02-08)
----------------------------------------
** Bug

View File

@@ -1,5 +1,5 @@
Spring Data Document 1.1.2
Copyright (c) [2010-2013] SpringSource, a division of VMware, Inc.
Spring Data Document 1.2.4
Copyright (c) [2010-2013] Pivotal, Inc.
This product is licensed to you under the Apache License, Version 2.0 (the "License").
You may not use this product except in compliance with the License.

View File

@@ -1,4 +1,4 @@
SPRING DATA MongoDB 1.0.0.M5
SPRING DATA MongoDB 1.2.3.GA
----------------------------
Spring Data MongoDB is released under the terms of the Apache Software License Version 2.0 (see license.txt).