Compare commits

..

58 Commits

Author SHA1 Message Date
Spring Buildmaster
1202d2cbc2 DATAMONGO-755 - Prepare next development iteration. 2013-09-30 05:01:30 -07:00
Spring Buildmaster
0edf6c06ed DATAMONGO-755 - Release version 1.2.4.RELEASE. 2013-09-30 05:01:28 -07:00
Oliver Gierke
0afd137e7d DATAMONGO-755 - Prepare 1.2.4.RELEASE.
Upgraded to Spring Data Core 1.5.3. Updated changelog, readme etc.
2013-09-30 13:40:36 +02:00
Thomas Darimont
bc3f44197b DATAMONGO-445 - Allow to skip unnecessary elements in NearQuery.
Added support for skipping elements for NearQuery in MongoTemplate. As mongodb currently (2.4.4) doesn't support he skipping of elements in geoNear-Queries we skip the unnecessary elements ourselves. We use the limit & skip information from the given query or an explicitly passed Pageable.

Original pull request: #64.
2013-08-22 18:22:00 +02:00
Thomas Darimont
28bd631579 DATAMONGO-742 - Document CDI integration in reference documentation.
Added chapter for CDI Integration under the new chapter Miscellaneous.

Original pull request: #63.
2013-08-13 12:25:59 +02:00
Randy Watler
5396df9af4 DATAMONGO-737 - Register TransactionSynchronization holder once per Mongo instance.
Original pull request: #62.
2013-08-12 17:39:42 +02:00
Thomas Darimont
2b864e9744 DATAMONGO-507 - Reject incorrect usage of Criteria#not().
Added a guard to Criteria#(and|or|nor)Operator to prevent wrapping $and, $or or $nor expressions in a $not expression as mongodb currently doesn't support this. Added test case to CriteriaTests to verify that not() works as specified.

Original pull request: #60.
2013-08-09 12:24:40 +02:00
Spring Buildmaster
5accbbdac5 DATAMONGO-729 - Prepare next development iteration. 2013-07-24 06:32:26 -07:00
Spring Buildmaster
11d9f04fd1 DATAMONGO-729 - Release version 1.2.3.RELEASE. 2013-07-24 06:32:22 -07:00
Oliver Gierke
67b91e446e DATAMONGO-729 - Prepare 1.2.3 release.
Upgraded to Spring Data Build 1.0.4.RELEASE, Spring Data Commons 1.5.2.RELEASE. Updated changelog, notice and readmes. Removed Snapshot repository in favor of the release one.
2013-07-24 12:40:47 +02:00
Thomas Darimont
1124841e17 DATAMONGO-728 - Added missing package-info files. 2013-07-23 16:33:39 +02:00
Spring Buildmaster
10ccbf131d DATAMONGO-727 - Prepare next development iteration. 2013-07-19 06:37:58 -07:00
Spring Buildmaster
b29930b512 DATAMONGO-727 - Release version 1.2.2.RELEASE. 2013-07-19 06:37:54 -07:00
Oliver Gierke
d671fb13ae DATAMONGO-727 - Prepare 1.2.2.RELEASE. 2013-07-19 15:21:57 +02:00
Oliver Gierke
b0a10d19c3 DATAMONGO-723 - Cleand up a few misnamed test cases.
Reactivated test cases that were name Test instead of Tests by accident.
2013-07-12 18:44:28 +02:00
Oliver Gierke
d8ef7e1472 DATAMONGO-717 - Partial back-port of change in AbstractMongoConfiguration.
Backported a part of DATAMONGO-569 to avoid lifecycle issues on MongoMappingContext. ….initialize() was called to early so that other lifecycle callbacks (e.g. setApplicationContext(…)) are applied too late.
2013-07-10 23:00:36 +02:00
Thomas Darimont
b9a25eabae DATAMONGO-685 - ServerInfo should return used hostname reported by MongoDB.
Added test case getHostNameShouldReturnServerNameReportedByMongo() to MongoMonitorIntegrationTests. Modified MongoMonitorIntegrationTests to use common mongo-infrastructure configuration. ServerInfo.getHostName() is now derived from serverStatus.serverUsed.

Original pull request: #51.
2013-07-10 14:30:40 +02:00
Oliver Gierke
e92e5c737f DATAMONGO-540 - Fixed test case.
Apparently the test case was not working on MongoDB instances in version 2.2.
2013-07-08 14:07:24 +02:00
Thomas Darimont
6e46fb12cb DATAMONGO-671 - Improve test case to not fail on fast machines. 2013-07-05 12:00:27 +02:00
Thomas Darimont
031d446a1c DATAMONGO-540 - Added test case to show findOne(…) works after upsert. 2013-07-05 12:00:18 +02:00
Oliver Gierke
e6bab1ce60 DATAMONGO-671 - Added integration tests to show lookups by date are working. 2013-07-05 12:00:04 +02:00
Oliver Gierke
2fffe0a5c4 DATAMONGO-714 - Updated formatter to latest changes. 2013-07-05 11:51:35 +02:00
Thomas Darimont
2493de5f91 DATAMONGO-693 - More robust handling of host and replica set config in MongoFactoryBean.
MongoFactoryBean now considers empty strings for the replicaPair property as not set at all. The ServerAdressPropertyEditor also returns null as value for empty text strings. Deprecated setter for replica pair on MongoFactoryBean.
2013-07-02 16:37:54 +02:00
Oliver Gierke
bd11bab076 DATAMONGO-675 - MongoTemplate now maps Update in findAndModify(…).
The Update object handed to ….findAndModify(…) is now handed through the QueryMapper before being executed.
2013-07-01 08:27:29 +02:00
Oliver Gierke
b667984563 DATAMONGO-706 - Fixed DBRef conversion for nested keywords. 2013-06-29 13:07:52 +02:00
Oliver Gierke
7ef167ed96 DATAMONGO-705 - Fixed handling of DBRefs in QueryMapper.
QueryMapper now converts values to become DBRefs correctly in getMappedKeyword(…). Added an exclusion path for the value handling in case we have an $exists keyword.
2013-06-25 19:29:11 +02:00
Andrew Duncan
303a057d86 DATAMONGO-701 - Improve performance of starts-with and ends-with queries.
This changes the starts-with regex to the prefixed form using ^ to better make use of any index on the queried field. Also changes ending-with queries to use the $ anchor.
2013-06-25 18:03:09 +02:00
Ivan Sopov
607072c0d3 DATAMONGO-704 - Removed references to SimpleMongoConverter from JavaDoc.
Removing SimpleMongoConverter references from javadocs In commit 2832b524d3 MappingMongoConverter was made default instead of SimpleMongoConverter. Also SimpleMongoConverter was completely removed between 1.0.0.M3 and 1.0.0.M4 releases. This is an update for JavaDocs, that still reference SimpleMongoConverter as the default MongoConverter.
2013-06-25 17:32:05 +02:00
Oliver Gierke
11e9c562b3 DATAMONGO-683 - QueryMapper now handles default ID names it mapping metadata missing.
Changed the implementation so that _id is considered an id field if no metadata is present. Heavily refactored QueryMapper internals so that the conversion code is more readable.
2013-05-24 09:18:57 +02:00
Oliver Gierke
220b211faa DATAMONGO-679 - Fixed saving plain JSON strings in MongoTemplate.
MongoTemplate.doSave(…) didn't handle plain JSON strings correctly. It now correctly passes the marshaled object to the underlying method.
2013-05-23 20:46:08 +02:00
Patryk Wąsik
c0c51fcc29 DATAMONGO-677 - QueryMapper now handles DBRefs in Maps correctly.
QueryMapper now handle Map with DBRef value, which is needed to process an update in MongoTemplate.doUpdate(…) to save versioned document correctly.
2013-05-23 20:10:29 +02:00
Oliver Gierke
ed9eddf10e DATAMONGO-682 - Performance improvement for mapping hotspots.
This commit includes the MongoDB specific parts for the mapping subsystem performance improvements. Reworked PerformanceTest to output more reasonable numbers.

Heavily inspired by Patryk Wasik's contribution at https://github.com/SpringSource/spring-data-mongodb/pull/37.

GitHub PR: #37
Conflicts:
	spring-data-mongodb/src/test/java/org/springframework/data/mongodb/core/mapping/MappingTests.java
2013-05-23 19:52:34 +02:00
Oliver Gierke
e23d73d55e DATACMNS-379 - Adapt to changes in Spring Data Commons. 2013-05-12 23:50:42 +02:00
Philipp Schneider
9627fbaebf DATAMONGO-663 - Added equals(…) and hashCode() to Field. 2013-04-26 15:37:38 +02:00
Spring Buildmaster
ef93d4db0b DATAMONGO-654 - Prepare next development iteration. 2013-04-17 02:50:13 -07:00
Spring Buildmaster
928b5a7742 DATAMONGO-654 - Release 1.2.1.RELEASE. 2013-04-17 02:50:08 -07:00
Oliver Gierke
118a52a8d6 DATAMONGO-654 - Prepare 1.2.1.RELEASE.
Upgraded to Spring Data Build 1.0.3.RELEASE. Upgraded to Spring Data Commons 1.5.1.RELEASE. Updated changelog, notice and readme. Polished readme.md.
2013-04-17 11:28:46 +02:00
Oliver Gierke
b47e8ca3da DATAMONGO-656 - Fixed potential NPE in MongoTemplate. 2013-04-17 10:24:37 +02:00
Oliver Gierke
8527d6eb43 DATAMONGO-651 - MongoTemplate now throws Mongo-specific exception with WriteResult.
If the WriteResultChecking is set to EXCEPTION on a MongoTemplate, we now throw a Mongo-specific exception that captures both the WriteResult and MongoActionOperation for further evaluation.
2013-04-15 17:05:56 +02:00
Oliver Gierke
d645c778c3 DATAMONGO-571 - Fixed setting null values during update of versioned entities.
In case of updating a versioned object,the Update object is now constructed from plain key value pairs, not using $set anymore. This will correctly set the null values in the updated document.
2013-04-11 12:01:53 +02:00
Oliver Gierke
5c47f1ae9e DATAMONGO-650 - Added snapshot repository to resolve Spring Data Commons. 2013-04-11 11:08:02 +02:00
Oliver Gierke
9f324bac19 DATAMONGO-648 - Replaced XSD ids with XSD strings.
This change allows usage of Spring Data MongoDB XML namespace elements with <bean /> element using a profile. This scenario creates the case of e.g. two <mongo:db-factory /> declarations in the same XML file.
2013-04-10 20:34:29 +02:00
Oliver Gierke
186caba1ac DATAMONGO-642 - MongoChangeSetPersister now considers mapped collection.
So far the change set persister has used the plain domain type name to persist data. We now consider the collection name defined by the object mapping (through @Document(collection = "…")).
2013-04-02 11:42:50 +02:00
Oliver Gierke
2ebb7e801d DATAMONGO-641 - Fixed potential NullPointerException in MongoLog4jAppender.
Log4j appender now only closes Mongo instance if available.
2013-04-02 10:59:13 +02:00
Oliver Gierke
a932f3474e DATAMONGO-641 - Reformatting to prepare fix. 2013-04-02 10:59:05 +02:00
Oliver Gierke
e992456532 DATAMONGO-638 - MappingContext does not create PersistentEntities for AbstractMaps. 2013-03-28 16:29:55 +01:00
Oliver Gierke
7b34c5cac4 DATAMONGO-629 - Fixed QueryMapper not to massage queries without type information.
So far the QueryMapper applied the id massaging (especially interpreting the default id keys) even if there was no persistence metadata available to do so. This caused e.g. queries handed into MongoTemplate.count(Query, String) to get keys of "id" massaged into "_id" which shouldn't be the case as we cannot assume anything about the documents and the keys contained in them.

So we now only apply the defaults if there is at least persistence metadata present. This means that for methods on MongoOperations that don't take type information of any kind the queries have to be defined in terms of the document, not the object model as we cannot refer to it.
2013-03-27 12:04:06 +01:00
Andrey Bloschetsov
16baf00f5e DATAMONGO-637 - Fixed typo in Query.query(…).
Correcting mispelled "critera". Polished JavaDoc a little.
2013-03-27 11:28:22 +01:00
Oliver Gierke
61a2c56a27 DATAMONGO-632 - Removed schemaLocation attribute from namespace references. 2013-03-25 18:46:34 +01:00
Oliver Gierke
e62437b64a DATAMONGO-633 - Back down to bug fix versions of SD Parent and Core.
Avoid version bump to Querydsl 3.0.
2013-03-25 18:44:18 +01:00
Oliver Gierke
e9a7e887be DATAMONGO-635 - Fixed some Sonar warnings. 2013-03-25 18:13:21 +01:00
Oliver Gierke
b91a66f6f9 DATAMONGO-622 - Saving unversioned object now uses doInsert(…).
We now rather use doInsert(…) if a versioned object is saved the first time to prevent accidental updates not bumping the version number.
2013-02-26 11:39:36 +01:00
Oliver Gierke
a047e54e5a DATAMONGO-621 - Initializing version property now uses ConversionService.
The BeanWrapper used in MongoTemplate.initializeVersionProperty(…) now uses the ConversionService held in the MongoConverter.
2013-02-26 11:39:30 +01:00
Oliver Gierke
8197ff57c8 DATAMONGO-620 - Updating versioned object uses explicit collection name.
We're now handing the collection name given to doSaveVersioned(…) to the update method.
2013-02-26 11:39:22 +01:00
Oliver Gierke
a2136719e1 DATAMONGO-617 - Fixed potential NullPointerException in MongoTemplate.insert(…).
If MongoTemplate.insert(…) was called with a Mongo-simple type (such as a raw DBBobject) it caused a NullPointerException during the lookup of a version property. This is now fixed by correcting the guard.
2013-02-20 12:07:32 +01:00
Oliver Gierke
7bcf142c8d DATAMONGO-613 - Readded JConsole image and upgrade to Spring Data Build 1.1. 2013-02-11 12:41:12 +01:00
Oliver Gierke
d50d03a80e DATAMONGO-612 - Fixed reconfiguration of dist.id. 2013-02-11 12:39:15 +01:00
Spring Buildmaster
ba894a4511 DATAMONGO-609 - Prepare next development iteration. 2013-02-08 13:22:05 +01:00
126 changed files with 1708 additions and 2765 deletions

154
README.md
View File

@@ -1,68 +1,96 @@
# Spring Data MongoDB
Spring Data MongoDB
======================
The primary goal of the [Spring Data](http://www.springsource.org/spring-data) project is to make it easier to build Spring-powered applications that use new data access technologies such as non-relational databases, map-reduce frameworks, and cloud based data services.
The Spring Data MongoDB project aims to provide a familiar and consistent Spring-based programming model for new datastores while retaining store-specific features and capabilities. The Spring Data MongoDB project provides integration with the MongoDB document database. Key functional areas of Spring Data MongoDB are a POJO centric model for interacting with a MongoDB DBCollection and easily writing a repository style data access layer.
The Spring Data MongoDB aims to provide a familiar and consistent Spring-based programming model for for new datastores while retaining store-specific features and capabilities. The Spring Data MongoDB project provides integration with the MongoDB document database. Key functional areas of Spring Data MongoDB are a POJO centric model for interacting with a MongoDB DBCollection and easily writing a Repository style data access layer
## Getting Help
Getting Help
------------
For a comprehensive treatmet of all the Spring Data MongoDB features, please refer to:
For a comprehensive treatmet of all the Spring Data MongoDB features, please refer to the The [User Guide](http://static.springsource.org/spring-data/data-mongodb/docs/current/reference/html/)
* the [User Guide](http://static.springsource.org/spring-data/data-mongodb/docs/current/reference/html/)
* the [JavaDocs](http://static.springsource.org/spring-data/data-mongodb/docs/current/api/) have extensive comments in them as well.
* the home page of [Spring Data MongoDB](http://www.springsource.org/spring-data/mongodb) contains links to articles and other resources.
* for more detailed questions, use the [forum](http://forum.springsource.org/forumdisplay.php?f=80).
The [JavaDocs](http://static.springsource.org/spring-data/data-mongodb/docs/current/api/) have extensive comments in them as well.
The home page of [Spring Data MongoDB](http://www.springsource.org/spring-data/mongodb) contains links to articles and other resources.
For more detailed questions, use the [forum](http://forum.springsource.org/forumdisplay.php?f=80).
If you are new to Spring as well as to Spring Data, look for information about [Spring projects](http://www.springsource.org/projects).
## Quick Start
Quick Start
-----------
### Maven configuration
## MongoDB
Add the Maven dependency:
For those in a hurry:
* Download the jar through Maven:
```xml
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.2.1.RELEASE</version>
<version>1.2.3.RELEASE</version>
</dependency>
```
If you'd rather like the latest snapshots of the upcoming major version, use our Maven snapshot repository and declare the appropriate dependency version.
```xml
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.2.1.RELEASE</version>
</dependency>
<repository>
<id>spring-libs-snapshot</id>
<name>Spring Snapshot Repository</name>
<url>http://repo.springsource.org/libs-snapshot</url>
</repository>
```
### MongoTemplate
MongoTemplate is the central support class for Mongo database operations. It provides:
MongoTemplate is the central support class for Mongo database operations. It provides
* Basic POJO mapping support to and from BSON
* Convenience methods to interact with the store (insert object, update objects) and MongoDB specific ones (geo-spatial operations, upserts, map-reduce etc.)
* Connection affinity callback
* Connection Affinity callback
* Exception translation into Spring's [technology agnostic DAO exception hierarchy](http://static.springsource.org/spring/docs/3.0.x/spring-framework-reference/html/dao.html#dao-exceptions).
### Spring Data repositories
Future plans are to support optional logging and/or exception throwing based on WriteResult return value, common map-reduce operations, GridFS operations. A simple API for partial document updates is also planned.
To simplify the creation of data repositories Sprin Data MongoDB provides a generic repository programming model. It will automatically create a repository proxy for you that adds implementations of finder methods you specify on an interface.
### Easy Data Repository generation
For example, given a `Person` class with first and last name properties, a `PersonRepository` interface that can query for `Person` by last name and when the first name matches a like expression is shown below:
To simplify the creation of data repositories a generic `Repository` interface and default implementation is provided. Furthermore, Spring will automatically create a Repository implementation for you that adds implementations of finder methods you specify on an interface.
The Repository interface is
```java
public interface PersonRepository extends CrudRepository<Person, Long> {
public interface Repository<T, ID extends Serializable> {
T save(T entity);
List<T> save(Iterable<? extends T> entities);
T findById(ID id);
boolean exists(ID id);
List<T> findAll();
Long count();
void delete(T entity);
void delete(Iterable<? extends T> entities);
void deleteAll();
}
```
The `MongoRepository` extends `Repository` and will in future add more Mongo specific methods.
```java
public interface MongoRepository<T, ID extends Serializable> extends Repository<T, ID> {
}
```
`SimpleMongoRepository` is the out of the box implementation of the `MongoRepository` you can use for basid CRUD operations.
To go beyond basic CRUD, extend the `MongoRepository` interface and supply your own finder methods that follow simple naming conventions such that they can be easily converted into queries.
For example, given a `Person` class with first and last name properties, a `PersonRepository` interface that can query for `Person` by last name and when the first name matches a regular expression is shown below
```java
public interface PersonRepository extends MongoRepository<Person, Long> {
List<Person> findByLastname(String lastname);
@@ -70,55 +98,31 @@ public interface PersonRepository extends CrudRepository<Person, Long> {
}
```
The queries issued on execution will be derived from the method name. Exending `CrudRepository` causes CRUD methods being pulled into the interface so that you can easily save and find single entities and collections of them.
You can have Spring automatically create a proxy for the interface by using the following JavaConfig:
```java
@Configuration
@EnableMongoRepositories
class ApplicationConfig extends AbstractMongoConfiguration {
@Override
public Mongo mongo() throws Exception {
return new Mongo();
}
@Override
protected String getDatabaseName() {
return "springdata";
}
}
```
This sets up a connection to a local MongoDB instance and enables the detection of Spring Data repositories (through `@EnableMongoRepositories`). The same configuration would look like this in XML:
You can have Spring automatically create a proxy for the interface as shown below:
```xml
<bean id="template" class="org.springframework.data.document.mongodb.MongoTemplate">
<constructor-arg>
<bean class="com.mongodb.Mongo">
<constructor-arg value="localhost" />
<constructor-arg value="27017" />
</bean>
</constructor-arg>
<constructor-arg value="database" />
<constructor-arg>
<bean class="com.mongodb.Mongo">
<constructor-arg value="localhost" />
<constructor-arg value="27017" />
</bean>
</constructor-arg>
<constructor-arg value="database" />
<property name="defaultCollectionName" value="springdata" />
</bean>
<mongo:repositories base-package="com.acme.repository" />
```
This will find the repository interface and register a proxy object in the container. You can use it as shown below:
This will find the repository interface and register a proxy object in the container. You can use it as shown below:
```java
``java
@Service
public class MyService {
private final PersonRepository repository;
@Autowired
public MyService(PersonRepository repository) {
this.repository = repository;
}
private final PersonRepository repository;
public void doWork() {
@@ -130,12 +134,16 @@ public class MyService {
person = repository.save(person);
List<Person> lastNameResults = repository.findByLastname("Gierke");
List<Person> firstNameResults = repository.findByFirstnameLike("Oli*");
}
}
```
## Contributing to Spring Data
Contributing to Spring Data
---------------------------
Here are some ways for you to get involved in the community:

View File

@@ -73,7 +73,7 @@
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_enum_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_angle_bracket_in_type_parameters" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.clear_blank_lines_in_block_comment" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_type_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_type_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.lineSplit" value="120"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_if" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_between_brackets_in_array_type_reference" value="do not insert"/>
@@ -102,7 +102,7 @@
<setting id="org.eclipse.jdt.core.formatter.alignment_for_parameters_in_constructor_declaration" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_colon_in_labeled_statement" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_annotation_type_declaration" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_method_body" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_method_body" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_method_declaration" value="0"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_method_invocation" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_try" value="do not insert"/>
@@ -124,7 +124,7 @@
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_parenthesized_expression_in_throw" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_type_arguments" value="do not insert"/>
<setting id="org.eclipse.jdt.core.compiler.problem.enumIdentifier" value="error"/>
<setting id="org.eclipse.jdt.core.formatter.indent_switchstatements_compare_to_switch" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.indent_switchstatements_compare_to_switch" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_ellipsis" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_block" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_for_inits" value="do not insert"/>
@@ -135,9 +135,9 @@
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_for_increments" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.format_line_comment_starting_on_first_column" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_bracket_in_array_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_after_annotation_on_field" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_enum_constant" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_after_annotation_on_field" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.indent_root_tags" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_enum_constant" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_enum_declarations" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_union_type_in_multicatch" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_explicitconstructorcall_arguments" value="insert"/>
@@ -150,12 +150,12 @@
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_after_opening_brace_in_array_initializer" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_closing_brace_in_block" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_bracket_in_array_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_enum_constant" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_enum_constant" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_angle_bracket_in_type_arguments" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_constructor_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_constructor_declaration_throws" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_if" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.clear_blank_lines_in_javadoc_comment" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.comment.clear_blank_lines_in_javadoc_comment" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_throws_clause_in_constructor_declaration" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_assignment_operator" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_assignment_operator" value="insert"/>
@@ -182,11 +182,11 @@
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_angle_bracket_in_parameterized_type_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_anonymous_type_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.keep_empty_array_initializer_on_one_line" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_enum_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_enum_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.keep_imple_if_on_one_line" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_constructor_declaration_parameters" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_closing_angle_bracket_in_type_parameters" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_at_end_of_file_if_missing" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_at_end_of_file_if_missing" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_colon_in_for" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_colon_in_labeled_statement" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_parameterized_type_reference" value="do not insert"/>
@@ -237,7 +237,7 @@
<setting id="org.eclipse.jdt.core.formatter.alignment_for_throws_clause_in_method_declaration" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_angle_bracket_in_type_parameters" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_bracket_in_array_allocation_expression" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_anonymous_type_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_anonymous_type_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_colon_in_conditional" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_angle_bracket_in_parameterized_type_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_for" value="do not insert"/>
@@ -251,12 +251,12 @@
<setting id="org.eclipse.jdt.core.compiler.codegen.targetPlatform" value="1.7"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_resources_in_try" value="80"/>
<setting id="org.eclipse.jdt.core.formatter.use_tabs_only_for_leading_indentations" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_arguments_in_annotation" value="0"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_arguments_in_annotation" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.comment.format_header" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.comment.format_block_comments" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_enum_constant" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_enum_constants" value="0"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_block" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_block" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.indent_body_declarations_compare_to_annotation_declaration_header" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_parenthesized_expression" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_parenthesized_expression" value="do not insert"/>

12
pom.xml
View File

@@ -5,7 +5,7 @@
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.3.0.M1</version>
<version>1.2.5.BUILD-SNAPSHOT</version>
<packaging>pom</packaging>
<name>Spring Data MongoDB</name>
@@ -15,7 +15,7 @@
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>1.1.0.RELEASE</version>
<version>1.0.5.RELEASE</version>
<relativePath>../spring-data-build/parent/pom.xml</relativePath>
</parent>
@@ -29,7 +29,7 @@
<properties>
<project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>1.6.0.M1</springdata.commons>
<springdata.commons>1.5.3.RELEASE</springdata.commons>
<mongo>2.10.1</mongo>
</properties>
@@ -88,11 +88,11 @@
<version>${mongo}</version>
</dependency>
</dependencies>
<repositories>
<repository>
<id>spring-lib-milestone</id>
<url>http://repo.springsource.org/libs-milestone-local</url>
<id>spring-libs-release</id>
<url>http://repo.springsource.org/libs-release</url>
</repository>
</repositories>

View File

@@ -6,7 +6,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.3.0.M1</version>
<version>1.2.5.BUILD-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -52,7 +52,7 @@
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.3.0.M1</version>
<version>1.2.5.BUILD-SNAPSHOT</version>
</dependency>
<dependency>

View File

@@ -0,0 +1,5 @@
/**
* Infrastructure for Spring Data's MongoDB cross store support.
*/
package org.springframework.data.mongodb.crossstore;

View File

@@ -13,7 +13,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.3.0.M1</version>
<version>1.2.5.BUILD-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -32,7 +32,7 @@ An example log entry might look like:
{
"_id" : ObjectId("4d89341a8ef397e06940d5cd"),
"applicationId" : "my.application",
"name" : "org.springframework.data.mongodb.log4j.AppenderTest",
"name" : "org.springframework.data.mongodb.log4j.MongoLog4jAppenderIntegrationTests",
"level" : "DEBUG",
"timestamp" : ISODate("2011-03-23T16:53:46.778Z"),
"properties" : {

View File

@@ -5,7 +5,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.3.0.M1</version>
<version>1.2.5.BUILD-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -0,0 +1,5 @@
/**
* Infrastructure for to use MongoDB as a logging sink.
*/
package org.springframework.data.mongodb.log4j;

View File

@@ -1,11 +1,11 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -13,63 +13,65 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.log4j;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import java.net.UnknownHostException;
import java.util.Calendar;
import com.mongodb.DB;
import com.mongodb.DBCursor;
import com.mongodb.Mongo;
import org.apache.log4j.Logger;
import org.apache.log4j.MDC;
import org.junit.Before;
import org.junit.Test;
/**
* @author Jon Brisbin <jbrisbin@vmware.com>
*/
public class AppenderTest {
import com.mongodb.DB;
import com.mongodb.DBCursor;
import com.mongodb.Mongo;
private static final String NAME = AppenderTest.class.getName();
private Logger log = Logger.getLogger(NAME);
private Mongo mongo;
private DB db;
private String collection;
/**
* Integration tests for {@link MongoLog4jAppender}.
*
* @author Jon Brisbin
* @author Oliver Gierke
*/
public class MongoLog4jAppenderIntegrationTests {
static final String NAME = MongoLog4jAppenderIntegrationTests.class.getName();
Logger log = Logger.getLogger(NAME);
Mongo mongo;
DB db;
String collection;
@Before
public void setup() {
try {
mongo = new Mongo("localhost", 27017);
db = mongo.getDB("logs");
Calendar now = Calendar.getInstance();
collection = String.valueOf(now.get(Calendar.YEAR)) + String.format("%1$02d", now.get(Calendar.MONTH) + 1);
db.getCollection(collection).drop();
} catch (UnknownHostException e) {
throw new RuntimeException(e.getMessage(), e);
}
public void setUp() throws Exception {
mongo = new Mongo("localhost", 27017);
db = mongo.getDB("logs");
Calendar now = Calendar.getInstance();
collection = String.valueOf(now.get(Calendar.YEAR)) + String.format("%1$02d", now.get(Calendar.MONTH) + 1);
db.getCollection(collection).drop();
}
@Test
public void testLogging() {
log.debug("DEBUG message");
log.info("INFO message");
log.warn("WARN message");
log.error("ERROR message");
DBCursor msgs = db.getCollection(collection).find();
assertThat(msgs.count(), is(4));
assertThat(msgs.count(), is(4));
}
@Test
public void testProperties() {
MDC.put("property", "one");
log.debug("DEBUG message");
}
}

View File

@@ -10,11 +10,4 @@ log4j.appender.stdout.collectionPattern = %X{year}%X{month}
log4j.appender.stdout.applicationId = my.application
log4j.appender.stdout.warnOrHigherWriteConcern = FSYNC_SAFE
log4j.category.org.apache.activemq=ERROR
log4j.category.org.springframework.batch=DEBUG
log4j.category.org.springframework.data.document.mongodb=DEBUG
log4j.category.org.springframework.transaction=INFO
log4j.category.org.hibernate.SQL=DEBUG
# for debugging datasource initialization
# log4j.category.test.jdbc=DEBUG
log4j.category.org.springframework.data.mongodb=DEBUG

View File

@@ -1,5 +1,5 @@
<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<context version="7.1.7.187">
<context version="7.0.3.1152">
<scope name="spring-data-mongodb" type="Project">
<element name="Filter" type="TypeFilterReferenceOverridden">
<element name="org.springframework.data.mongodb.**" type="IncludeTypePattern"/>
@@ -10,7 +10,6 @@
<element name="**.config.**" type="WeakTypePattern"/>
</element>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core"/>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|GridFS"/>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Monitoring"/>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Repositories"/>
</element>
@@ -94,12 +93,6 @@
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Query"/>
</element>
</element>
<element name="API" type="Subsystem">
<element name="Assignment" type="TypeFilter">
<element name="org.springframework.data.mongodb.*" type="IncludeTypePattern"/>
</element>
<stereotype name="Public"/>
</element>
</architecture>
<workspace>
<element name="src/main/java" type="JavaRootDirectory">

View File

@@ -11,7 +11,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.3.0.M1</version>
<version>1.2.5.BUILD-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -141,8 +141,8 @@
<plugin>
<groupId>com.mysema.maven</groupId>
<artifactId>apt-maven-plugin</artifactId>
<version>1.0.8</version>
<artifactId>maven-apt-plugin</artifactId>
<version>1.0.4</version>
<dependencies>
<dependency>
<groupId>com.mysema.querydsl</groupId>

View File

@@ -13,9 +13,10 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
package org.springframework.data.mongodb;
import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.data.mongodb.core.MongoActionOperation;
import org.springframework.util.Assert;
import com.mongodb.WriteResult;

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -32,7 +32,6 @@ import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.SimpleMongoDbFactory;
import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.mapping.CamelCaseAbbreviatingFieldNamingStrategy;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.support.CachingIsNewStrategyFactory;
@@ -59,12 +58,12 @@ public abstract class AbstractMongoConfiguration {
protected abstract String getDatabaseName();
/**
* Return the {@link Mongo} instance to connect to. Annotate with {@link Bean} in case you want to expose a
* {@link Mongo} instance to the {@link org.springframework.context.ApplicationContext}.
* Return the {@link Mongo} instance to connect to.
*
* @return
* @throws Exception
*/
@Bean
public abstract Mongo mongo() throws Exception;
/**
@@ -136,10 +135,6 @@ public abstract class AbstractMongoConfiguration {
mappingContext.setInitialEntitySet(getInitialEntitySet());
mappingContext.setSimpleTypeHolder(customConversions().getSimpleTypeHolder());
if (abbreviateFieldNames()) {
mappingContext.setFieldNamingStrategy(new CamelCaseAbbreviatingFieldNamingStrategy());
}
return mappingContext;
}
@@ -209,15 +204,4 @@ public abstract class AbstractMongoConfiguration {
return initialEntitySet;
}
/**
* Configures whether to abbreviate field names for domain objects by configuring a
* {@link CamelCaseAbbreviatingFieldNamingStrategy} on the {@link MongoMappingContext} instance created. For advanced
* customization needs, consider overriding {@link #mappingMongoConverter()}.
*
* @return
*/
protected boolean abbreviateFieldNames() {
return false;
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright (c) 2011 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -13,14 +13,11 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
/**
* Constants to declare bean names used by the namespace configuration.
*
* @author Jon Brisbin
* @author Oliver Gierke
* @author Martin Baumgartner
* @author Jon Brisbin <jbrisbin@vmware.com>
*/
public abstract class BeanNames {
@@ -31,6 +28,4 @@ public abstract class BeanNames {
static final String VALIDATING_EVENT_LISTENER = "validatingMongoEventListener";
static final String IS_NEW_STRATEGY_FACTORY = "isNewStrategyFactory";
static final String DEFAULT_CONVERTER_BEAN_NAME = "mappingConverter";
static final String MONGO_TEMPLATE = "mongoTemplate";
static final String GRID_FS_TEMPLATE = "gridFsTemplate";
}

View File

@@ -1,78 +0,0 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import org.springframework.beans.factory.BeanDefinitionStoreException;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.xml.AbstractBeanDefinitionParser;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.config.BeanComponentDefinitionBuilder;
import org.springframework.data.mongodb.gridfs.GridFsTemplate;
import org.springframework.util.StringUtils;
import org.w3c.dom.Element;
/**
* {@link BeanDefinitionParser} to parse {@code gridFsTemplate} elements into {@link BeanDefinition}s.
*
* @author Martin Baumgartner
*/
class GridFsTemplateParser extends AbstractBeanDefinitionParser {
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#resolveId(org.w3c.dom.Element, org.springframework.beans.factory.support.AbstractBeanDefinition, org.springframework.beans.factory.xml.ParserContext)
*/
@Override
protected String resolveId(Element element, AbstractBeanDefinition definition, ParserContext parserContext)
throws BeanDefinitionStoreException {
String id = super.resolveId(element, definition, parserContext);
return StringUtils.hasText(id) ? id : BeanNames.GRID_FS_TEMPLATE;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#parseInternal(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
*/
@Override
protected AbstractBeanDefinition parseInternal(Element element, ParserContext parserContext) {
BeanComponentDefinitionBuilder helper = new BeanComponentDefinitionBuilder(element, parserContext);
String converterRef = element.getAttribute("converter-ref");
String dbFactoryRef = element.getAttribute("db-factory-ref");
BeanDefinitionBuilder gridFsTemplateBuilder = BeanDefinitionBuilder.genericBeanDefinition(GridFsTemplate.class);
if (StringUtils.hasText(dbFactoryRef)) {
gridFsTemplateBuilder.addConstructorArgReference(dbFactoryRef);
} else {
gridFsTemplateBuilder.addConstructorArgReference(BeanNames.DB_FACTORY);
}
if (StringUtils.hasText(converterRef)) {
gridFsTemplateBuilder.addConstructorArgReference(converterRef);
} else {
gridFsTemplateBuilder.addConstructorArgReference(BeanNames.DEFAULT_CONVERTER_BEAN_NAME);
}
return (AbstractBeanDefinition) helper.getComponentIdButFallback(gridFsTemplateBuilder, BeanNames.GRID_FS_TEMPLATE)
.getBeanDefinition();
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -54,7 +54,6 @@ import org.springframework.data.mapping.context.MappingContextIsNewStrategyFacto
import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexCreator;
import org.springframework.data.mongodb.core.mapping.CamelCaseAbbreviatingFieldNamingStrategy;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.event.ValidatingMongoEventListener;
@@ -202,12 +201,6 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
mappingContextBuilder.addPropertyValue("simpleTypeHolder", simpleTypesDefinition);
}
String abbreviateFieldNames = element.getAttribute("abbreviate-field-names");
if ("true".equals(abbreviateFieldNames)) {
mappingContextBuilder.addPropertyValue("fieldNamingStrategy", new RootBeanDefinition(
CamelCaseAbbreviatingFieldNamingStrategy.class));
}
ctxRef = converterId + "." + MAPPING_CONTEXT;
parserContext.registerBeanComponent(componentDefinitionBuilder.getComponent(mappingContextBuilder, ctxRef));

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -24,7 +24,6 @@ import org.springframework.data.repository.config.RepositoryConfigurationExtensi
* {@link org.springframework.beans.factory.xml.NamespaceHandler} for Mongo DB configuration.
*
* @author Oliver Gierke
* @author Martin Baumgartner
*/
public class MongoNamespaceHandler extends NamespaceHandlerSupport {
@@ -43,7 +42,5 @@ public class MongoNamespaceHandler extends NamespaceHandlerSupport {
registerBeanDefinitionParser("db-factory", new MongoDbFactoryParser());
registerBeanDefinitionParser("jmx", new MongoJmxParser());
registerBeanDefinitionParser("auditing", new MongoAuditingBeanDefinitionParser());
registerBeanDefinitionParser("template", new MongoTemplateParser());
registerBeanDefinitionParser("gridFsTemplate", new GridFsTemplateParser());
}
}

View File

@@ -1,86 +0,0 @@
/*
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import static org.springframework.data.config.ParsingUtils.*;
import static org.springframework.data.mongodb.config.MongoParsingUtils.*;
import org.springframework.beans.factory.BeanDefinitionStoreException;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.parsing.BeanComponentDefinition;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.xml.AbstractBeanDefinitionParser;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.config.BeanComponentDefinitionBuilder;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.util.StringUtils;
import org.w3c.dom.Element;
/**
* {@link BeanDefinitionParser} to parse {@code template} elements into {@link BeanDefinition}s.
*
* @author Martin Baumgartner
*/
class MongoTemplateParser extends AbstractBeanDefinitionParser {
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#resolveId(org.w3c.dom.Element, org.springframework.beans.factory.support.AbstractBeanDefinition, org.springframework.beans.factory.xml.ParserContext)
*/
@Override
protected String resolveId(Element element, AbstractBeanDefinition definition, ParserContext parserContext)
throws BeanDefinitionStoreException {
String id = super.resolveId(element, definition, parserContext);
return StringUtils.hasText(id) ? id : BeanNames.MONGO_TEMPLATE;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#parseInternal(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
*/
@Override
protected AbstractBeanDefinition parseInternal(Element element, ParserContext parserContext) {
BeanComponentDefinitionBuilder helper = new BeanComponentDefinitionBuilder(element, parserContext);
String converterRef = element.getAttribute("converter-ref");
String dbFactoryRef = element.getAttribute("db-factory-ref");
BeanDefinitionBuilder mongoTemplateBuilder = BeanDefinitionBuilder.genericBeanDefinition(MongoTemplate.class);
setPropertyValue(mongoTemplateBuilder, element, "write-concern", "writeConcern");
if (StringUtils.hasText(dbFactoryRef)) {
mongoTemplateBuilder.addConstructorArgReference(dbFactoryRef);
} else {
mongoTemplateBuilder.addConstructorArgReference(BeanNames.DB_FACTORY);
}
if (StringUtils.hasText(converterRef)) {
mongoTemplateBuilder.addConstructorArgReference(converterRef);
}
BeanDefinitionBuilder writeConcernPropertyEditorBuilder = getWriteConcernPropertyEditorBuilder();
BeanComponentDefinition component = helper.getComponent(writeConcernPropertyEditorBuilder);
parserContext.registerBeanComponent(component);
return (AbstractBeanDefinition) helper.getComponentIdButFallback(mongoTemplateBuilder, BeanNames.MONGO_TEMPLATE)
.getBeanDefinition();
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -31,6 +31,7 @@ import com.mongodb.ServerAddress;
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
*/
public class ServerAddressPropertyEditor extends PropertyEditorSupport {
@@ -43,6 +44,11 @@ public class ServerAddressPropertyEditor extends PropertyEditorSupport {
@Override
public void setAsText(String replicaSetString) {
if (!StringUtils.hasText(replicaSetString)) {
setValue(null);
return;
}
String[] replicaSetStringArray = StringUtils.commaDelimitedListToStringArray(replicaSetString);
Set<ServerAddress> serverAddresses = new HashSet<ServerAddress>(replicaSetStringArray.length);

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2010-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -26,14 +26,13 @@ import com.mongodb.DB;
import com.mongodb.Mongo;
/**
* Helper class featuring helper methods for internal MongoDb classes.
* <p/>
* <p>
* Mainly intended for internal use within the framework.
* Helper class featuring helper methods for internal MongoDb classes. Mainly intended for internal use within the
* framework.
*
* @author Thomas Risberg
* @author Graeme Rocher
* @author Oliver Gierke
* @author Randy Watler
* @since 1.0
*/
public abstract class MongoDbUtils {
@@ -131,8 +130,11 @@ public abstract class MongoDbUtils {
holderToUse.addDB(databaseName, db);
}
TransactionSynchronizationManager.registerSynchronization(new MongoSynchronization(holderToUse, mongo));
holderToUse.setSynchronizedWithTransaction(true);
// synchronize holder only if not yet synchronized
if (!holderToUse.isSynchronizedWithTransaction()) {
TransactionSynchronizationManager.registerSynchronization(new MongoSynchronization(holderToUse, mongo));
holderToUse.setSynchronizedWithTransaction(true);
}
if (holderToUse != dbHolder) {
TransactionSynchronizationManager.bindResource(mongo, holderToUse);

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2013 the original author or authors.
* Copyright 2010-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,7 +15,9 @@
*/
package org.springframework.data.mongodb.core;
import java.util.Arrays;
import java.util.ArrayList;
import java.util.Collection;
import java.util.Collections;
import java.util.List;
import org.springframework.beans.factory.DisposableBean;
@@ -24,6 +26,7 @@ import org.springframework.beans.factory.InitializingBean;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.CannotGetMongoDbConnectionException;
import org.springframework.util.StringUtils;
import com.mongodb.Mongo;
import com.mongodb.MongoOptions;
@@ -36,6 +39,7 @@ import com.mongodb.WriteConcern;
* @author Thomas Risberg
* @author Graeme Rocher
* @author Oliver Gierke
* @author Thomas Darimont
* @since 1.0
*/
public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, DisposableBean,
@@ -57,11 +61,38 @@ public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, D
}
public void setReplicaSetSeeds(ServerAddress[] replicaSetSeeds) {
this.replicaSetSeeds = Arrays.asList(replicaSetSeeds);
this.replicaSetSeeds = filterNonNullElementsAsList(replicaSetSeeds);
}
/**
* @deprecated use {@link #setReplicaSetSeeds(ServerAddress[])} instead
*
* @param replicaPair
*/
@Deprecated
public void setReplicaPair(ServerAddress[] replicaPair) {
this.replicaPair = Arrays.asList(replicaPair);
this.replicaPair = filterNonNullElementsAsList(replicaPair);
}
/**
* @param elements the elements to filter <T>
* @return a new unmodifiable {@link List#} from the given elements without nulls
*/
private <T> List<T> filterNonNullElementsAsList(T[] elements) {
if (elements == null) {
return Collections.emptyList();
}
List<T> candidateElements = new ArrayList<T>();
for (T element : elements) {
if (element != null) {
candidateElements.add(element);
}
}
return Collections.unmodifiableList(candidateElements);
}
public void setHost(String host) {
@@ -117,7 +148,6 @@ public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, D
* (non-Javadoc)
* @see org.springframework.beans.factory.InitializingBean#afterPropertiesSet()
*/
@SuppressWarnings("deprecation")
public void afterPropertiesSet() throws Exception {
Mongo mongo;
@@ -127,15 +157,15 @@ public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, D
mongoOptions = new MongoOptions();
}
if (replicaPair != null) {
if (!isNullOrEmpty(replicaPair)) {
if (replicaPair.size() < 2) {
throw new CannotGetMongoDbConnectionException("A replica pair must have two server entries");
}
mongo = new Mongo(replicaPair.get(0), replicaPair.get(1), mongoOptions);
} else if (replicaSetSeeds != null) {
} else if (!isNullOrEmpty(replicaSetSeeds)) {
mongo = new Mongo(replicaSetSeeds, mongoOptions);
} else {
String mongoHost = host != null ? host : defaultOptions.getHost();
String mongoHost = StringUtils.hasText(host) ? host : defaultOptions.getHost();
mongo = port != null ? new Mongo(new ServerAddress(mongoHost, port), mongoOptions) : new Mongo(mongoHost,
mongoOptions);
}
@@ -147,6 +177,10 @@ public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, D
this.mongo = mongo;
}
private boolean isNullOrEmpty(Collection<?> elements) {
return elements == null || elements.isEmpty();
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.DisposableBean#destroy()

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2013 the original author or authors.
* Copyright 2010-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -247,7 +247,7 @@ public interface MongoOperations {
* Query for a list of objects of type T from the collection used by the entity class.
* <p/>
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of SimpleMongoConverter will be used.
* configured otherwise, an instance of MappingMongoConverter will be used.
* <p/>
* If your collection does not contain a homogeneous collection of types, this operation will not be an efficient way
* to map objects since the test for class type is done in the client and not on the server.
@@ -261,7 +261,7 @@ public interface MongoOperations {
* Query for a list of objects of type T from the specified collection.
* <p/>
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of SimpleMongoConverter will be used.
* configured otherwise, an instance of MappingMongoConverter will be used.
* <p/>
* If your collection does not contain a homogeneous collection of types, this operation will not be an efficient way
* to map objects since the test for class type is done in the client and not on the server.
@@ -382,7 +382,7 @@ public interface MongoOperations {
* specified type.
* <p/>
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of SimpleMongoConverter will be used.
* configured otherwise, an instance of MappingMongoConverter will be used.
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
@@ -399,7 +399,7 @@ public interface MongoOperations {
* type.
* <p/>
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of SimpleMongoConverter will be used.
* configured otherwise, an instance of MappingMongoConverter will be used.
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
@@ -408,21 +408,16 @@ public interface MongoOperations {
* specification
* @param entityClass the parameterized type of the returned list.
* @param collectionName name of the collection to retrieve the objects from
*
* @return the converted object
*/
<T> T findOne(Query query, Class<T> entityClass, String collectionName);
boolean exists(Query query, String collectionName);
boolean exists(Query query, Class<?> entityClass);
boolean exists(Query query, Class<?> entityClass, String collectionName);
/**
* Map the results of an ad-hoc query on the collection for the entity class to a List of the specified type.
* <p/>
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of SimpleMongoConverter will be used.
* configured otherwise, an instance of MappingMongoConverter will be used.
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
@@ -438,7 +433,7 @@ public interface MongoOperations {
* Map the results of an ad-hoc query on the specified collection to a List of the specified type.
* <p/>
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of SimpleMongoConverter will be used.
* configured otherwise, an instance of MappingMongoConverter will be used.
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
@@ -447,6 +442,7 @@ public interface MongoOperations {
* specification
* @param entityClass the parameterized type of the returned list.
* @param collectionName name of the collection to retrieve the objects from
*
* @return the List of converted objects
*/
<T> List<T> find(Query query, Class<T> entityClass, String collectionName);
@@ -468,6 +464,7 @@ public interface MongoOperations {
* @param id the id of the document to return
* @param entityClass the type to convert the document to
* @param collectionName the collection to query for the document
*
* @param <T>
* @return
*/
@@ -504,7 +501,7 @@ public interface MongoOperations {
* type. The first document that matches the query is returned and also removed from the collection in the database.
* <p/>
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of SimpleMongoConverter will be used.
* configured otherwise, an instance of MappingMongoConverter will be used.
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
@@ -513,6 +510,7 @@ public interface MongoOperations {
* specification
* @param entityClass the parameterized type of the returned list.
* @param collectionName name of the collection to retrieve the objects from
*
* @return the converted object
*/
<T> T findAndRemove(Query query, Class<T> entityClass, String collectionName);
@@ -557,7 +555,7 @@ public interface MongoOperations {
* Insert the object into the specified collection.
* <p/>
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of SimpleMongoConverter will be used.
* configured otherwise, an instance of MappingMongoConverter will be used.
* <p/>
* Insert is used to initially store the object into the database. To update an existing object use the save method.
*
@@ -595,7 +593,7 @@ public interface MongoOperations {
* object is not already present, that is an 'upsert'.
* <p/>
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of SimpleMongoConverter will be used.
* configured otherwise, an instance of MappingMongoConverter will be used.
* <p/>
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
@@ -612,7 +610,7 @@ public interface MongoOperations {
* is an 'upsert'.
* <p/>
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of SimpleMongoConverter will be used.
* configured otherwise, an instance of MappingMongoConverter will be used.
* <p/>
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
@@ -715,12 +713,11 @@ public interface MongoOperations {
* Remove all documents that match the provided query document criteria from the the collection used to store the
* entityClass. The Class parameter is also used to help convert the Id of the object if it is present in the query.
*
* @param <T>
* @param query
* @param entityClass
*/
void remove(Query query, Class<?> entityClass);
void remove(Query query, Class<?> entityClass, String collectionName);
<T> void remove(Query query, Class<T> entityClass);
/**
* Remove all documents from the specified collection that match the provided query document criteria. There is no

View File

@@ -52,6 +52,7 @@ import org.springframework.data.mapping.PersistentEntity;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.model.BeanWrapper;
import org.springframework.data.mapping.model.MappingException;
import org.springframework.data.mongodb.MongoDataIntegrityViolationException;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.convert.MongoConverter;
@@ -68,11 +69,9 @@ import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.mapping.MongoSimpleTypes;
import org.springframework.data.mongodb.core.mapping.event.AfterConvertEvent;
import org.springframework.data.mongodb.core.mapping.event.AfterDeleteEvent;
import org.springframework.data.mongodb.core.mapping.event.AfterLoadEvent;
import org.springframework.data.mongodb.core.mapping.event.AfterSaveEvent;
import org.springframework.data.mongodb.core.mapping.event.BeforeConvertEvent;
import org.springframework.data.mongodb.core.mapping.event.BeforeDeleteEvent;
import org.springframework.data.mongodb.core.mapping.event.BeforeSaveEvent;
import org.springframework.data.mongodb.core.mapping.event.MongoMappingEvent;
import org.springframework.data.mongodb.core.mapreduce.GroupBy;
@@ -481,24 +480,6 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
}
public boolean exists(Query query, Class<?> entityClass) {
return exists(query, entityClass, determineCollectionName(entityClass));
}
public boolean exists(Query query, String collectionName) {
return exists(query, null, collectionName);
}
public boolean exists(Query query, Class<?> entityClass, String collectionName) {
if (query == null) {
throw new InvalidDataAccessApiUsageException("Query passed in to exist can't be null");
}
DBObject mappedQuery = mapper.getMappedObject(query.getQueryObject(), getPersistentEntity(entityClass));
return execute(collectionName, new FindCallback(mappedQuery)).hasNext();
}
// Find methods that take a Query to express the query and that return a List of objects.
public <T> List<T> find(Query query, Class<T> entityClass) {
@@ -553,8 +534,26 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
mongoConverter, entityClass), near.getMetric());
List<GeoResult<T>> result = new ArrayList<GeoResult<T>>(results.size());
int index = 0;
int elementsToSkip = near.getSkip() != null ? near.getSkip() : 0;
for (Object element : results) {
result.add(callback.doWith((DBObject) element));
/*
* As MongoDB currently (2.4.4) doesn't support the skipping of elements in near queries
* we skip the elements ourselves to avoid at least the document 2 object mapping overhead.
*
* @see https://jira.mongodb.org/browse/SERVER-3925
*/
if (index >= elementsToSkip) {
result.add(callback.doWith((DBObject) element));
}
index++;
}
if (elementsToSkip > 0) {
// as we skipped some elements we have to calculate the averageDistance ourselves:
return new GeoResults<T>(result, near.getMetric());
}
DBObject stats = (DBObject) commandResult.get("stats");
@@ -1033,34 +1032,23 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
}
public void remove(Query query, String collectionName) {
remove(query, null, collectionName);
}
public void remove(Query query, Class<?> entityClass) {
remove(query, entityClass, determineCollectionName(entityClass));
}
public void remove(Query query, Class<?> entityClass, String collectionName) {
doRemove(collectionName, query, entityClass);
public <T> void remove(Query query, Class<T> entityClass) {
Assert.notNull(query);
doRemove(determineCollectionName(entityClass), query, entityClass);
}
protected <T> void doRemove(final String collectionName, final Query query, final Class<T> entityClass) {
if (query == null) {
throw new InvalidDataAccessApiUsageException("Query passed in to remove can't be null!");
throw new InvalidDataAccessApiUsageException("Query passed in to remove can't be null");
}
Assert.hasText(collectionName, "Collection name must not be null or empty!");
final DBObject queryObject = query.getQueryObject();
final MongoPersistentEntity<?> entity = getPersistentEntity(entityClass);
execute(collectionName, new CollectionCallback<Void>() {
public Void doInCollection(DBCollection collection) throws MongoException, DataAccessException {
maybeEmitEvent(new BeforeDeleteEvent<T>(queryObject, entityClass));
DBObject dboq = mapper.getMappedObject(queryObject, entity);
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.REMOVE, collectionName,
@@ -1074,14 +1062,15 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
WriteResult wr = writeConcernToUse == null ? collection.remove(dboq) : collection.remove(dboq,
writeConcernToUse);
handleAnyWriteResultErrors(wr, dboq, MongoActionOperation.REMOVE);
maybeEmitEvent(new AfterDeleteEvent<T>(queryObject, entityClass));
return null;
}
});
}
public void remove(final Query query, String collectionName) {
doRemove(collectionName, query, null);
}
public <T> List<T> findAll(Class<T> entityClass) {
return executeFindMultiInternal(new FindCallback(null), null, new ReadDbObjectCallback<T>(mongoConverter,
entityClass), determineCollectionName(entityClass));
@@ -1335,8 +1324,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
/**
* Map the results of an ad-hoc query on the default MongoDB collection to a List of the specified type. The object is
* converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless configured
* otherwise, an instance of SimpleMongoConverter will be used. The query document is specified as a standard DBObject
* and so is the fields specification. Can be overridden by subclasses.
* otherwise, an instance of MappingMongoConverter will be used. The query document is specified as a standard
* DBObject and so is the fields specification. Can be overridden by subclasses.
*
* @param collectionName name of the collection to retrieve the objects from
* @param query the query document that specifies the criteria used to find a record
@@ -1437,19 +1426,15 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass);
DBObject updateObj = update.getUpdateObject();
for (String key : updateObj.keySet()) {
updateObj.put(key, mongoConverter.convertToMongoType(updateObj.get(key)));
}
DBObject mappedUpdate = mapper.getMappedObject(update.getUpdateObject(), entity);
DBObject mappedQuery = mapper.getMappedObject(query, entity);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("findAndModify using query: " + mappedQuery + " fields: " + fields + " sort: " + sort
+ " for class: " + entityClass + " and update: " + updateObj + " in collection: " + collectionName);
+ " for class: " + entityClass + " and update: " + mappedUpdate + " in collection: " + collectionName);
}
return executeFindOneInternal(new FindAndModifyCallback(mappedQuery, fields, sort, updateObj, options),
return executeFindOneInternal(new FindAndModifyCallback(mappedQuery, fields, sort, mappedUpdate, options),
new ReadDbObjectCallback<T>(readerToUse, entityClass), collectionName);
}

View File

@@ -71,7 +71,6 @@ import com.mongodb.DBRef;
*
* @author Oliver Gierke
* @author Jon Brisbin
* @author Patrik Wasik
*/
public class MappingMongoConverter extends AbstractMongoConverter implements ApplicationContextAware {
@@ -412,7 +411,8 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
if (valueType.isMap()) {
DBObject mapDbObj = createMap((Map<Object, Object>) obj, prop);
BasicDBObject mapDbObj = new BasicDBObject();
writeMapInternal((Map<Object, Object>) obj, mapDbObj, type);
dbo.put(name, mapDbObj);
return;
}
@@ -492,42 +492,6 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return dbList;
}
/**
* Writes the given {@link Map} using the given {@link MongoPersistentProperty} information.
*
* @param map must not {@literal null}.
* @param property must not be {@literal null}.
* @return
*/
protected DBObject createMap(Map<Object, Object> map, MongoPersistentProperty property) {
Assert.notNull(map, "Given map must not be null!");
Assert.notNull(property, "PersistentProperty must not be null!");
if (!property.isDbReference()) {
return writeMapInternal(map, new BasicDBObject(), property.getTypeInformation());
}
BasicDBObject dbObject = new BasicDBObject();
for (Map.Entry<Object, Object> entry : map.entrySet()) {
Object key = entry.getKey();
Object value = entry.getValue();
if (conversions.isSimpleType(key.getClass())) {
String simpleKey = potentiallyEscapeMapKey(key.toString());
dbObject.put(simpleKey, value != null ? createDBRef(value, property.getDBRef()) : null);
} else {
throw new MappingException("Cannot use a complex object as a key value.");
}
}
return dbObject;
}
/**
* Populates the given {@link BasicDBList} with values from the given {@link Collection}.
*

View File

@@ -18,6 +18,7 @@ package org.springframework.data.mongodb.core.convert;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import java.util.Set;
import org.bson.types.ObjectId;
import org.springframework.core.convert.ConversionException;
@@ -84,8 +85,22 @@ public class QueryMapper {
for (String key : query.keySet()) {
if (Keyword.isKeyword(key)) {
result.putAll(getMappedKeyword(new Keyword(query, key), entity));
continue;
}
Field field = entity == null ? new Field(key) : new MetadataBackedField(key, entity, mappingContext);
result.put(field.getMappedKey(), getMappedValue(query.get(key), field));
Object rawValue = query.get(key);
String newKey = field.getMappedKey();
if (Keyword.isKeyword(rawValue) && !field.isIdField()) {
Keyword keyword = new Keyword((DBObject) rawValue);
result.put(newKey, getMappedKeyword(field, keyword));
} else {
result.put(newKey, getMappedValue(field, query.get(key)));
}
}
return result;
@@ -101,13 +116,14 @@ public class QueryMapper {
private DBObject getMappedKeyword(Keyword query, MongoPersistentEntity<?> entity) {
// $or/$nor
if (query.key.matches(N_OR_PATTERN)) {
if (query.key.matches(N_OR_PATTERN) || query.value instanceof Iterable) {
Iterable<?> conditions = (Iterable<?>) query.value;
BasicDBList newConditions = new BasicDBList();
for (Object condition : conditions) {
newConditions.add(getMappedObject((DBObject) condition, entity));
newConditions.add(condition instanceof DBObject ? getMappedObject((DBObject) condition, entity)
: convertSimpleOrDBObject(condition, entity));
}
return new BasicDBObject(query.key, newConditions);
@@ -119,34 +135,34 @@ public class QueryMapper {
/**
* Returns the mapped keyword considered defining a criteria for the given property.
*
* @param keyword
* @param property
* @param keyword
* @return
*/
public DBObject getMappedKeyword(Keyword keyword, Field property) {
private DBObject getMappedKeyword(Field property, Keyword keyword) {
if (property.isAssociation()) {
convertAssociation(keyword.value, property.getProperty());
}
boolean needsAssociationConversion = property.isAssociation() && !keyword.isExists();
Object value = needsAssociationConversion ? convertAssociation(keyword.value, property.getProperty())
: getMappedValue(property.with(keyword.key), keyword.value);
return new BasicDBObject(keyword.key, getMappedValue(keyword.value, property.with(keyword.key)));
return new BasicDBObject(keyword.key, value);
}
/**
* Returns the mapped value for the given source object assuming it's a value for the given
* {@link MongoPersistentProperty}.
*
* @param source the source object to be mapped
* @param value the source object to be mapped
* @param property the property the value is a value for
* @param newKey the key the value will be bound to eventually
* @return
*/
private Object getMappedValue(Object source, Field key) {
private Object getMappedValue(Field documentField, Object value) {
if (key.isIdField()) {
if (documentField.isIdField()) {
if (source instanceof DBObject) {
DBObject valueDbo = (DBObject) source;
if (value instanceof DBObject) {
DBObject valueDbo = (DBObject) value;
if (valueDbo.containsField("$in") || valueDbo.containsField("$nin")) {
String inKey = valueDbo.containsField("$in") ? "$in" : "$nin";
List<Object> ids = new ArrayList<Object>();
@@ -157,22 +173,25 @@ public class QueryMapper {
} else if (valueDbo.containsField("$ne")) {
valueDbo.put("$ne", convertId(valueDbo.get("$ne")));
} else {
return getMappedObject((DBObject) source, null);
return getMappedObject((DBObject) value, null);
}
return valueDbo;
} else {
return convertId(source);
return convertId(value);
}
}
if (key.isAssociation()) {
return Keyword.isKeyword(source) ? getMappedKeyword(new Keyword(source), key) : convertAssociation(source,
key.getProperty());
if (Keyword.isKeyword(value)) {
return getMappedKeyword(new Keyword((DBObject) value), null);
}
return convertSimpleOrDBObject(source, key.getPropertyEntity());
if (documentField.isAssociation()) {
return convertAssociation(value, documentField.getProperty());
}
return convertSimpleOrDBObject(value, documentField.getPropertyEntity());
}
/**
@@ -196,7 +215,7 @@ public class QueryMapper {
}
/**
* Converts the given source assuming it's actually an association to anoter object.
* Converts the given source assuming it's actually an association to another object.
*
* @param source
* @param property
@@ -256,16 +275,27 @@ public class QueryMapper {
String key;
Object value;
Keyword(Object source) {
public Keyword(DBObject source, String key) {
this.key = key;
this.value = source.get(key);
}
Assert.isInstanceOf(DBObject.class, source);
public Keyword(DBObject dbObject) {
DBObject value = (DBObject) source;
Set<String> keys = dbObject.keySet();
Assert.isTrue(keys.size() == 1, "Can only use a single value DBObject!");
Assert.isTrue(value.keySet().size() == 1, "Keyword must have a single key only!");
this.key = keys.iterator().next();
this.value = dbObject.get(key);
}
this.key = value.keySet().iterator().next();
this.value = value.get(key);
/**
* Returns whether the current keyword is the {@code $exists} keyword.
*
* @return
*/
public boolean isExists() {
return "$exists".equalsIgnoreCase(key);
}
/**
@@ -275,7 +305,11 @@ public class QueryMapper {
* @param value
* @return
*/
static boolean isKeyword(Object value) {
public static boolean isKeyword(Object value) {
if (value instanceof String) {
return ((String) value).startsWith("$");
}
if (!(value instanceof DBObject)) {
return false;

View File

@@ -0,0 +1,5 @@
/**
* Spring Data MongoDB specific converter infrastructure.
*/
package org.springframework.data.mongodb.core.convert;

View File

@@ -131,7 +131,7 @@ public class GeoResults<T> implements Iterable<GeoResult<T>> {
private static Distance calculateAverageDistance(List<? extends GeoResult<?>> results, Metric metric) {
if (results.isEmpty()) {
return new Distance(0, null);
return new Distance(0, metric);
}
double averageDistance = 0;

View File

@@ -0,0 +1,5 @@
/**
* Support for MongoDB geo-spatial queries.
*/
package org.springframework.data.mongodb.core.geo;

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -26,7 +26,6 @@ import java.lang.annotation.Target;
*
* @author Jon Brisbin
* @author Oliver Gierke
* @author Philipp Schneider
*/
@Target({ ElementType.TYPE })
@Documented
@@ -70,12 +69,4 @@ public @interface CompoundIndex {
* @return
*/
String collection() default "";
/**
* If {@literal true} the index will be created in the background.
*
* @see http://docs.mongodb.org/manual/core/indexes/#background-construction
* @return
*/
boolean background() default false;
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2013 the original author or authors.
* Copyright 2010-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -18,20 +18,18 @@ package org.springframework.data.mongodb.core.index;
import java.util.LinkedHashMap;
import java.util.Map;
import org.springframework.data.domain.Sort.Direction;
import org.springframework.data.mongodb.core.query.Order;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
@SuppressWarnings("deprecation")
public class Index implements IndexDefinition {
public enum Duplicates {
RETAIN, DROP
}
private final Map<String, Direction> fieldSpec = new LinkedHashMap<String, Direction>();
private final Map<String, Order> fieldSpec = new LinkedHashMap<String, Order>();
private String name;
@@ -44,37 +42,12 @@ public class Index implements IndexDefinition {
public Index() {
}
public Index(String key, Direction direction) {
fieldSpec.put(key, direction);
}
/**
* Creates a new {@link Indexed} on the given key and {@link Order}.
*
* @deprecated use {@link #Index(String, Direction)} instead.
* @param key must not be {@literal null} or empty.
* @param order must not be {@literal null}.
*/
@Deprecated
public Index(String key, Order order) {
this(key, order.toDirection());
fieldSpec.put(key, order);
}
/**
* Adds the given field to the index.
*
* @deprecated use {@link #on(String, Direction)} instead.
* @param key must not be {@literal null} or empty.
* @param order must not be {@literal null}.
* @return
*/
@Deprecated
public Index on(String key, Order order) {
return on(key, order.toDirection());
}
public Index on(String key, Direction direction) {
fieldSpec.put(key, direction);
fieldSpec.put(key, order);
return this;
}
@@ -103,7 +76,7 @@ public class Index implements IndexDefinition {
public DBObject getIndexKeys() {
DBObject dbo = new BasicDBObject();
for (String k : fieldSpec.keySet()) {
dbo.put(k, fieldSpec.get(k).equals(Direction.ASC) ? 1 : -1);
dbo.put(k, (fieldSpec.get(k).equals(Order.ASCENDING) ? 1 : -1));
}
return dbo;
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2012-2013 the original author or authors.
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,7 +15,6 @@
*/
package org.springframework.data.mongodb.core.index;
import org.springframework.data.domain.Sort.Direction;
import org.springframework.data.mongodb.core.query.Order;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
@@ -25,38 +24,30 @@ import org.springframework.util.ObjectUtils;
*
* @author Oliver Gierke
*/
@SuppressWarnings("deprecation")
public final class IndexField {
private final String key;
private final Direction direction;
private final Order order;
private final boolean isGeo;
private IndexField(String key, Direction direction, boolean isGeo) {
private IndexField(String key, Order order, boolean isGeo) {
Assert.hasText(key);
Assert.isTrue(direction != null ^ isGeo);
Assert.isTrue(order != null ^ isGeo);
this.key = key;
this.direction = direction;
this.order = order;
this.isGeo = isGeo;
}
/**
* Creates a default {@link IndexField} with the given key and {@link Order}.
*
* @deprecated use {@link #create(String, Direction)}.
* @param key must not be {@literal null} or emtpy.
* @param direction must not be {@literal null}.
* @param order must not be {@literal null}.
* @return
*/
@Deprecated
public static IndexField create(String key, Order order) {
Assert.notNull(order);
return new IndexField(key, order.toDirection(), false);
}
public static IndexField create(String key, Direction order) {
Assert.notNull(order);
return new IndexField(key, order, false);
}
@@ -79,23 +70,12 @@ public final class IndexField {
}
/**
* Returns the direction of the {@link IndexField} or {@literal null} in case we have a geo index field.
* Returns the order of the {@link IndexField} or {@literal null} in case we have a geo index field.
*
* @deprecated use {@link #getDirection()} instead.
* @return the direction
* @return the order
*/
@Deprecated
public Order getOrder() {
return Direction.ASC.equals(direction) ? Order.ASCENDING : Order.DESCENDING;
}
/**
* Returns the direction of the {@link IndexField} or {@literal null} in case we have a geo index field.
*
* @return the direction
*/
public Direction getDirection() {
return direction;
return order;
}
/**
@@ -124,8 +104,7 @@ public final class IndexField {
IndexField that = (IndexField) obj;
return this.key.equals(that.key) && ObjectUtils.nullSafeEquals(this.direction, that.direction)
&& this.isGeo == that.isGeo;
return this.key.equals(that.key) && ObjectUtils.nullSafeEquals(this.order, that.order) && this.isGeo == that.isGeo;
}
/*
@@ -137,7 +116,7 @@ public final class IndexField {
int result = 17;
result += 31 * ObjectUtils.nullSafeHashCode(key);
result += 31 * ObjectUtils.nullSafeHashCode(direction);
result += 31 * ObjectUtils.nullSafeHashCode(order);
result += 31 * ObjectUtils.nullSafeHashCode(isGeo);
return result;
}
@@ -148,6 +127,6 @@ public final class IndexField {
*/
@Override
public String toString() {
return String.format("IndexField [ key: %s, direction: %s, isGeo: %s]", key, direction, isGeo);
return String.format("IndexField [ key: %s, order: %s, isGeo: %s]", key, order, isGeo);
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright (c) 2011 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -13,6 +13,7 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.index;
import java.lang.annotation.ElementType;
@@ -23,9 +24,7 @@ import java.lang.annotation.Target;
/**
* Mark a field to be indexed using MongoDB's indexing feature.
*
* @author Jon Brisbin
* @author Oliver Gierke
* @author Philipp Schneider
* @author Jon Brisbin <jbrisbin@vmware.com>
*/
@Target(ElementType.FIELD)
@Retention(RetentionPolicy.RUNTIME)
@@ -42,12 +41,4 @@ public @interface Indexed {
String name() default "";
String collection() default "";
/**
* If {@literal true} the index will be created in the background.
*
* @see http://docs.mongodb.org/manual/core/indexes/#background-construction
* @return
*/
boolean background() default false;
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -43,7 +43,6 @@ import com.mongodb.util.JSON;
*
* @author Jon Brisbin
* @author Oliver Gierke
* @author Philipp Schneider
*/
public class MongoPersistentEntityIndexCreator implements
ApplicationListener<MappingContextEvent<MongoPersistentEntity<?>, MongoPersistentProperty>> {
@@ -107,8 +106,7 @@ public class MongoPersistentEntityIndexCreator implements
String indexColl = StringUtils.hasText(index.collection()) ? index.collection() : entity.getCollection();
DBObject definition = (DBObject) JSON.parse(index.def());
ensureIndex(indexColl, index.name(), definition, index.unique(), index.dropDups(), index.sparse(),
index.background());
ensureIndex(indexColl, index.name(), definition, index.unique(), index.dropDups(), index.sparse());
if (log.isDebugEnabled()) {
log.debug("Created compound index " + index);
@@ -142,8 +140,7 @@ public class MongoPersistentEntityIndexCreator implements
int direction = index.direction() == IndexDirection.ASCENDING ? 1 : -1;
DBObject definition = new BasicDBObject(persistentProperty.getFieldName(), direction);
ensureIndex(collection, name, definition, index.unique(), index.dropDups(), index.sparse(),
index.background());
ensureIndex(collection, name, definition, index.unique(), index.dropDups(), index.sparse());
if (log.isDebugEnabled()) {
log.debug("Created property index " + index);
@@ -194,14 +191,13 @@ public class MongoPersistentEntityIndexCreator implements
* @param sparse sparse or not
*/
protected void ensureIndex(String collection, String name, DBObject indexDefinition, boolean unique,
boolean dropDups, boolean sparse, boolean background) {
boolean dropDups, boolean sparse) {
DBObject opts = new BasicDBObject();
opts.put("name", name);
opts.put("dropDups", dropDups);
opts.put("sparse", sparse);
opts.put("unique", unique);
opts.put("background", background);
mongoDbFactory.getDb().getCollection(collection).ensureIndex(indexDefinition, opts);
}

View File

@@ -0,0 +1,5 @@
/**
* Support for MongoDB document indexing.
*/
package org.springframework.data.mongodb.core.index;

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -16,19 +16,13 @@
package org.springframework.data.mongodb.core.mapping;
import java.util.Comparator;
import java.util.HashMap;
import java.util.Map;
import org.springframework.beans.BeansException;
import org.springframework.context.ApplicationContext;
import org.springframework.context.ApplicationContextAware;
import org.springframework.context.expression.BeanFactoryAccessor;
import org.springframework.context.expression.BeanFactoryResolver;
import org.springframework.data.mapping.Association;
import org.springframework.data.mapping.AssociationHandler;
import org.springframework.data.mapping.PropertyHandler;
import org.springframework.data.mapping.model.BasicPersistentEntity;
import org.springframework.data.mapping.model.MappingException;
import org.springframework.data.mongodb.MongoCollectionUtils;
import org.springframework.data.util.TypeInformation;
import org.springframework.expression.Expression;
@@ -47,7 +41,6 @@ import org.springframework.util.StringUtils;
public class BasicMongoPersistentEntity<T> extends BasicPersistentEntity<T, MongoPersistentProperty> implements
MongoPersistentEntity<T>, ApplicationContextAware {
private static final String AMBIGUOUS_FIELD_MAPPING = "Ambiguous field mapping detected! Both %s and %s map to the same field name %s! Disambiguate using @Field annotation!";
private final String collection;
private final SpelExpressionParser parser;
private final StandardEvaluationContext context;
@@ -96,19 +89,6 @@ public class BasicMongoPersistentEntity<T> extends BasicPersistentEntity<T, Mong
return expression.getValue(context, String.class);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mapping.model.BasicPersistentEntity#verify()
*/
@Override
public void verify() {
AssertFieldNameUniquenessHandler handler = new AssertFieldNameUniquenessHandler();
doWithProperties(handler);
doWithAssociations(handler);
}
/**
* {@link Comparator} implementation inspecting the {@link MongoPersistentProperty}'s order.
*
@@ -135,37 +115,4 @@ public class BasicMongoPersistentEntity<T> extends BasicPersistentEntity<T, Mong
return o1.getFieldOrder() - o2.getFieldOrder();
}
}
/**
* Handler to collect {@link MongoPersistentProperty} instances and check that each of them is mapped to a distinct
* field name.
*
* @author Oliver Gierke
*/
private static class AssertFieldNameUniquenessHandler implements PropertyHandler<MongoPersistentProperty>,
AssociationHandler<MongoPersistentProperty> {
private final Map<String, MongoPersistentProperty> properties = new HashMap<String, MongoPersistentProperty>();
public void doWithPersistentProperty(MongoPersistentProperty persistentProperty) {
assertUniqueness(persistentProperty);
}
public void doWithAssociation(Association<MongoPersistentProperty> association) {
assertUniqueness(association.getInverse());
}
private void assertUniqueness(MongoPersistentProperty property) {
String fieldName = property.getFieldName();
MongoPersistentProperty existingProperty = properties.get(fieldName);
if (existingProperty != null) {
throw new MappingException(String.format(AMBIGUOUS_FIELD_MAPPING, property.toString(),
existingProperty.toString(), fieldName));
}
properties.put(fieldName, property);
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -26,7 +26,6 @@ import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.data.mapping.Association;
import org.springframework.data.mapping.model.AnnotationBasedPersistentProperty;
import org.springframework.data.mapping.model.MappingException;
import org.springframework.data.mapping.model.SimpleTypeHolder;
import org.springframework.util.ReflectionUtils;
import org.springframework.util.StringUtils;
@@ -61,8 +60,6 @@ public class BasicMongoPersistentProperty extends AnnotationBasedPersistentPrope
CAUSE_FIELD = ReflectionUtils.findField(Throwable.class, "cause");
}
private final FieldNamingStrategy fieldNamingStrategy;
/**
* Creates a new {@link BasicMongoPersistentProperty}.
*
@@ -70,14 +67,10 @@ public class BasicMongoPersistentProperty extends AnnotationBasedPersistentPrope
* @param propertyDescriptor
* @param owner
* @param simpleTypeHolder
* @param fieldNamingStrategy
*/
public BasicMongoPersistentProperty(Field field, PropertyDescriptor propertyDescriptor,
MongoPersistentEntity<?> owner, SimpleTypeHolder simpleTypeHolder, FieldNamingStrategy fieldNamingStrategy) {
MongoPersistentEntity<?> owner, SimpleTypeHolder simpleTypeHolder) {
super(field, propertyDescriptor, owner, simpleTypeHolder);
this.fieldNamingStrategy = fieldNamingStrategy == null ? PropertyNameFieldNamingStrategy.INSTANCE
: fieldNamingStrategy;
if (isIdProperty() && getFieldName() != ID_FIELD_NAME) {
LOG.warn("Customizing field name for id property not allowed! Custom name will not be considered!");
@@ -120,20 +113,9 @@ public class BasicMongoPersistentProperty extends AnnotationBasedPersistentPrope
return ID_FIELD_NAME;
}
org.springframework.data.mongodb.core.mapping.Field annotation = findAnnotation(org.springframework.data.mongodb.core.mapping.Field.class);
if (annotation != null && StringUtils.hasText(annotation.value())) {
return annotation.value();
}
String fieldName = fieldNamingStrategy.getFieldName(this);
if (!StringUtils.hasText(fieldName)) {
throw new MappingException(String.format("Invalid (null or empty) field name returned for property %s by %s!",
this, fieldNamingStrategy.getClass()));
}
return fieldName;
org.springframework.data.mongodb.core.mapping.Field annotation = getField().getAnnotation(
org.springframework.data.mongodb.core.mapping.Field.class);
return annotation != null && StringUtils.hasText(annotation.value()) ? annotation.value() : field.getName();
}
/*

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -38,11 +38,10 @@ public class CachingMongoPersistentProperty extends BasicMongoPersistentProperty
* @param propertyDescriptor
* @param owner
* @param simpleTypeHolder
* @param fieldNamingStrategy
*/
public CachingMongoPersistentProperty(Field field, PropertyDescriptor propertyDescriptor,
MongoPersistentEntity<?> owner, SimpleTypeHolder simpleTypeHolder, FieldNamingStrategy fieldNamingStrategy) {
super(field, propertyDescriptor, owner, simpleTypeHolder, fieldNamingStrategy);
MongoPersistentEntity<?> owner, SimpleTypeHolder simpleTypeHolder) {
super(field, propertyDescriptor, owner, simpleTypeHolder);
}
/*

View File

@@ -1,46 +0,0 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.mapping;
import java.util.Locale;
/**
* {@link FieldNamingStrategy} that abbreviates field names by using the very first letter of the camel case parts of
* the {@link MongoPersistentProperty}'s name.
*
* @since 1.3
* @author Oliver Gierke
*/
public class CamelCaseAbbreviatingFieldNamingStrategy implements FieldNamingStrategy {
private static final String CAMEL_CASE_PATTERN = "(?<!(^|[A-Z]))(?=[A-Z])|(?<!^)(?=[A-Z][a-z])";
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.FieldNamingStrategy#getFieldName(org.springframework.data.mongodb.core.mapping.MongoPersistentProperty)
*/
public String getFieldName(MongoPersistentProperty property) {
String[] parts = property.getName().split(CAMEL_CASE_PATTERN);
StringBuilder builder = new StringBuilder();
for (String part : parts) {
builder.append(part.substring(0, 1).toLowerCase(Locale.US));
}
return builder.toString();
}
}

View File

@@ -1,36 +0,0 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.mapping;
/**
* SPI interface to determine how to name document fields in cases the field name is not manually defined.
*
* @see Field
* @see PropertyNameFieldNamingStrategy
* @see CamelCaseAbbreviatingFieldNamingStrategy
* @since 1.3
* @author Oliver Gierke
*/
public interface FieldNamingStrategy {
/**
* Returns the field name to be used for the given {@link MongoPersistentProperty}.
*
* @param property must not be {@literal null} or empty;
* @return
*/
String getFieldName(MongoPersistentProperty property);
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -37,9 +37,6 @@ import org.springframework.data.util.TypeInformation;
public class MongoMappingContext extends AbstractMappingContext<BasicMongoPersistentEntity<?>, MongoPersistentProperty>
implements ApplicationContextAware {
private static final FieldNamingStrategy DEFAULT_NAMING_STRATEGY = PropertyNameFieldNamingStrategy.INSTANCE;
private FieldNamingStrategy fieldNamingStrategy = DEFAULT_NAMING_STRATEGY;
private ApplicationContext context;
/**
@@ -49,17 +46,6 @@ public class MongoMappingContext extends AbstractMappingContext<BasicMongoPersis
setSimpleTypeHolder(MongoSimpleTypes.HOLDER);
}
/**
* Configures the {@link FieldNamingStrategy} to be used to determine the field name if no manual mapping is applied.
* Defaults to a strategy using the plain property name.
*
* @param fieldNamingStrategy the {@link FieldNamingStrategy} to be used to determine the field name if no manual
* mapping is applied.
*/
public void setFieldNamingStrategy(FieldNamingStrategy fieldNamingStrategy) {
this.fieldNamingStrategy = fieldNamingStrategy == null ? DEFAULT_NAMING_STRATEGY : fieldNamingStrategy;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mapping.context.AbstractMappingContext#shouldCreatePersistentEntityFor(org.springframework.data.util.TypeInformation)
@@ -76,7 +62,7 @@ public class MongoMappingContext extends AbstractMappingContext<BasicMongoPersis
@Override
public MongoPersistentProperty createPersistentProperty(Field field, PropertyDescriptor descriptor,
BasicMongoPersistentEntity<?> owner, SimpleTypeHolder simpleTypeHolder) {
return new CachingMongoPersistentProperty(field, descriptor, owner, simpleTypeHolder, fieldNamingStrategy);
return new CachingMongoPersistentProperty(field, descriptor, owner, simpleTypeHolder);
}
/*
@@ -101,6 +87,8 @@ public class MongoMappingContext extends AbstractMappingContext<BasicMongoPersis
*/
@Override
public void setApplicationContext(ApplicationContext applicationContext) throws BeansException {
this.context = applicationContext;
super.setApplicationContext(applicationContext);
}
}

View File

@@ -1,35 +0,0 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.mapping;
/**
* {@link FieldNamingStrategy} simply using the {@link MongoPersistentProperty}'s name.
*
* @since 1.3
* @author Oliver Gierke
*/
public enum PropertyNameFieldNamingStrategy implements FieldNamingStrategy {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.FieldNamingStrategy#getFieldName(org.springframework.data.mongodb.core.mapping.MongoPersistentProperty)
*/
public String getFieldName(MongoPersistentProperty property) {
return property.getName();
}
}

View File

@@ -0,0 +1,167 @@
/*
* Copyright 2012-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.mapping;
import java.beans.PropertyDescriptor;
import java.lang.reflect.Field;
import java.util.Arrays;
import java.util.List;
import org.springframework.data.mapping.Association;
import org.springframework.data.mapping.context.AbstractMappingContext;
import org.springframework.data.mapping.model.AbstractPersistentProperty;
import org.springframework.data.mapping.model.BasicPersistentEntity;
import org.springframework.data.mapping.model.SimpleTypeHolder;
import org.springframework.data.mongodb.MongoCollectionUtils;
import org.springframework.data.util.TypeInformation;
/**
* @deprecated use {@link MongoMappingContext} instead.
* @author Oliver Gierke
*/
@Deprecated
public class SimpleMongoMappingContext extends
AbstractMappingContext<SimpleMongoMappingContext.SimpleMongoPersistentEntity<?>, MongoPersistentProperty> {
/*
* (non-Javadoc)
* @see org.springframework.data.mapping.context.AbstractMappingContext#createPersistentEntity(org.springframework.data.util.TypeInformation)
*/
@Override
protected <T> SimpleMongoPersistentEntity<T> createPersistentEntity(TypeInformation<T> typeInformation) {
return new SimpleMongoPersistentEntity<T>(typeInformation);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mapping.context.AbstractMappingContext#createPersistentProperty(java.lang.reflect.Field, java.beans.PropertyDescriptor, org.springframework.data.mapping.model.MutablePersistentEntity, org.springframework.data.mapping.model.SimpleTypeHolder)
*/
@Override
protected SimplePersistentProperty createPersistentProperty(Field field, PropertyDescriptor descriptor,
SimpleMongoPersistentEntity<?> owner, SimpleTypeHolder simpleTypeHolder) {
return new SimplePersistentProperty(field, descriptor, owner, simpleTypeHolder);
}
static class SimplePersistentProperty extends AbstractPersistentProperty<MongoPersistentProperty> implements
MongoPersistentProperty {
private static final List<String> ID_FIELD_NAMES = Arrays.asList("id", "_id");
/**
* Creates a new {@link SimplePersistentProperty}.
*
* @param field
* @param propertyDescriptor
* @param information
*/
public SimplePersistentProperty(Field field, PropertyDescriptor propertyDescriptor, MongoPersistentEntity<?> owner,
SimpleTypeHolder simpleTypeHolder) {
super(field, propertyDescriptor, owner, simpleTypeHolder);
}
/* (non-Javadoc)
* @see org.springframework.data.mapping.BasicPersistentProperty#isIdProperty()
*/
public boolean isIdProperty() {
return ID_FIELD_NAMES.contains(field.getName());
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.mapping.MongoPersistentProperty#getKey()
*/
public String getFieldName() {
return isIdProperty() ? "_id" : getName();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.mapping.MongoPersistentProperty#getFieldOrder()
*/
public int getFieldOrder() {
return Integer.MAX_VALUE;
}
/* (non-Javadoc)
* @see org.springframework.data.mapping.AbstractPersistentProperty#createAssociation()
*/
@Override
protected Association<MongoPersistentProperty> createAssociation() {
return new Association<MongoPersistentProperty>(this, null);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.mapping.MongoPersistentProperty#isDbReference()
*/
public boolean isDbReference() {
return false;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.mapping.MongoPersistentProperty#getDBRef()
*/
public DBRef getDBRef() {
return null;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.mapping.MongoPersistentProperty#isVersion()
*/
public boolean isVersionProperty() {
return false;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.MongoPersistentProperty#usePropertyAccess()
*/
public boolean usePropertyAccess() {
return false;
}
}
static class SimpleMongoPersistentEntity<T> extends BasicPersistentEntity<T, MongoPersistentProperty> implements
MongoPersistentEntity<T> {
/**
* @param information
*/
public SimpleMongoPersistentEntity(TypeInformation<T> information) {
super(information);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.mapping.MongoPersistentEntity#getCollection()
*/
public String getCollection() {
return MongoCollectionUtils.getPreferredCollectionName(getType());
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.mapping.MongoPersistentEntity#getVersionProperty()
*/
public MongoPersistentProperty getVersionProperty() {
return null;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.MongoPersistentEntity#hasVersionProperty()
*/
public boolean hasVersionProperty() {
return false;
}
}
}

View File

@@ -1,50 +0,0 @@
/*
* Copyright 2013 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.mapping.event;
import com.mongodb.DBObject;
/**
* Base class for delete events.
*
* @author Martin Baumgartner
*/
public abstract class AbstractDeleteEvent<T> extends MongoMappingEvent<DBObject> {
private static final long serialVersionUID = 1L;
private final Class<T> type;
/**
* Creates a new {@link AbstractDeleteEvent} for the given {@link DBObject} and type.
*
* @param dbo must not be {@literal null}.
* @param type , possibly be {@literal null}.
*/
public AbstractDeleteEvent(DBObject dbo, Class<T> type) {
super(dbo, dbo);
this.type = type;
}
/**
* Returns the type for which the {@link AbstractDeleteEvent} shall be invoked for.
*
* @return
*/
public Class<T> getType() {
return type;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 by the original author(s).
* Copyright 2011 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -27,7 +27,6 @@ import com.mongodb.DBObject;
*
* @author Jon Brisbin
* @author Oliver Gierke
* @author Martin Baumgartner
*/
public abstract class AbstractMongoEventListener<E> implements ApplicationListener<MongoMappingEvent<?>> {
@@ -46,7 +45,6 @@ public abstract class AbstractMongoEventListener<E> implements ApplicationListen
* (non-Javadoc)
* @see org.springframework.context.ApplicationListener#onApplicationEvent(org.springframework.context.ApplicationEvent)
*/
@SuppressWarnings("rawtypes")
public void onApplicationEvent(MongoMappingEvent<?> event) {
if (event instanceof AfterLoadEvent) {
@@ -59,22 +57,6 @@ public abstract class AbstractMongoEventListener<E> implements ApplicationListen
return;
}
if (event instanceof AbstractDeleteEvent) {
Class<?> eventDomainType = ((AbstractDeleteEvent) event).getType();
if (eventDomainType != null && domainClass.isAssignableFrom(eventDomainType)) {
if (event instanceof BeforeDeleteEvent) {
onBeforeDelete(event.getDBObject());
}
if (event instanceof AfterDeleteEvent) {
onAfterDelete(event.getDBObject());
}
}
return;
}
@SuppressWarnings("unchecked")
E source = (E) event.getSource();
@@ -96,43 +78,31 @@ public abstract class AbstractMongoEventListener<E> implements ApplicationListen
public void onBeforeConvert(E source) {
if (LOG.isDebugEnabled()) {
LOG.debug("onBeforeConvert({})", source);
LOG.debug("onBeforeConvert(" + source + ")");
}
}
public void onBeforeSave(E source, DBObject dbo) {
if (LOG.isDebugEnabled()) {
LOG.debug("onBeforeSave({}, {})", source, dbo);
LOG.debug("onBeforeSave(" + source + ", " + dbo + ")");
}
}
public void onAfterSave(E source, DBObject dbo) {
if (LOG.isDebugEnabled()) {
LOG.debug("onAfterSave({}, {})", source, dbo);
LOG.debug("onAfterSave(" + source + ", " + dbo + ")");
}
}
public void onAfterLoad(DBObject dbo) {
if (LOG.isDebugEnabled()) {
LOG.debug("onAfterLoad({})", dbo);
LOG.debug("onAfterLoad(" + dbo + ")");
}
}
public void onAfterConvert(DBObject dbo, E source) {
if (LOG.isDebugEnabled()) {
LOG.debug("onAfterConvert({}, {})", dbo, source);
}
}
public void onAfterDelete(DBObject dbo) {
if (LOG.isDebugEnabled()) {
LOG.debug("onAfterConvert({})", dbo);
}
}
public void onBeforeDelete(DBObject dbo) {
if (LOG.isDebugEnabled()) {
LOG.debug("onAfterConvert({})", dbo);
LOG.debug("onAfterConvert(" + dbo + "," + source + ")");
}
}
}

View File

@@ -1,39 +0,0 @@
/*
* Copyright 2013 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.mapping.event;
import com.mongodb.DBObject;
/**
* Event being thrown after a single or a set of documents has/have been deleted. The {@link DBObject} held in the event
* will be the query document <em>after</am> it has been mapped onto the domain type handled.
*
* @author Martin Baumgartner
*/
public class AfterDeleteEvent<T> extends AbstractDeleteEvent<T> {
private static final long serialVersionUID = 1L;
/**
* Creates a new {@link AfterDeleteEvent} for the given {@link DBObject} and type.
*
* @param dbo must not be {@literal null}.
* @param type can be {@literal null}.
*/
public AfterDeleteEvent(DBObject dbo, Class<T> type) {
super(dbo, type);
}
}

View File

@@ -1,39 +0,0 @@
/*
* Copyright 2013 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.mapping.event;
import com.mongodb.DBObject;
/**
* Event being thrown before a document is deleted. The {@link DBObject} held in the event will represent the query
* document <em>before</em> being mapped based on the domain class handled.
*
* @author Martin Baumgartner
*/
public class BeforeDeleteEvent<T> extends AbstractDeleteEvent<T> {
private static final long serialVersionUID = -2627547705679734497L;
/**
* Creates a new {@link BeforeDeleteEvent} for the given {@link DBObject} and type.
*
* @param dbo must not be {@literal null}.
* @param type can be {@literal null}.
*/
public BeforeDeleteEvent(DBObject dbo, Class<T> type) {
super(dbo, type);
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -17,19 +17,15 @@ package org.springframework.data.mongodb.core.mapping.event;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.context.ApplicationListener;
import com.mongodb.DBObject;
/**
* {@link ApplicationListener} for Mongo mapping events logging the events.
*
* @author Jon Brisbin
* @author Martin Baumgartner
* @author Jon Brisbin <jbrisbin@vmware.com>
*/
public class LoggingEventListener extends AbstractMongoEventListener<Object> {
private static final Logger LOGGER = LoggerFactory.getLogger(LoggingEventListener.class);
private static final Logger log = LoggerFactory.getLogger(LoggingEventListener.class);
/*
* (non-Javadoc)
@@ -37,7 +33,7 @@ public class LoggingEventListener extends AbstractMongoEventListener<Object> {
*/
@Override
public void onBeforeConvert(Object source) {
LOGGER.info("onBeforeConvert: {}", source);
log.info("onBeforeConvert: " + source);
}
/*
@@ -46,7 +42,10 @@ public class LoggingEventListener extends AbstractMongoEventListener<Object> {
*/
@Override
public void onBeforeSave(Object source, DBObject dbo) {
LOGGER.info("onBeforeSave: {}, {}", source, dbo);
try {
log.info("onBeforeSave: " + source + ", " + dbo);
} catch (Throwable ignored) {
}
}
/*
@@ -55,7 +54,7 @@ public class LoggingEventListener extends AbstractMongoEventListener<Object> {
*/
@Override
public void onAfterSave(Object source, DBObject dbo) {
LOGGER.info("onAfterSave: {}, {}", source, dbo);
log.info("onAfterSave: " + source + ", " + dbo);
}
/*
@@ -64,7 +63,7 @@ public class LoggingEventListener extends AbstractMongoEventListener<Object> {
*/
@Override
public void onAfterLoad(DBObject dbo) {
LOGGER.info("onAfterLoad: {}", dbo);
log.info("onAfterLoad: " + dbo);
}
/*
@@ -73,24 +72,6 @@ public class LoggingEventListener extends AbstractMongoEventListener<Object> {
*/
@Override
public void onAfterConvert(DBObject dbo, Object source) {
LOGGER.info("onAfterConvert: {}, {}", dbo, source);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.event.AbstractMongoEventListener#onAfterDelete(com.mongodb.DBObject)
*/
@Override
public void onAfterDelete(DBObject dbo) {
LOGGER.info("onAfterDelete: {}", dbo);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.event.AbstractMongoEventListener#onBeforeDelete(com.mongodb.DBObject)
*/
@Override
public void onBeforeDelete(DBObject dbo) {
LOGGER.info("onBeforeDelete: {}", dbo);
log.info("onAfterConvert: " + dbo + ", " + source);
}
}

View File

@@ -0,0 +1,5 @@
/**
* Mapping event callback infrastructure for the MongoDB document-to-object mapping subsystem.
*/
package org.springframework.data.mongodb.core.mapping.event;

View File

@@ -0,0 +1,5 @@
/**
* Infrastructure for the MongoDB document-to-object mapping subsystem.
*/
package org.springframework.data.mongodb.core.mapping;

View File

@@ -0,0 +1,5 @@
/**
* Support for MongoDB map-reduce operations.
*/
package org.springframework.data.mongodb.core.mapreduce;

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2010-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -39,6 +39,10 @@ import com.mongodb.DBObject;
/**
* Central class for creating queries. It follows a fluent API style so that you can easily chain together multiple
* criteria. Static import of the 'Criteria.where' method will improve readability.
*
* @author Thomas Risberg
* @author Oliver Gierke
* @author Thomas Darimont
*/
public class Criteria implements CriteriaDefinition {
@@ -396,34 +400,54 @@ public class Criteria implements CriteriaDefinition {
/**
* Creates an 'or' criteria using the $or operator for all of the provided criteria
* <p>
* Note that mongodb doesn't support an $or operator to be wrapped in a $not operator.
* <p>
*
* @throws IllegalArgumentException if {@link #orOperator(Criteria...)} follows a not() call directly.
* @param criteria
*/
public Criteria orOperator(Criteria... criteria) {
BasicDBList bsonList = createCriteriaList(criteria);
criteriaChain.add(new Criteria("$or").is(bsonList));
return this;
return registerCriteriaChainElement(new Criteria("$or").is(bsonList));
}
/**
* Creates a 'nor' criteria using the $nor operator for all of the provided criteria
* Creates a 'nor' criteria using the $nor operator for all of the provided criteria.
* <p>
* Note that mongodb doesn't support an $nor operator to be wrapped in a $not operator.
* <p>
*
* @throws IllegalArgumentException if {@link #norOperator(Criteria...)} follows a not() call directly.
* @param criteria
*/
public Criteria norOperator(Criteria... criteria) {
BasicDBList bsonList = createCriteriaList(criteria);
criteriaChain.add(new Criteria("$nor").is(bsonList));
return this;
return registerCriteriaChainElement(new Criteria("$nor").is(bsonList));
}
/**
* Creates an 'and' criteria using the $and operator for all of the provided criteria
* Creates an 'and' criteria using the $and operator for all of the provided criteria.
* <p>
* Note that mongodb doesn't support an $and operator to be wrapped in a $not operator.
* <p>
*
* @throws IllegalArgumentException if {@link #andOperator(Criteria...)} follows a not() call directly.
* @param criteria
*/
public Criteria andOperator(Criteria... criteria) {
BasicDBList bsonList = createCriteriaList(criteria);
criteriaChain.add(new Criteria("$and").is(bsonList));
return registerCriteriaChainElement(new Criteria("$and").is(bsonList));
}
private Criteria registerCriteriaChainElement(Criteria criteria) {
if (lastOperatorWasNot()) {
throw new IllegalArgumentException("operator $not is not allowed around criteria chain element: "
+ criteria.getCriteriaObject());
} else {
criteriaChain.add(criteria);
}
return this;
}
@@ -468,6 +492,7 @@ public class Criteria implements CriteriaDefinition {
}
}
}
DBObject queryCriteria = new BasicDBObject();
if (isValue != NOT_SET) {
queryCriteria.put(this.key, this.isValue);

View File

@@ -17,26 +17,16 @@ package org.springframework.data.mongodb.core.query;
import java.util.HashMap;
import java.util.Map;
import java.util.Map.Entry;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* @author Thomas Risberg
* @author Oliver Gierke
* @author Patryk Wasik
*/
public class Field {
private final Map<String, Integer> criteria = new HashMap<String, Integer>();
private final Map<String, Object> slices = new HashMap<String, Object>();
private final Map<String, Criteria> elemMatchs = new HashMap<String, Criteria>();
private String postionKey;
private int positionValue;
public Field include(String key) {
criteria.put(key, Integer.valueOf(1));
@@ -58,50 +48,14 @@ public class Field {
return this;
}
public Field elemMatch(String key, Criteria elemMatchCriteria) {
elemMatchs.put(key, elemMatchCriteria);
return this;
}
/**
* The array field must appear in the query. Only one positional {@code $} operator can appear in the projection and
* only one array field can appear in the query.
*
* @param field query array field, must not be {@literal null} or empty.
* @param value
* @return
*/
public Field position(String field, int value) {
Assert.hasText(field, "Field must not be null or empty!");
postionKey = field;
positionValue = value;
return this;
}
public DBObject getFieldsObject() {
DBObject dbo = new BasicDBObject();
for (String k : criteria.keySet()) {
dbo.put(k, criteria.get(k));
}
for (String k : slices.keySet()) {
dbo.put(k, new BasicDBObject("$slice", slices.get(k)));
}
for (Entry<String, Criteria> entry : elemMatchs.entrySet()) {
DBObject dbObject = new BasicDBObject("$elemMatch", entry.getValue().getCriteriaObject());
dbo.put(entry.getKey(), dbObject);
}
if (postionKey != null) {
dbo.put(postionKey + ".$", positionValue);
}
return dbo;
}
@@ -130,15 +84,7 @@ public class Field {
return false;
}
if (!this.elemMatchs.equals(that.elemMatchs)) {
return false;
}
boolean samePositionKey = this.postionKey == null ? that.postionKey == null : this.postionKey
.equals(that.postionKey);
boolean samePositionValue = this.positionValue == that.positionValue;
return samePositionKey && samePositionValue;
return true;
}
/*
@@ -151,10 +97,7 @@ public class Field {
int result = 17;
result += 31 * ObjectUtils.nullSafeHashCode(this.criteria);
result += 31 * ObjectUtils.nullSafeHashCode(this.elemMatchs);
result += 31 * ObjectUtils.nullSafeHashCode(this.slices);
result += 31 * ObjectUtils.nullSafeHashCode(this.postionKey);
result += 31 * ObjectUtils.nullSafeHashCode(this.positionValue);
return result;
}

View File

@@ -15,6 +15,7 @@
*/
package org.springframework.data.mongodb.core.query;
import org.springframework.data.domain.Pageable;
import org.springframework.data.mongodb.core.geo.CustomMetric;
import org.springframework.data.mongodb.core.geo.Distance;
import org.springframework.data.mongodb.core.geo.Metric;
@@ -29,6 +30,7 @@ import com.mongodb.DBObject;
* Builder class to build near-queries.
*
* @author Oliver Gierke
* @author Thomas Darimont
*/
public class NearQuery {
@@ -38,6 +40,7 @@ public class NearQuery {
private Metric metric;
private boolean spherical;
private Integer num;
private Integer skip;
/**
* Creates a new {@link NearQuery}.
@@ -116,7 +119,7 @@ public class NearQuery {
}
/**
* Configures the number of results to return.
* Configures the maximum number of results to return.
*
* @param num
* @return
@@ -126,6 +129,29 @@ public class NearQuery {
return this;
}
/**
* Configures the number of results to skip.
*
* @param skip
* @return
*/
public NearQuery skip(int skip) {
this.skip = skip;
return this;
}
/**
* Configures the {@link Pageable} to use.
*
* @param pageable
* @return
*/
public NearQuery with(Pageable pageable) {
this.num = pageable.getOffset() + pageable.getPageSize();
this.skip = pageable.getOffset();
return this;
}
/**
* Sets the max distance results shall have from the configured origin. If a {@link Metric} was set before the given
* value will be interpreted as being a value in that metric. E.g.
@@ -290,9 +316,18 @@ public class NearQuery {
*/
public NearQuery query(Query query) {
this.query = query;
this.skip = query.getSkip();
this.num = query.getLimit();
return this;
}
/**
* @return the number of elements to skip.
*/
public Integer getSkip() {
return skip;
}
/**
* Returns the {@link DBObject} built by the {@link NearQuery}.
*

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2013 the original author or authors.
* Copyright 2010-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,31 +15,11 @@
*/
package org.springframework.data.mongodb.core.query;
import org.springframework.data.domain.Sort.Direction;
/**
* An enum that specifies the ordering for sort or index specifications
*
* @deprecated prefer {@link Direction}
* @author Thomas Risberg
* @author Oliver Gierke
* @author trisberg
*/
@Deprecated
public enum Order {
ASCENDING {
@Override
public Direction toDirection() {
return Direction.ASC;
}
},
DESCENDING {
@Override
public Direction toDirection() {
return Direction.DESC;
}
};
public abstract Direction toDirection();
ASCENDING, DESCENDING
}

View File

@@ -24,7 +24,6 @@ import java.util.List;
import org.springframework.data.domain.Pageable;
import org.springframework.data.domain.Sort;
import org.springframework.data.domain.Sort.Order;
import org.springframework.data.mongodb.InvalidMongoDbApiUsageException;
import org.springframework.util.Assert;
@@ -39,7 +38,9 @@ public class Query {
private LinkedHashMap<String, Criteria> criteria = new LinkedHashMap<String, Criteria>();
private Field fieldSpec;
private Sort sort;
private Sort coreSort;
@SuppressWarnings("deprecation")
private org.springframework.data.mongodb.core.query.Sort sort;
private int skip;
private int limit;
private String hint;
@@ -114,6 +115,21 @@ public class Query {
return this;
}
/**
* Returns a {@link org.springframework.data.mongodb.core.query.Sort} instance to define ordering properties.
*
* @deprecated use {@link #with(Sort)} instead
* @return
*/
@Deprecated
public org.springframework.data.mongodb.core.query.Sort sort() {
if (this.sort == null) {
this.sort = new org.springframework.data.mongodb.core.query.Sort();
}
return this.sort;
}
/**
* Sets the given pagination information on the {@link Query} instance. Will transparently set {@code skip} and
* {@code limit} as well as applying the {@link Sort} instance defined with the {@link Pageable}.
@@ -145,17 +161,10 @@ public class Query {
return this;
}
for (Order order : sort) {
if (order.isIgnoreCase()) {
throw new IllegalArgumentException(String.format("Gven sort contained an Order for %s with ignore case! "
+ "MongoDB does not support sorting ignoreing case currently!", order.getProperty()));
}
}
if (this.sort == null) {
this.sort = sort;
if (this.coreSort == null) {
this.coreSort = sort;
} else {
this.sort = this.sort.and(sort);
this.coreSort = this.coreSort.and(sort);
}
return this;
@@ -178,20 +187,25 @@ public class Query {
return fieldSpec.getFieldsObject();
}
@SuppressWarnings("deprecation")
public DBObject getSortObject() {
if (this.sort == null && this.sort == null) {
if (this.coreSort == null && this.sort == null) {
return null;
}
DBObject dbo = new BasicDBObject();
if (this.sort != null) {
for (org.springframework.data.domain.Sort.Order order : this.sort) {
if (this.coreSort != null) {
for (org.springframework.data.domain.Sort.Order order : this.coreSort) {
dbo.put(order.getProperty(), order.isAscending() ? 1 : -1);
}
}
if (this.sort != null) {
dbo.putAll(this.sort.getSortObject());
}
return dbo;
}

View File

@@ -0,0 +1,56 @@
/*
* Copyright 2010-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.query;
import java.util.LinkedHashMap;
import java.util.Map;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Helper class to define sorting criterias for a Query instance.
*
* @author Thomas Risberg
* @author Oliver Gierke
* @deprecated use {@link org.springframework.data.domain.Sort} instead. See
* {@link Query#with(org.springframework.data.domain.Sort)}.
*/
@Deprecated
public class Sort {
private Map<String, Order> fieldSpec = new LinkedHashMap<String, Order>();
public Sort() {
}
public Sort(String key, Order order) {
fieldSpec.put(key, order);
}
public Sort on(String key, Order order) {
fieldSpec.put(key, order);
return this;
}
public DBObject getSortObject() {
DBObject dbo = new BasicDBObject();
for (String k : fieldSpec.keySet()) {
dbo.put(k, fieldSpec.get(k).equals(Order.ASCENDING) ? 1 : -1);
}
return dbo;
}
}

View File

@@ -217,22 +217,23 @@ public class Update {
return dbo;
}
@SuppressWarnings("unchecked")
protected void addMultiFieldOperation(String operator, String key, Object value) {
Object existingValue = this.modifierOps.get(operator);
LinkedHashMap<String, Object> keyValueMap;
DBObject keyValueMap;
if (existingValue == null) {
keyValueMap = new LinkedHashMap<String, Object>();
keyValueMap = new BasicDBObject();
this.modifierOps.put(operator, keyValueMap);
} else {
if (existingValue instanceof LinkedHashMap) {
keyValueMap = (LinkedHashMap<String, Object>) existingValue;
if (existingValue instanceof BasicDBObject) {
keyValueMap = (BasicDBObject) existingValue;
} else {
throw new InvalidDataAccessApiUsageException("Modifier Operations should be a LinkedHashMap but was "
+ existingValue.getClass());
}
}
keyValueMap.put(key, value);
}
}

View File

@@ -0,0 +1,5 @@
/**
* Support for MongoDB GridFS feature.
*/
package org.springframework.data.mongodb.gridfs;

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2002-2011 the original author or authors.
* Copyright 2002-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,19 +15,20 @@
*/
package org.springframework.data.mongodb.monitor;
import java.net.InetAddress;
import java.net.UnknownHostException;
import com.mongodb.Mongo;
import org.springframework.jmx.export.annotation.ManagedMetric;
import org.springframework.jmx.export.annotation.ManagedOperation;
import org.springframework.jmx.export.annotation.ManagedResource;
import org.springframework.jmx.support.MetricType;
import com.mongodb.Mongo;
/**
* Expose basic server information via JMX
*
* @author Mark Pollack
* @author Thomas Darimont
*/
@ManagedResource(description = "Server Information")
public class ServerInfo extends AbstractMonitor {
@@ -36,9 +37,20 @@ public class ServerInfo extends AbstractMonitor {
this.mongo = mongo;
}
/**
* Returns the hostname of the used server reported by mongo.
*
* @return the reported hostname can also be an IP address.
* @throws UnknownHostException
*/
@ManagedOperation(description = "Server host name")
public String getHostName() throws UnknownHostException {
return InetAddress.getLocalHost().getHostName();
/*
* UnknownHostException is not necessary anymore, but clients could have
* called this method in a try..catch(UnknownHostException) already
*/
return getServerStatus().getServerUsed().getHost();
}
@ManagedMetric(displayName = "Uptime Estimate")

View File

@@ -0,0 +1,5 @@
/**
* Spring Data's MongoDB abstraction.
*/
package org.springframework.data.mongodb;

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,11 +15,7 @@
*/
package org.springframework.data.mongodb.repository;
import java.lang.annotation.Documented;
import java.lang.annotation.ElementType;
import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;
import java.lang.annotation.Target;
import java.lang.annotation.*;
/**
* Annotation to declare finder queries directly on repository methods. Both attributes allow using a placeholder
@@ -47,12 +43,4 @@ public @interface Query {
* @return
*/
String fields() default "";
/**
* Returns whether the query defined should be executed as count projection.
*
* @since 1.3
* @return
*/
boolean count() default false;
}

View File

@@ -53,6 +53,14 @@ public class MongoRepositoryBean<T> extends CdiRepositoryBean<T> {
this.operations = operations;
}
/*
* (non-Javadoc)
* @see javax.enterprise.inject.spi.Bean#getScope()
*/
public Class<? extends Annotation> getScope() {
return operations.getScope();
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.cdi.CdiRepositoryBean#create(javax.enterprise.context.spi.CreationalContext, java.lang.Class)

View File

@@ -0,0 +1,5 @@
/**
* CDI support for MongoDB specific repository implementation.
*/
package org.springframework.data.mongodb.repository.cdi;

View File

@@ -0,0 +1,5 @@
/**
* Support infrastructure for the configuration of MongoDB specific repositories.
*/
package org.springframework.data.mongodb.repository.config;

View File

@@ -17,8 +17,6 @@ package org.springframework.data.mongodb.repository.query;
import java.util.List;
import org.springframework.core.convert.ConversionService;
import org.springframework.core.convert.support.DefaultConversionService;
import org.springframework.data.domain.PageImpl;
import org.springframework.data.domain.Pageable;
import org.springframework.data.mongodb.core.MongoOperations;
@@ -38,11 +36,10 @@ import org.springframework.util.Assert;
* Base class for {@link RepositoryQuery} implementations for Mongo.
*
* @author Oliver Gierke
* @author Thomas Darimont
*/
public abstract class AbstractMongoQuery implements RepositoryQuery {
private static final ConversionService CONVERSION_SERVICE = new DefaultConversionService();
private final MongoQueryMethod method;
private final MongoOperations operations;
@@ -90,23 +87,19 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
return new CollectionExecution(accessor.getPageable()).execute(query);
} else if (method.isPageQuery()) {
return new PagedExecution(accessor.getPageable()).execute(query);
} else {
return new SingleEntityExecution().execute(query);
}
Object result = new SingleEntityExecution(isCountQuery()).execute(query);
if (result == null) {
return result;
}
Class<?> expectedReturnType = method.getReturnType().getType();
if (expectedReturnType.isAssignableFrom(result.getClass())) {
return result;
}
return CONVERSION_SERVICE.convert(result, expectedReturnType);
}
/**
* Creates a {@link Query} instance using the given {@link ParameterAccessor}
*
* @param accessor must not be {@literal null}.
* @return
*/
protected abstract Query createQuery(ConvertingParameterAccessor accessor);
/**
* Creates a {@link Query} instance using the given {@link ConvertingParameterAccessor}. Will delegate to
* {@link #createQuery(ConvertingParameterAccessor)} by default but allows customization of the count query to be
@@ -119,21 +112,6 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
return createQuery(accessor);
}
/**
* Creates a {@link Query} instance using the given {@link ParameterAccessor}
*
* @param accessor must not be {@literal null}.
* @return
*/
protected abstract Query createQuery(ConvertingParameterAccessor accessor);
/**
* Returns whether the query should get a count projection applied.
*
* @return
*/
protected abstract boolean isCountQuery();
private abstract class Execution {
abstract Object execute(Query query);
@@ -214,12 +192,6 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
*/
class SingleEntityExecution extends Execution {
private final boolean countProjection;
private SingleEntityExecution(boolean countProjection) {
this.countProjection = countProjection;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.AbstractMongoQuery.Execution#execute(org.springframework.data.mongodb.core.core.query.Query)
@@ -228,8 +200,7 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
Object execute(Query query) {
MongoEntityMetadata<?> metadata = method.getEntityInformation();
return countProjection ? operations.count(query, metadata.getJavaType()) : operations.findOne(query,
metadata.getJavaType());
return operations.findOne(query, metadata.getJavaType());
}
}
@@ -287,6 +258,11 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
nearQuery.maxDistance(maxDistance).in(maxDistance.getMetric());
}
Pageable pageable = accessor.getPageable();
if (pageable != null) {
nearQuery.with(pageable);
}
MongoEntityMetadata<?> metadata = method.getEntityInformation();
return (GeoResults<Object>) operations.geoNear(nearQuery, metadata.getJavaType(), metadata.getCollectionName());
}

View File

@@ -169,68 +169,68 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
PotentiallyConvertingIterator parameters) {
switch (type) {
case AFTER:
case GREATER_THAN:
return criteria.gt(parameters.nextConverted(property));
case GREATER_THAN_EQUAL:
return criteria.gte(parameters.nextConverted(property));
case BEFORE:
case LESS_THAN:
return criteria.lt(parameters.nextConverted(property));
case LESS_THAN_EQUAL:
return criteria.lte(parameters.nextConverted(property));
case BETWEEN:
return criteria.gt(parameters.nextConverted(property)).lt(parameters.nextConverted(property));
case IS_NOT_NULL:
return criteria.ne(null);
case IS_NULL:
return criteria.is(null);
case NOT_IN:
return criteria.nin(nextAsArray(parameters, property));
case IN:
return criteria.in(nextAsArray(parameters, property));
case LIKE:
case STARTING_WITH:
case ENDING_WITH:
case CONTAINING:
String value = parameters.next().toString();
return criteria.regex(toLikeRegex(value, type));
case REGEX:
return criteria.regex(parameters.next().toString());
case EXISTS:
return criteria.exists((Boolean) parameters.next());
case TRUE:
return criteria.is(true);
case FALSE:
return criteria.is(false);
case NEAR:
case AFTER:
case GREATER_THAN:
return criteria.gt(parameters.nextConverted(property));
case GREATER_THAN_EQUAL:
return criteria.gte(parameters.nextConverted(property));
case BEFORE:
case LESS_THAN:
return criteria.lt(parameters.nextConverted(property));
case LESS_THAN_EQUAL:
return criteria.lte(parameters.nextConverted(property));
case BETWEEN:
return criteria.gt(parameters.nextConverted(property)).lt(parameters.nextConverted(property));
case IS_NOT_NULL:
return criteria.ne(null);
case IS_NULL:
return criteria.is(null);
case NOT_IN:
return criteria.nin(nextAsArray(parameters, property));
case IN:
return criteria.in(nextAsArray(parameters, property));
case LIKE:
case STARTING_WITH:
case ENDING_WITH:
case CONTAINING:
String value = parameters.next().toString();
return criteria.regex(toLikeRegex(value, type));
case REGEX:
return criteria.regex(parameters.next().toString());
case EXISTS:
return criteria.exists((Boolean) parameters.next());
case TRUE:
return criteria.is(true);
case FALSE:
return criteria.is(false);
case NEAR:
Distance distance = accessor.getMaxDistance();
Point point = accessor.getGeoNearLocation();
point = point == null ? nextAs(parameters, Point.class) : point;
Distance distance = accessor.getMaxDistance();
Point point = accessor.getGeoNearLocation();
point = point == null ? nextAs(parameters, Point.class) : point;
if (distance == null) {
return criteria.near(point);
if (distance == null) {
return criteria.near(point);
} else {
if (distance.getMetric() != null) {
criteria.nearSphere(point);
} else {
if (distance.getMetric() != null) {
criteria.nearSphere(point);
} else {
criteria.near(point);
}
criteria.maxDistance(distance.getNormalizedValue());
criteria.near(point);
}
return criteria;
criteria.maxDistance(distance.getNormalizedValue());
}
return criteria;
case WITHIN:
Object parameter = parameters.next();
return criteria.within((Shape) parameter);
case SIMPLE_PROPERTY:
return criteria.is(parameters.nextConverted(property));
case NEGATING_SIMPLE_PROPERTY:
return criteria.ne(parameters.nextConverted(property));
default:
throw new IllegalArgumentException("Unsupported keyword!");
case WITHIN:
Object parameter = parameters.next();
return criteria.within((Shape) parameter);
case SIMPLE_PROPERTY:
return criteria.is(parameters.nextConverted(property));
case NEGATING_SIMPLE_PROPERTY:
return criteria.ne(parameters.nextConverted(property));
}
throw new IllegalArgumentException("Unsupported keyword!");
}
/**
@@ -269,10 +269,10 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
switch (type) {
case STARTING_WITH:
source = source + "*";
source = "^" + source;
break;
case ENDING_WITH:
source = "*" + source;
source = source + "$";
break;
case CONTAINING:
source = "*" + source + "*";

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -171,7 +171,7 @@ public class MongoQueryMethod extends QueryMethod {
*
* @return
*/
Query getQueryAnnotation() {
private Query getQueryAnnotation() {
return method.getAnnotation(Query.class);
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2002-2013 the original author or authors.
* Copyright 2002-2010 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -77,13 +77,4 @@ public class PartTreeMongoQuery extends AbstractMongoQuery {
protected Query createCountQuery(ConvertingParameterAccessor accessor) {
return new MongoQueryCreator(tree, accessor, context, false).createQuery();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.query.AbstractMongoQuery#isCountQuery()
*/
@Override
protected boolean isCountQuery() {
return tree.isCountProjection();
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2013 the original author or authors.
* Copyright 2010-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,7 +15,10 @@
*/
package org.springframework.data.mongodb.repository.query;
import org.springframework.data.domain.Pageable;
import org.springframework.data.domain.Sort;
import org.springframework.data.domain.Sort.Order;
import org.springframework.data.mongodb.core.query.Query;
import com.mongodb.DBCursor;
@@ -24,7 +27,6 @@ import com.mongodb.DBCursor;
*
* @author Oliver Gierke
*/
@Deprecated
public abstract class QueryUtils {
private QueryUtils() {
@@ -32,13 +34,51 @@ public abstract class QueryUtils {
}
/**
* Turns an {@link Order} into an {@link org.springframework.data.mongodb.core.query.Order}.
* Applies the given {@link Pageable} to the given {@link Query}. Will do nothing if {@link Pageable} is
* {@literal null}.
*
* @deprecated use {@link Order} directly.
* @param order
* @deprecated use {@link Query#with(Pageable)}.
* @param query must not be {@literal null}.
* @param pageable
* @return
*/
@Deprecated
public static Query applyPagination(Query query, Pageable pageable) {
if (pageable == null) {
return query;
}
query.limit(pageable.getPageSize());
query.skip(pageable.getOffset());
return query.with(pageable.getSort());
}
/**
* Applies the given {@link Sort} to the {@link Query}. Will do nothing if {@link Sort} is {@literal null}.
*
* @deprecated use {@link Query#with(Pageable)}.
* @param query must not be {@literal null}.
* @param sort
* @return
*/
@Deprecated
public static Query applySorting(Query query, Sort sort) {
if (sort == null) {
return query;
}
org.springframework.data.mongodb.core.query.Sort bSort = query.sort();
for (Order order : sort) {
bSort.on(order.getProperty(), toOrder(order));
}
return query;
}
public static org.springframework.data.mongodb.core.query.Order toOrder(Order order) {
return order.isAscending() ? org.springframework.data.mongodb.core.query.Order.ASCENDING
: org.springframework.data.mongodb.core.query.Order.DESCENDING;

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -38,21 +38,17 @@ public class StringBasedMongoQuery extends AbstractMongoQuery {
private final String query;
private final String fieldSpec;
private final boolean isCountQuery;
/**
* Creates a new {@link StringBasedMongoQuery}.
*
* @param method must not be {@literal null}.
* @param template must not be {@literal null}.
* @param method
* @param template
*/
public StringBasedMongoQuery(String query, MongoQueryMethod method, MongoOperations mongoOperations) {
super(method, mongoOperations);
this.query = query;
this.fieldSpec = method.getFieldSpecification();
this.isCountQuery = method.hasAnnotatedQuery() ? method.getQueryAnnotation().count() : false;
}
public StringBasedMongoQuery(MongoQueryMethod method, MongoOperations mongoOperations) {
@@ -86,15 +82,6 @@ public class StringBasedMongoQuery extends AbstractMongoQuery {
return query;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.query.AbstractMongoQuery#isCountQuery()
*/
@Override
protected boolean isCountQuery() {
return isCountQuery;
}
private String replacePlaceholders(String input, ConvertingParameterAccessor accessor) {
Matcher matcher = PLACEHOLDER.matcher(input);

View File

@@ -0,0 +1,5 @@
/**
* Query derivation mechanism for MongoDB specific repositories.
*/
package org.springframework.data.mongodb.repository.query;

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -22,11 +22,12 @@ import java.util.Set;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.data.domain.Sort;
import org.springframework.data.domain.Sort.Direction;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.index.Index;
import org.springframework.data.mongodb.core.query.Order;
import org.springframework.data.mongodb.repository.query.MongoEntityMetadata;
import org.springframework.data.mongodb.repository.query.PartTreeMongoQuery;
import org.springframework.data.mongodb.repository.query.QueryUtils;
import org.springframework.data.repository.core.support.QueryCreationListener;
import org.springframework.data.repository.query.parser.Part;
import org.springframework.data.repository.query.parser.Part.Type;
@@ -73,14 +74,14 @@ class IndexEnsuringQueryCreationListener implements QueryCreationListener<PartTr
return;
}
String property = part.getProperty().toDotPath();
Direction order = toDirection(sort, property);
Order order = toOrder(sort, property);
index.on(property, order);
}
// Add fixed sorting criteria to index
if (sort != null) {
for (Sort.Order order : sort) {
index.on(order.getProperty(), order.getDirection());
index.on(order.getProperty(), QueryUtils.toOrder(order));
}
}
@@ -89,13 +90,13 @@ class IndexEnsuringQueryCreationListener implements QueryCreationListener<PartTr
LOG.debug(String.format("Created %s!", index));
}
private static Direction toDirection(Sort sort, String property) {
private static Order toOrder(Sort sort, String property) {
if (sort == null) {
return Direction.DESC;
return Order.DESCENDING;
}
org.springframework.data.domain.Sort.Order order = sort.getOrderFor(property);
return order == null ? Direction.DESC : order.isAscending() ? Direction.ASC : Direction.DESC;
return order == null ? Order.DESCENDING : order.isAscending() ? Order.ASCENDING : Order.DESCENDING;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -39,6 +39,7 @@ import com.mysema.query.apt.DefaultConfiguration;
*
* @author Oliver Gierke
*/
@SuppressWarnings("restriction")
@SupportedAnnotationTypes({ "com.mysema.query.annotations.*", "org.springframework.data.mongodb.core.mapping.*" })
@SupportedSourceVersion(SourceVersion.RELEASE_6)
public class MongoAnnotationProcessor extends AbstractQuerydslProcessor {

View File

@@ -49,14 +49,13 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements MongoR
/**
* Creates a ew {@link SimpleMongoRepository} for the given {@link MongoEntityInformation} and {@link MongoTemplate}.
*
* @param metadata must not be {@literal null}.
* @param template must not be {@literal null}.
* @param metadata
* @param template
*/
public SimpleMongoRepository(MongoEntityInformation<T, ID> metadata, MongoOperations mongoOperations) {
Assert.notNull(mongoOperations);
Assert.notNull(metadata);
this.entityInformation = metadata;
this.mongoOperations = mongoOperations;
}
@@ -97,7 +96,7 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements MongoR
*/
public T findOne(ID id) {
Assert.notNull(id, "The given id must not be null!");
return mongoOperations.findById(id, entityInformation.getJavaType(), entityInformation.getCollectionName());
return mongoOperations.findById(id, entityInformation.getJavaType());
}
private Query getIdQuery(Object id) {
@@ -115,8 +114,11 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements MongoR
public boolean exists(ID id) {
Assert.notNull(id, "The given id must not be null!");
return mongoOperations.exists(getIdQuery(id), entityInformation.getJavaType(),
entityInformation.getCollectionName());
final Query idQuery = getIdQuery(id);
idQuery.fields();
return mongoOperations.findOne(idQuery, entityInformation.getJavaType(), entityInformation.getCollectionName()) != null;
}
/*
@@ -124,6 +126,7 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements MongoR
* @see org.springframework.data.repository.CrudRepository#count()
*/
public long count() {
return mongoOperations.getCollection(entityInformation.getCollectionName()).count();
}
@@ -133,7 +136,7 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements MongoR
*/
public void delete(ID id) {
Assert.notNull(id, "The given id must not be null!");
mongoOperations.remove(getIdQuery(id), entityInformation.getJavaType(), entityInformation.getCollectionName());
mongoOperations.remove(getIdQuery(id), entityInformation.getJavaType());
}
/*
@@ -163,6 +166,7 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements MongoR
* @see org.springframework.data.repository.CrudRepository#deleteAll()
*/
public void deleteAll() {
mongoOperations.remove(new Query(), entityInformation.getCollectionName());
}
@@ -223,6 +227,7 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements MongoR
* @return
*/
protected MongoOperations getMongoOperations() {
return this.mongoOperations;
}
@@ -230,6 +235,7 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements MongoR
* @return the entityInformation
*/
protected MongoEntityInformation<T, ID> getEntityInformation() {
return entityInformation;
}
}

View File

@@ -63,8 +63,7 @@ class SpringDataMongodbSerializer extends MongodbSerializer {
Path<?> parent = metadata.getParent();
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(parent.getType());
MongoPersistentProperty property = entity.getPersistentProperty(metadata.getName());
MongoPersistentProperty property = entity.getPersistentProperty(metadata.getExpression().toString());
return property == null ? super.getKeyForPath(expr, metadata) : property.getFieldName();
}

View File

@@ -0,0 +1,5 @@
/**
* Support infrastructure for query derivation of MongoDB specific repositories.
*/
package org.springframework.data.mongodb.repository.support;

View File

@@ -1,5 +1,4 @@
http\://www.springframework.org/schema/data/mongo/spring-mongo-1.0.xsd=org/springframework/data/mongodb/config/spring-mongo-1.0.xsd
http\://www.springframework.org/schema/data/mongo/spring-mongo-1.1.xsd=org/springframework/data/mongodb/config/spring-mongo-1.1.xsd
http\://www.springframework.org/schema/data/mongo/spring-mongo-1.2.xsd=org/springframework/data/mongodb/config/spring-mongo-1.2.xsd
http\://www.springframework.org/schema/data/mongo/spring-mongo-1.3.xsd=org/springframework/data/mongodb/config/spring-mongo-1.3.xsd
http\://www.springframework.org/schema/data/mongo/spring-mongo.xsd=org/springframework/data/mongodb/config/spring-mongo-1.3.xsd
http\://www.springframework.org/schema/data/mongo/spring-mongo.xsd=org/springframework/data/mongodb/config/spring-mongo-1.2.xsd

View File

@@ -1,597 +0,0 @@
<?xml version="1.0" encoding="UTF-8" ?>
<xsd:schema xmlns="http://www.springframework.org/schema/data/mongo"
xmlns:xsd="http://www.w3.org/2001/XMLSchema"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:beans="http://www.springframework.org/schema/beans"
xmlns:tool="http://www.springframework.org/schema/tool"
xmlns:context="http://www.springframework.org/schema/context"
xmlns:repository="http://www.springframework.org/schema/data/repository"
targetNamespace="http://www.springframework.org/schema/data/mongo"
elementFormDefault="qualified" attributeFormDefault="unqualified">
<xsd:import namespace="http://www.springframework.org/schema/beans" />
<xsd:import namespace="http://www.springframework.org/schema/tool" />
<xsd:import namespace="http://www.springframework.org/schema/context" />
<xsd:import namespace="http://www.springframework.org/schema/data/repository"
schemaLocation="http://www.springframework.org/schema/data/repository/spring-repository.xsd" />
<xsd:element name="mongo" type="mongoType">
<xsd:annotation>
<xsd:documentation source="org.springframework.data.mongodb.core.core.MongoFactoryBean"><![CDATA[
Defines a Mongo instance used for accessing MongoDB'.
]]></xsd:documentation>
<xsd:appinfo>
<tool:annotation>
<tool:exports type="com.mongodb.Mongo"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
</xsd:element>
<xsd:element name="db-factory">
<xsd:annotation>
<xsd:documentation><![CDATA[
Defines a MongoDbFactory for connecting to a specific database
]]></xsd:documentation>
</xsd:annotation>
<xsd:complexType>
<xsd:attribute name="id" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The name of the mongo definition (by default "mongoDbFactory").]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="mongo-ref" type="mongoRef" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The reference to a Mongo instance. If not configured a default com.mongodb.Mongo instance will be created.
]]>
</xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="dbname" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The name of the database to connect to. Default is 'db'.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="port" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The port to connect to MongoDB server. Default is 27017
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="host" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The host to connect to a MongoDB server. Default is localhost
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="username" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The username to use when connecting to a MongoDB server.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="password" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The password to use when connecting to a MongoDB server.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="uri" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The Mongo URI string.]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="write-concern">
<xsd:annotation>
<xsd:documentation>
The WriteConcern that will be the default value used when asking the MongoDbFactory for a DB object
</xsd:documentation>
</xsd:annotation>
<xsd:simpleType>
<xsd:union memberTypes="writeConcernEnumeration xsd:string"/>
</xsd:simpleType>
</xsd:attribute>
</xsd:complexType>
</xsd:element>
<xsd:attributeGroup name="mongo-repository-attributes">
<xsd:attribute name="mongo-template-ref" type="mongoTemplateRef" default="mongoTemplate">
<xsd:annotation>
<xsd:documentation>
The reference to a MongoTemplate. Will default to 'mongoTemplate'.
</xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="create-query-indexes" type="xsd:boolean" default="false">
<xsd:annotation>
<xsd:documentation>
Enables creation of indexes for queries that get derived from the method name
and thus reference domain class properties. Defaults to false.
</xsd:documentation>
</xsd:annotation>
</xsd:attribute>
</xsd:attributeGroup>
<xsd:element name="repositories">
<xsd:complexType>
<xsd:complexContent>
<xsd:extension base="repository:repositories">
<xsd:attributeGroup ref="mongo-repository-attributes"/>
<xsd:attributeGroup ref="repository:repository-attributes"/>
</xsd:extension>
</xsd:complexContent>
</xsd:complexType>
</xsd:element>
<xsd:element name="mapping-converter">
<xsd:annotation>
<xsd:documentation><![CDATA[Defines a MongoConverter for getting rich mapping functionality.]]></xsd:documentation>
<xsd:appinfo>
<tool:exports type="org.springframework.data.mongodb.core.convert.MappingMongoConverter" />
</xsd:appinfo>
</xsd:annotation>
<xsd:complexType>
<xsd:sequence>
<xsd:element name="custom-converters" minOccurs="0">
<xsd:annotation>
<xsd:documentation><![CDATA[
Top-level element that contains one or more custom converters to be used for mapping
domain objects to and from Mongo's DBObject]]>
</xsd:documentation>
</xsd:annotation>
<xsd:complexType>
<xsd:sequence>
<xsd:element name="converter" type="customConverterType" minOccurs="0" maxOccurs="unbounded"/>
</xsd:sequence>
<xsd:attribute name="base-package" type="xsd:string" />
</xsd:complexType>
</xsd:element>
</xsd:sequence>
<xsd:attribute name="id" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The name of the MappingMongoConverter instance (by default "mappingConverter").]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="base-package" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The base package in which to scan for entities annotated with @Document
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="db-factory-ref" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation>
The reference to a DbFactory.
</xsd:documentation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to type="org.springframework.data.mongodb.MongoDbFactory" />
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="mongo-ref" type="mongoRef" use="optional">
<xsd:annotation>
<xsd:documentation>
The reference to a Mongo. Will default to 'mongo'.
</xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="mapping-context-ref" type="mappingContextRef" use="optional">
<xsd:annotation>
<xsd:documentation source="org.springframework.data.mapping.model.MappingContext">
The reference to a MappingContext. Will default to 'mappingContext'.
</xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="mongo-template-ref" type="mongoTemplateRef" use="optional">
<xsd:annotation>
<xsd:documentation source="org.springframework.data.mongodb.core.core.MongoTemplate">
The reference to a MongoTemplate. Will default to 'mongoTemplate'.
</xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="disable-validation" use="optional">
<xsd:annotation>
<xsd:documentation source="org.springframework.data.mongodb.core.mapping.event.ValidatingMongoEventListener">
Disables JSR-303 validation on MongoDB documents before they are saved. By default it is set to false.
</xsd:documentation>
</xsd:annotation>
<xsd:simpleType>
<xsd:union memberTypes="xsd:boolean xsd:string"/>
</xsd:simpleType>
</xsd:attribute>
<xsd:attribute name="abbreviate-field-names" use="optional" default="false">
<xsd:annotation>
<xsd:documentation source="org.springframework.data.mongodb.core.mapping.CamelCaseAbbreviatingFieldNamingStrategy">
Enables abbreviating the field names for domain class properties to the
first character of their camel case names, e.g. fooBar -> fb.
</xsd:documentation>
</xsd:annotation>
<xsd:simpleType>
<xsd:union memberTypes="xsd:boolean xsd:string"/>
</xsd:simpleType>
</xsd:attribute>
</xsd:complexType>
</xsd:element>
<xsd:element name="jmx">
<xsd:annotation>
<xsd:documentation><![CDATA[
Defines a JMX Model MBeans for monitoring a MongoDB server'.
]]></xsd:documentation>
</xsd:annotation>
<xsd:complexType>
<xsd:attribute name="mongo-ref" type="mongoRef" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The name of the Mongo object that determines what server to monitor. (by default "mongo").]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
</xsd:complexType>
</xsd:element>
<xsd:element name="auditing">
<xsd:annotation>
<xsd:appinfo>
<tool:annotation>
<tool:exports type="org.springframework.data.mongodb.core.mapping.event.AuditingEventListener" />
<tool:exports type="org.springframework.data.auditing.IsNewAwareAuditingHandler" />
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
<xsd:complexType>
<xsd:attributeGroup ref="repository:auditing-attributes" />
<xsd:attribute name="mapping-context-ref" type="mappingContextRef" />
</xsd:complexType>
</xsd:element>
<xsd:simpleType name="mappingContextRef">
<xsd:annotation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to type="org.springframework.data.mapping.model.MappingContext"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
<xsd:union memberTypes="xsd:string"/>
</xsd:simpleType>
<xsd:simpleType name="mongoTemplateRef">
<xsd:annotation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to type="org.springframework.data.mongodb.core.core.MongoTemplate"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
<xsd:union memberTypes="xsd:string"/>
</xsd:simpleType>
<xsd:simpleType name="mongoRef">
<xsd:annotation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to type="org.springframework.data.mongodb.core.core.MongoFactoryBean"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
<xsd:union memberTypes="xsd:string"/>
</xsd:simpleType>
<xsd:simpleType name="writeConcernEnumeration">
<xsd:restriction base="xsd:token">
<xsd:enumeration value="NONE" />
<xsd:enumeration value="NORMAL" />
<xsd:enumeration value="SAFE" />
<xsd:enumeration value="FSYNC_SAFE" />
<xsd:enumeration value="REPLICAS_SAFE" />
<xsd:enumeration value="JOURNAL_SAFE" />
<xsd:enumeration value="MAJORITY" />
</xsd:restriction>
</xsd:simpleType>
<!-- MLP
<xsd:attributeGroup name="writeConcern">
<xsd:attribute name="write-concern">
<xsd:simpleType>
<xsd:restriction base="xsd:string">
<xsd:enumeration value="NONE" />
<xsd:enumeration value="NORMAL" />
<xsd:enumeration value="SAFE" />
<xsd:enumeration value="FSYNC_SAFE" />
<xsd:enumeration value="REPLICA_SAFE" />
<xsd:enumeration value="JOURNAL_SAFE" />
<xsd:enumeration value="MAJORITY" />
</xsd:restriction>
</xsd:simpleType>
</xsd:attribute>
</xsd:attributeGroup>
-->
<xsd:complexType name="mongoType">
<xsd:sequence minOccurs="0" maxOccurs="1">
<xsd:element name="options" type="optionsType">
<xsd:annotation>
<xsd:documentation><![CDATA[
The Mongo driver options
]]></xsd:documentation>
<xsd:appinfo>
<tool:annotation>
<tool:exports type="com.mongodb.MongoOptions"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
</xsd:element>
</xsd:sequence>
<xsd:attribute name="write-concern">
<xsd:annotation>
<xsd:documentation>
The WriteConcern that will be the default value used when asking the MongoDbFactory for a DB object
</xsd:documentation>
</xsd:annotation>
<xsd:simpleType>
<xsd:union memberTypes="writeConcernEnumeration xsd:string"/>
</xsd:simpleType>
</xsd:attribute>
<!-- MLP
<xsd:attributeGroup ref="writeConcern" />
-->
<xsd:attribute name="id" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The name of the mongo definition (by default "mongo").]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="port" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The port to connect to MongoDB server. Default is 27017
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="host" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The host to connect to a MongoDB server. Default is localhost
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="replica-set" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The comma delimited list of host:port entries to use for replica set/pairs.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
</xsd:complexType>
<xsd:complexType name="optionsType">
<xsd:attribute name="connections-per-host" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The number of connections allowed per host. Will block if run out. Default is 10. System property MONGO.POOLSIZE can override
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="threads-allowed-to-block-for-connection-multiplier" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The multiplier for connectionsPerHost for # of threads that can block. Default is 5.
If connectionsPerHost is 10, and threadsAllowedToBlockForConnectionMultiplier is 5,
then 50 threads can block more than that and an exception will be thrown.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="max-wait-time" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The max wait time of a blocking thread for a connection. Default is 12000 ms (2 minutes)
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="connect-timeout" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The connect timeout in milliseconds. 0 is default and infinite.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="socket-timeout" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The socket timeout. 0 is default and infinite.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="socket-keep-alive" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The keep alive flag, controls whether or not to have socket keep alive timeout. Defaults to false.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="auto-connect-retry" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
This controls whether or not on a connect, the system retries automatically. Default is false.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="max-auto-connect-retry-time" type="xsd:long">
<xsd:annotation>
<xsd:documentation><![CDATA[
The maximum amount of time in millisecons to spend retrying to open connection to the same server. Default is 0, which means to use the default 15s if autoConnectRetry is on.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="write-number" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
This specifies the number of servers to wait for on the write operation, and exception raising behavior. The 'w' option to the getlasterror command. Defaults to 0.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="write-timeout" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
This controls timeout for write operations in milliseconds. The 'wtimeout' option to the getlasterror command. Defaults to 0 (indefinite). Greater than zero is number of milliseconds to wait.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="write-fsync" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
This controls whether or not to fsync. The 'fsync' option to the getlasterror command. Defaults to false.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="slave-ok" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
This controls if the driver is allowed to read from secondaries or slaves. Defaults to false.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
</xsd:complexType>
<xsd:group name="beanElementGroup">
<xsd:choice>
<xsd:element ref="beans:bean"/>
<xsd:element ref="beans:ref"/>
</xsd:choice>
</xsd:group>
<xsd:complexType name="customConverterType">
<xsd:annotation>
<xsd:documentation><![CDATA[
Element defining a custom converterr.
]]></xsd:documentation>
</xsd:annotation>
<xsd:group ref="beanElementGroup" minOccurs="0" maxOccurs="1"/>
<xsd:attribute name="ref" type="xsd:string">
<xsd:annotation>
<xsd:documentation>
A reference to a custom converter.
</xsd:documentation>
<xsd:appinfo>
<tool:annotation kind="ref"/>
</xsd:appinfo>
</xsd:annotation>
</xsd:attribute>
</xsd:complexType>
<xsd:simpleType name="converterRef">
<xsd:annotation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to type="org.springframework.data.mongodb.core.convert.MongoConverter"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
<xsd:union memberTypes="xsd:string"/>
</xsd:simpleType>
<xsd:element name="template">
<xsd:annotation>
<xsd:documentation><![CDATA[
Defines a MongoDbFactory for connecting to a specific database
]]></xsd:documentation>
</xsd:annotation>
<xsd:complexType>
<xsd:attribute name="id" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The name of the mongo definition (by default "mongoDbFactory").]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="converter-ref" type="converterRef" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The reference to a Mongoconverter instance.
]]>
</xsd:documentation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to type="org.springframework.data.mongodb.core.convert.MongoConverter"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="db-factory-ref" type="xsd:string"
use="optional">
<xsd:annotation>
<xsd:documentation>
The reference to a DbFactory.
</xsd:documentation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to
type="org.springframework.data.mongodb.MongoDbFactory" />
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="write-concern">
<xsd:annotation>
<xsd:documentation>
The WriteConcern that will be the default value used when asking the MongoDbFactory for a DB object
</xsd:documentation>
</xsd:annotation>
<xsd:simpleType>
<xsd:union memberTypes="writeConcernEnumeration xsd:string"/>
</xsd:simpleType>
</xsd:attribute>
</xsd:complexType>
</xsd:element>
<xsd:element name="gridFsTemplate">
<xsd:annotation>
<xsd:documentation><![CDATA[
Defines a MongoDbFactory for connecting to a specific database
]]></xsd:documentation>
</xsd:annotation>
<xsd:complexType>
<xsd:attribute name="id" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The name of the mongo definition (by default "mongoDbFactory").]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="converter-ref" type="converterRef" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The reference to a Mongoconverter instance.
]]>
</xsd:documentation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to type="org.springframework.data.mongodb.core.convert.MongoConverter"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="db-factory-ref" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation>
The reference to a DbFactory.
</xsd:documentation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to type="org.springframework.data.mongodb.MongoDbFactory" />
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
</xsd:attribute>
</xsd:complexType>
</xsd:element>
</xsd:schema>

View File

@@ -18,15 +18,14 @@ package org.springframework.data.mongodb.config;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import org.junit.Rule;
import org.junit.Test;
import org.junit.rules.ExpectedException;
import org.springframework.beans.factory.NoSuchBeanDefinitionException;
import org.springframework.context.annotation.AnnotationConfigApplicationContext;
import org.springframework.context.annotation.Configuration;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.context.annotation.Bean;
import org.springframework.data.mongodb.core.mapping.BasicMongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.expression.spel.support.StandardEvaluationContext;
import org.springframework.test.util.ReflectionTestUtils;
import com.mongodb.Mongo;
@@ -37,9 +36,6 @@ import com.mongodb.Mongo;
*/
public class AbstractMongoConfigurationUnitTests {
@Rule
public ExpectedException exception = ExpectedException.none();
/**
* @see DATAMONGO-496
*/
@@ -72,29 +68,15 @@ public class AbstractMongoConfigurationUnitTests {
assertScanningDisabled(" ");
}
/**
* @see DATAMONGO-569
*/
@Test
public void containsMongoDbFactoryButNoMongoBean() {
public void lifecycleCallbacksAreInvokedInAppropriateOrder() {
AnnotationConfigApplicationContext context = new AnnotationConfigApplicationContext(SampleMongoConfiguration.class);
MongoMappingContext mappingContext = context.getBean(MongoMappingContext.class);
BasicMongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(Entity.class);
StandardEvaluationContext spElContext = (StandardEvaluationContext) ReflectionTestUtils.getField(entity, "context");
assertThat(context.getBean(MongoDbFactory.class), is(notNullValue()));
exception.expect(NoSuchBeanDefinitionException.class);
context.getBean(Mongo.class);
}
@Test
public void returnsUninitializedMappingContext() throws Exception {
SampleMongoConfiguration configuration = new SampleMongoConfiguration();
MongoMappingContext context = configuration.mongoMappingContext();
assertThat(context.getPersistentEntities(), is(emptyIterable()));
context.initialize();
assertThat(context.getPersistentEntities(), is(not(emptyIterable())));
assertThat(spElContext.getBeanResolver(), is(notNullValue()));
}
private static void assertScanningDisabled(final String value) throws ClassNotFoundException {
@@ -110,7 +92,6 @@ public class AbstractMongoConfigurationUnitTests {
assertThat(configuration.getInitialEntitySet(), hasSize(0));
}
@Configuration
static class SampleMongoConfiguration extends AbstractMongoConfiguration {
@Override
@@ -118,6 +99,7 @@ public class AbstractMongoConfigurationUnitTests {
return "database";
}
@Bean
@Override
public Mongo mongo() throws Exception {
return new Mongo();

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,7 +15,7 @@
*/
package org.springframework.data.mongodb.config;
import static org.hamcrest.Matchers.*;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import java.util.Collections;
@@ -23,7 +23,6 @@ import java.util.Set;
import org.junit.Before;
import org.junit.Test;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.support.DefaultListableBeanFactory;
import org.springframework.beans.factory.xml.XmlBeanDefinitionReader;
import org.springframework.core.convert.TypeDescriptor;
@@ -32,7 +31,6 @@ import org.springframework.core.convert.converter.GenericConverter;
import org.springframework.core.io.ClassPathResource;
import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.mapping.Account;
import org.springframework.data.mongodb.core.mapping.CamelCaseAbbreviatingFieldNamingStrategy;
import org.springframework.data.mongodb.repository.Person;
import org.springframework.stereotype.Component;
@@ -69,20 +67,6 @@ public class MappingMongoConverterParserIntegrationTests {
assertThat(conversions.hasCustomWriteTarget(Account.class), is(true));
}
/**
* @see DATAMONGO-607
*/
@Test
public void activatesAbbreviatingPropertiesCorrectly() {
BeanDefinition definition = factory.getBeanDefinition("abbreviatingConverter.mappingContext");
Object value = definition.getPropertyValues().getPropertyValue("fieldNamingStrategy").getValue();
assertThat(value, is(instanceOf(BeanDefinition.class)));
BeanDefinition strategy = (BeanDefinition) value;
assertThat(strategy.getBeanClassName(), is(CamelCaseAbbreviatingFieldNamingStrategy.class.getName()));
}
@Component
public static class SampleConverter implements Converter<Person, DBObject> {
public DBObject convert(Person source) {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2013 the original author or authors.
* Copyright 2010 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -13,6 +13,7 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import static org.junit.Assert.*;
@@ -25,23 +26,12 @@ import org.springframework.context.ApplicationContext;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.MongoFactoryBean;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.gridfs.GridFsOperations;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import com.mongodb.Mongo;
import com.mongodb.MongoOptions;
import com.mongodb.WriteConcern;
/**
* Integration tests for the MongoDB namespace.
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Martin Baumgartner
*/
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration
public class MongoNamespaceTests {
@@ -68,7 +58,7 @@ public class MongoNamespaceTests {
}
@Test
public void testSecondMongoDbFactory() {
public void testSecondMongoDbFactory() throws Exception {
assertTrue(ctx.containsBean("secondMongoDbFactory"));
MongoDbFactory dbf = (MongoDbFactory) ctx.getBean("secondMongoDbFactory");
Mongo mongo = (Mongo) getField(dbf, "mongo");
@@ -78,58 +68,6 @@ public class MongoNamespaceTests {
assertEquals("database", getField(dbf, "databaseName"));
}
/**
* @see DATAMONGO-140
*/
@Test
public void testMongoTemplateFactory() {
assertTrue(ctx.containsBean("mongoTemplate"));
MongoOperations operations = (MongoOperations) ctx.getBean("mongoTemplate");
MongoDbFactory dbf = (MongoDbFactory) getField(operations, "mongoDbFactory");
assertEquals("database", getField(dbf, "databaseName"));
MongoConverter converter = (MongoConverter) getField(operations, "mongoConverter");
assertNotNull(converter);
}
/**
* @see DATAMONGO-140
*/
@Test
public void testSecondMongoTemplateFactory() {
assertTrue(ctx.containsBean("anotherMongoTemplate"));
MongoOperations operations = (MongoOperations) ctx.getBean("anotherMongoTemplate");
MongoDbFactory dbf = (MongoDbFactory) getField(operations, "mongoDbFactory");
assertEquals("database", getField(dbf, "databaseName"));
WriteConcern writeConcern = (WriteConcern) getField(operations, "writeConcern");
assertEquals(WriteConcern.SAFE, writeConcern);
}
/**
* @see DATAMONGO-628
*/
@Test
public void testGridFsTemplateFactory() {
assertTrue(ctx.containsBean("gridFsTemplate"));
GridFsOperations operations = (GridFsOperations) ctx.getBean("gridFsTemplate");
MongoDbFactory dbf = (MongoDbFactory) getField(operations, "dbFactory");
assertEquals("database", getField(dbf, "databaseName"));
MongoConverter converter = (MongoConverter) getField(operations, "converter");
assertNotNull(converter);
}
/**
* @see DATAMONGO-628
*/
@Test
public void testSecondGridFsTemplateFactory() {
assertTrue(ctx.containsBean("antoherGridFsTemplate"));
GridFsOperations operations = (GridFsOperations) ctx.getBean("antoherGridFsTemplate");
MongoDbFactory dbf = (MongoDbFactory) getField(operations, "dbFactory");
assertEquals("database", getField(dbf, "databaseName"));
MongoConverter converter = (MongoConverter) getField(operations, "converter");
assertNotNull(converter);
}
@Test
@SuppressWarnings("deprecation")
public void testMongoSingletonWithPropertyPlaceHolders() throws Exception {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2012 the original author or authors.
* Copyright 2012-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -31,6 +31,7 @@ import com.mongodb.ServerAddress;
* Unit tests for {@link ServerAddressPropertyEditor}.
*
* @author Oliver Gierke
* @author Thomas Darimont
*/
public class ServerAddressPropertyEditorUnitTests {
@@ -70,6 +71,16 @@ public class ServerAddressPropertyEditorUnitTests {
assertSingleAddressOfLocalhost(editor.getValue());
}
/**
* @see DATAMONGO-693
*/
@Test
public void interpretEmptyStringAsNull() {
editor.setAsText("");
assertNull(editor.getValue());
}
private static void assertSingleAddressOfLocalhost(Object result) throws UnknownHostException {
assertThat(result, is(instanceOf(ServerAddress[].class)));

View File

@@ -27,7 +27,7 @@ import com.mongodb.WriteConcern;
*
* @author Oliver Gierke
*/
public class StringToWriteConcernConverterUnitTest {
public class StringToWriteConcernConverterUnitTests {
StringToWriteConcernConverter converter = new StringToWriteConcernConverter();

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2012 the original author or authors.
* Copyright 2012-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -20,6 +20,8 @@ import static org.junit.Assert.*;
import static org.mockito.Matchers.*;
import static org.mockito.Mockito.*;
import java.util.List;
import org.junit.After;
import org.junit.Before;
import org.junit.Test;
@@ -28,7 +30,9 @@ import org.mockito.Mock;
import org.mockito.invocation.InvocationOnMock;
import org.mockito.runners.MockitoJUnitRunner;
import org.mockito.stubbing.Answer;
import org.springframework.transaction.support.TransactionSynchronization;
import org.springframework.transaction.support.TransactionSynchronizationManager;
import org.springframework.transaction.support.TransactionSynchronizationUtils;
import com.mongodb.DB;
import com.mongodb.Mongo;
@@ -37,12 +41,12 @@ import com.mongodb.Mongo;
* Unit tests for {@link MongoDbUtils}.
*
* @author Oliver Gierke
* @author Randy Watler
*/
@RunWith(MockitoJUnitRunner.class)
public class MongoDbUtilsUnitTests {
@Mock
Mongo mongo;
@Mock Mongo mongo;
@Before
public void setUp() throws Exception {
@@ -81,4 +85,94 @@ public class MongoDbUtilsUnitTests {
assertThat(first, is(notNullValue()));
assertThat(MongoDbUtils.getDB(mongo, "first"), is(sameInstance(first)));
}
/**
* @see DATAMONGO-737
*/
@Test
public void handlesTransactionSynchronizationLifecycle() {
// ensure transaction synchronization manager has no registered
// transaction synchronizations or bound resources at start of test
assertThat(TransactionSynchronizationManager.getSynchronizations().isEmpty(), is(true));
assertThat(TransactionSynchronizationManager.getResourceMap().isEmpty(), is(true));
// access database for one mongo instance, (registers transaction
// synchronization and binds transaction resource)
MongoDbUtils.getDB(mongo, "first");
// ensure transaction synchronization manager has registered
// transaction synchronizations and bound resources
assertThat(TransactionSynchronizationManager.getSynchronizations().isEmpty(), is(false));
assertThat(TransactionSynchronizationManager.getResourceMap().isEmpty(), is(false));
// simulate transaction completion, (unbinds transaction resource)
try {
simulateTransactionCompletion();
} catch (Exception e) {
fail("Unexpected exception thrown during transaction completion: " + e);
}
// ensure transaction synchronization manager has no bound resources
// at end of test
assertThat(TransactionSynchronizationManager.getResourceMap().isEmpty(), is(true));
}
/**
* @see DATAMONGO-737
*/
@Test
public void handlesTransactionSynchronizationsLifecycle() {
// ensure transaction synchronization manager has no registered
// transaction synchronizations or bound resources at start of test
assertThat(TransactionSynchronizationManager.getSynchronizations().isEmpty(), is(true));
assertThat(TransactionSynchronizationManager.getResourceMap().isEmpty(), is(true));
// access multiple databases for one mongo instance, (registers
// transaction synchronizations and binds transaction resources)
MongoDbUtils.getDB(mongo, "first");
MongoDbUtils.getDB(mongo, "second");
// ensure transaction synchronization manager has registered
// transaction synchronizations and bound resources
assertThat(TransactionSynchronizationManager.getSynchronizations().isEmpty(), is(false));
assertThat(TransactionSynchronizationManager.getResourceMap().isEmpty(), is(false));
// simulate transaction completion, (unbinds transaction resources)
try {
simulateTransactionCompletion();
} catch (Exception e) {
fail("Unexpected exception thrown during transaction completion: " + e);
}
// ensure transaction synchronization manager has no bound
// transaction resources at end of test
assertThat(TransactionSynchronizationManager.getResourceMap().isEmpty(), is(true));
}
/**
* Simulate transaction rollback/commit completion protocol on managed transaction synchronizations which will unbind
* managed transaction resources. Does not swallow exceptions for testing purposes.
*
* @see TransactionSynchronizationUtils#triggerBeforeCompletion()
* @see TransactionSynchronizationUtils#triggerAfterCompletion(int)
*/
private void simulateTransactionCompletion() {
// triggerBeforeCompletion() implementation without swallowed exceptions
List<TransactionSynchronization> synchronizations = TransactionSynchronizationManager.getSynchronizations();
for (TransactionSynchronization synchronization : synchronizations) {
synchronization.beforeCompletion();
}
// triggerAfterCompletion() implementation without swallowed exceptions
List<TransactionSynchronization> remainingSynchronizations = TransactionSynchronizationManager
.getSynchronizations();
if (remainingSynchronizations != null) {
for (TransactionSynchronization remainingSynchronization : remainingSynchronizations) {
remainingSynchronization.afterCompletion(TransactionSynchronization.STATUS_ROLLED_BACK);
}
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2012 the original author or authors.
* Copyright 2012-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -21,17 +21,21 @@ import static org.junit.Assert.*;
import org.junit.Test;
import org.springframework.beans.factory.support.DefaultListableBeanFactory;
import org.springframework.beans.factory.support.RootBeanDefinition;
import org.springframework.data.mongodb.config.ServerAddressPropertyEditor;
import org.springframework.data.mongodb.config.WriteConcernPropertyEditor;
import org.springframework.test.util.ReflectionTestUtils;
import com.mongodb.Mongo;
import com.mongodb.ServerAddress;
import com.mongodb.WriteConcern;
/**
* Integration tests for {@link MongoFactoryBean}.
*
* @author Oliver Gierke
* @author Thomas Darimont
*/
public class MongoFactoryBeanIntegrationTest {
public class MongoFactoryBeanIntegrationTests {
/**
* @see DATAMONGO-408
@@ -49,4 +53,22 @@ public class MongoFactoryBeanIntegrationTest {
MongoFactoryBean bean = factory.getBean("&factory", MongoFactoryBean.class);
assertThat(ReflectionTestUtils.getField(bean, "writeConcern"), is((Object) WriteConcern.SAFE));
}
/**
* @see DATAMONGO-693
*/
@Test
public void createMongoInstanceWithHostAndEmptyReplicaSets() {
RootBeanDefinition definition = new RootBeanDefinition(MongoFactoryBean.class);
definition.getPropertyValues().addPropertyValue("host", "localhost");
definition.getPropertyValues().addPropertyValue("replicaPair", "");
DefaultListableBeanFactory factory = new DefaultListableBeanFactory();
factory.registerCustomEditor(ServerAddress.class, ServerAddressPropertyEditor.class);
factory.registerBeanDefinition("factory", definition);
Mongo mongo = factory.getBean(Mongo.class);
assertNotNull(mongo);
}
}

View File

@@ -50,10 +50,9 @@ import org.springframework.dao.OptimisticLockingFailureException;
import org.springframework.data.annotation.Id;
import org.springframework.data.annotation.PersistenceConstructor;
import org.springframework.data.annotation.Version;
import org.springframework.data.domain.Sort;
import org.springframework.data.domain.Sort.Direction;
import org.springframework.data.mapping.model.MappingException;
import org.springframework.data.mongodb.InvalidMongoDbApiUsageException;
import org.springframework.data.mongodb.MongoDataIntegrityViolationException;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
@@ -61,9 +60,11 @@ import org.springframework.data.mongodb.core.index.Index;
import org.springframework.data.mongodb.core.index.Index.Duplicates;
import org.springframework.data.mongodb.core.index.IndexField;
import org.springframework.data.mongodb.core.index.IndexInfo;
import org.springframework.data.mongodb.core.mapping.Field;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.query.BasicQuery;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.Order;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.test.context.ContextConfiguration;
@@ -87,20 +88,18 @@ import com.mongodb.WriteResult;
* @author Thomas Risberg
* @author Amol Nayak
* @author Patryk Wasik
* @author Thomas Darimont
*/
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration("classpath:infrastructure.xml")
public class MongoTemplateTests {
@Autowired
MongoTemplate template;
@Autowired
MongoDbFactory factory;
@Autowired MongoTemplate template;
@Autowired MongoDbFactory factory;
MongoTemplate mappingTemplate;
@Rule
public ExpectedException thrown = ExpectedException.none();
@Rule public ExpectedException thrown = ExpectedException.none();
@Autowired
@SuppressWarnings("unchecked")
@@ -150,6 +149,8 @@ public class MongoTemplateTests {
template.dropCollection(TestClass.class);
template.dropCollection(Sample.class);
template.dropCollection(MyPerson.class);
template.dropCollection(TypeWithFieldAnnotation.class);
template.dropCollection(TypeWithDate.class);
template.dropCollection("collection");
template.dropCollection("personX");
}
@@ -236,7 +237,7 @@ public class MongoTemplateTests {
MongoTemplate template = new MongoTemplate(factory);
template.setWriteResultChecking(WriteResultChecking.EXCEPTION);
template.indexOps(Person.class).ensureIndex(new Index().on("firstName", Direction.DESC).unique());
template.indexOps(Person.class).ensureIndex(new Index().on("firstName", Order.DESCENDING).unique());
Person person = new Person(new ObjectId(), "Amol");
person.setAge(28);
@@ -293,7 +294,7 @@ public class MongoTemplateTests {
p2.setAge(40);
template.insert(p2);
template.indexOps(Person.class).ensureIndex(new Index().on("age", Direction.DESC).unique(Duplicates.DROP));
template.indexOps(Person.class).ensureIndex(new Index().on("age", Order.DESCENDING).unique(Duplicates.DROP));
DBCollection coll = template.getCollection(template.getCollectionName(Person.class));
List<DBObject> indexInfo = coll.getIndexInfo();
@@ -323,7 +324,7 @@ public class MongoTemplateTests {
List<IndexField> indexFields = ii.getIndexFields();
IndexField field = indexFields.get(0);
assertThat(field, is(IndexField.create("age", Direction.DESC)));
assertThat(field, is(IndexField.create("age", Order.DESCENDING)));
}
@Test
@@ -772,8 +773,7 @@ public class MongoTemplateTests {
Query q3 = new Query(Criteria.where("age").in(l1, l2));
template.find(q3, PersonWithIdPropertyOfTypeObjectId.class);
Assert.fail("Should have trown an InvalidDocumentStoreApiUsageException");
} catch (InvalidMongoDbApiUsageException e) {
}
} catch (InvalidMongoDbApiUsageException e) {}
}
@Test
@@ -950,7 +950,7 @@ public class MongoTemplateTests {
// test query with a sort
Query q2 = new Query(Criteria.where("age").gt(10));
q2.with(new Sort(Direction.DESC, "age"));
q2.sort().on("age", Order.DESCENDING);
PersonWithAList p5 = template.findOne(q2, PersonWithAList.class);
assertThat(p5.getFirstName(), is("Mark"));
}
@@ -1612,17 +1612,60 @@ public class MongoTemplateTests {
assertThat(result.get(0).containsField("first"), is(true));
}
/**
* @see DATAMONGO-675
*/
@Test
public void executesExistsCorrectly() {
public void updateConsidersMappingAnnotations() {
Sample sample = new Sample();
template.save(sample);
TypeWithFieldAnnotation entity = new TypeWithFieldAnnotation();
entity.emailAddress = "old";
Query query = query(where("id").is(sample.id));
template.save(entity);
assertThat(template.exists(query, Sample.class), is(true));
assertThat(template.exists(query(where("_id").is(sample.id)), template.getCollectionName(Sample.class)), is(true));
assertThat(template.exists(query, Sample.class, template.getCollectionName(Sample.class)), is(true));
Query query = query(where("_id").is(entity.id));
Update update = Update.update("emailAddress", "new");
FindAndModifyOptions options = new FindAndModifyOptions().returnNew(true);
TypeWithFieldAnnotation result = template.findAndModify(query, update, options, TypeWithFieldAnnotation.class);
assertThat(result.emailAddress, is("new"));
}
/**
* @see DATAMONGO-671
*/
@Test
public void findsEntityByDateReference() {
TypeWithDate entity = new TypeWithDate();
entity.date = new Date(System.currentTimeMillis() - 10);
template.save(entity);
Query query = query(where("date").lt(new Date()));
List<TypeWithDate> result = template.find(query, TypeWithDate.class);
assertThat(result, hasSize(1));
assertThat(result.get(0).date, is(notNullValue()));
}
/**
* @see DATAMONGO-540
*/
@Test
public void findOneAfterUpsertForNonExistingObjectReturnsTheInsertedObject() {
String idValue = "4711";
Query query = new Query(Criteria.where("id").is(idValue));
String fieldValue = "bubu";
Update update = Update.update("field", fieldValue);
template.upsert(query, update, Sample.class);
Sample result = template.findOne(query, Sample.class);
assertThat(result, is(notNullValue()));
assertThat(result.field, is(fieldValue));
assertThat(result.id, is(idValue));
}
static class MyId {
@@ -1633,14 +1676,12 @@ public class MongoTemplateTests {
static class TypeWithMyId {
@Id
MyId id;
@Id MyId id;
}
public static class Sample {
@Id
String id;
@Id String id;
String field;
}
@@ -1697,8 +1738,19 @@ public class MongoTemplateTests {
static class VersionedPerson {
@Version
Long version;
@Version Long version;
String id, firstname, lastname;
}
static class TypeWithFieldAnnotation {
@Id ObjectId id;
@Field("email") String emailAddress;
}
static class TypeWithDate {
@Id String id;
Date date;
}
}

View File

@@ -1,64 +0,0 @@
package org.springframework.data.mongodb.core;
public class SomeEnumTest {
public enum StringEnum {
ONE, TWO, FIVE;
}
public enum NumberEnum {
ONE(1), TWO(2), FIVE(5);
private int value;
public int value() {
return value;
}
NumberEnum(int value) {
this.value = value;
}
}
private StringEnum stringEnum;
private NumberEnum numberEnum;
private String id;
private String name;
public StringEnum getStringEnum() {
return stringEnum;
}
public void setStringEnum(StringEnum stringEnum) {
this.stringEnum = stringEnum;
}
public NumberEnum getNumberEnum() {
return numberEnum;
}
public void setNumberEnum(NumberEnum numberEnum) {
this.numberEnum = numberEnum;
}
public String getId() {
return id;
}
public void setId(String id) {
this.id = id;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
}

View File

@@ -45,6 +45,7 @@ import org.mockito.Mock;
import org.mockito.runners.MockitoJUnitRunner;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.ApplicationContext;
import org.springframework.core.convert.converter.Converter;
import org.springframework.data.annotation.Id;
import org.springframework.data.annotation.PersistenceConstructor;
import org.springframework.data.annotation.TypeAlias;
@@ -70,7 +71,6 @@ import com.mongodb.util.JSON;
* Unit tests for {@link MappingMongoConverter}.
*
* @author Oliver Gierke
* @author Patrik Wasik
*/
@RunWith(MockitoJUnitRunner.class)
public class MappingMongoConverterUnitTests {
@@ -1320,53 +1320,6 @@ public class MappingMongoConverterUnitTests {
converter.read(ObjectContainer.class, input);
}
/**
* @see DATAMONGO-657
*/
@Test
public void convertDocumentWithMapDBRef() {
MapDBRef mapDBRef = new MapDBRef();
MapDBRefVal val = new MapDBRefVal();
val.id = BigInteger.ONE;
Map<String, MapDBRefVal> mapVal = new HashMap<String, MapDBRefVal>();
mapVal.put("test", val);
mapDBRef.map = mapVal;
BasicDBObject dbObject = new BasicDBObject();
converter.write(mapDBRef, dbObject);
DBObject map = (DBObject) dbObject.get("map");
assertThat(map.get("test"), instanceOf(DBRef.class));
DBObject mapValDBObject = new BasicDBObject();
mapValDBObject.put("_id", BigInteger.ONE);
DBRef dbRef = mock(DBRef.class);
when(dbRef.fetch()).thenReturn(mapValDBObject);
((DBObject) dbObject.get("map")).put("test", dbRef);
MapDBRef read = converter.read(MapDBRef.class, dbObject);
assertThat(read.map.get("test").id, is(BigInteger.ONE));
}
@Document
class MapDBRef {
@org.springframework.data.mongodb.core.mapping.DBRef
Map<String, MapDBRefVal> map;
}
@Document
class MapDBRefVal {
BigInteger id;
}
static class GenericType<T> {
T content;
}
@@ -1591,4 +1544,18 @@ public class MappingMongoConverterUnitTests {
return m_property;
}
}
private class LocalDateToDateConverter implements Converter<LocalDate, Date> {
public Date convert(LocalDate source) {
return source.toDateMidnight().toDate();
}
}
private class DateToLocalDateConverter implements Converter<Date, LocalDate> {
public LocalDate convert(Date source) {
return new LocalDate(source.getTime());
}
}
}

View File

@@ -37,6 +37,7 @@ import org.springframework.data.annotation.Id;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.DBObjectUtils;
import org.springframework.data.mongodb.core.Person;
import org.springframework.data.mongodb.core.mapping.BasicMongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.DBRef;
import org.springframework.data.mongodb.core.mapping.Field;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
@@ -63,8 +64,7 @@ public class QueryMapperUnitTests {
MongoMappingContext context;
MappingMongoConverter converter;
@Mock
MongoDbFactory factory;
@Mock MongoDbFactory factory;
@Before
public void setUp() {
@@ -334,7 +334,11 @@ public class QueryMapperUnitTests {
DBObject result = mapper.getMappedObject(query.getQueryObject(), context.getPersistentEntity(WithDBRef.class));
DBObject reference = DBObjectUtils.getAsDBObject(result, "reference");
assertThat(reference.containsField("$in"), is(true));
BasicDBList inClause = getAsDBList(reference, "$in");
assertThat(inClause, hasSize(2));
assertThat(inClause.get(0), is(instanceOf(com.mongodb.DBRef.class)));
assertThat(inClause.get(1), is(instanceOf(com.mongodb.DBRef.class)));
}
/**
@@ -394,6 +398,46 @@ public class QueryMapperUnitTests {
assertThat(mapped.get("_id"), is(instanceOf(ObjectId.class)));
}
/**
* @see DATAMONGO-705
*/
@Test
public void convertsDBRefWithExistsQuery() {
Query query = query(where("reference").exists(false));
BasicMongoPersistentEntity<?> entity = context.getPersistentEntity(WithDBRef.class);
DBObject mappedObject = mapper.getMappedObject(query.getQueryObject(), entity);
DBObject reference = getAsDBObject(mappedObject, "reference");
assertThat(reference.containsField("$exists"), is(true));
assertThat(reference.get("$exists"), is((Object) false));
}
/**
* @see DATAMONGO-706
*/
@Test
public void convertsNestedDBRefsCorrectly() {
Reference reference = new Reference();
reference.id = 5L;
Query query = query(where("someString").is("foo").andOperator(where("reference").in(reference)));
BasicMongoPersistentEntity<?> entity = context.getPersistentEntity(WithDBRef.class);
DBObject mappedObject = mapper.getMappedObject(query.getQueryObject(), entity);
assertThat(mappedObject.get("someString"), is((Object) "foo"));
BasicDBList andClause = getAsDBList(mappedObject, "$and");
assertThat(andClause, hasSize(1));
BasicDBList inClause = getAsDBList(getAsDBObject(getAsDBObject(andClause, 0), "reference"), "$in");
assertThat(inClause, hasSize(1));
assertThat(inClause.get(0), is(instanceOf(com.mongodb.DBRef.class)));
}
class IdWrapper {
Object id;
}
@@ -406,14 +450,12 @@ public class QueryMapperUnitTests {
class Sample {
@Id
private String foo;
@Id private String foo;
}
class BigIntegerId {
@Id
private BigInteger id;
@Id private BigInteger id;
}
enum Enum {
@@ -427,14 +469,13 @@ public class QueryMapperUnitTests {
class CustomizedField {
@Field("foo")
CustomizedField field;
@Field("foo") CustomizedField field;
}
class WithDBRef {
@DBRef
Reference reference;
String someString;
@DBRef Reference reference;
}
class Reference {
@@ -449,7 +490,6 @@ public class QueryMapperUnitTests {
class WithMapDBRef {
@DBRef
Map<String, Sample> mapWithDBRef;
@DBRef Map<String, Sample> mapWithDBRef;
}
}

View File

@@ -33,7 +33,6 @@ import org.junit.Test;
import org.springframework.context.ApplicationContext;
import org.springframework.context.annotation.AnnotationConfigApplicationContext;
import org.springframework.dao.DataAccessException;
import org.springframework.data.domain.Sort.Direction;
import org.springframework.data.mongodb.core.CollectionCallback;
import org.springframework.data.mongodb.core.IndexOperations;
import org.springframework.data.mongodb.core.MongoTemplate;
@@ -42,6 +41,7 @@ import org.springframework.data.mongodb.core.index.GeospatialIndex;
import org.springframework.data.mongodb.core.index.IndexField;
import org.springframework.data.mongodb.core.index.IndexInfo;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Order;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.monitor.ServerInfo;
import org.springframework.expression.ExpressionParser;
@@ -58,7 +58,6 @@ import com.mongodb.WriteConcern;
* Modified from https://github.com/deftlabs/mongo-java-geospatial-example
*
* @author Mark Pollack
* @author Oliver Gierke
*/
public class GeoSpatialTests {
@@ -212,7 +211,7 @@ public class GeoSpatialTests {
List<IndexField> fields = indexInfo.get(0).getIndexFields();
assertThat(fields.size(), is(1));
assertThat(fields, hasItem(IndexField.create("_id", Direction.ASC)));
assertThat(fields, hasItem(IndexField.create("_id", Order.ASCENDING)));
fields = indexInfo.get(1).getIndexFields();
assertThat(fields.size(), is(1));

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2012-2013 the original author or authors.
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -19,7 +19,6 @@ import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import org.junit.Test;
import org.springframework.data.domain.Sort.Direction;
import org.springframework.data.mongodb.core.query.Order;
/**
@@ -27,16 +26,14 @@ import org.springframework.data.mongodb.core.query.Order;
*
* @author Oliver Gierke
*/
@SuppressWarnings("deprecation")
public class IndexFieldUnitTests {
@Test
public void createsPlainIndexFieldCorrectly() {
IndexField field = IndexField.create("foo", Direction.ASC);
IndexField field = IndexField.create("foo", Order.ASCENDING);
assertThat(field.getKey(), is("foo"));
assertThat(field.getDirection(), is(Direction.ASC));
assertThat(field.getOrder(), is(Order.ASCENDING));
assertThat(field.isGeo(), is(false));
}
@@ -47,15 +44,15 @@ public class IndexFieldUnitTests {
IndexField field = IndexField.geo("foo");
assertThat(field.getKey(), is("foo"));
assertThat(field.getDirection(), is(nullValue()));
assertThat(field.getOrder(), is(nullValue()));
assertThat(field.isGeo(), is(true));
}
@Test
public void correctEqualsForPlainFields() {
IndexField first = IndexField.create("foo", Direction.ASC);
IndexField second = IndexField.create("foo", Direction.ASC);
IndexField first = IndexField.create("foo", Order.ASCENDING);
IndexField second = IndexField.create("foo", Order.ASCENDING);
assertThat(first, is(second));
assertThat(second, is(first));

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2012-2013 the original author or authors.
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -21,7 +21,7 @@ import static org.junit.Assert.*;
import java.util.Arrays;
import org.junit.Test;
import org.springframework.data.domain.Sort.Direction;
import org.springframework.data.mongodb.core.query.Order;
/**
* Unit tests for {@link IndexInfo}.
@@ -33,8 +33,8 @@ public class IndexInfoUnitTests {
@Test
public void isIndexForFieldsCorrectly() {
IndexField fooField = IndexField.create("foo", Direction.ASC);
IndexField barField = IndexField.create("bar", Direction.DESC);
IndexField fooField = IndexField.create("foo", Order.ASCENDING);
IndexField barField = IndexField.create("bar", Order.DESCENDING);
IndexInfo info = new IndexInfo(Arrays.asList(fooField, barField), "myIndex", false, false, false);
assertThat(info.isIndexForFields(Arrays.asList("foo", "bar")), is(true));

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2012-2013 the original author or authors.
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,7 +15,7 @@
*/
package org.springframework.data.mongodb.core.index;
import static org.hamcrest.Matchers.*;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import java.util.Collections;
@@ -38,7 +38,6 @@ import com.mongodb.DBObject;
* Unit tests for {@link MongoPersistentEntityIndexCreator}.
*
* @author Oliver Gierke
* @author Philipp Schneider
*/
@RunWith(MockitoJUnitRunner.class)
public class MongoPersistentEntityIndexCreatorUnitTests {
@@ -51,20 +50,25 @@ public class MongoPersistentEntityIndexCreatorUnitTests {
@Test
public void buildsIndexDefinitionUsingFieldName() {
MongoMappingContext mappingContext = prepareMappingContext(Person.class);
MongoMappingContext mappingContext = new MongoMappingContext();
mappingContext.setInitialEntitySet(Collections.singleton(Person.class));
mappingContext.initialize();
DummyMongoPersistentEntityIndexCreator creator = new DummyMongoPersistentEntityIndexCreator(mappingContext, factory);
assertThat(creator.indexDefinition, is(notNullValue()));
assertThat(creator.indexDefinition.keySet(), hasItem("fieldname"));
assertThat(creator.name, is("indexName"));
assertThat(creator.background, is(false));
}
@Test
public void doesNotCreateIndexForEntityComingFromDifferentMappingContext() {
MongoMappingContext mappingContext = new MongoMappingContext();
MongoMappingContext personMappingContext = prepareMappingContext(Person.class);
MongoMappingContext personMappingContext = new MongoMappingContext();
personMappingContext.setInitialEntitySet(Collections.singleton(Person.class));
personMappingContext.initialize();
DummyMongoPersistentEntityIndexCreator creator = new DummyMongoPersistentEntityIndexCreator(mappingContext, factory);
@@ -91,49 +95,17 @@ public class MongoPersistentEntityIndexCreatorUnitTests {
assertThat(creator.isIndexCreatorFor(new MongoMappingContext()), is(false));
}
/**
* @see DATAMONGO-554
*/
@Test
public void triggersBackgroundIndexingIfConfigured() {
MongoMappingContext mappingContext = prepareMappingContext(AnotherPerson.class);
DummyMongoPersistentEntityIndexCreator creator = new DummyMongoPersistentEntityIndexCreator(mappingContext, factory);
assertThat(creator.indexDefinition, is(notNullValue()));
assertThat(creator.indexDefinition.keySet(), hasItem("lastname"));
assertThat(creator.name, is("lastname"));
assertThat(creator.background, is(true));
}
private static MongoMappingContext prepareMappingContext(Class<?> type) {
MongoMappingContext mappingContext = new MongoMappingContext();
mappingContext.setInitialEntitySet(Collections.singleton(type));
mappingContext.initialize();
return mappingContext;
}
static class Person {
@Indexed(name = "indexName")
@Field("fieldname")
String field;
}
static class AnotherPerson {
@Indexed(background = true)
String lastname;
}
static class DummyMongoPersistentEntityIndexCreator extends MongoPersistentEntityIndexCreator {
DBObject indexDefinition;
String name;
boolean background;
public DummyMongoPersistentEntityIndexCreator(MongoMappingContext mappingContext, MongoDbFactory mongoDbFactory) {
super(mappingContext, mongoDbFactory);
@@ -141,11 +113,10 @@ public class MongoPersistentEntityIndexCreatorUnitTests {
@Override
protected void ensureIndex(String collection, String name, DBObject indexDefinition, boolean unique,
boolean dropDups, boolean sparse, boolean background) {
boolean dropDups, boolean sparse) {
this.name = name;
this.indexDefinition = indexDefinition;
this.background = background;
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 by the original author(s).
* Copyright (c) 2011 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,9 +15,9 @@
*/
package org.springframework.data.mongodb.core.mapping;
import static org.mockito.Mockito.*;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import static org.mockito.Mockito.*;
import org.junit.Test;
import org.junit.runner.RunWith;

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 by the original author(s).
* Copyright (c) 2011 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -19,14 +19,10 @@ import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import java.lang.reflect.Field;
import java.util.Locale;
import org.junit.Before;
import org.junit.Rule;
import org.junit.Test;
import org.junit.rules.ExpectedException;
import org.springframework.data.annotation.Id;
import org.springframework.data.mapping.model.MappingException;
import org.springframework.data.mapping.model.SimpleTypeHolder;
import org.springframework.data.util.ClassTypeInformation;
import org.springframework.util.ReflectionUtils;
@@ -40,9 +36,6 @@ public class BasicMongoPersistentPropertyUnitTests {
MongoPersistentEntity<Person> entity;
@Rule
public ExpectedException exception = ExpectedException.none();
@Before
public void setup() {
entity = new BasicMongoPersistentEntity<Person>(ClassTypeInformation.from(Person.class));
@@ -85,45 +78,8 @@ public class BasicMongoPersistentPropertyUnitTests {
assertThat(property.usePropertyAccess(), is(true));
}
/**
* @see DATAMONGO-607
*/
@Test
public void usesCustomFieldNamingStrategyByDefault() throws Exception {
Field field = ReflectionUtils.findField(Person.class, "lastname");
MongoPersistentProperty property = new BasicMongoPersistentProperty(field, null, entity, new SimpleTypeHolder(),
UppercaseFieldNamingStrategy.INSTANCE);
assertThat(property.getFieldName(), is("LASTNAME"));
field = ReflectionUtils.findField(Person.class, "firstname");
property = new BasicMongoPersistentProperty(field, null, entity, new SimpleTypeHolder(),
UppercaseFieldNamingStrategy.INSTANCE);
assertThat(property.getFieldName(), is("foo"));
}
/**
* @see DATAMONGO-607
*/
@Test
public void rejectsInvalidValueReturnedByFieldNamingStrategy() {
Field field = ReflectionUtils.findField(Person.class, "lastname");
MongoPersistentProperty property = new BasicMongoPersistentProperty(field, null, entity, new SimpleTypeHolder(),
InvalidFieldNamingStrategy.INSTANCE);
exception.expect(MappingException.class);
exception.expectMessage(InvalidFieldNamingStrategy.class.getName());
exception.expectMessage(property.toString());
property.getFieldName();
}
private MongoPersistentProperty getPropertyFor(Field field) {
return new BasicMongoPersistentProperty(field, null, entity, new SimpleTypeHolder(),
PropertyNameFieldNamingStrategy.INSTANCE);
return new BasicMongoPersistentProperty(field, null, entity, new SimpleTypeHolder());
}
class Person {
@@ -138,22 +94,4 @@ public class BasicMongoPersistentPropertyUnitTests {
@org.springframework.data.mongodb.core.mapping.Field(order = -20)
String ssn;
}
enum UppercaseFieldNamingStrategy implements FieldNamingStrategy {
INSTANCE;
public String getFieldName(MongoPersistentProperty property) {
return property.getName().toUpperCase(Locale.US);
}
}
enum InvalidFieldNamingStrategy implements FieldNamingStrategy {
INSTANCE;
public String getFieldName(MongoPersistentProperty property) {
return null;
}
}
}

View File

@@ -1,51 +0,0 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.mapping;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import static org.mockito.Mockito.*;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.mockito.runners.MockitoJUnitRunner;
/**
* Unit tests for {@link CamelCaseAbbreviatingFieldNamingStrategy}.
*
* @author Oliver Gierke
*/
@RunWith(MockitoJUnitRunner.class)
public class CamelCaseAbbreviatingFieldNamingStrategyUnitTests {
FieldNamingStrategy strategy = new CamelCaseAbbreviatingFieldNamingStrategy();
@Mock
MongoPersistentProperty property;
@Test
public void foo() {
assertFieldNameForPropertyName("fooBar", "fb");
assertFieldNameForPropertyName("fooBARFooBar", "fbfb");
}
private void assertFieldNameForPropertyName(String propertyName, String fieldName) {
when(property.getName()).thenReturn(propertyName);
assertThat(strategy.getFieldName(property), is(fieldName));
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright (c) 2011 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -13,18 +13,21 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.mapping;
import org.springframework.data.annotation.Id;
import org.springframework.data.mongodb.core.index.Indexed;
import org.springframework.data.mongodb.core.mapping.Document;
/**
* @author Jon Brisbin
* @author Jon Brisbin <jbrisbin@vmware.com>
*/
@Document(collection = "foobar")
public class CustomCollectionWithIndex {
@Id
@SuppressWarnings("unused")
private String id;
@Indexed
private String name;

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright (c) 2011 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -13,18 +13,21 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.mapping;
import org.springframework.data.annotation.Id;
import org.springframework.data.mongodb.core.index.Indexed;
import org.springframework.data.mongodb.core.mapping.Document;
/**
* @author Jon Brisbin
* @author Jon Brisbin <jbrisbin@vmware.com>
*/
@Document
public class DetectedCollectionWithIndex {
@Id
@SuppressWarnings("unused")
private String id;
@Indexed
private String name;

Some files were not shown because too many files have changed in this diff Show More