Compare commits
58 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
1202d2cbc2 | ||
|
|
0edf6c06ed | ||
|
|
0afd137e7d | ||
|
|
bc3f44197b | ||
|
|
28bd631579 | ||
|
|
5396df9af4 | ||
|
|
2b864e9744 | ||
|
|
5accbbdac5 | ||
|
|
11d9f04fd1 | ||
|
|
67b91e446e | ||
|
|
1124841e17 | ||
|
|
10ccbf131d | ||
|
|
b29930b512 | ||
|
|
d671fb13ae | ||
|
|
b0a10d19c3 | ||
|
|
d8ef7e1472 | ||
|
|
b9a25eabae | ||
|
|
e92e5c737f | ||
|
|
6e46fb12cb | ||
|
|
031d446a1c | ||
|
|
e6bab1ce60 | ||
|
|
2fffe0a5c4 | ||
|
|
2493de5f91 | ||
|
|
bd11bab076 | ||
|
|
b667984563 | ||
|
|
7ef167ed96 | ||
|
|
303a057d86 | ||
|
|
607072c0d3 | ||
|
|
11e9c562b3 | ||
|
|
220b211faa | ||
|
|
c0c51fcc29 | ||
|
|
ed9eddf10e | ||
|
|
e23d73d55e | ||
|
|
9627fbaebf | ||
|
|
ef93d4db0b | ||
|
|
928b5a7742 | ||
|
|
118a52a8d6 | ||
|
|
b47e8ca3da | ||
|
|
8527d6eb43 | ||
|
|
d645c778c3 | ||
|
|
5c47f1ae9e | ||
|
|
9f324bac19 | ||
|
|
186caba1ac | ||
|
|
2ebb7e801d | ||
|
|
a932f3474e | ||
|
|
e992456532 | ||
|
|
7b34c5cac4 | ||
|
|
16baf00f5e | ||
|
|
61a2c56a27 | ||
|
|
e62437b64a | ||
|
|
e9a7e887be | ||
|
|
b91a66f6f9 | ||
|
|
a047e54e5a | ||
|
|
8197ff57c8 | ||
|
|
a2136719e1 | ||
|
|
7bcf142c8d | ||
|
|
d50d03a80e | ||
|
|
ba894a4511 |
@@ -1 +0,0 @@
|
||||
You find the contribution guidelines for Spring Data projects [here](https://github.com/spring-projects/spring-data-build/blob/master/CONTRIBUTING.md).
|
||||
152
README.md
152
README.md
@@ -1,68 +1,96 @@
|
||||
# Spring Data MongoDB
|
||||
Spring Data MongoDB
|
||||
======================
|
||||
|
||||
The primary goal of the [Spring Data](http://projects.spring.io/spring-data) project is to make it easier to build Spring-powered applications that use new data access technologies such as non-relational databases, map-reduce frameworks, and cloud based data services.
|
||||
The primary goal of the [Spring Data](http://www.springsource.org/spring-data) project is to make it easier to build Spring-powered applications that use new data access technologies such as non-relational databases, map-reduce frameworks, and cloud based data services.
|
||||
|
||||
The Spring Data MongoDB project aims to provide a familiar and consistent Spring-based programming model for new datastores while retaining store-specific features and capabilities. The Spring Data MongoDB project provides integration with the MongoDB document database. Key functional areas of Spring Data MongoDB are a POJO centric model for interacting with a MongoDB DBCollection and easily writing a repository style data access layer.
|
||||
The Spring Data MongoDB aims to provide a familiar and consistent Spring-based programming model for for new datastores while retaining store-specific features and capabilities. The Spring Data MongoDB project provides integration with the MongoDB document database. Key functional areas of Spring Data MongoDB are a POJO centric model for interacting with a MongoDB DBCollection and easily writing a Repository style data access layer
|
||||
|
||||
## Getting Help
|
||||
Getting Help
|
||||
------------
|
||||
|
||||
For a comprehensive treatment of all the Spring Data MongoDB features, please refer to:
|
||||
For a comprehensive treatmet of all the Spring Data MongoDB features, please refer to the The [User Guide](http://static.springsource.org/spring-data/data-mongodb/docs/current/reference/html/)
|
||||
|
||||
* the [User Guide](http://docs.spring.io/spring-data/mongodb/docs/current/reference/html/)
|
||||
* the [JavaDocs](http://docs.spring.io/spring-data/mongodb/docs/current/api/) have extensive comments in them as well.
|
||||
* the home page of [Spring Data MongoDB](http://projects.spring.io/spring-data-mongodb) contains links to articles and other resources.
|
||||
* for more detailed questions, use [Spring Data Mongodb on Stackoverflow](http://stackoverflow.com/questions/tagged/spring-data-mongodb).
|
||||
The [JavaDocs](http://static.springsource.org/spring-data/data-mongodb/docs/current/api/) have extensive comments in them as well.
|
||||
|
||||
If you are new to Spring as well as to Spring Data, look for information about [Spring projects](http://projects.spring.io/).
|
||||
The home page of [Spring Data MongoDB](http://www.springsource.org/spring-data/mongodb) contains links to articles and other resources.
|
||||
|
||||
For more detailed questions, use the [forum](http://forum.springsource.org/forumdisplay.php?f=80).
|
||||
|
||||
If you are new to Spring as well as to Spring Data, look for information about [Spring projects](http://www.springsource.org/projects).
|
||||
|
||||
|
||||
## Quick Start
|
||||
Quick Start
|
||||
-----------
|
||||
|
||||
### Maven configuration
|
||||
## MongoDB
|
||||
|
||||
Add the Maven dependency:
|
||||
For those in a hurry:
|
||||
|
||||
|
||||
* Download the jar through Maven:
|
||||
|
||||
```xml
|
||||
<dependency>
|
||||
<groupId>org.springframework.data</groupId>
|
||||
<artifactId>spring-data-mongodb</artifactId>
|
||||
<version>1.5.0.RELEASE</version>
|
||||
<version>1.2.3.RELEASE</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
If you'd rather like the latest snapshots of the upcoming major version, use our Maven snapshot repository and declare the appropriate dependency version.
|
||||
|
||||
```xml
|
||||
<dependency>
|
||||
<groupId>org.springframework.data</groupId>
|
||||
<artifactId>spring-data-mongodb</artifactId>
|
||||
<version>1.6.0.BUILD-SNAPSHOT</version>
|
||||
</dependency>
|
||||
|
||||
<repository>
|
||||
<id>spring-libs-snapshot</id>
|
||||
<name>Spring Snapshot Repository</name>
|
||||
<url>http://repo.spring.io/libs-snapshot</url>
|
||||
</repository>
|
||||
```
|
||||
|
||||
### MongoTemplate
|
||||
|
||||
MongoTemplate is the central support class for Mongo database operations. It provides:
|
||||
MongoTemplate is the central support class for Mongo database operations. It provides
|
||||
|
||||
* Basic POJO mapping support to and from BSON
|
||||
* Convenience methods to interact with the store (insert object, update objects) and MongoDB specific ones (geo-spatial operations, upserts, map-reduce etc.)
|
||||
* Connection affinity callback
|
||||
* Exception translation into Spring's [technology agnostic DAO exception hierarchy](http://docs.spring.io/spring/docs/current/spring-framework-reference/html/dao.html#dao-exceptions).
|
||||
* Connection Affinity callback
|
||||
* Exception translation into Spring's [technology agnostic DAO exception hierarchy](http://static.springsource.org/spring/docs/3.0.x/spring-framework-reference/html/dao.html#dao-exceptions).
|
||||
|
||||
### Spring Data repositories
|
||||
Future plans are to support optional logging and/or exception throwing based on WriteResult return value, common map-reduce operations, GridFS operations. A simple API for partial document updates is also planned.
|
||||
|
||||
To simplify the creation of data repositories Spring Data MongoDB provides a generic repository programming model. It will automatically create a repository proxy for you that adds implementations of finder methods you specify on an interface.
|
||||
### Easy Data Repository generation
|
||||
|
||||
For example, given a `Person` class with first and last name properties, a `PersonRepository` interface that can query for `Person` by last name and when the first name matches a like expression is shown below:
|
||||
To simplify the creation of data repositories a generic `Repository` interface and default implementation is provided. Furthermore, Spring will automatically create a Repository implementation for you that adds implementations of finder methods you specify on an interface.
|
||||
|
||||
The Repository interface is
|
||||
|
||||
```java
|
||||
public interface PersonRepository extends CrudRepository<Person, Long> {
|
||||
public interface Repository<T, ID extends Serializable> {
|
||||
|
||||
T save(T entity);
|
||||
|
||||
List<T> save(Iterable<? extends T> entities);
|
||||
|
||||
T findById(ID id);
|
||||
|
||||
boolean exists(ID id);
|
||||
|
||||
List<T> findAll();
|
||||
|
||||
Long count();
|
||||
|
||||
void delete(T entity);
|
||||
|
||||
void delete(Iterable<? extends T> entities);
|
||||
|
||||
void deleteAll();
|
||||
}
|
||||
```
|
||||
|
||||
|
||||
The `MongoRepository` extends `Repository` and will in future add more Mongo specific methods.
|
||||
|
||||
```java
|
||||
public interface MongoRepository<T, ID extends Serializable> extends Repository<T, ID> {
|
||||
}
|
||||
```
|
||||
|
||||
`SimpleMongoRepository` is the out of the box implementation of the `MongoRepository` you can use for basid CRUD operations.
|
||||
|
||||
To go beyond basic CRUD, extend the `MongoRepository` interface and supply your own finder methods that follow simple naming conventions such that they can be easily converted into queries.
|
||||
|
||||
For example, given a `Person` class with first and last name properties, a `PersonRepository` interface that can query for `Person` by last name and when the first name matches a regular expression is shown below
|
||||
|
||||
```java
|
||||
public interface PersonRepository extends MongoRepository<Person, Long> {
|
||||
|
||||
List<Person> findByLastname(String lastname);
|
||||
|
||||
@@ -70,38 +98,18 @@ public interface PersonRepository extends CrudRepository<Person, Long> {
|
||||
}
|
||||
```
|
||||
|
||||
The queries issued on execution will be derived from the method name. Extending `CrudRepository` causes CRUD methods being pulled into the interface so that you can easily save and find single entities and collections of them.
|
||||
|
||||
You can have Spring automatically create a proxy for the interface by using the following JavaConfig:
|
||||
|
||||
```java
|
||||
@Configuration
|
||||
@EnableMongoRepositories
|
||||
class ApplicationConfig extends AbstractMongoConfiguration {
|
||||
|
||||
@Override
|
||||
public Mongo mongo() throws Exception {
|
||||
return new MongoClient();
|
||||
}
|
||||
|
||||
@Override
|
||||
protected String getDatabaseName() {
|
||||
return "springdata";
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
This sets up a connection to a local MongoDB instance and enables the detection of Spring Data repositories (through `@EnableMongoRepositories`). The same configuration would look like this in XML:
|
||||
You can have Spring automatically create a proxy for the interface as shown below:
|
||||
|
||||
```xml
|
||||
<bean id="template" class="org.springframework.data.mongodb.core.MongoTemplate">
|
||||
<bean id="template" class="org.springframework.data.document.mongodb.MongoTemplate">
|
||||
<constructor-arg>
|
||||
<bean class="com.mongodb.MongoClient">
|
||||
<bean class="com.mongodb.Mongo">
|
||||
<constructor-arg value="localhost" />
|
||||
<constructor-arg value="27017" />
|
||||
</bean>
|
||||
</constructor-arg>
|
||||
<constructor-arg value="database" />
|
||||
<property name="defaultCollectionName" value="springdata" />
|
||||
</bean>
|
||||
|
||||
<mongo:repositories base-package="com.acme.repository" />
|
||||
@@ -109,16 +117,12 @@ This sets up a connection to a local MongoDB instance and enables the detection
|
||||
|
||||
This will find the repository interface and register a proxy object in the container. You can use it as shown below:
|
||||
|
||||
```java
|
||||
``java
|
||||
@Service
|
||||
public class MyService {
|
||||
|
||||
private final PersonRepository repository;
|
||||
|
||||
@Autowired
|
||||
public MyService(PersonRepository repository) {
|
||||
this.repository = repository;
|
||||
}
|
||||
private final PersonRepository repository;
|
||||
|
||||
public void doWork() {
|
||||
|
||||
@@ -130,18 +134,22 @@ public class MyService {
|
||||
person = repository.save(person);
|
||||
|
||||
List<Person> lastNameResults = repository.findByLastname("Gierke");
|
||||
|
||||
List<Person> firstNameResults = repository.findByFirstnameLike("Oli*");
|
||||
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Contributing to Spring Data
|
||||
|
||||
Contributing to Spring Data
|
||||
---------------------------
|
||||
|
||||
Here are some ways for you to get involved in the community:
|
||||
|
||||
* Get involved with the Spring community on Stackoverflow and help out on the [spring-data-mongodb](http://stackoverflow.com/questions/tagged/spring-data-mongodb) tag by responding to questions and joining the debate.
|
||||
* Get involved with the Spring community on the Spring Community Forums. Please help out on the [forum](http://forum.springsource.org/forumdisplay.php?f=80) by responding to questions and joining the debate.
|
||||
* Create [JIRA](https://jira.springframework.org/browse/DATADOC) tickets for bugs and new features and comment and vote on the ones that you are interested in.
|
||||
* Github is for social coding: if you want to write code, we encourage contributions through pull requests from [forks of this repository](http://help.github.com/forking/). If you want to contribute code this way, please reference a JIRA ticket as well covering the specific issue you are addressing.
|
||||
* Watch for upcoming articles on Spring by [subscribing](http://spring.io/blog) to spring.io.
|
||||
* Watch for upcoming articles on Spring by [subscribing](http://www.springsource.org/node/feed) to springframework.org
|
||||
|
||||
Before we accept a non-trivial patch or pull request we will need you to sign the [contributor's agreement](https://support.springsource.com/spring_committer_signup). Signing the contributor's agreement does not grant anyone commit rights to the main repository, but it does mean that we can accept your contributions, and you will get an author credit if we do. Active contributors might be asked to join the core team, and given the ability to merge pull requests.
|
||||
|
||||
102
pom.xml
102
pom.xml
@@ -5,17 +5,17 @@
|
||||
|
||||
<groupId>org.springframework.data</groupId>
|
||||
<artifactId>spring-data-mongodb-parent</artifactId>
|
||||
<version>1.6.0.M1</version>
|
||||
<version>1.2.5.BUILD-SNAPSHOT</version>
|
||||
<packaging>pom</packaging>
|
||||
|
||||
<name>Spring Data MongoDB</name>
|
||||
<description>MongoDB support for Spring Data</description>
|
||||
<url>http://projects.spring.io/spring-data-mongodb</url>
|
||||
<url>http://www.springsource.org/spring-data/mongodb</url>
|
||||
|
||||
<parent>
|
||||
<groupId>org.springframework.data.build</groupId>
|
||||
<artifactId>spring-data-parent</artifactId>
|
||||
<version>1.5.0.M1</version>
|
||||
<version>1.0.5.RELEASE</version>
|
||||
<relativePath>../spring-data-build/parent/pom.xml</relativePath>
|
||||
</parent>
|
||||
|
||||
@@ -29,20 +29,19 @@
|
||||
<properties>
|
||||
<project.type>multi</project.type>
|
||||
<dist.id>spring-data-mongodb</dist.id>
|
||||
<springdata.commons>1.9.0.M1</springdata.commons>
|
||||
<mongo>2.12.1</mongo>
|
||||
<mongo.osgi>2.12.1</mongo.osgi>
|
||||
<springdata.commons>1.5.3.RELEASE</springdata.commons>
|
||||
<mongo>2.10.1</mongo>
|
||||
</properties>
|
||||
|
||||
<developers>
|
||||
<developer>
|
||||
<id>ogierke</id>
|
||||
<name>Oliver Gierke</name>
|
||||
<email>ogierke at gopivotal.com</email>
|
||||
<organization>Pivotal</organization>
|
||||
<organizationUrl>http://www.gopivotal.com</organizationUrl>
|
||||
<email>ogierke at vmware.com</email>
|
||||
<organization>SpringSource</organization>
|
||||
<organizationUrl>http://www.springsource.com</organizationUrl>
|
||||
<roles>
|
||||
<role>Project Lead</role>
|
||||
<role>Project Lean</role>
|
||||
</roles>
|
||||
<timezone>+1</timezone>
|
||||
</developer>
|
||||
@@ -50,8 +49,8 @@
|
||||
<id>trisberg</id>
|
||||
<name>Thomas Risberg</name>
|
||||
<email>trisberg at vmware.com</email>
|
||||
<organization>Pivotal</organization>
|
||||
<organizationUrl>http://www.gopivotal.com</organizationUrl>
|
||||
<organization>SpringSource</organization>
|
||||
<organizationUrl>http://www.springsource.com</organizationUrl>
|
||||
<roles>
|
||||
<role>Developer</role>
|
||||
</roles>
|
||||
@@ -60,9 +59,9 @@
|
||||
<developer>
|
||||
<id>mpollack</id>
|
||||
<name>Mark Pollack</name>
|
||||
<email>mpollack at gopivotal.com</email>
|
||||
<organization>Pivotal</organization>
|
||||
<organizationUrl>http://www.gopivotal.com</organizationUrl>
|
||||
<email>mpollack at vmware.com</email>
|
||||
<organization>SpringSource</organization>
|
||||
<organizationUrl>http://www.springsource.com</organizationUrl>
|
||||
<roles>
|
||||
<role>Developer</role>
|
||||
</roles>
|
||||
@@ -71,72 +70,16 @@
|
||||
<developer>
|
||||
<id>jbrisbin</id>
|
||||
<name>Jon Brisbin</name>
|
||||
<email>jbrisbin at gopivotal.com</email>
|
||||
<organization>Pivotal</organization>
|
||||
<organizationUrl>http://www.gopivotal.com</organizationUrl>
|
||||
<email>jbrisbin at vmware.com</email>
|
||||
<organization>SpringSource</organization>
|
||||
<organizationUrl>http://www.springsource.com</organizationUrl>
|
||||
<roles>
|
||||
<role>Developer</role>
|
||||
</roles>
|
||||
<timezone>-6</timezone>
|
||||
</developer>
|
||||
<developer>
|
||||
<id>tdarimont</id>
|
||||
<name>Thomas Darimont</name>
|
||||
<email>tdarimont at gopivotal.com</email>
|
||||
<organization>Pivotal</organization>
|
||||
<organizationUrl>http://www.gopivotal.com</organizationUrl>
|
||||
<roles>
|
||||
<role>Developer</role>
|
||||
</roles>
|
||||
<timezone>+1</timezone>
|
||||
</developer>
|
||||
<developer>
|
||||
<id>cstrobl</id>
|
||||
<name>Christoph Strobl</name>
|
||||
<email>cstrobl at gopivotal.com</email>
|
||||
<organization>Pivotal</organization>
|
||||
<organizationUrl>http://www.gopivotal.com</organizationUrl>
|
||||
<roles>
|
||||
<role>Developer</role>
|
||||
</roles>
|
||||
<timezone>+1</timezone>
|
||||
</developer>
|
||||
</developers>
|
||||
|
||||
<profiles>
|
||||
<profile>
|
||||
|
||||
<id>mongo-next</id>
|
||||
<properties>
|
||||
<mongo>2.12.3-SNAPSHOT</mongo>
|
||||
</properties>
|
||||
|
||||
<repositories>
|
||||
<repository>
|
||||
<id>mongo-snapshots</id>
|
||||
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
|
||||
</repository>
|
||||
</repositories>
|
||||
|
||||
</profile>
|
||||
|
||||
<profile>
|
||||
|
||||
<id>mongo-3-next</id>
|
||||
<properties>
|
||||
<mongo>3.0.0-SNAPSHOT</mongo>
|
||||
</properties>
|
||||
|
||||
<repositories>
|
||||
<repository>
|
||||
<id>mongo-snapshots</id>
|
||||
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
|
||||
</repository>
|
||||
</repositories>
|
||||
|
||||
</profile>
|
||||
</profiles>
|
||||
|
||||
<dependencies>
|
||||
<!-- MongoDB -->
|
||||
<dependency>
|
||||
@@ -148,16 +91,9 @@
|
||||
|
||||
<repositories>
|
||||
<repository>
|
||||
<id>spring-libs-milestone</id>
|
||||
<url>http://repo.spring.io/libs-milestone</url>
|
||||
<id>spring-libs-release</id>
|
||||
<url>http://repo.springsource.org/libs-release</url>
|
||||
</repository>
|
||||
</repositories>
|
||||
|
||||
<pluginRepositories>
|
||||
<pluginRepository>
|
||||
<id>spring-plugins-release</id>
|
||||
<url>http://repo.spring.io/plugins-release</url>
|
||||
</pluginRepository>
|
||||
</pluginRepositories>
|
||||
|
||||
</project>
|
||||
|
||||
@@ -6,12 +6,12 @@
|
||||
<parent>
|
||||
<groupId>org.springframework.data</groupId>
|
||||
<artifactId>spring-data-mongodb-parent</artifactId>
|
||||
<version>1.6.0.M1</version>
|
||||
<version>1.2.5.BUILD-SNAPSHOT</version>
|
||||
<relativePath>../pom.xml</relativePath>
|
||||
</parent>
|
||||
|
||||
<artifactId>spring-data-mongodb-cross-store</artifactId>
|
||||
<name>Spring Data MongoDB - Cross-Store Support</name>
|
||||
<name>Spring Data MongoDB - Cross-Store Persistence Support</name>
|
||||
|
||||
<properties>
|
||||
<jpa>1.0.0.Final</jpa>
|
||||
@@ -24,6 +24,7 @@
|
||||
<dependency>
|
||||
<groupId>org.springframework</groupId>
|
||||
<artifactId>spring-beans</artifactId>
|
||||
<version>${spring}</version>
|
||||
<exclusions>
|
||||
<exclusion>
|
||||
<groupId>commons-logging</groupId>
|
||||
@@ -34,21 +35,24 @@
|
||||
<dependency>
|
||||
<groupId>org.springframework</groupId>
|
||||
<artifactId>spring-tx</artifactId>
|
||||
<version>${spring}</version>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.springframework</groupId>
|
||||
<artifactId>spring-aspects</artifactId>
|
||||
<version>${spring}</version>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.springframework</groupId>
|
||||
<artifactId>spring-orm</artifactId>
|
||||
<version>${spring}</version>
|
||||
</dependency>
|
||||
|
||||
<!-- Spring Data -->
|
||||
<dependency>
|
||||
<groupId>org.springframework.data</groupId>
|
||||
<artifactId>spring-data-mongodb</artifactId>
|
||||
<version>1.6.0.M1</version>
|
||||
<version>1.2.5.BUILD-SNAPSHOT</version>
|
||||
</dependency>
|
||||
|
||||
<dependency>
|
||||
@@ -56,13 +60,17 @@
|
||||
<artifactId>aspectjrt</artifactId>
|
||||
<version>${aspectj}</version>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>cglib</groupId>
|
||||
<artifactId>cglib</artifactId>
|
||||
<version>2.2</version>
|
||||
</dependency>
|
||||
|
||||
<!-- JPA -->
|
||||
<dependency>
|
||||
<groupId>org.hibernate.javax.persistence</groupId>
|
||||
<artifactId>hibernate-jpa-2.0-api</artifactId>
|
||||
<version>${jpa}</version>
|
||||
<optional>true</optional>
|
||||
</dependency>
|
||||
|
||||
<!-- For Tests -->
|
||||
@@ -98,7 +106,7 @@
|
||||
<plugin>
|
||||
<groupId>org.codehaus.mojo</groupId>
|
||||
<artifactId>aspectj-maven-plugin</artifactId>
|
||||
<version>1.6</version>
|
||||
<version>1.4</version>
|
||||
<dependencies>
|
||||
<dependency>
|
||||
<groupId>org.aspectj</groupId>
|
||||
@@ -127,7 +135,6 @@
|
||||
<artifactId>spring-aspects</artifactId>
|
||||
</aspectLibrary>
|
||||
</aspectLibraries>
|
||||
<complianceLevel>${source.level}</complianceLevel>
|
||||
<source>${source.level}</source>
|
||||
<target>${source.level}</target>
|
||||
</configuration>
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2011-2014 the original author or authors.
|
||||
* Copyright 2011-2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -41,13 +41,17 @@ import com.mongodb.MongoException;
|
||||
public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
|
||||
|
||||
private static final String ENTITY_CLASS = "_entity_class";
|
||||
|
||||
private static final String ENTITY_ID = "_entity_id";
|
||||
|
||||
private static final String ENTITY_FIELD_NAME = "_entity_field_name";
|
||||
|
||||
private static final String ENTITY_FIELD_CLASS = "_entity_field_class";
|
||||
|
||||
protected final Logger log = LoggerFactory.getLogger(getClass());
|
||||
|
||||
private MongoTemplate mongoTemplate;
|
||||
|
||||
private EntityManagerFactory entityManagerFactory;
|
||||
|
||||
public void setMongoTemplate(MongoTemplate mongoTemplate) {
|
||||
@@ -109,14 +113,12 @@ public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
|
||||
* @see org.springframework.data.crossstore.ChangeSetPersister#getPersistentId(org.springframework.data.crossstore.ChangeSetBacked, org.springframework.data.crossstore.ChangeSet)
|
||||
*/
|
||||
public Object getPersistentId(ChangeSetBacked entity, ChangeSet cs) throws DataAccessException {
|
||||
|
||||
log.debug("getPersistentId called on " + entity);
|
||||
|
||||
if (entityManagerFactory == null) {
|
||||
throw new DataAccessResourceFailureException("EntityManagerFactory cannot be null");
|
||||
}
|
||||
|
||||
return entityManagerFactory.getPersistenceUnitUtil().getIdentifier(entity);
|
||||
Object o = entityManagerFactory.getPersistenceUnitUtil().getIdentifier(entity);
|
||||
return o;
|
||||
}
|
||||
|
||||
/*
|
||||
|
||||
@@ -1,18 +1,18 @@
|
||||
Bundle-SymbolicName: org.springframework.data.mongodb.crossstore
|
||||
Bundle-Name: Spring Data MongoDB Cross Store Support
|
||||
Bundle-Vendor: Pivotal Software, Inc.
|
||||
Bundle-Vendor: SpringSource
|
||||
Bundle-ManifestVersion: 2
|
||||
Import-Package:
|
||||
sun.reflect;version="0";resolution:=optional
|
||||
Export-Template:
|
||||
org.springframework.data.mongodb.crossstore.*;version="${project.version}"
|
||||
Import-Template:
|
||||
com.mongodb.*;version="${mongo.osgi:[=.=.=,+1.0.0)}",
|
||||
com.mongodb.*;version="0",
|
||||
javax.persistence.*;version="${jpa:[=.=.=,+1.0.0)}",
|
||||
org.aspectj.*;version="${aspectj:[1.0.0, 2.0.0)}",
|
||||
org.bson.*;version="0",
|
||||
org.slf4j.*;version="${slf4j:[=.=.=,+1.0.0)}",
|
||||
org.springframework.*;version="${spring:[=.=.=.=,+1.0.0)}",
|
||||
org.springframework.*;version="${spring30:[=.=.=.=,+1.0.0)}",
|
||||
org.springframework.data.*;version="${springdata.commons:[=.=.=.=,+1.0.0)}",
|
||||
org.springframework.data.mongodb.*;version="${project.version:[=.=.=.=,+1.0.0)}",
|
||||
org.w3c.dom.*;version="0"
|
||||
|
||||
@@ -13,7 +13,7 @@
|
||||
<parent>
|
||||
<groupId>org.springframework.data</groupId>
|
||||
<artifactId>spring-data-mongodb-parent</artifactId>
|
||||
<version>1.6.0.M1</version>
|
||||
<version>1.2.5.BUILD-SNAPSHOT</version>
|
||||
<relativePath>../pom.xml</relativePath>
|
||||
</parent>
|
||||
|
||||
|
||||
@@ -5,7 +5,7 @@
|
||||
<parent>
|
||||
<groupId>org.springframework.data</groupId>
|
||||
<artifactId>spring-data-mongodb-parent</artifactId>
|
||||
<version>1.6.0.M1</version>
|
||||
<version>1.2.5.BUILD-SNAPSHOT</version>
|
||||
<relativePath>../pom.xml</relativePath>
|
||||
</parent>
|
||||
|
||||
|
||||
@@ -1,9 +1,9 @@
|
||||
Bundle-SymbolicName: org.springframework.data.mongodb.log4j
|
||||
Bundle-Name: Spring Data Mongo DB Log4J Appender
|
||||
Bundle-Vendor: Pivotal Software, Inc.
|
||||
Bundle-Vendor: SpringSource
|
||||
Bundle-ManifestVersion: 2
|
||||
Import-Package:
|
||||
sun.reflect;version="0";resolution:=optional
|
||||
Import-Template:
|
||||
com.mongodb.*;version="${mongo.osgi:[=.=.=,+1.0.0)}",
|
||||
com.mongodb.*;version="${mongo:[=.=,+1.0.0)}",
|
||||
org.apache.log4j.*;version="${log4j:[=.=.=,+1.0.0)}"
|
||||
|
||||
@@ -1,182 +1,146 @@
|
||||
<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
|
||||
<context version="7.1.9.205">
|
||||
<scope type="Project" name="spring-data-mongodb">
|
||||
<element type="TypeFilterReferenceOverridden" name="Filter">
|
||||
<element type="IncludeTypePattern" name="org.springframework.data.mongodb.**"/>
|
||||
<context version="7.0.3.1152">
|
||||
<scope name="spring-data-mongodb" type="Project">
|
||||
<element name="Filter" type="TypeFilterReferenceOverridden">
|
||||
<element name="org.springframework.data.mongodb.**" type="IncludeTypePattern"/>
|
||||
</element>
|
||||
<architecture>
|
||||
<element type="Layer" name="Repositories">
|
||||
<element type="TypeFilter" name="Assignment">
|
||||
<element type="IncludeTypePattern" name="**.repository.**"/>
|
||||
<element name="Config" type="Layer">
|
||||
<element name="Assignment" type="TypeFilter">
|
||||
<element name="**.config.**" type="WeakTypePattern"/>
|
||||
</element>
|
||||
<element type="Subsystem" name="API">
|
||||
<element type="TypeFilter" name="Assignment">
|
||||
<element type="IncludeTypePattern" name="**.repository.*"/>
|
||||
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core"/>
|
||||
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Monitoring"/>
|
||||
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Repositories"/>
|
||||
</element>
|
||||
<element name="Repositories" type="Layer">
|
||||
<element name="Assignment" type="TypeFilter">
|
||||
<element name="**.repository.**" type="IncludeTypePattern"/>
|
||||
</element>
|
||||
<element name="API" type="Subsystem">
|
||||
<element name="Assignment" type="TypeFilter">
|
||||
<element name="**.repository.*" type="IncludeTypePattern"/>
|
||||
</element>
|
||||
</element>
|
||||
<element type="Subsystem" name="Query">
|
||||
<element type="TypeFilter" name="Assignment">
|
||||
<element type="IncludeTypePattern" name="**.query.**"/>
|
||||
<element name="Query" type="Subsystem">
|
||||
<element name="Assignment" type="TypeFilter">
|
||||
<element name="**.query.**" type="IncludeTypePattern"/>
|
||||
</element>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|API" type="AllowedDependency"/>
|
||||
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|API"/>
|
||||
</element>
|
||||
<element type="Subsystem" name="Implementation">
|
||||
<element type="TypeFilter" name="Assignment">
|
||||
<element type="IncludeTypePattern" name="**.support.**"/>
|
||||
<element name="Implementation" type="Subsystem">
|
||||
<element name="Assignment" type="TypeFilter">
|
||||
<element name="**.support.**" type="IncludeTypePattern"/>
|
||||
</element>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|API" type="AllowedDependency"/>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|Query" type="AllowedDependency"/>
|
||||
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|API"/>
|
||||
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|Query"/>
|
||||
</element>
|
||||
<element type="Subsystem" name="Config">
|
||||
<element type="TypeFilter" name="Assignment">
|
||||
<element type="IncludeTypePattern" name="**.config.**"/>
|
||||
<element name="Config" type="Subsystem">
|
||||
<element name="Assignment" type="TypeFilter">
|
||||
<element name="**.config.**" type="IncludeTypePattern"/>
|
||||
</element>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping" type="AllowedDependency"/>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|Implementation" type="AllowedDependency"/>
|
||||
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|Implementation"/>
|
||||
</element>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Config" type="AllowedDependency"/>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Core" type="AllowedDependency"/>
|
||||
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core"/>
|
||||
</element>
|
||||
<element type="Layer" name="Config">
|
||||
<element type="TypeFilter" name="Assignment">
|
||||
<element type="WeakTypePattern" name="**.config.**"/>
|
||||
<element name="Monitoring" type="Layer">
|
||||
<element name="Assignment" type="TypeFilter">
|
||||
<element name="**.monitor.**" type="IncludeTypePattern"/>
|
||||
</element>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Core" type="AllowedDependency"/>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|GridFS" type="AllowedDependency"/>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Monitoring" type="AllowedDependency"/>
|
||||
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core"/>
|
||||
</element>
|
||||
<element type="Layer" name="Monitoring">
|
||||
<element type="TypeFilter" name="Assignment">
|
||||
<element type="IncludeTypePattern" name="**.monitor.**"/>
|
||||
<element name="GridFS" type="Layer">
|
||||
<element name="Assignment" type="TypeFilter">
|
||||
<element name="**.gridfs.**" type="IncludeTypePattern"/>
|
||||
</element>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Core" type="AllowedDependency"/>
|
||||
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core"/>
|
||||
</element>
|
||||
<element type="Layer" name="GridFS">
|
||||
<element type="TypeFilter" name="Assignment">
|
||||
<element type="IncludeTypePattern" name="**.gridfs.**"/>
|
||||
<element name="Core" type="Layer">
|
||||
<element name="Assignment" type="TypeFilter">
|
||||
<element name="**.core.**" type="IncludeTypePattern"/>
|
||||
</element>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Core" type="AllowedDependency"/>
|
||||
</element>
|
||||
<element type="Layer" name="Core">
|
||||
<element type="TypeFilter" name="Assignment"/>
|
||||
<element type="Subsystem" name="Mapping">
|
||||
<element type="TypeFilter" name="Assignment">
|
||||
<element type="IncludeTypePattern" name="**.core.mapping.**"/>
|
||||
<element name="Mapping" type="Subsystem">
|
||||
<element name="Assignment" type="TypeFilter">
|
||||
<element name="**.mapping.**" type="IncludeTypePattern"/>
|
||||
</element>
|
||||
</element>
|
||||
<element type="Subsystem" name="Geospatial">
|
||||
<element type="TypeFilter" name="Assignment">
|
||||
<element type="IncludeTypePattern" name="**.core.geo.**"/>
|
||||
<element name="Geospatial" type="Subsystem">
|
||||
<element name="Assignment" type="TypeFilter">
|
||||
<element name="**.geo.**" type="IncludeTypePattern"/>
|
||||
</element>
|
||||
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping"/>
|
||||
</element>
|
||||
<element type="Subsystem" name="Query">
|
||||
<element type="TypeFilter" name="Assignment">
|
||||
<element type="IncludeTypePattern" name="**.core.query.**"/>
|
||||
<element name="Query" type="Subsystem">
|
||||
<element name="Assignment" type="TypeFilter">
|
||||
<element name="**.query.**" type="IncludeTypePattern"/>
|
||||
</element>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Geospatial" type="AllowedDependency"/>
|
||||
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Geospatial"/>
|
||||
</element>
|
||||
<element type="Subsystem" name="Conversion">
|
||||
<element type="TypeFilter" name="Assignment">
|
||||
<element type="IncludeTypePattern" name="**.core.convert.**"/>
|
||||
<element name="Index" type="Subsystem">
|
||||
<element name="Assignment" type="TypeFilter">
|
||||
<element name="**.index.**" type="IncludeTypePattern"/>
|
||||
</element>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Geospatial" type="AllowedDependency"/>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping" type="AllowedDependency"/>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Query" type="AllowedDependency"/>
|
||||
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping"/>
|
||||
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Query"/>
|
||||
</element>
|
||||
<element type="Subsystem" name="SpEL">
|
||||
<element type="TypeFilter" name="Assignment">
|
||||
<element type="IncludeTypePattern" name="**.core.spel.**"/>
|
||||
<element name="Core" type="Subsystem">
|
||||
<element name="Assignment" type="TypeFilter">
|
||||
<element name="**.core.**" type="WeakTypePattern"/>
|
||||
</element>
|
||||
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Geospatial"/>
|
||||
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Index"/>
|
||||
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping"/>
|
||||
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Query"/>
|
||||
</element>
|
||||
<element type="Subsystem" name="Aggregation">
|
||||
<element type="TypeFilter" name="Assignment">
|
||||
<element type="IncludeTypePattern" name="**.core.aggregation.**"/>
|
||||
</element>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Conversion" type="AllowedDependency"/>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping" type="AllowedDependency"/>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Query" type="AllowedDependency"/>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|SpEL" type="AllowedDependency"/>
|
||||
</element>
|
||||
<element type="Subsystem" name="Index">
|
||||
<element type="TypeFilter" name="Assignment">
|
||||
<element type="IncludeTypePattern" name="**.core.index.**"/>
|
||||
</element>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping" type="AllowedDependency"/>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Query" type="AllowedDependency"/>
|
||||
</element>
|
||||
<element type="Subsystem" name="Core">
|
||||
<element type="TypeFilter" name="Assignment">
|
||||
<element type="WeakTypePattern" name="**.core.**"/>
|
||||
</element>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Aggregation" type="AllowedDependency"/>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Conversion" type="AllowedDependency"/>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Geospatial" type="AllowedDependency"/>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Index" type="AllowedDependency"/>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping" type="AllowedDependency"/>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Query" type="AllowedDependency"/>
|
||||
</element>
|
||||
<element type="Subsystem" name="Util">
|
||||
<element type="TypeFilter" name="Assignment">
|
||||
<element type="IncludeTypePattern" name="**.util.**"/>
|
||||
</element>
|
||||
<stereotype name="Unrestricted"/>
|
||||
<stereotype name="Public"/>
|
||||
</element>
|
||||
</element>
|
||||
<element type="Subsystem" name="API">
|
||||
<element type="TypeFilter" name="Assignment">
|
||||
<element type="IncludeTypePattern" name="org.springframework.data.mongodb.*"/>
|
||||
</element>
|
||||
<stereotype name="Public"/>
|
||||
</element>
|
||||
</architecture>
|
||||
<workspace>
|
||||
<element type="JavaRootDirectory" name="src/main/java">
|
||||
<element name="src/main/java" type="JavaRootDirectory">
|
||||
<reference name="Project|spring-data-mongodb::BuildUnit|spring-data-mongodb"/>
|
||||
</element>
|
||||
<element type="JavaRootDirectory" name="target/classes">
|
||||
<element name="target/classes" type="JavaRootDirectory">
|
||||
<reference name="Project|spring-data-mongodb::BuildUnit|spring-data-mongodb"/>
|
||||
</element>
|
||||
</workspace>
|
||||
<physical>
|
||||
<element type="BuildUnit" name="spring-data-mongodb"/>
|
||||
<element name="spring-data-mongodb" type="BuildUnit"/>
|
||||
</physical>
|
||||
</scope>
|
||||
<scope type="External" name="External">
|
||||
<element type="TypeFilter" name="Filter">
|
||||
<element type="IncludeTypePattern" name="**"/>
|
||||
<element type="ExcludeTypePattern" name="java.**"/>
|
||||
<element type="ExcludeTypePattern" name="javax.**"/>
|
||||
<scope name="External" type="External">
|
||||
<element name="Filter" type="TypeFilter">
|
||||
<element name="**" type="IncludeTypePattern"/>
|
||||
<element name="java.**" type="ExcludeTypePattern"/>
|
||||
<element name="javax.**" type="ExcludeTypePattern"/>
|
||||
</element>
|
||||
<architecture>
|
||||
<element type="Subsystem" name="Spring">
|
||||
<element type="TypeFilter" name="Assignment">
|
||||
<element type="IncludeTypePattern" name="org.springframework.**"/>
|
||||
<element type="ExcludeTypePattern" name="org.springframework.data.**"/>
|
||||
<element name="Spring" type="Subsystem">
|
||||
<element name="Assignment" type="TypeFilter">
|
||||
<element name="org.springframework.**" type="IncludeTypePattern"/>
|
||||
<element name="org.springframework.data.**" type="ExcludeTypePattern"/>
|
||||
</element>
|
||||
</element>
|
||||
<element type="Subsystem" name="Spring Data Core">
|
||||
<element type="TypeFilter" name="Assignment">
|
||||
<element type="IncludeTypePattern" name="org.springframework.data.**"/>
|
||||
<element name="Spring Data Core" type="Subsystem">
|
||||
<element name="Assignment" type="TypeFilter">
|
||||
<element name="org.springframework.data.**" type="IncludeTypePattern"/>
|
||||
</element>
|
||||
</element>
|
||||
<element type="Subsystem" name="Mongo Java Driver">
|
||||
<element type="TypeFilter" name="Assignment">
|
||||
<element type="IncludeTypePattern" name="com.mongodb.**"/>
|
||||
<element type="IncludeTypePattern" name="org.bson.**"/>
|
||||
<element name="Mongo Java Driver" type="Subsystem">
|
||||
<element name="Assignment" type="TypeFilter">
|
||||
<element name="com.mongodb.**" type="IncludeTypePattern"/>
|
||||
<element name="org.bson.**" type="IncludeTypePattern"/>
|
||||
</element>
|
||||
</element>
|
||||
<element type="Subsystem" name="Querydsl">
|
||||
<element type="TypeFilter" name="Assignment">
|
||||
<element type="IncludeTypePattern" name="com.mysema.query.**"/>
|
||||
<element name="Querydsl" type="Subsystem">
|
||||
<element name="Assignment" type="TypeFilter">
|
||||
<element name="com.mysema.query.**" type="IncludeTypePattern"/>
|
||||
</element>
|
||||
</element>
|
||||
</architecture>
|
||||
</scope>
|
||||
<scope type="Global" name="Global">
|
||||
<element type="Configuration" name="Configuration"/>
|
||||
<element type="TypeFilter" name="Filter">
|
||||
<element type="IncludeTypePattern" name="**"/>
|
||||
<scope name="Global" type="Global">
|
||||
<element name="Configuration" type="Configuration"/>
|
||||
<element name="Filter" type="TypeFilter">
|
||||
<element name="**" type="IncludeTypePattern"/>
|
||||
</element>
|
||||
</scope>
|
||||
</context>
|
||||
|
||||
@@ -11,13 +11,12 @@
|
||||
<parent>
|
||||
<groupId>org.springframework.data</groupId>
|
||||
<artifactId>spring-data-mongodb-parent</artifactId>
|
||||
<version>1.6.0.M1</version>
|
||||
<version>1.2.5.BUILD-SNAPSHOT</version>
|
||||
<relativePath>../pom.xml</relativePath>
|
||||
</parent>
|
||||
|
||||
<properties>
|
||||
<validation>1.0.0.GA</validation>
|
||||
<objenesis>1.3</objenesis>
|
||||
</properties>
|
||||
|
||||
<dependencies>
|
||||
@@ -26,18 +25,22 @@
|
||||
<dependency>
|
||||
<groupId>org.springframework</groupId>
|
||||
<artifactId>spring-tx</artifactId>
|
||||
<version>${spring}</version>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.springframework</groupId>
|
||||
<artifactId>spring-context</artifactId>
|
||||
<version>${spring}</version>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.springframework</groupId>
|
||||
<artifactId>spring-beans</artifactId>
|
||||
<version>${spring}</version>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.springframework</groupId>
|
||||
<artifactId>spring-core</artifactId>
|
||||
<version>${spring}</version>
|
||||
<exclusions>
|
||||
<exclusion>
|
||||
<groupId>commons-logging</groupId>
|
||||
@@ -48,6 +51,7 @@
|
||||
<dependency>
|
||||
<groupId>org.springframework</groupId>
|
||||
<artifactId>spring-expression</artifactId>
|
||||
<version>${spring}</version>
|
||||
</dependency>
|
||||
|
||||
<!-- Spring Data -->
|
||||
@@ -116,13 +120,6 @@
|
||||
<optional>true</optional>
|
||||
</dependency>
|
||||
|
||||
<dependency>
|
||||
<groupId>org.objenesis</groupId>
|
||||
<artifactId>objenesis</artifactId>
|
||||
<version>${objenesis}</version>
|
||||
<optional>true</optional>
|
||||
</dependency>
|
||||
|
||||
<dependency>
|
||||
<groupId>org.hibernate</groupId>
|
||||
<artifactId>hibernate-validator</artifactId>
|
||||
@@ -137,13 +134,6 @@
|
||||
<scope>test</scope>
|
||||
</dependency>
|
||||
|
||||
<dependency>
|
||||
<groupId>org.slf4j</groupId>
|
||||
<artifactId>jul-to-slf4j</artifactId>
|
||||
<version>${slf4j}</version>
|
||||
<scope>test</scope>
|
||||
</dependency>
|
||||
|
||||
</dependencies>
|
||||
|
||||
<build>
|
||||
@@ -151,8 +141,8 @@
|
||||
|
||||
<plugin>
|
||||
<groupId>com.mysema.maven</groupId>
|
||||
<artifactId>apt-maven-plugin</artifactId>
|
||||
<version>${apt}</version>
|
||||
<artifactId>maven-apt-plugin</artifactId>
|
||||
<version>1.0.4</version>
|
||||
<dependencies>
|
||||
<dependency>
|
||||
<groupId>com.mysema.querydsl</groupId>
|
||||
|
||||
@@ -1,34 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb;
|
||||
|
||||
import org.springframework.dao.UncategorizedDataAccessException;
|
||||
|
||||
/**
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
public class LazyLoadingException extends UncategorizedDataAccessException {
|
||||
|
||||
private static final long serialVersionUID = -7089224903873220037L;
|
||||
|
||||
/**
|
||||
* @param msg
|
||||
* @param cause
|
||||
*/
|
||||
public LazyLoadingException(String msg, Throwable cause) {
|
||||
super(msg, cause);
|
||||
}
|
||||
}
|
||||
@@ -13,9 +13,10 @@
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core;
|
||||
package org.springframework.data.mongodb;
|
||||
|
||||
import org.springframework.dao.DataIntegrityViolationException;
|
||||
import org.springframework.data.mongodb.core.MongoActionOperation;
|
||||
import org.springframework.util.Assert;
|
||||
|
||||
import com.mongodb.WriteResult;
|
||||
@@ -1,23 +1,6 @@
|
||||
/*
|
||||
* Copyright 2011-2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb;
|
||||
|
||||
import org.springframework.dao.DataAccessException;
|
||||
import org.springframework.dao.support.PersistenceExceptionTranslator;
|
||||
import org.springframework.data.mongodb.core.MongoExceptionTranslator;
|
||||
|
||||
import com.mongodb.DB;
|
||||
|
||||
@@ -25,7 +8,6 @@ import com.mongodb.DB;
|
||||
* Interface for factories creating {@link DB} instances.
|
||||
*
|
||||
* @author Mark Pollack
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
public interface MongoDbFactory {
|
||||
|
||||
@@ -45,11 +27,4 @@ public interface MongoDbFactory {
|
||||
* @throws DataAccessException
|
||||
*/
|
||||
DB getDb(String dbName) throws DataAccessException;
|
||||
|
||||
/**
|
||||
* Exposes a shared {@link MongoExceptionTranslator}.
|
||||
*
|
||||
* @return will never be {@literal null}.
|
||||
*/
|
||||
PersistenceExceptionTranslator getExceptionTranslator();
|
||||
}
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2011-2014 the original author or authors.
|
||||
* Copyright 2011-2012 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -28,15 +28,9 @@ import org.springframework.core.type.filter.AnnotationTypeFilter;
|
||||
import org.springframework.data.annotation.Persistent;
|
||||
import org.springframework.data.authentication.UserCredentials;
|
||||
import org.springframework.data.mapping.context.MappingContextIsNewStrategyFactory;
|
||||
import org.springframework.data.mapping.model.CamelCaseAbbreviatingFieldNamingStrategy;
|
||||
import org.springframework.data.mapping.model.FieldNamingStrategy;
|
||||
import org.springframework.data.mapping.model.PropertyNameFieldNamingStrategy;
|
||||
import org.springframework.data.mongodb.MongoDbFactory;
|
||||
import org.springframework.data.mongodb.core.MongoTemplate;
|
||||
import org.springframework.data.mongodb.core.SimpleMongoDbFactory;
|
||||
import org.springframework.data.mongodb.core.convert.CustomConversions;
|
||||
import org.springframework.data.mongodb.core.convert.DbRefResolver;
|
||||
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver;
|
||||
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
|
||||
import org.springframework.data.mongodb.core.mapping.Document;
|
||||
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
|
||||
@@ -52,8 +46,6 @@ import com.mongodb.Mongo;
|
||||
*
|
||||
* @author Mark Pollack
|
||||
* @author Oliver Gierke
|
||||
* @author Thomas Darimont
|
||||
* @author Ryan Tenney
|
||||
*/
|
||||
@Configuration
|
||||
public abstract class AbstractMongoConfiguration {
|
||||
@@ -66,22 +58,12 @@ public abstract class AbstractMongoConfiguration {
|
||||
protected abstract String getDatabaseName();
|
||||
|
||||
/**
|
||||
* Return the name of the authentication database to use. Defaults to {@literal null} and will turn into the value
|
||||
* returned by {@link #getDatabaseName()} later on effectively.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
protected String getAuthenticationDatabaseName() {
|
||||
return null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Return the {@link Mongo} instance to connect to. Annotate with {@link Bean} in case you want to expose a
|
||||
* {@link Mongo} instance to the {@link org.springframework.context.ApplicationContext}.
|
||||
* Return the {@link Mongo} instance to connect to.
|
||||
*
|
||||
* @return
|
||||
* @throws Exception
|
||||
*/
|
||||
@Bean
|
||||
public abstract Mongo mongo() throws Exception;
|
||||
|
||||
/**
|
||||
@@ -105,8 +87,15 @@ public abstract class AbstractMongoConfiguration {
|
||||
* @throws Exception
|
||||
*/
|
||||
@Bean
|
||||
public MongoDbFactory mongoDbFactory() throws Exception {
|
||||
return new SimpleMongoDbFactory(mongo(), getDatabaseName(), getUserCredentials(), getAuthenticationDatabaseName());
|
||||
public SimpleMongoDbFactory mongoDbFactory() throws Exception {
|
||||
|
||||
UserCredentials credentials = getUserCredentials();
|
||||
|
||||
if (credentials == null) {
|
||||
return new SimpleMongoDbFactory(mongo(), getDatabaseName());
|
||||
} else {
|
||||
return new SimpleMongoDbFactory(mongo(), getDatabaseName(), credentials);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -119,9 +108,7 @@ public abstract class AbstractMongoConfiguration {
|
||||
* entities.
|
||||
*/
|
||||
protected String getMappingBasePackage() {
|
||||
|
||||
Package mappingBasePackage = getClass().getPackage();
|
||||
return mappingBasePackage == null ? null : mappingBasePackage.getName();
|
||||
return getClass().getPackage().getName();
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -147,7 +134,6 @@ public abstract class AbstractMongoConfiguration {
|
||||
MongoMappingContext mappingContext = new MongoMappingContext();
|
||||
mappingContext.setInitialEntitySet(getInitialEntitySet());
|
||||
mappingContext.setSimpleTypeHolder(customConversions().getSimpleTypeHolder());
|
||||
mappingContext.setFieldNamingStrategy(fieldNamingStrategy());
|
||||
|
||||
return mappingContext;
|
||||
}
|
||||
@@ -187,11 +173,8 @@ public abstract class AbstractMongoConfiguration {
|
||||
*/
|
||||
@Bean
|
||||
public MappingMongoConverter mappingMongoConverter() throws Exception {
|
||||
|
||||
DbRefResolver dbRefResolver = new DefaultDbRefResolver(mongoDbFactory());
|
||||
MappingMongoConverter converter = new MappingMongoConverter(dbRefResolver, mongoMappingContext());
|
||||
MappingMongoConverter converter = new MappingMongoConverter(mongoDbFactory(), mongoMappingContext());
|
||||
converter.setCustomConversions(customConversions());
|
||||
|
||||
return converter;
|
||||
}
|
||||
|
||||
@@ -221,26 +204,4 @@ public abstract class AbstractMongoConfiguration {
|
||||
|
||||
return initialEntitySet;
|
||||
}
|
||||
|
||||
/**
|
||||
* Configures whether to abbreviate field names for domain objects by configuring a
|
||||
* {@link CamelCaseAbbreviatingFieldNamingStrategy} on the {@link MongoMappingContext} instance created. For advanced
|
||||
* customization needs, consider overriding {@link #mappingMongoConverter()}.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
protected boolean abbreviateFieldNames() {
|
||||
return false;
|
||||
}
|
||||
|
||||
/**
|
||||
* Configures a {@link FieldNamingStrategy} on the {@link MongoMappingContext} instance created.
|
||||
*
|
||||
* @return
|
||||
* @since 1.5
|
||||
*/
|
||||
protected FieldNamingStrategy fieldNamingStrategy() {
|
||||
return abbreviateFieldNames() ? new CamelCaseAbbreviatingFieldNamingStrategy()
|
||||
: PropertyNameFieldNamingStrategy.INSTANCE;
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2011-2014 the original author or authors.
|
||||
* Copyright (c) 2011 by the original author(s).
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -13,25 +13,19 @@
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
|
||||
package org.springframework.data.mongodb.config;
|
||||
|
||||
/**
|
||||
* Constants to declare bean names used by the namespace configuration.
|
||||
*
|
||||
* @author Jon Brisbin
|
||||
* @author Oliver Gierke
|
||||
* @author Martin Baumgartner
|
||||
* @author Jon Brisbin <jbrisbin@vmware.com>
|
||||
*/
|
||||
public abstract class BeanNames {
|
||||
|
||||
public static final String MAPPING_CONTEXT_BEAN_NAME = "mongoMappingContext";
|
||||
|
||||
static final String INDEX_HELPER_BEAN_NAME = "indexCreationHelper";
|
||||
static final String MONGO_BEAN_NAME = "mongo";
|
||||
static final String DB_FACTORY_BEAN_NAME = "mongoDbFactory";
|
||||
static final String VALIDATING_EVENT_LISTENER_BEAN_NAME = "validatingMongoEventListener";
|
||||
static final String IS_NEW_STRATEGY_FACTORY_BEAN_NAME = "isNewStrategyFactory";
|
||||
static final String MAPPING_CONTEXT = "mappingContext";
|
||||
static final String INDEX_HELPER = "indexCreationHelper";
|
||||
static final String MONGO = "mongo";
|
||||
static final String DB_FACTORY = "mongoDbFactory";
|
||||
static final String VALIDATING_EVENT_LISTENER = "validatingMongoEventListener";
|
||||
static final String IS_NEW_STRATEGY_FACTORY = "isNewStrategyFactory";
|
||||
static final String DEFAULT_CONVERTER_BEAN_NAME = "mappingConverter";
|
||||
static final String MONGO_TEMPLATE_BEAN_NAME = "mongoTemplate";
|
||||
static final String GRID_FS_TEMPLATE_BEAN_NAME = "gridFsTemplate";
|
||||
}
|
||||
|
||||
@@ -1,70 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.config;
|
||||
|
||||
import java.lang.annotation.Documented;
|
||||
import java.lang.annotation.ElementType;
|
||||
import java.lang.annotation.Inherited;
|
||||
import java.lang.annotation.Retention;
|
||||
import java.lang.annotation.RetentionPolicy;
|
||||
import java.lang.annotation.Target;
|
||||
|
||||
import org.springframework.context.annotation.Import;
|
||||
import org.springframework.data.auditing.DateTimeProvider;
|
||||
import org.springframework.data.domain.AuditorAware;
|
||||
|
||||
/**
|
||||
* Annotation to enable auditing in MongoDB via annotation configuration.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
@Inherited
|
||||
@Documented
|
||||
@Target(ElementType.TYPE)
|
||||
@Retention(RetentionPolicy.RUNTIME)
|
||||
@Import(MongoAuditingRegistrar.class)
|
||||
public @interface EnableMongoAuditing {
|
||||
|
||||
/**
|
||||
* Configures the {@link AuditorAware} bean to be used to lookup the current principal.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
String auditorAwareRef() default "";
|
||||
|
||||
/**
|
||||
* Configures whether the creation and modification dates are set. Defaults to {@literal true}.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
boolean setDates() default true;
|
||||
|
||||
/**
|
||||
* Configures whether the entity shall be marked as modified on creation. Defaults to {@literal true}.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
boolean modifyOnCreate() default true;
|
||||
|
||||
/**
|
||||
* Configures a {@link DateTimeProvider} bean name that allows customizing the {@link org.joda.time.DateTime} to be
|
||||
* used for setting creation and modification dates.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
String dateTimeProviderRef() default "";
|
||||
}
|
||||
@@ -1,83 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013-2014 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.config;
|
||||
|
||||
import org.springframework.beans.factory.BeanDefinitionStoreException;
|
||||
import org.springframework.beans.factory.config.BeanDefinition;
|
||||
import org.springframework.beans.factory.support.AbstractBeanDefinition;
|
||||
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
|
||||
import org.springframework.beans.factory.xml.AbstractBeanDefinitionParser;
|
||||
import org.springframework.beans.factory.xml.BeanDefinitionParser;
|
||||
import org.springframework.beans.factory.xml.ParserContext;
|
||||
import org.springframework.data.config.BeanComponentDefinitionBuilder;
|
||||
import org.springframework.data.mongodb.gridfs.GridFsTemplate;
|
||||
import org.springframework.util.StringUtils;
|
||||
import org.w3c.dom.Element;
|
||||
|
||||
/**
|
||||
* {@link BeanDefinitionParser} to parse {@code gridFsTemplate} elements into {@link BeanDefinition}s.
|
||||
*
|
||||
* @author Martin Baumgartner
|
||||
*/
|
||||
class GridFsTemplateParser extends AbstractBeanDefinitionParser {
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#resolveId(org.w3c.dom.Element, org.springframework.beans.factory.support.AbstractBeanDefinition, org.springframework.beans.factory.xml.ParserContext)
|
||||
*/
|
||||
@Override
|
||||
protected String resolveId(Element element, AbstractBeanDefinition definition, ParserContext parserContext)
|
||||
throws BeanDefinitionStoreException {
|
||||
|
||||
String id = super.resolveId(element, definition, parserContext);
|
||||
return StringUtils.hasText(id) ? id : BeanNames.GRID_FS_TEMPLATE_BEAN_NAME;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#parseInternal(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
|
||||
*/
|
||||
@Override
|
||||
protected AbstractBeanDefinition parseInternal(Element element, ParserContext parserContext) {
|
||||
|
||||
BeanComponentDefinitionBuilder helper = new BeanComponentDefinitionBuilder(element, parserContext);
|
||||
|
||||
String converterRef = element.getAttribute("converter-ref");
|
||||
String dbFactoryRef = element.getAttribute("db-factory-ref");
|
||||
String bucket = element.getAttribute("bucket");
|
||||
|
||||
BeanDefinitionBuilder gridFsTemplateBuilder = BeanDefinitionBuilder.genericBeanDefinition(GridFsTemplate.class);
|
||||
|
||||
if (StringUtils.hasText(dbFactoryRef)) {
|
||||
gridFsTemplateBuilder.addConstructorArgReference(dbFactoryRef);
|
||||
} else {
|
||||
gridFsTemplateBuilder.addConstructorArgReference(BeanNames.DB_FACTORY_BEAN_NAME);
|
||||
}
|
||||
|
||||
if (StringUtils.hasText(converterRef)) {
|
||||
gridFsTemplateBuilder.addConstructorArgReference(converterRef);
|
||||
} else {
|
||||
gridFsTemplateBuilder.addConstructorArgReference(BeanNames.DEFAULT_CONVERTER_BEAN_NAME);
|
||||
}
|
||||
|
||||
if (StringUtils.hasText(bucket)) {
|
||||
gridFsTemplateBuilder.addConstructorArgValue(bucket);
|
||||
}
|
||||
|
||||
return (AbstractBeanDefinition) helper.getComponentIdButFallback(gridFsTemplateBuilder, BeanNames.GRID_FS_TEMPLATE_BEAN_NAME)
|
||||
.getBeanDefinition();
|
||||
}
|
||||
}
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2011-2014 the original author or authors.
|
||||
* Copyright 2011-2012 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -30,7 +30,6 @@ import org.springframework.beans.factory.config.BeanDefinitionHolder;
|
||||
import org.springframework.beans.factory.config.RuntimeBeanReference;
|
||||
import org.springframework.beans.factory.parsing.BeanComponentDefinition;
|
||||
import org.springframework.beans.factory.parsing.CompositeComponentDefinition;
|
||||
import org.springframework.beans.factory.parsing.ReaderContext;
|
||||
import org.springframework.beans.factory.support.AbstractBeanDefinition;
|
||||
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
|
||||
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
|
||||
@@ -52,7 +51,6 @@ import org.springframework.core.type.filter.TypeFilter;
|
||||
import org.springframework.data.annotation.Persistent;
|
||||
import org.springframework.data.config.BeanComponentDefinitionBuilder;
|
||||
import org.springframework.data.mapping.context.MappingContextIsNewStrategyFactory;
|
||||
import org.springframework.data.mapping.model.CamelCaseAbbreviatingFieldNamingStrategy;
|
||||
import org.springframework.data.mongodb.core.convert.CustomConversions;
|
||||
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
|
||||
import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexCreator;
|
||||
@@ -71,13 +69,11 @@ import org.w3c.dom.Element;
|
||||
* @author Jon Brisbin
|
||||
* @author Oliver Gierke
|
||||
* @author Maciej Walkowiak
|
||||
* @author Thomas Darimont
|
||||
* @author Christoph Strobl
|
||||
*/
|
||||
public class MappingMongoConverterParser implements BeanDefinitionParser {
|
||||
|
||||
private static final String BASE_PACKAGE = "base-package";
|
||||
private static final boolean JSR_303_PRESENT = ClassUtils.isPresent("javax.validation.Validator",
|
||||
private static final boolean jsr303Present = ClassUtils.isPresent("javax.validation.Validator",
|
||||
MappingMongoConverterParser.class.getClassLoader());
|
||||
|
||||
/* (non-Javadoc)
|
||||
@@ -85,13 +81,10 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
|
||||
*/
|
||||
public BeanDefinition parse(Element element, ParserContext parserContext) {
|
||||
|
||||
if (parserContext.isNested()) {
|
||||
parserContext.getReaderContext().error("Mongo Converter must not be defined as nested bean.", element);
|
||||
}
|
||||
|
||||
BeanDefinitionRegistry registry = parserContext.getRegistry();
|
||||
|
||||
String id = element.getAttribute(AbstractBeanDefinitionParser.ID_ATTRIBUTE);
|
||||
id = StringUtils.hasText(id) ? id : DEFAULT_CONVERTER_BEAN_NAME;
|
||||
id = StringUtils.hasText(id) ? id : "mappingConverter";
|
||||
|
||||
parserContext.pushContainingComponent(new CompositeComponentDefinition("Mapping Mongo Converter", element));
|
||||
|
||||
@@ -103,7 +96,7 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
|
||||
// Need a reference to a Mongo instance
|
||||
String dbFactoryRef = element.getAttribute("db-factory-ref");
|
||||
if (!StringUtils.hasText(dbFactoryRef)) {
|
||||
dbFactoryRef = DB_FACTORY_BEAN_NAME;
|
||||
dbFactoryRef = DB_FACTORY;
|
||||
}
|
||||
|
||||
// Converter
|
||||
@@ -111,20 +104,15 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
|
||||
converterBuilder.addConstructorArgReference(dbFactoryRef);
|
||||
converterBuilder.addConstructorArgReference(ctxRef);
|
||||
|
||||
String typeMapperRef = element.getAttribute("type-mapper-ref");
|
||||
if (StringUtils.hasText(typeMapperRef)) {
|
||||
converterBuilder.addPropertyReference("typeMapper", typeMapperRef);
|
||||
}
|
||||
|
||||
if (conversionsDefinition != null) {
|
||||
converterBuilder.addPropertyValue("customConversions", conversionsDefinition);
|
||||
}
|
||||
|
||||
try {
|
||||
registry.getBeanDefinition(INDEX_HELPER_BEAN_NAME);
|
||||
registry.getBeanDefinition(INDEX_HELPER);
|
||||
} catch (NoSuchBeanDefinitionException ignored) {
|
||||
if (!StringUtils.hasText(dbFactoryRef)) {
|
||||
dbFactoryRef = DB_FACTORY_BEAN_NAME;
|
||||
dbFactoryRef = DB_FACTORY;
|
||||
}
|
||||
BeanDefinitionBuilder indexHelperBuilder = BeanDefinitionBuilder
|
||||
.genericBeanDefinition(MongoPersistentEntityIndexCreator.class);
|
||||
@@ -133,14 +121,14 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
|
||||
indexHelperBuilder.addDependsOn(ctxRef);
|
||||
|
||||
parserContext.registerBeanComponent(new BeanComponentDefinition(indexHelperBuilder.getBeanDefinition(),
|
||||
INDEX_HELPER_BEAN_NAME));
|
||||
INDEX_HELPER));
|
||||
}
|
||||
|
||||
BeanDefinition validatingMongoEventListener = potentiallyCreateValidatingMongoEventListener(element, parserContext);
|
||||
|
||||
if (validatingMongoEventListener != null) {
|
||||
parserContext.registerBeanComponent(new BeanComponentDefinition(validatingMongoEventListener,
|
||||
VALIDATING_EVENT_LISTENER_BEAN_NAME));
|
||||
VALIDATING_EVENT_LISTENER));
|
||||
}
|
||||
|
||||
parserContext.registerBeanComponent(new BeanComponentDefinition(converterBuilder.getBeanDefinition(), id));
|
||||
@@ -171,7 +159,7 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
|
||||
|
||||
private RuntimeBeanReference getValidator(Object source, ParserContext parserContext) {
|
||||
|
||||
if (!JSR_303_PRESENT) {
|
||||
if (!jsr303Present) {
|
||||
return null;
|
||||
}
|
||||
|
||||
@@ -185,7 +173,7 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
|
||||
return new RuntimeBeanReference(validatorName);
|
||||
}
|
||||
|
||||
public static String potentiallyCreateMappingContext(Element element, ParserContext parserContext,
|
||||
static String potentiallyCreateMappingContext(Element element, ParserContext parserContext,
|
||||
BeanDefinition conversionsDefinition, String converterId) {
|
||||
|
||||
String ctxRef = element.getAttribute("mapping-context-ref");
|
||||
@@ -200,8 +188,7 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
|
||||
BeanDefinitionBuilder mappingContextBuilder = BeanDefinitionBuilder
|
||||
.genericBeanDefinition(MongoMappingContext.class);
|
||||
|
||||
Set<String> classesToAdd = getInititalEntityClasses(element);
|
||||
|
||||
Set<String> classesToAdd = getInititalEntityClasses(element, mappingContextBuilder);
|
||||
if (classesToAdd != null) {
|
||||
mappingContextBuilder.addPropertyValue("initialEntitySet", classesToAdd);
|
||||
}
|
||||
@@ -214,43 +201,12 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
|
||||
mappingContextBuilder.addPropertyValue("simpleTypeHolder", simpleTypesDefinition);
|
||||
}
|
||||
|
||||
parseFieldNamingStrategy(element, parserContext.getReaderContext(), mappingContextBuilder);
|
||||
|
||||
ctxRef = converterId == null || DEFAULT_CONVERTER_BEAN_NAME.equals(converterId) ? MAPPING_CONTEXT_BEAN_NAME
|
||||
: converterId + "." + MAPPING_CONTEXT_BEAN_NAME;
|
||||
ctxRef = converterId + "." + MAPPING_CONTEXT;
|
||||
|
||||
parserContext.registerBeanComponent(componentDefinitionBuilder.getComponent(mappingContextBuilder, ctxRef));
|
||||
return ctxRef;
|
||||
}
|
||||
|
||||
private static void parseFieldNamingStrategy(Element element, ReaderContext context, BeanDefinitionBuilder builder) {
|
||||
|
||||
String abbreviateFieldNames = element.getAttribute("abbreviate-field-names");
|
||||
String fieldNamingStrategy = element.getAttribute("field-naming-strategy-ref");
|
||||
|
||||
boolean fieldNamingStrategyReferenced = StringUtils.hasText(fieldNamingStrategy);
|
||||
boolean abbreviationActivated = StringUtils.hasText(abbreviateFieldNames)
|
||||
&& Boolean.parseBoolean(abbreviateFieldNames);
|
||||
|
||||
if (fieldNamingStrategyReferenced && abbreviationActivated) {
|
||||
context.error("Field name abbreviation cannot be activated if a field-naming-strategy-ref is configured!",
|
||||
element);
|
||||
return;
|
||||
}
|
||||
|
||||
Object value = null;
|
||||
|
||||
if ("true".equals(abbreviateFieldNames)) {
|
||||
value = new RootBeanDefinition(CamelCaseAbbreviatingFieldNamingStrategy.class);
|
||||
} else if (fieldNamingStrategyReferenced) {
|
||||
value = new RuntimeBeanReference(fieldNamingStrategy);
|
||||
}
|
||||
|
||||
if (value != null) {
|
||||
builder.addPropertyValue("fieldNamingStrategy", value);
|
||||
}
|
||||
}
|
||||
|
||||
private BeanDefinition getCustomConversions(Element element, ParserContext parserContext) {
|
||||
|
||||
List<Element> customConvertersElements = DomUtils.getChildElementsByTagName(element, "custom-converters");
|
||||
@@ -293,7 +249,7 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
|
||||
return null;
|
||||
}
|
||||
|
||||
private static Set<String> getInititalEntityClasses(Element element) {
|
||||
private static Set<String> getInititalEntityClasses(Element element, BeanDefinitionBuilder builder) {
|
||||
|
||||
String basePackage = element.getAttribute(BASE_PACKAGE);
|
||||
|
||||
@@ -340,10 +296,9 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
|
||||
mappingContextStrategyFactoryBuilder.addConstructorArgReference(mappingContextRef);
|
||||
|
||||
BeanComponentDefinitionBuilder builder = new BeanComponentDefinitionBuilder(element, context);
|
||||
context.registerBeanComponent(builder.getComponent(mappingContextStrategyFactoryBuilder,
|
||||
IS_NEW_STRATEGY_FACTORY_BEAN_NAME));
|
||||
context.registerBeanComponent(builder.getComponent(mappingContextStrategyFactoryBuilder, IS_NEW_STRATEGY_FACTORY));
|
||||
|
||||
return IS_NEW_STRATEGY_FACTORY_BEAN_NAME;
|
||||
return IS_NEW_STRATEGY_FACTORY;
|
||||
}
|
||||
|
||||
/**
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2012-2014 the original author or authors.
|
||||
* Copyright 2012 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -15,19 +15,14 @@
|
||||
*/
|
||||
package org.springframework.data.mongodb.config;
|
||||
|
||||
import static org.springframework.data.config.ParsingUtils.*;
|
||||
import static org.springframework.data.mongodb.config.BeanNames.*;
|
||||
|
||||
import org.springframework.beans.factory.config.BeanDefinition;
|
||||
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
|
||||
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
|
||||
import org.springframework.beans.factory.support.RootBeanDefinition;
|
||||
import org.springframework.beans.factory.xml.AbstractSingleBeanDefinitionParser;
|
||||
import org.springframework.beans.factory.xml.BeanDefinitionParser;
|
||||
import org.springframework.beans.factory.xml.ParserContext;
|
||||
import org.springframework.data.auditing.config.IsNewAwareAuditingHandlerBeanDefinitionParser;
|
||||
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
|
||||
import org.springframework.data.config.IsNewAwareAuditingHandlerBeanDefinitionParser;
|
||||
import org.springframework.data.mongodb.core.mapping.event.AuditingEventListener;
|
||||
import org.springframework.util.StringUtils;
|
||||
import org.w3c.dom.Element;
|
||||
|
||||
/**
|
||||
@@ -63,24 +58,23 @@ public class MongoAuditingBeanDefinitionParser extends AbstractSingleBeanDefinit
|
||||
@Override
|
||||
protected void doParse(Element element, ParserContext parserContext, BeanDefinitionBuilder builder) {
|
||||
|
||||
String mappingContextRef = element.getAttribute("mapping-context-ref");
|
||||
|
||||
if (!StringUtils.hasText(mappingContextRef)) {
|
||||
|
||||
BeanDefinitionRegistry registry = parserContext.getRegistry();
|
||||
|
||||
if (!registry.containsBeanDefinition(MAPPING_CONTEXT_BEAN_NAME)) {
|
||||
registry.registerBeanDefinition(MAPPING_CONTEXT_BEAN_NAME, new RootBeanDefinition(MongoMappingContext.class));
|
||||
if (!registry.containsBeanDefinition(BeanNames.IS_NEW_STRATEGY_FACTORY)) {
|
||||
|
||||
String mappingContextName = BeanNames.MAPPING_CONTEXT;
|
||||
|
||||
if (!registry.containsBeanDefinition(BeanNames.MAPPING_CONTEXT)) {
|
||||
mappingContextName = MappingMongoConverterParser.potentiallyCreateMappingContext(element, parserContext, null,
|
||||
BeanNames.DEFAULT_CONVERTER_BEAN_NAME);
|
||||
}
|
||||
|
||||
mappingContextRef = MAPPING_CONTEXT_BEAN_NAME;
|
||||
MappingMongoConverterParser.createIsNewStrategyFactoryBeanDefinition(mappingContextName, parserContext, element);
|
||||
}
|
||||
|
||||
IsNewAwareAuditingHandlerBeanDefinitionParser parser = new IsNewAwareAuditingHandlerBeanDefinitionParser(
|
||||
mappingContextRef);
|
||||
parser.parse(element, parserContext);
|
||||
BeanDefinitionParser parser = new IsNewAwareAuditingHandlerBeanDefinitionParser(BeanNames.IS_NEW_STRATEGY_FACTORY);
|
||||
BeanDefinition handlerBeanDefinition = parser.parse(element, parserContext);
|
||||
|
||||
builder.addConstructorArgValue(getObjectFactoryBeanDefinition(parser.getResolvedBeanName(),
|
||||
parserContext.extractSource(element)));
|
||||
builder.addConstructorArgValue(handlerBeanDefinition);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,130 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013-2014 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.config;
|
||||
|
||||
import static org.springframework.beans.factory.config.BeanDefinition.*;
|
||||
import static org.springframework.data.mongodb.config.BeanNames.*;
|
||||
|
||||
import java.lang.annotation.Annotation;
|
||||
|
||||
import org.springframework.beans.factory.config.BeanDefinition;
|
||||
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
|
||||
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
|
||||
import org.springframework.beans.factory.support.RootBeanDefinition;
|
||||
import org.springframework.context.annotation.ImportBeanDefinitionRegistrar;
|
||||
import org.springframework.core.type.AnnotationMetadata;
|
||||
import org.springframework.data.auditing.IsNewAwareAuditingHandler;
|
||||
import org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport;
|
||||
import org.springframework.data.auditing.config.AuditingConfiguration;
|
||||
import org.springframework.data.config.ParsingUtils;
|
||||
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
|
||||
import org.springframework.data.mongodb.core.mapping.event.AuditingEventListener;
|
||||
import org.springframework.data.support.IsNewStrategyFactory;
|
||||
import org.springframework.util.Assert;
|
||||
|
||||
/**
|
||||
* {@link ImportBeanDefinitionRegistrar} to enable {@link EnableMongoAuditing} annotation.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAnnotation()
|
||||
*/
|
||||
@Override
|
||||
protected Class<? extends Annotation> getAnnotation() {
|
||||
return EnableMongoAuditing.class;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAuditingHandlerBeanName()
|
||||
*/
|
||||
@Override
|
||||
protected String getAuditingHandlerBeanName() {
|
||||
return "mongoAuditingHandler";
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#registerBeanDefinitions(org.springframework.core.type.AnnotationMetadata, org.springframework.beans.factory.support.BeanDefinitionRegistry)
|
||||
*/
|
||||
@Override
|
||||
public void registerBeanDefinitions(AnnotationMetadata annotationMetadata, BeanDefinitionRegistry registry) {
|
||||
|
||||
Assert.notNull(annotationMetadata, "AnnotationMetadata must not be null!");
|
||||
Assert.notNull(registry, "BeanDefinitionRegistry must not be null!");
|
||||
|
||||
defaultDependenciesIfNecessary(registry, annotationMetadata);
|
||||
super.registerBeanDefinitions(annotationMetadata, registry);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAuditHandlerBeanDefinitionBuilder(org.springframework.data.auditing.config.AuditingConfiguration)
|
||||
*/
|
||||
@Override
|
||||
protected BeanDefinitionBuilder getAuditHandlerBeanDefinitionBuilder(AuditingConfiguration configuration) {
|
||||
|
||||
Assert.notNull(configuration, "AuditingConfiguration must not be null!");
|
||||
|
||||
BeanDefinitionBuilder builder = BeanDefinitionBuilder.rootBeanDefinition(IsNewAwareAuditingHandler.class);
|
||||
builder.addConstructorArgReference(MAPPING_CONTEXT_BEAN_NAME);
|
||||
return configureDefaultAuditHandlerAttributes(configuration, builder);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#registerAuditListener(org.springframework.beans.factory.config.BeanDefinition, org.springframework.beans.factory.support.BeanDefinitionRegistry)
|
||||
*/
|
||||
@Override
|
||||
protected void registerAuditListenerBeanDefinition(BeanDefinition auditingHandlerDefinition,
|
||||
BeanDefinitionRegistry registry) {
|
||||
|
||||
Assert.notNull(auditingHandlerDefinition, "BeanDefinition must not be null!");
|
||||
Assert.notNull(registry, "BeanDefinitionRegistry must not be null!");
|
||||
|
||||
BeanDefinitionBuilder listenerBeanDefinitionBuilder = BeanDefinitionBuilder
|
||||
.rootBeanDefinition(AuditingEventListener.class);
|
||||
listenerBeanDefinitionBuilder.addConstructorArgValue(ParsingUtils.getObjectFactoryBeanDefinition(
|
||||
getAuditingHandlerBeanName(), registry));
|
||||
|
||||
registerInfrastructureBeanWithId(listenerBeanDefinitionBuilder.getBeanDefinition(),
|
||||
AuditingEventListener.class.getName(), registry);
|
||||
}
|
||||
|
||||
/**
|
||||
* Register default bean definitions for a {@link MongoMappingContext} and an {@link IsNewStrategyFactory} in case we
|
||||
* don't find beans with the assumed names in the registry.
|
||||
*
|
||||
* @param registry the {@link BeanDefinitionRegistry} to use to register the components into.
|
||||
* @param source the source which the registered components shall be registered with
|
||||
*/
|
||||
private void defaultDependenciesIfNecessary(BeanDefinitionRegistry registry, Object source) {
|
||||
|
||||
if (!registry.containsBeanDefinition(MAPPING_CONTEXT_BEAN_NAME)) {
|
||||
|
||||
RootBeanDefinition definition = new RootBeanDefinition(MongoMappingContext.class);
|
||||
definition.setRole(ROLE_INFRASTRUCTURE);
|
||||
definition.setSource(source);
|
||||
|
||||
registry.registerBeanDefinition(MAPPING_CONTEXT_BEAN_NAME, definition);
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2011-2014 by the original author(s).
|
||||
* Copyright 2011-2012 by the original author(s).
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -41,7 +41,6 @@ import com.mongodb.MongoURI;
|
||||
*
|
||||
* @author Jon Brisbin
|
||||
* @author Oliver Gierke
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
|
||||
|
||||
@@ -54,7 +53,7 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
|
||||
throws BeanDefinitionStoreException {
|
||||
|
||||
String id = super.resolveId(element, definition, parserContext);
|
||||
return StringUtils.hasText(id) ? id : BeanNames.DB_FACTORY_BEAN_NAME;
|
||||
return StringUtils.hasText(id) ? id : BeanNames.DB_FACTORY;
|
||||
}
|
||||
|
||||
/*
|
||||
@@ -71,7 +70,6 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
|
||||
String uri = element.getAttribute("uri");
|
||||
String mongoRef = element.getAttribute("mongo-ref");
|
||||
String dbname = element.getAttribute("dbname");
|
||||
|
||||
BeanDefinition userCredentials = getUserCredentialsBeanDefinition(element, parserContext);
|
||||
|
||||
// Common setup
|
||||
@@ -94,16 +92,19 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
|
||||
dbFactoryBuilder.addConstructorArgValue(registerMongoBeanDefinition(element, parserContext));
|
||||
}
|
||||
|
||||
dbFactoryBuilder.addConstructorArgValue(StringUtils.hasText(dbname) ? dbname : "db");
|
||||
dbname = StringUtils.hasText(dbname) ? dbname : "db";
|
||||
dbFactoryBuilder.addConstructorArgValue(dbname);
|
||||
|
||||
if (userCredentials != null) {
|
||||
dbFactoryBuilder.addConstructorArgValue(userCredentials);
|
||||
dbFactoryBuilder.addConstructorArgValue(element.getAttribute("authentication-dbname"));
|
||||
}
|
||||
|
||||
BeanDefinitionBuilder writeConcernPropertyEditorBuilder = getWriteConcernPropertyEditorBuilder();
|
||||
|
||||
BeanComponentDefinition component = helper.getComponent(writeConcernPropertyEditorBuilder);
|
||||
parserContext.registerBeanComponent(component);
|
||||
|
||||
return (AbstractBeanDefinition) helper.getComponentIdButFallback(dbFactoryBuilder, BeanNames.DB_FACTORY_BEAN_NAME)
|
||||
return (AbstractBeanDefinition) helper.getComponentIdButFallback(dbFactoryBuilder, BeanNames.DB_FACTORY)
|
||||
.getBeanDefinition();
|
||||
}
|
||||
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2011-2014 the original author or authors.
|
||||
* Copyright 2011-2012 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -16,12 +16,14 @@
|
||||
package org.springframework.data.mongodb.config;
|
||||
|
||||
import org.springframework.beans.factory.xml.NamespaceHandlerSupport;
|
||||
import org.springframework.data.mongodb.repository.config.MongoRepositoryConfigurationExtension;
|
||||
import org.springframework.data.repository.config.RepositoryBeanDefinitionParser;
|
||||
import org.springframework.data.repository.config.RepositoryConfigurationExtension;
|
||||
|
||||
/**
|
||||
* {@link org.springframework.beans.factory.xml.NamespaceHandler} for Mongo DB configuration.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
* @author Martin Baumgartner
|
||||
*/
|
||||
public class MongoNamespaceHandler extends NamespaceHandlerSupport {
|
||||
|
||||
@@ -31,12 +33,14 @@ public class MongoNamespaceHandler extends NamespaceHandlerSupport {
|
||||
*/
|
||||
public void init() {
|
||||
|
||||
RepositoryConfigurationExtension extension = new MongoRepositoryConfigurationExtension();
|
||||
RepositoryBeanDefinitionParser repositoryBeanDefinitionParser = new RepositoryBeanDefinitionParser(extension);
|
||||
|
||||
registerBeanDefinitionParser("repositories", repositoryBeanDefinitionParser);
|
||||
registerBeanDefinitionParser("mapping-converter", new MappingMongoConverterParser());
|
||||
registerBeanDefinitionParser("mongo", new MongoParser());
|
||||
registerBeanDefinitionParser("db-factory", new MongoDbFactoryParser());
|
||||
registerBeanDefinitionParser("jmx", new MongoJmxParser());
|
||||
registerBeanDefinitionParser("auditing", new MongoAuditingBeanDefinitionParser());
|
||||
registerBeanDefinitionParser("template", new MongoTemplateParser());
|
||||
registerBeanDefinitionParser("gridFsTemplate", new GridFsTemplateParser());
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2011-2014 the original author or authors.
|
||||
* Copyright 2011-2012 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -58,7 +58,7 @@ public class MongoParser implements BeanDefinitionParser {
|
||||
MongoParsingUtils.parseMongoOptions(element, builder);
|
||||
MongoParsingUtils.parseReplicaSet(element, builder);
|
||||
|
||||
String defaultedId = StringUtils.hasText(id) ? id : BeanNames.MONGO_BEAN_NAME;
|
||||
String defaultedId = StringUtils.hasText(id) ? id : BeanNames.MONGO;
|
||||
|
||||
parserContext.pushContainingComponent(new CompositeComponentDefinition("Mongo", source));
|
||||
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2011-2013 the original author or authors.
|
||||
* Copyright 2011-2012 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -33,7 +33,6 @@ import org.w3c.dom.Element;
|
||||
*
|
||||
* @author Mark Pollack
|
||||
* @author Oliver Gierke
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
abstract class MongoParsingUtils {
|
||||
|
||||
@@ -80,8 +79,6 @@ abstract class MongoParsingUtils {
|
||||
setPropertyValue(optionsDefBuilder, optionsElement, "write-timeout", "writeTimeout");
|
||||
setPropertyValue(optionsDefBuilder, optionsElement, "write-fsync", "writeFsync");
|
||||
setPropertyValue(optionsDefBuilder, optionsElement, "slave-ok", "slaveOk");
|
||||
setPropertyValue(optionsDefBuilder, optionsElement, "ssl", "ssl");
|
||||
setPropertyReference(optionsDefBuilder, optionsElement, "ssl-socket-factory-ref", "sslSocketFactory");
|
||||
|
||||
mongoBuilder.addPropertyValue("mongoOptions", optionsDefBuilder.getBeanDefinition());
|
||||
return true;
|
||||
|
||||
@@ -1,87 +0,0 @@
|
||||
/*
|
||||
* Copyright 2011-2014 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.config;
|
||||
|
||||
import static org.springframework.data.config.ParsingUtils.*;
|
||||
import static org.springframework.data.mongodb.config.MongoParsingUtils.*;
|
||||
|
||||
import org.springframework.beans.factory.BeanDefinitionStoreException;
|
||||
import org.springframework.beans.factory.config.BeanDefinition;
|
||||
import org.springframework.beans.factory.parsing.BeanComponentDefinition;
|
||||
import org.springframework.beans.factory.support.AbstractBeanDefinition;
|
||||
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
|
||||
import org.springframework.beans.factory.xml.AbstractBeanDefinitionParser;
|
||||
import org.springframework.beans.factory.xml.BeanDefinitionParser;
|
||||
import org.springframework.beans.factory.xml.ParserContext;
|
||||
import org.springframework.data.config.BeanComponentDefinitionBuilder;
|
||||
import org.springframework.data.mongodb.core.MongoTemplate;
|
||||
import org.springframework.util.StringUtils;
|
||||
import org.w3c.dom.Element;
|
||||
|
||||
/**
|
||||
* {@link BeanDefinitionParser} to parse {@code template} elements into {@link BeanDefinition}s.
|
||||
*
|
||||
* @author Martin Baumgartner
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
class MongoTemplateParser extends AbstractBeanDefinitionParser {
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#resolveId(org.w3c.dom.Element, org.springframework.beans.factory.support.AbstractBeanDefinition, org.springframework.beans.factory.xml.ParserContext)
|
||||
*/
|
||||
@Override
|
||||
protected String resolveId(Element element, AbstractBeanDefinition definition, ParserContext parserContext)
|
||||
throws BeanDefinitionStoreException {
|
||||
|
||||
String id = super.resolveId(element, definition, parserContext);
|
||||
return StringUtils.hasText(id) ? id : BeanNames.MONGO_TEMPLATE_BEAN_NAME;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#parseInternal(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
|
||||
*/
|
||||
@Override
|
||||
protected AbstractBeanDefinition parseInternal(Element element, ParserContext parserContext) {
|
||||
|
||||
BeanComponentDefinitionBuilder helper = new BeanComponentDefinitionBuilder(element, parserContext);
|
||||
|
||||
String converterRef = element.getAttribute("converter-ref");
|
||||
String dbFactoryRef = element.getAttribute("db-factory-ref");
|
||||
|
||||
BeanDefinitionBuilder mongoTemplateBuilder = BeanDefinitionBuilder.genericBeanDefinition(MongoTemplate.class);
|
||||
setPropertyValue(mongoTemplateBuilder, element, "write-concern", "writeConcern");
|
||||
|
||||
if (StringUtils.hasText(dbFactoryRef)) {
|
||||
mongoTemplateBuilder.addConstructorArgReference(dbFactoryRef);
|
||||
} else {
|
||||
mongoTemplateBuilder.addConstructorArgReference(BeanNames.DB_FACTORY_BEAN_NAME);
|
||||
}
|
||||
|
||||
if (StringUtils.hasText(converterRef)) {
|
||||
mongoTemplateBuilder.addConstructorArgReference(converterRef);
|
||||
}
|
||||
|
||||
BeanDefinitionBuilder writeConcernPropertyEditorBuilder = getWriteConcernPropertyEditorBuilder();
|
||||
|
||||
BeanComponentDefinition component = helper.getComponent(writeConcernPropertyEditorBuilder);
|
||||
parserContext.registerBeanComponent(component);
|
||||
|
||||
return (AbstractBeanDefinition) helper.getComponentIdButFallback(mongoTemplateBuilder,
|
||||
BeanNames.MONGO_TEMPLATE_BEAN_NAME).getBeanDefinition();
|
||||
}
|
||||
}
|
||||
@@ -16,14 +16,12 @@
|
||||
package org.springframework.data.mongodb.config;
|
||||
|
||||
import java.beans.PropertyEditorSupport;
|
||||
import java.net.InetAddress;
|
||||
import java.net.UnknownHostException;
|
||||
import java.util.HashSet;
|
||||
import java.util.Set;
|
||||
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.util.Assert;
|
||||
import org.springframework.util.StringUtils;
|
||||
|
||||
import com.mongodb.ServerAddress;
|
||||
@@ -37,11 +35,6 @@ import com.mongodb.ServerAddress;
|
||||
*/
|
||||
public class ServerAddressPropertyEditor extends PropertyEditorSupport {
|
||||
|
||||
/**
|
||||
* A port is a number without a leading 0 at the end of the address that is proceeded by just a single :.
|
||||
*/
|
||||
private static final String HOST_PORT_SPLIT_PATTERN = "(?<!:):(?=[123456789]\\d*$)";
|
||||
private static final String COULD_NOT_PARSE_ADDRESS_MESSAGE = "Could not parse address {} '{}'. Check your replica set configuration!";
|
||||
private static final Logger LOG = LoggerFactory.getLogger(ServerAddressPropertyEditor.class);
|
||||
|
||||
/*
|
||||
@@ -84,53 +77,22 @@ public class ServerAddressPropertyEditor extends PropertyEditorSupport {
|
||||
*/
|
||||
private ServerAddress parseServerAddress(String source) {
|
||||
|
||||
if (!StringUtils.hasText(source)) {
|
||||
LOG.warn(COULD_NOT_PARSE_ADDRESS_MESSAGE, "source", source);
|
||||
return null;
|
||||
}
|
||||
String[] hostAndPort = StringUtils.delimitedListToStringArray(source.trim(), ":");
|
||||
|
||||
String[] hostAndPort = extractHostAddressAndPort(source.trim());
|
||||
|
||||
if (hostAndPort.length > 2) {
|
||||
LOG.warn(COULD_NOT_PARSE_ADDRESS_MESSAGE, "source", source);
|
||||
if (!StringUtils.hasText(source) || hostAndPort.length > 2) {
|
||||
LOG.warn("Could not parse address source '{}'. Check your replica set configuration!", source);
|
||||
return null;
|
||||
}
|
||||
|
||||
try {
|
||||
InetAddress hostAddress = InetAddress.getByName(hostAndPort[0]);
|
||||
Integer port = hostAndPort.length == 1 ? null : Integer.parseInt(hostAndPort[1]);
|
||||
|
||||
return port == null ? new ServerAddress(hostAddress) : new ServerAddress(hostAddress, port);
|
||||
return hostAndPort.length == 1 ? new ServerAddress(hostAndPort[0]) : new ServerAddress(hostAndPort[0],
|
||||
Integer.parseInt(hostAndPort[1]));
|
||||
} catch (UnknownHostException e) {
|
||||
LOG.warn(COULD_NOT_PARSE_ADDRESS_MESSAGE, "host", hostAndPort[0]);
|
||||
LOG.warn("Could not parse host '{}'. Check your replica set configuration!", hostAndPort[0]);
|
||||
} catch (NumberFormatException e) {
|
||||
LOG.warn(COULD_NOT_PARSE_ADDRESS_MESSAGE, "port", hostAndPort[1]);
|
||||
LOG.warn("Could not parse port '{}'. Check your replica set configuration!", hostAndPort[1]);
|
||||
}
|
||||
|
||||
return null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Extract the host and port from the given {@link String}.
|
||||
*
|
||||
* @param addressAndPortSource must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
private String[] extractHostAddressAndPort(String addressAndPortSource) {
|
||||
|
||||
Assert.notNull(addressAndPortSource, "Address and port source must not be null!");
|
||||
|
||||
String[] hostAndPort = addressAndPortSource.split(HOST_PORT_SPLIT_PATTERN);
|
||||
String hostAddress = hostAndPort[0];
|
||||
|
||||
if (isHostAddressInIPv6BracketNotation(hostAddress)) {
|
||||
hostAndPort[0] = hostAddress.substring(1, hostAddress.length() - 1);
|
||||
}
|
||||
|
||||
return hostAndPort;
|
||||
}
|
||||
|
||||
private boolean isHostAddressInIPv6BracketNotation(String hostAddress) {
|
||||
return hostAddress.startsWith("[") && hostAddress.endsWith("]");
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2011-2014 the original author or authors.
|
||||
* Copyright 2011 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -15,8 +15,6 @@
|
||||
*/
|
||||
package org.springframework.data.mongodb.core;
|
||||
|
||||
import static org.springframework.data.domain.Sort.Direction.*;
|
||||
|
||||
import java.util.ArrayList;
|
||||
import java.util.List;
|
||||
|
||||
@@ -24,6 +22,7 @@ import org.springframework.dao.DataAccessException;
|
||||
import org.springframework.data.mongodb.core.index.IndexDefinition;
|
||||
import org.springframework.data.mongodb.core.index.IndexField;
|
||||
import org.springframework.data.mongodb.core.index.IndexInfo;
|
||||
import org.springframework.data.mongodb.core.query.Order;
|
||||
import org.springframework.util.Assert;
|
||||
|
||||
import com.mongodb.DBCollection;
|
||||
@@ -35,14 +34,9 @@ import com.mongodb.MongoException;
|
||||
*
|
||||
* @author Mark Pollack
|
||||
* @author Oliver Gierke
|
||||
* @author Komi Innocent
|
||||
* @author Christoph Strobl
|
||||
*/
|
||||
public class DefaultIndexOperations implements IndexOperations {
|
||||
|
||||
private static final Double ONE = Double.valueOf(1);
|
||||
private static final Double MINUS_ONE = Double.valueOf(-1);
|
||||
|
||||
private final MongoOperations mongoOperations;
|
||||
private final String collectionName;
|
||||
|
||||
@@ -141,24 +135,12 @@ public class DefaultIndexOperations implements IndexOperations {
|
||||
|
||||
Object value = keyDbObject.get(key);
|
||||
|
||||
if ("2d".equals(value)) {
|
||||
if (Integer.valueOf(1).equals(value)) {
|
||||
indexFields.add(IndexField.create(key, Order.ASCENDING));
|
||||
} else if (Integer.valueOf(-1).equals(value)) {
|
||||
indexFields.add(IndexField.create(key, Order.DESCENDING));
|
||||
} else if ("2d".equals(value)) {
|
||||
indexFields.add(IndexField.geo(key));
|
||||
} else if ("text".equals(value)) {
|
||||
|
||||
DBObject weights = (DBObject) ix.get("weights");
|
||||
for (String fieldName : weights.keySet()) {
|
||||
indexFields.add(IndexField.text(fieldName, Float.valueOf(weights.get(fieldName).toString())));
|
||||
}
|
||||
|
||||
} else {
|
||||
|
||||
Double keyValue = new Double(value.toString());
|
||||
|
||||
if (ONE.equals(keyValue)) {
|
||||
indexFields.add(IndexField.create(key, ASC));
|
||||
} else if (MINUS_ONE.equals(keyValue)) {
|
||||
indexFields.add(IndexField.create(key, DESC));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -167,8 +149,8 @@ public class DefaultIndexOperations implements IndexOperations {
|
||||
boolean unique = ix.containsField("unique") ? (Boolean) ix.get("unique") : false;
|
||||
boolean dropDuplicates = ix.containsField("dropDups") ? (Boolean) ix.get("dropDups") : false;
|
||||
boolean sparse = ix.containsField("sparse") ? (Boolean) ix.get("sparse") : false;
|
||||
String language = ix.containsField("default_language") ? (String) ix.get("default_language") : "";
|
||||
indexInfoList.add(new IndexInfo(indexFields, name, unique, dropDuplicates, sparse, language));
|
||||
|
||||
indexInfoList.add(new IndexInfo(indexFields, name, unique, dropDuplicates, sparse));
|
||||
}
|
||||
|
||||
return indexInfoList;
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2011-2013 the original author or authors.
|
||||
* Copyright 2011-2012 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -27,7 +27,6 @@ import com.mongodb.Mongo;
|
||||
* Mongo server administration exposed via JMX annotations
|
||||
*
|
||||
* @author Mark Pollack
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
@ManagedResource(description = "Mongo Admin Operations")
|
||||
public class MongoAdmin implements MongoAdminOperations {
|
||||
@@ -35,7 +34,6 @@ public class MongoAdmin implements MongoAdminOperations {
|
||||
private final Mongo mongo;
|
||||
private String username;
|
||||
private String password;
|
||||
private String authenticationDatabaseName;
|
||||
|
||||
public MongoAdmin(Mongo mongo) {
|
||||
Assert.notNull(mongo);
|
||||
@@ -84,16 +82,7 @@ public class MongoAdmin implements MongoAdminOperations {
|
||||
this.password = password;
|
||||
}
|
||||
|
||||
/**
|
||||
* Sets the authenticationDatabaseName to use to authenticate with the Mongo database.
|
||||
*
|
||||
* @param authenticationDatabaseName The authenticationDatabaseName to use.
|
||||
*/
|
||||
public void setAuthenticationDatabaseName(String authenticationDatabaseName) {
|
||||
this.authenticationDatabaseName = authenticationDatabaseName;
|
||||
}
|
||||
|
||||
DB getDB(String databaseName) {
|
||||
return MongoDbUtils.getDB(mongo, databaseName, new UserCredentials(username, password), authenticationDatabaseName);
|
||||
return MongoDbUtils.getDB(mongo, databaseName, new UserCredentials(username, password));
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,34 +1,16 @@
|
||||
/*
|
||||
* Copyright 2011-2014 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core;
|
||||
|
||||
import org.springframework.jmx.export.annotation.ManagedOperation;
|
||||
|
||||
/**
|
||||
* @author Mark Pollack
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
public interface MongoAdminOperations {
|
||||
|
||||
@ManagedOperation
|
||||
void dropDatabase(String databaseName);
|
||||
public abstract void dropDatabase(String databaseName);
|
||||
|
||||
@ManagedOperation
|
||||
void createDatabase(String databaseName);
|
||||
public abstract void createDatabase(String databaseName);
|
||||
|
||||
@ManagedOperation
|
||||
String getDatabaseStats(String databaseName);
|
||||
public abstract String getDatabaseStats(String databaseName);
|
||||
|
||||
}
|
||||
@@ -33,7 +33,6 @@ import com.mongodb.Mongo;
|
||||
* @author Graeme Rocher
|
||||
* @author Oliver Gierke
|
||||
* @author Randy Watler
|
||||
* @author Thomas Darimont
|
||||
* @since 1.0
|
||||
*/
|
||||
public abstract class MongoDbUtils {
|
||||
@@ -55,7 +54,7 @@ public abstract class MongoDbUtils {
|
||||
* @return the {@link DB} connection
|
||||
*/
|
||||
public static DB getDB(Mongo mongo, String databaseName) {
|
||||
return doGetDB(mongo, databaseName, UserCredentials.NO_CREDENTIALS, true, databaseName);
|
||||
return doGetDB(mongo, databaseName, UserCredentials.NO_CREDENTIALS, true);
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -67,22 +66,15 @@ public abstract class MongoDbUtils {
|
||||
* @return the {@link DB} connection
|
||||
*/
|
||||
public static DB getDB(Mongo mongo, String databaseName, UserCredentials credentials) {
|
||||
return getDB(mongo, databaseName, credentials, databaseName);
|
||||
}
|
||||
|
||||
public static DB getDB(Mongo mongo, String databaseName, UserCredentials credentials,
|
||||
String authenticationDatabaseName) {
|
||||
|
||||
Assert.notNull(mongo, "No Mongo instance specified!");
|
||||
Assert.hasText(databaseName, "Database name must be given!");
|
||||
Assert.notNull(credentials, "Credentials must not be null, use UserCredentials.NO_CREDENTIALS!");
|
||||
Assert.hasText(authenticationDatabaseName, "Authentication database name must not be null or empty!");
|
||||
|
||||
return doGetDB(mongo, databaseName, credentials, true, authenticationDatabaseName);
|
||||
return doGetDB(mongo, databaseName, credentials, true);
|
||||
}
|
||||
|
||||
private static DB doGetDB(Mongo mongo, String databaseName, UserCredentials credentials, boolean allowCreate,
|
||||
String authenticationDatabaseName) {
|
||||
private static DB doGetDB(Mongo mongo, String databaseName, UserCredentials credentials, boolean allowCreate) {
|
||||
|
||||
DbHolder dbHolder = (DbHolder) TransactionSynchronizationManager.getResource(mongo);
|
||||
|
||||
@@ -111,16 +103,14 @@ public abstract class MongoDbUtils {
|
||||
DB db = mongo.getDB(databaseName);
|
||||
boolean credentialsGiven = credentials.hasUsername() && credentials.hasPassword();
|
||||
|
||||
DB authDb = databaseName.equals(authenticationDatabaseName) ? db : mongo.getDB(authenticationDatabaseName);
|
||||
synchronized (db) {
|
||||
|
||||
synchronized (authDb) {
|
||||
|
||||
if (credentialsGiven && !authDb.isAuthenticated()) {
|
||||
if (credentialsGiven && !db.isAuthenticated()) {
|
||||
|
||||
String username = credentials.getUsername();
|
||||
String password = credentials.hasPassword() ? credentials.getPassword() : null;
|
||||
|
||||
if (!authDb.authenticate(username, password == null ? null : password.toCharArray())) {
|
||||
if (!db.authenticate(username, password == null ? null : password.toCharArray())) {
|
||||
throw new CannotGetMongoDbConnectionException("Failed to authenticate to database [" + databaseName + "], "
|
||||
+ credentials.toString(), databaseName, credentials);
|
||||
}
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
* Copyright 2010-2011 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -13,8 +13,8 @@
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
/**
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
package org.springframework.data.mongodb.util;
|
||||
package org.springframework.data.mongodb.core;
|
||||
|
||||
public interface MongoDocumentWriter {
|
||||
|
||||
}
|
||||
@@ -23,15 +23,11 @@ import org.springframework.dao.InvalidDataAccessResourceUsageException;
|
||||
import org.springframework.dao.support.PersistenceExceptionTranslator;
|
||||
import org.springframework.data.mongodb.UncategorizedMongoDbException;
|
||||
|
||||
import com.mongodb.MongoCursorNotFoundException;
|
||||
import com.mongodb.MongoException;
|
||||
import com.mongodb.MongoException.CursorNotFound;
|
||||
import com.mongodb.MongoException.DuplicateKey;
|
||||
import com.mongodb.MongoException.Network;
|
||||
import com.mongodb.MongoInternalException;
|
||||
import com.mongodb.MongoServerSelectionException;
|
||||
import com.mongodb.MongoSocketException;
|
||||
import com.mongodb.MongoTimeoutException;
|
||||
|
||||
/**
|
||||
* Simple {@link PersistenceExceptionTranslator} for Mongo. Convert the given runtime exception to an appropriate
|
||||
@@ -51,23 +47,16 @@ public class MongoExceptionTranslator implements PersistenceExceptionTranslator
|
||||
|
||||
// Check for well-known MongoException subclasses.
|
||||
|
||||
if (ex instanceof DuplicateKey || ex instanceof DuplicateKeyException) {
|
||||
// All other MongoExceptions
|
||||
if (ex instanceof DuplicateKey) {
|
||||
return new DuplicateKeyException(ex.getMessage(), ex);
|
||||
}
|
||||
|
||||
if (ex instanceof Network || ex instanceof MongoSocketException) {
|
||||
if (ex instanceof Network) {
|
||||
return new DataAccessResourceFailureException(ex.getMessage(), ex);
|
||||
}
|
||||
|
||||
if (ex instanceof CursorNotFound || ex instanceof MongoCursorNotFoundException) {
|
||||
return new DataAccessResourceFailureException(ex.getMessage(), ex);
|
||||
}
|
||||
|
||||
if (ex instanceof MongoServerSelectionException) {
|
||||
return new DataAccessResourceFailureException(ex.getMessage(), ex);
|
||||
}
|
||||
|
||||
if (ex instanceof MongoTimeoutException) {
|
||||
if (ex instanceof CursorNotFound) {
|
||||
return new DataAccessResourceFailureException(ex.getMessage(), ex);
|
||||
}
|
||||
|
||||
@@ -75,7 +64,6 @@ public class MongoExceptionTranslator implements PersistenceExceptionTranslator
|
||||
return new InvalidDataAccessResourceUsageException(ex.getMessage(), ex);
|
||||
}
|
||||
|
||||
// All other MongoExceptions
|
||||
if (ex instanceof MongoException) {
|
||||
|
||||
int code = ((MongoException) ex).getCode();
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2010-2013 the original author or authors.
|
||||
* Copyright 2010-2012 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -148,7 +148,6 @@ public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, D
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.beans.factory.InitializingBean#afterPropertiesSet()
|
||||
*/
|
||||
@SuppressWarnings("deprecation")
|
||||
public void afterPropertiesSet() throws Exception {
|
||||
|
||||
Mongo mongo;
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2011-2014 the original author or authors.
|
||||
* Copyright 2010-2011 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -19,10 +19,8 @@ import java.util.Collection;
|
||||
import java.util.List;
|
||||
import java.util.Set;
|
||||
|
||||
import org.springframework.data.mongodb.core.aggregation.Aggregation;
|
||||
import org.springframework.data.mongodb.core.aggregation.AggregationResults;
|
||||
import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
|
||||
import org.springframework.data.mongodb.core.convert.MongoConverter;
|
||||
import org.springframework.data.mongodb.core.geo.GeoResult;
|
||||
import org.springframework.data.mongodb.core.geo.GeoResults;
|
||||
import org.springframework.data.mongodb.core.mapreduce.GroupBy;
|
||||
import org.springframework.data.mongodb.core.mapreduce.GroupByResults;
|
||||
@@ -47,12 +45,7 @@ import com.mongodb.WriteResult;
|
||||
* @author Thomas Risberg
|
||||
* @author Mark Pollack
|
||||
* @author Oliver Gierke
|
||||
* @author Tobias Trelle
|
||||
* @author Chuong Ngo
|
||||
* @author Christoph Strobl
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
@SuppressWarnings("deprecation")
|
||||
public interface MongoOperations {
|
||||
|
||||
/**
|
||||
@@ -308,57 +301,6 @@ public interface MongoOperations {
|
||||
*/
|
||||
<T> GroupByResults<T> group(Criteria criteria, String inputCollectionName, GroupBy groupBy, Class<T> entityClass);
|
||||
|
||||
/**
|
||||
* Execute an aggregation operation. The raw results will be mapped to the given entity class. The name of the
|
||||
* inputCollection is derived from the inputType of the aggregation.
|
||||
*
|
||||
* @param aggregation The {@link TypedAggregation} specification holding the aggregation operations, must not be
|
||||
* {@literal null}.
|
||||
* @param collectionName The name of the input collection to use for the aggreation.
|
||||
* @param outputType The parameterized type of the returned list, must not be {@literal null}.
|
||||
* @return The results of the aggregation operation.
|
||||
* @since 1.3
|
||||
*/
|
||||
<O> AggregationResults<O> aggregate(TypedAggregation<?> aggregation, String collectionName, Class<O> outputType);
|
||||
|
||||
/**
|
||||
* Execute an aggregation operation. The raw results will be mapped to the given entity class. The name of the
|
||||
* inputCollection is derived from the inputType of the aggregation.
|
||||
*
|
||||
* @param aggregation The {@link TypedAggregation} specification holding the aggregation operations, must not be
|
||||
* {@literal null}.
|
||||
* @param outputType The parameterized type of the returned list, must not be {@literal null}.
|
||||
* @return The results of the aggregation operation.
|
||||
* @since 1.3
|
||||
*/
|
||||
<O> AggregationResults<O> aggregate(TypedAggregation<?> aggregation, Class<O> outputType);
|
||||
|
||||
/**
|
||||
* Execute an aggregation operation. The raw results will be mapped to the given entity class.
|
||||
*
|
||||
* @param aggregation The {@link Aggregation} specification holding the aggregation operations, must not be
|
||||
* {@literal null}.
|
||||
* @param inputType the inputType where the aggregation operation will read from, must not be {@literal null} or
|
||||
* empty.
|
||||
* @param outputType The parameterized type of the returned list, must not be {@literal null}.
|
||||
* @return The results of the aggregation operation.
|
||||
* @since 1.3
|
||||
*/
|
||||
<O> AggregationResults<O> aggregate(Aggregation aggregation, Class<?> inputType, Class<O> outputType);
|
||||
|
||||
/**
|
||||
* Execute an aggregation operation. The raw results will be mapped to the given entity class.
|
||||
*
|
||||
* @param aggregation The {@link Aggregation} specification holding the aggregation operations, must not be
|
||||
* {@literal null}.
|
||||
* @param collectionName the collection where the aggregation operation will read from, must not be {@literal null} or
|
||||
* empty.
|
||||
* @param outputType The parameterized type of the returned list, must not be {@literal null}.
|
||||
* @return The results of the aggregation operation.
|
||||
* @since 1.3
|
||||
*/
|
||||
<O> AggregationResults<O> aggregate(Aggregation aggregation, String collectionName, Class<O> outputType);
|
||||
|
||||
/**
|
||||
* Execute a map-reduce operation. The map-reduce operation will be formed with an output type of INLINE
|
||||
*
|
||||
@@ -415,7 +357,7 @@ public interface MongoOperations {
|
||||
MapReduceOptions mapReduceOptions, Class<T> entityClass);
|
||||
|
||||
/**
|
||||
* Returns {@link GeoResults} for all entities matching the given {@link NearQuery}. Will consider entity mapping
|
||||
* Returns {@link GeoResult} for all entities matching the given {@link NearQuery}. Will consider entity mapping
|
||||
* information to determine the collection the query is ran against.
|
||||
*
|
||||
* @param near must not be {@literal null}.
|
||||
@@ -425,7 +367,7 @@ public interface MongoOperations {
|
||||
<T> GeoResults<T> geoNear(NearQuery near, Class<T> entityClass);
|
||||
|
||||
/**
|
||||
* Returns {@link GeoResults} for all entities matching the given {@link NearQuery}.
|
||||
* Returns {@link GeoResult} for all entities matching the given {@link NearQuery}.
|
||||
*
|
||||
* @param near must not be {@literal null}.
|
||||
* @param entityClass must not be {@literal null}.
|
||||
@@ -466,38 +408,11 @@ public interface MongoOperations {
|
||||
* specification
|
||||
* @param entityClass the parameterized type of the returned list.
|
||||
* @param collectionName name of the collection to retrieve the objects from
|
||||
*
|
||||
* @return the converted object
|
||||
*/
|
||||
<T> T findOne(Query query, Class<T> entityClass, String collectionName);
|
||||
|
||||
/**
|
||||
* Determine result of given {@link Query} contains at least one element.
|
||||
*
|
||||
* @param query the {@link Query} class that specifies the criteria used to find a record.
|
||||
* @param collectionName name of the collection to check for objects.
|
||||
* @return
|
||||
*/
|
||||
boolean exists(Query query, String collectionName);
|
||||
|
||||
/**
|
||||
* Determine result of given {@link Query} contains at least one element.
|
||||
*
|
||||
* @param query the {@link Query} class that specifies the criteria used to find a record.
|
||||
* @param entityClass the parameterized type.
|
||||
* @return
|
||||
*/
|
||||
boolean exists(Query query, Class<?> entityClass);
|
||||
|
||||
/**
|
||||
* Determine result of given {@link Query} contains at least one element.
|
||||
*
|
||||
* @param query the {@link Query} class that specifies the criteria used to find a record.
|
||||
* @param entityClass the parameterized type.
|
||||
* @param collectionName name of the collection to check for objects.
|
||||
* @return
|
||||
*/
|
||||
boolean exists(Query query, Class<?> entityClass, String collectionName);
|
||||
|
||||
/**
|
||||
* Map the results of an ad-hoc query on the collection for the entity class to a List of the specified type.
|
||||
* <p/>
|
||||
@@ -527,6 +442,7 @@ public interface MongoOperations {
|
||||
* specification
|
||||
* @param entityClass the parameterized type of the returned list.
|
||||
* @param collectionName name of the collection to retrieve the objects from
|
||||
*
|
||||
* @return the List of converted objects
|
||||
*/
|
||||
<T> List<T> find(Query query, Class<T> entityClass, String collectionName);
|
||||
@@ -548,63 +464,18 @@ public interface MongoOperations {
|
||||
* @param id the id of the document to return
|
||||
* @param entityClass the type to convert the document to
|
||||
* @param collectionName the collection to query for the document
|
||||
*
|
||||
* @param <T>
|
||||
* @return
|
||||
*/
|
||||
<T> T findById(Object id, Class<T> entityClass, String collectionName);
|
||||
|
||||
/**
|
||||
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify<a/>
|
||||
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
|
||||
*
|
||||
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
|
||||
* fields specification.
|
||||
* @param update the {@link Update} to apply on matching documents.
|
||||
* @param entityClass the parameterized type.
|
||||
* @return
|
||||
*/
|
||||
<T> T findAndModify(Query query, Update update, Class<T> entityClass);
|
||||
|
||||
/**
|
||||
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify<a/>
|
||||
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
|
||||
*
|
||||
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
|
||||
* fields specification.
|
||||
* @param update the {@link Update} to apply on matching documents.
|
||||
* @param entityClass the parameterized type.
|
||||
* @param collectionName the collection to query.
|
||||
* @return
|
||||
*/
|
||||
<T> T findAndModify(Query query, Update update, Class<T> entityClass, String collectionName);
|
||||
|
||||
/**
|
||||
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify<a/>
|
||||
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
|
||||
* {@link FindAndModifyOptions} into account.
|
||||
*
|
||||
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
|
||||
* fields specification.
|
||||
* @param update the {@link Update} to apply on matching documents.
|
||||
* @param options the {@link FindAndModifyOptions} holding additional information.
|
||||
* @param entityClass the parameterized type.
|
||||
* @return
|
||||
*/
|
||||
<T> T findAndModify(Query query, Update update, FindAndModifyOptions options, Class<T> entityClass);
|
||||
|
||||
/**
|
||||
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify<a/>
|
||||
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
|
||||
* {@link FindAndModifyOptions} into account.
|
||||
*
|
||||
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
|
||||
* fields specification.
|
||||
* @param update the {@link Update} to apply on matching documents.
|
||||
* @param options the {@link FindAndModifyOptions} holding additional information.
|
||||
* @param entityClass the parameterized type.
|
||||
* @param collectionName the collection to query.
|
||||
* @return
|
||||
*/
|
||||
<T> T findAndModify(Query query, Update update, FindAndModifyOptions options, Class<T> entityClass,
|
||||
String collectionName);
|
||||
|
||||
@@ -639,6 +510,7 @@ public interface MongoOperations {
|
||||
* specification
|
||||
* @param entityClass the parameterized type of the returned list.
|
||||
* @param collectionName name of the collection to retrieve the objects from
|
||||
*
|
||||
* @return the converted object
|
||||
*/
|
||||
<T> T findAndRemove(Query query, Class<T> entityClass, String collectionName);
|
||||
@@ -668,9 +540,9 @@ public interface MongoOperations {
|
||||
* <p/>
|
||||
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
|
||||
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
|
||||
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See <a
|
||||
* href="http://docs.spring.io/spring/docs/current/spring-framework-reference/html/validation.html#core-convert"
|
||||
* >Spring's Type Conversion"</a> for more details.
|
||||
* property type will be handled by Spring's BeanWrapper class that leverages Spring 3.0's new Type Conversion API.
|
||||
* See <a href="http://static.springsource.org/spring/docs/3.0.x/reference/validation.html#core-convert">Spring 3 Type
|
||||
* Conversion"</a> for more details.
|
||||
* <p/>
|
||||
* <p/>
|
||||
* Insert is used to initially store the object into the database. To update an existing object use the save method.
|
||||
@@ -725,9 +597,9 @@ public interface MongoOperations {
|
||||
* <p/>
|
||||
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
|
||||
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
|
||||
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See <a
|
||||
* href="http://docs.spring.io/spring/docs/current/spring-framework-reference/html/validation.html#core-convert"
|
||||
* >Spring's Type Conversion"</a> for more details.
|
||||
* property type will be handled by Spring's BeanWrapper class that leverages Spring 3.0's new Type Conversion API.
|
||||
* See <a href="http://static.springsource.org/spring/docs/3.0.x/reference/validation.html#core-convert">Spring 3 Type
|
||||
* Conversion"</a> for more details.
|
||||
*
|
||||
* @param objectToSave the object to store in the collection
|
||||
*/
|
||||
@@ -742,9 +614,9 @@ public interface MongoOperations {
|
||||
* <p/>
|
||||
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
|
||||
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
|
||||
* property type will be handled by Spring's BeanWrapper class that leverages Type Cobnversion API. See <a
|
||||
* http://docs.spring.io/spring/docs/current/spring-framework-reference/html/validation.html#core-convert">Spring's
|
||||
* Type Conversion"</a> for more details.
|
||||
* property type will be handled by Spring's BeanWrapper class that leverages Spring 3.0's new Type Cobnversion API.
|
||||
* See <a href="http://static.springsource.org/spring/docs/3.0.x/reference/validation.html#core-convert">Spring 3 Type
|
||||
* Conversion"</a> for more details.
|
||||
*
|
||||
* @param objectToSave the object to store in the collection
|
||||
* @param collectionName name of the collection to store the object in
|
||||
@@ -774,18 +646,6 @@ public interface MongoOperations {
|
||||
*/
|
||||
WriteResult upsert(Query query, Update update, String collectionName);
|
||||
|
||||
/**
|
||||
* Performs an upsert. If no document is found that matches the query, a new document is created and inserted by
|
||||
* combining the query document and the update document.
|
||||
*
|
||||
* @param query the query document that specifies the criteria used to select a record to be upserted
|
||||
* @param update the update document that contains the updated object or $ operators to manipulate the existing object
|
||||
* @param entityClass class of the pojo to be operated on
|
||||
* @param collectionName name of the collection to update the object in
|
||||
* @return the WriteResult which lets you access the results of the previous write.
|
||||
*/
|
||||
WriteResult upsert(Query query, Update update, Class<?> entityClass, String collectionName);
|
||||
|
||||
/**
|
||||
* Updates the first object that is found in the collection of the entity class that matches the query document with
|
||||
* the provided update document.
|
||||
@@ -810,19 +670,6 @@ public interface MongoOperations {
|
||||
*/
|
||||
WriteResult updateFirst(Query query, Update update, String collectionName);
|
||||
|
||||
/**
|
||||
* Updates the first object that is found in the specified collection that matches the query document criteria with
|
||||
* the provided updated document.
|
||||
*
|
||||
* @param query the query document that specifies the criteria used to select a record to be updated
|
||||
* @param update the update document that contains the updated object or $ operators to manipulate the existing
|
||||
* object.
|
||||
* @param entityClass class of the pojo to be operated on
|
||||
* @param collectionName name of the collection to update the object in
|
||||
* @return the WriteResult which lets you access the results of the previous write.
|
||||
*/
|
||||
WriteResult updateFirst(Query query, Update update, Class<?> entityClass, String collectionName);
|
||||
|
||||
/**
|
||||
* Updates all objects that are found in the collection for the entity class that matches the query document criteria
|
||||
* with the provided updated document.
|
||||
@@ -847,25 +694,12 @@ public interface MongoOperations {
|
||||
*/
|
||||
WriteResult updateMulti(Query query, Update update, String collectionName);
|
||||
|
||||
/**
|
||||
* Updates all objects that are found in the collection for the entity class that matches the query document criteria
|
||||
* with the provided updated document.
|
||||
*
|
||||
* @param query the query document that specifies the criteria used to select a record to be updated
|
||||
* @param update the update document that contains the updated object or $ operators to manipulate the existing
|
||||
* object.
|
||||
* @param entityClass class of the pojo to be operated on
|
||||
* @param collectionName name of the collection to update the object in
|
||||
* @return the WriteResult which lets you access the results of the previous write.
|
||||
*/
|
||||
WriteResult updateMulti(final Query query, final Update update, Class<?> entityClass, String collectionName);
|
||||
|
||||
/**
|
||||
* Remove the given object from the collection by id.
|
||||
*
|
||||
* @param object
|
||||
*/
|
||||
WriteResult remove(Object object);
|
||||
void remove(Object object);
|
||||
|
||||
/**
|
||||
* Removes the given object from the given collection.
|
||||
@@ -873,26 +707,17 @@ public interface MongoOperations {
|
||||
* @param object
|
||||
* @param collection must not be {@literal null} or empty.
|
||||
*/
|
||||
WriteResult remove(Object object, String collection);
|
||||
void remove(Object object, String collection);
|
||||
|
||||
/**
|
||||
* Remove all documents that match the provided query document criteria from the the collection used to store the
|
||||
* entityClass. The Class parameter is also used to help convert the Id of the object if it is present in the query.
|
||||
*
|
||||
* @param <T>
|
||||
* @param query
|
||||
* @param entityClass
|
||||
*/
|
||||
WriteResult remove(Query query, Class<?> entityClass);
|
||||
|
||||
/**
|
||||
* Remove all documents that match the provided query document criteria from the the collection used to store the
|
||||
* entityClass. The Class parameter is also used to help convert the Id of the object if it is present in the query.
|
||||
*
|
||||
* @param query
|
||||
* @param entityClass
|
||||
* @param collectionName
|
||||
*/
|
||||
WriteResult remove(Query query, Class<?> entityClass, String collectionName);
|
||||
<T> void remove(Query query, Class<T> entityClass);
|
||||
|
||||
/**
|
||||
* Remove all documents from the specified collection that match the provided query document criteria. There is no
|
||||
@@ -901,40 +726,7 @@ public interface MongoOperations {
|
||||
* @param query the query document that specifies the criteria used to remove a record
|
||||
* @param collectionName name of the collection where the objects will removed
|
||||
*/
|
||||
WriteResult remove(Query query, String collectionName);
|
||||
|
||||
/**
|
||||
* Returns and removes all documents form the specified collection that match the provided query.
|
||||
*
|
||||
* @param query
|
||||
* @param collectionName
|
||||
* @return
|
||||
* @since 1.5
|
||||
*/
|
||||
<T> List<T> findAllAndRemove(Query query, String collectionName);
|
||||
|
||||
/**
|
||||
* Returns and removes all documents matching the given query form the collection used to store the entityClass.
|
||||
*
|
||||
* @param query
|
||||
* @param entityClass
|
||||
* @return
|
||||
* @since 1.5
|
||||
*/
|
||||
<T> List<T> findAllAndRemove(Query query, Class<T> entityClass);
|
||||
|
||||
/**
|
||||
* Returns and removes all documents that match the provided query document criteria from the the collection used to
|
||||
* store the entityClass. The Class parameter is also used to help convert the Id of the object if it is present in
|
||||
* the query.
|
||||
*
|
||||
* @param query
|
||||
* @param entityClass
|
||||
* @param collectionName
|
||||
* @return
|
||||
* @since 1.5
|
||||
*/
|
||||
<T> List<T> findAllAndRemove(Query query, Class<T> entityClass, String collectionName);
|
||||
void remove(Query query, String collectionName);
|
||||
|
||||
/**
|
||||
* Returns the underlying {@link MongoConverter}.
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2010-2014 the original author or authors.
|
||||
* Copyright 2010-2011 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -15,92 +15,129 @@
|
||||
*/
|
||||
package org.springframework.data.mongodb.core;
|
||||
|
||||
import javax.net.ssl.SSLSocketFactory;
|
||||
import com.mongodb.MongoOptions;
|
||||
|
||||
import org.springframework.beans.factory.FactoryBean;
|
||||
import org.springframework.beans.factory.InitializingBean;
|
||||
|
||||
import com.mongodb.MongoOptions;
|
||||
|
||||
/**
|
||||
* A factory bean for construction of a {@link MongoOptions} instance.
|
||||
* A factory bean for construction of a MongoOptions instance
|
||||
*
|
||||
* @author Graeme Rocher
|
||||
* @author Mark Pollack
|
||||
* @author Mike Saavedra
|
||||
* @author Thomas Darimont
|
||||
* @Author Mark Pollack
|
||||
*/
|
||||
@SuppressWarnings("deprecation")
|
||||
public class MongoOptionsFactoryBean implements FactoryBean<MongoOptions>, InitializingBean {
|
||||
|
||||
private static final MongoOptions DEFAULT_MONGO_OPTIONS = new MongoOptions();
|
||||
|
||||
private int connectionsPerHost = DEFAULT_MONGO_OPTIONS.connectionsPerHost;
|
||||
private int threadsAllowedToBlockForConnectionMultiplier = DEFAULT_MONGO_OPTIONS.threadsAllowedToBlockForConnectionMultiplier;
|
||||
private int maxWaitTime = DEFAULT_MONGO_OPTIONS.maxWaitTime;
|
||||
private int connectTimeout = DEFAULT_MONGO_OPTIONS.connectTimeout;
|
||||
private int socketTimeout = DEFAULT_MONGO_OPTIONS.socketTimeout;
|
||||
private boolean socketKeepAlive = DEFAULT_MONGO_OPTIONS.socketKeepAlive;
|
||||
private boolean autoConnectRetry = DEFAULT_MONGO_OPTIONS.autoConnectRetry;
|
||||
private long maxAutoConnectRetryTime = DEFAULT_MONGO_OPTIONS.maxAutoConnectRetryTime;
|
||||
private int writeNumber = DEFAULT_MONGO_OPTIONS.w;
|
||||
private int writeTimeout = DEFAULT_MONGO_OPTIONS.wtimeout;
|
||||
private boolean writeFsync = DEFAULT_MONGO_OPTIONS.fsync;
|
||||
private boolean slaveOk = DEFAULT_MONGO_OPTIONS.slaveOk;
|
||||
private boolean ssl;
|
||||
private SSLSocketFactory sslSocketFactory;
|
||||
|
||||
private MongoOptions options;
|
||||
private static final MongoOptions MONGO_OPTIONS = new MongoOptions();
|
||||
/**
|
||||
* number of connections allowed per host will block if run out
|
||||
*/
|
||||
private int connectionsPerHost = MONGO_OPTIONS.connectionsPerHost;
|
||||
|
||||
/**
|
||||
* Configures the maximum number of connections allowed per host until we will block.
|
||||
* multiplier for connectionsPerHost for # of threads that can block if connectionsPerHost is 10, and
|
||||
* threadsAllowedToBlockForConnectionMultiplier is 5, then 50 threads can block more than that and an exception will
|
||||
* be throw
|
||||
*/
|
||||
private int threadsAllowedToBlockForConnectionMultiplier = MONGO_OPTIONS.threadsAllowedToBlockForConnectionMultiplier;
|
||||
|
||||
/**
|
||||
* max wait time of a blocking thread for a connection
|
||||
*/
|
||||
private int maxWaitTime = MONGO_OPTIONS.maxWaitTime;
|
||||
|
||||
/**
|
||||
* connect timeout in milliseconds. 0 is default and infinite
|
||||
*/
|
||||
private int connectTimeout = MONGO_OPTIONS.connectTimeout;
|
||||
|
||||
/**
|
||||
* socket timeout. 0 is default and infinite
|
||||
*/
|
||||
private int socketTimeout = MONGO_OPTIONS.socketTimeout;
|
||||
|
||||
/**
|
||||
* This controls whether or not to have socket keep alive turned on (SO_KEEPALIVE).
|
||||
*
|
||||
* @param connectionsPerHost
|
||||
* defaults to false
|
||||
*/
|
||||
public boolean socketKeepAlive = MONGO_OPTIONS.socketKeepAlive;
|
||||
|
||||
/**
|
||||
* this controls whether or not on a connect, the system retries automatically
|
||||
*/
|
||||
private boolean autoConnectRetry = MONGO_OPTIONS.autoConnectRetry;
|
||||
|
||||
private long maxAutoConnectRetryTime = MONGO_OPTIONS.maxAutoConnectRetryTime;
|
||||
|
||||
/**
|
||||
* This specifies the number of servers to wait for on the write operation, and exception raising behavior.
|
||||
*
|
||||
* Defaults to 0.
|
||||
*/
|
||||
private int writeNumber;
|
||||
|
||||
/**
|
||||
* This controls timeout for write operations in milliseconds.
|
||||
*
|
||||
* Defaults to 0 (indefinite). Greater than zero is number of milliseconds to wait.
|
||||
*/
|
||||
private int writeTimeout;
|
||||
|
||||
/**
|
||||
* This controls whether or not to fsync.
|
||||
*
|
||||
* Defaults to false.
|
||||
*/
|
||||
private boolean writeFsync;
|
||||
|
||||
/**
|
||||
* Specifies if the driver is allowed to read from secondaries or slaves.
|
||||
*
|
||||
* Defaults to false
|
||||
*/
|
||||
@SuppressWarnings("deprecation")
|
||||
private boolean slaveOk = MONGO_OPTIONS.slaveOk;
|
||||
|
||||
/**
|
||||
* number of connections allowed per host will block if run out
|
||||
*/
|
||||
public void setConnectionsPerHost(int connectionsPerHost) {
|
||||
this.connectionsPerHost = connectionsPerHost;
|
||||
}
|
||||
|
||||
/**
|
||||
* A multiplier for connectionsPerHost for # of threads that can block a connection. If connectionsPerHost is 10, and
|
||||
* threadsAllowedToBlockForConnectionMultiplier is 5, then 50 threads can block. If more threads try to block an
|
||||
* exception will be thrown.
|
||||
*
|
||||
* @param threadsAllowedToBlockForConnectionMultiplier
|
||||
* multiplier for connectionsPerHost for # of threads that can block if connectionsPerHost is 10, and
|
||||
* threadsAllowedToBlockForConnectionMultiplier is 5, then 50 threads can block more than that and an exception will
|
||||
* be throw
|
||||
*/
|
||||
public void setThreadsAllowedToBlockForConnectionMultiplier(int threadsAllowedToBlockForConnectionMultiplier) {
|
||||
this.threadsAllowedToBlockForConnectionMultiplier = threadsAllowedToBlockForConnectionMultiplier;
|
||||
}
|
||||
|
||||
/**
|
||||
* Max wait time of a blocking thread for a connection.
|
||||
*
|
||||
* @param maxWaitTime
|
||||
* max wait time of a blocking thread for a connection
|
||||
*/
|
||||
public void setMaxWaitTime(int maxWaitTime) {
|
||||
this.maxWaitTime = maxWaitTime;
|
||||
}
|
||||
|
||||
/**
|
||||
* Configures the connect timeout in milliseconds. Defaults to 0 (infinite time).
|
||||
*
|
||||
* @param connectTimeout
|
||||
* connect timeout in milliseconds. 0 is default and infinite
|
||||
*/
|
||||
public void setConnectTimeout(int connectTimeout) {
|
||||
this.connectTimeout = connectTimeout;
|
||||
}
|
||||
|
||||
/**
|
||||
* Configures the socket timeout. Defaults to 0 (infinite time).
|
||||
*
|
||||
* @param socketTimeout
|
||||
* socket timeout. 0 is default and infinite
|
||||
*/
|
||||
public void setSocketTimeout(int socketTimeout) {
|
||||
this.socketTimeout = socketTimeout;
|
||||
}
|
||||
|
||||
/**
|
||||
* Configures whether or not to have socket keep alive turned on (SO_KEEPALIVE). Defaults to {@literal false}.
|
||||
* This controls whether or not to have socket keep alive
|
||||
*
|
||||
* @param socketKeepAlive
|
||||
*/
|
||||
@@ -115,7 +152,7 @@ public class MongoOptionsFactoryBean implements FactoryBean<MongoOptions>, Initi
|
||||
* <li>-1 = don't even report network errors</li>
|
||||
* <li>0 = default, don't call getLastError by default</li>
|
||||
* <li>1 = basic, call getLastError, but don't wait for slaves</li>
|
||||
* <li>2 += wait for slaves</li>
|
||||
* <li>2+= wait for slaves</li>
|
||||
* </ul>
|
||||
*
|
||||
* @param writeNumber the number of servers to wait for on the write operation, and exception raising behavior.
|
||||
@@ -125,33 +162,33 @@ public class MongoOptionsFactoryBean implements FactoryBean<MongoOptions>, Initi
|
||||
}
|
||||
|
||||
/**
|
||||
* Configures the timeout for write operations in milliseconds. This defaults to {@literal 0} (indefinite).
|
||||
* This controls timeout for write operations in milliseconds. The 'wtimeout' option to the getlasterror command.
|
||||
*
|
||||
* @param writeTimeout
|
||||
* @param writeTimeout Defaults to 0 (indefinite). Greater than zero is number of milliseconds to wait.
|
||||
*/
|
||||
public void setWriteTimeout(int writeTimeout) {
|
||||
this.writeTimeout = writeTimeout;
|
||||
}
|
||||
|
||||
/**
|
||||
* Configures whether or not to fsync. The 'fsync' option to the getlasterror command. Defaults to {@literal false}.
|
||||
* This controls whether or not to fsync. The 'fsync' option to the getlasterror command. Defaults to false.
|
||||
*
|
||||
* @param writeFsync to fsync on <code>write (true)<code>, otherwise {@literal false}.
|
||||
* @param writeFsync to fsync on write (true), otherwise false.
|
||||
*/
|
||||
public void setWriteFsync(boolean writeFsync) {
|
||||
this.writeFsync = writeFsync;
|
||||
}
|
||||
|
||||
/**
|
||||
* Configures whether or not the system retries automatically on a failed connect. This defaults to {@literal false}.
|
||||
* this controls whether or not on a connect, the system retries automatically
|
||||
*/
|
||||
public void setAutoConnectRetry(boolean autoConnectRetry) {
|
||||
this.autoConnectRetry = autoConnectRetry;
|
||||
}
|
||||
|
||||
/**
|
||||
* Configures the maximum amount of time in millisecons to spend retrying to open connection to the same server. This
|
||||
* defaults to {@literal 0}, which means to use the default {@literal 15s} if {@link #autoConnectRetry} is on.
|
||||
* The maximum amount of time in millisecons to spend retrying to open connection to the same server. Default is 0,
|
||||
* which means to use the default 15s if autoConnectRetry is on.
|
||||
*
|
||||
* @param maxAutoConnectRetryTime the maxAutoConnectRetryTime to set
|
||||
*/
|
||||
@@ -160,7 +197,7 @@ public class MongoOptionsFactoryBean implements FactoryBean<MongoOptions>, Initi
|
||||
}
|
||||
|
||||
/**
|
||||
* Specifies if the driver is allowed to read from secondaries or slaves. Defaults to {@literal false}.
|
||||
* Specifies if the driver is allowed to read from secondaries or slaves. Defaults to false.
|
||||
*
|
||||
* @param slaveOk true if the driver should read from secondaries or slaves.
|
||||
*/
|
||||
@@ -168,81 +205,32 @@ public class MongoOptionsFactoryBean implements FactoryBean<MongoOptions>, Initi
|
||||
this.slaveOk = slaveOk;
|
||||
}
|
||||
|
||||
/**
|
||||
* Specifies if the driver should use an SSL connection to Mongo. This defaults to {@literal false}. By default
|
||||
* {@link SSLSocketFactory#getDefault()} will be used. See {@link #setSslSocketFactory(SSLSocketFactory)} if you want
|
||||
* to configure a custom factory.
|
||||
*
|
||||
* @param ssl true if the driver should use an SSL connection.
|
||||
* @see #setSslSocketFactory(SSLSocketFactory)
|
||||
*/
|
||||
public void setSsl(boolean ssl) {
|
||||
this.ssl = ssl;
|
||||
}
|
||||
|
||||
/**
|
||||
* Specifies the {@link SSLSocketFactory} to use for creating SSL connections to Mongo. Defaults to
|
||||
* {@link SSLSocketFactory#getDefault()}. Implicitly activates {@link #setSsl(boolean)} if a non-{@literal null} value
|
||||
* is given.
|
||||
*
|
||||
* @param sslSocketFactory the sslSocketFactory to use.
|
||||
* @see #setSsl(boolean)
|
||||
*/
|
||||
public void setSslSocketFactory(SSLSocketFactory sslSocketFactory) {
|
||||
|
||||
setSsl(sslSocketFactory != null);
|
||||
this.sslSocketFactory = sslSocketFactory;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.beans.factory.InitializingBean#afterPropertiesSet()
|
||||
*/
|
||||
@SuppressWarnings("deprecation")
|
||||
public void afterPropertiesSet() {
|
||||
|
||||
MongoOptions options = new MongoOptions();
|
||||
|
||||
options.connectionsPerHost = connectionsPerHost;
|
||||
options.threadsAllowedToBlockForConnectionMultiplier = threadsAllowedToBlockForConnectionMultiplier;
|
||||
options.maxWaitTime = maxWaitTime;
|
||||
options.connectTimeout = connectTimeout;
|
||||
options.socketTimeout = socketTimeout;
|
||||
options.socketKeepAlive = socketKeepAlive;
|
||||
options.autoConnectRetry = autoConnectRetry;
|
||||
options.maxAutoConnectRetryTime = maxAutoConnectRetryTime;
|
||||
options.slaveOk = slaveOk;
|
||||
options.w = writeNumber;
|
||||
options.wtimeout = writeTimeout;
|
||||
options.fsync = writeFsync;
|
||||
|
||||
if (ssl) {
|
||||
options.setSocketFactory(sslSocketFactory != null ? sslSocketFactory : SSLSocketFactory.getDefault());
|
||||
MONGO_OPTIONS.connectionsPerHost = connectionsPerHost;
|
||||
MONGO_OPTIONS.threadsAllowedToBlockForConnectionMultiplier = threadsAllowedToBlockForConnectionMultiplier;
|
||||
MONGO_OPTIONS.maxWaitTime = maxWaitTime;
|
||||
MONGO_OPTIONS.connectTimeout = connectTimeout;
|
||||
MONGO_OPTIONS.socketTimeout = socketTimeout;
|
||||
MONGO_OPTIONS.socketKeepAlive = socketKeepAlive;
|
||||
MONGO_OPTIONS.autoConnectRetry = autoConnectRetry;
|
||||
MONGO_OPTIONS.maxAutoConnectRetryTime = maxAutoConnectRetryTime;
|
||||
MONGO_OPTIONS.slaveOk = slaveOk;
|
||||
MONGO_OPTIONS.w = writeNumber;
|
||||
MONGO_OPTIONS.wtimeout = writeTimeout;
|
||||
MONGO_OPTIONS.fsync = writeFsync;
|
||||
}
|
||||
|
||||
this.options = options;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.beans.factory.FactoryBean#getObject()
|
||||
*/
|
||||
public MongoOptions getObject() {
|
||||
return this.options;
|
||||
return MONGO_OPTIONS;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.beans.factory.FactoryBean#getObjectType()
|
||||
*/
|
||||
public Class<?> getObjectType() {
|
||||
return MongoOptions.class;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.beans.factory.FactoryBean#isSingleton()
|
||||
*/
|
||||
public boolean isSingleton() {
|
||||
return true;
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
@@ -1,26 +1,8 @@
|
||||
/*
|
||||
* Copyright 2012-2014 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core;
|
||||
|
||||
import org.springframework.transaction.support.ResourceHolder;
|
||||
import org.springframework.transaction.support.ResourceHolderSynchronization;
|
||||
|
||||
/**
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
class MongoSynchronization extends ResourceHolderSynchronization<ResourceHolder, Object> {
|
||||
|
||||
public MongoSynchronization(ResourceHolder resourceHolder, Object resourceKey) {
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2010-2014 the original author or authors.
|
||||
* Copyright 2010-2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -27,7 +27,6 @@ import java.util.HashSet;
|
||||
import java.util.Iterator;
|
||||
import java.util.List;
|
||||
import java.util.Map;
|
||||
import java.util.Map.Entry;
|
||||
import java.util.Scanner;
|
||||
import java.util.Set;
|
||||
|
||||
@@ -47,31 +46,22 @@ import org.springframework.core.io.ResourceLoader;
|
||||
import org.springframework.dao.DataAccessException;
|
||||
import org.springframework.dao.InvalidDataAccessApiUsageException;
|
||||
import org.springframework.dao.OptimisticLockingFailureException;
|
||||
import org.springframework.dao.support.PersistenceExceptionTranslator;
|
||||
import org.springframework.data.annotation.Id;
|
||||
import org.springframework.data.authentication.UserCredentials;
|
||||
import org.springframework.data.convert.EntityReader;
|
||||
import org.springframework.data.geo.Distance;
|
||||
import org.springframework.data.geo.GeoResult;
|
||||
import org.springframework.data.geo.Metric;
|
||||
import org.springframework.data.mapping.PersistentEntity;
|
||||
import org.springframework.data.mapping.context.MappingContext;
|
||||
import org.springframework.data.mapping.model.BeanWrapper;
|
||||
import org.springframework.data.mapping.model.MappingException;
|
||||
import org.springframework.data.mongodb.MongoDataIntegrityViolationException;
|
||||
import org.springframework.data.mongodb.MongoDbFactory;
|
||||
import org.springframework.data.mongodb.core.aggregation.Aggregation;
|
||||
import org.springframework.data.mongodb.core.aggregation.AggregationOperationContext;
|
||||
import org.springframework.data.mongodb.core.aggregation.AggregationResults;
|
||||
import org.springframework.data.mongodb.core.aggregation.Fields;
|
||||
import org.springframework.data.mongodb.core.aggregation.TypeBasedAggregationOperationContext;
|
||||
import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
|
||||
import org.springframework.data.mongodb.core.convert.DbRefResolver;
|
||||
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver;
|
||||
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
|
||||
import org.springframework.data.mongodb.core.convert.MongoConverter;
|
||||
import org.springframework.data.mongodb.core.convert.MongoWriter;
|
||||
import org.springframework.data.mongodb.core.convert.QueryMapper;
|
||||
import org.springframework.data.mongodb.core.convert.UpdateMapper;
|
||||
import org.springframework.data.mongodb.core.geo.Distance;
|
||||
import org.springframework.data.mongodb.core.geo.GeoResult;
|
||||
import org.springframework.data.mongodb.core.geo.GeoResults;
|
||||
import org.springframework.data.mongodb.core.geo.Metric;
|
||||
import org.springframework.data.mongodb.core.index.MongoMappingEventPublisher;
|
||||
import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexCreator;
|
||||
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
|
||||
@@ -79,11 +69,9 @@ import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
|
||||
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
|
||||
import org.springframework.data.mongodb.core.mapping.MongoSimpleTypes;
|
||||
import org.springframework.data.mongodb.core.mapping.event.AfterConvertEvent;
|
||||
import org.springframework.data.mongodb.core.mapping.event.AfterDeleteEvent;
|
||||
import org.springframework.data.mongodb.core.mapping.event.AfterLoadEvent;
|
||||
import org.springframework.data.mongodb.core.mapping.event.AfterSaveEvent;
|
||||
import org.springframework.data.mongodb.core.mapping.event.BeforeConvertEvent;
|
||||
import org.springframework.data.mongodb.core.mapping.event.BeforeDeleteEvent;
|
||||
import org.springframework.data.mongodb.core.mapping.event.BeforeSaveEvent;
|
||||
import org.springframework.data.mongodb.core.mapping.event.MongoMappingEvent;
|
||||
import org.springframework.data.mongodb.core.mapreduce.GroupBy;
|
||||
@@ -96,7 +84,6 @@ import org.springframework.data.mongodb.core.query.Query;
|
||||
import org.springframework.data.mongodb.core.query.Update;
|
||||
import org.springframework.jca.cci.core.ConnectionCallback;
|
||||
import org.springframework.util.Assert;
|
||||
import org.springframework.util.CollectionUtils;
|
||||
import org.springframework.util.ResourceUtils;
|
||||
import org.springframework.util.StringUtils;
|
||||
|
||||
@@ -125,13 +112,7 @@ import com.mongodb.util.JSONParseException;
|
||||
* @author Oliver Gierke
|
||||
* @author Amol Nayak
|
||||
* @author Patryk Wasik
|
||||
* @author Tobias Trelle
|
||||
* @author Sebastian Herold
|
||||
* @author Thomas Darimont
|
||||
* @author Chuong Ngo
|
||||
* @author Christoph Strobl
|
||||
*/
|
||||
@SuppressWarnings("deprecation")
|
||||
public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
|
||||
private static final Logger LOGGER = LoggerFactory.getLogger(MongoTemplate.class);
|
||||
@@ -152,9 +133,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
private final MongoConverter mongoConverter;
|
||||
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
|
||||
private final MongoDbFactory mongoDbFactory;
|
||||
private final PersistenceExceptionTranslator exceptionTranslator;
|
||||
private final QueryMapper queryMapper;
|
||||
private final UpdateMapper updateMapper;
|
||||
private final MongoExceptionTranslator exceptionTranslator = new MongoExceptionTranslator();
|
||||
private final QueryMapper mapper;
|
||||
|
||||
private WriteConcern writeConcern;
|
||||
private WriteConcernResolver writeConcernResolver = DefaultWriteConcernResolver.INSTANCE;
|
||||
@@ -206,10 +186,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
Assert.notNull(mongoDbFactory);
|
||||
|
||||
this.mongoDbFactory = mongoDbFactory;
|
||||
this.exceptionTranslator = mongoDbFactory.getExceptionTranslator();
|
||||
this.mongoConverter = mongoConverter == null ? getDefaultMongoConverter(mongoDbFactory) : mongoConverter;
|
||||
this.queryMapper = new QueryMapper(this.mongoConverter);
|
||||
this.updateMapper = new UpdateMapper(this.mongoConverter);
|
||||
this.mapper = new QueryMapper(this.mongoConverter);
|
||||
|
||||
// We always have a mapping context in the converter, whether it's a simple one or not
|
||||
mappingContext = this.mongoConverter.getMappingContext();
|
||||
@@ -355,7 +333,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
}
|
||||
|
||||
public void executeQuery(Query query, String collectionName, DocumentCallbackHandler dch) {
|
||||
executeQuery(query, collectionName, dch, new QueryCursorPreparer(query, null));
|
||||
executeQuery(query, collectionName, dch, new QueryCursorPreparer(query));
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -373,12 +351,12 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
|
||||
Assert.notNull(query);
|
||||
|
||||
DBObject queryObject = queryMapper.getMappedObject(query.getQueryObject(), null);
|
||||
DBObject queryObject = query.getQueryObject();
|
||||
DBObject sortObject = query.getSortObject();
|
||||
DBObject fieldsObject = query.getFieldsObject();
|
||||
|
||||
if (LOGGER.isDebugEnabled()) {
|
||||
LOGGER.debug(String.format("Executing query: %s sort: %s fields: %s in collection: %s",
|
||||
LOGGER.debug(String.format("Executing query: %s sort: %s fields: %s in collection: $s",
|
||||
serializeToJsonSafely(queryObject), sortObject, fieldsObject, collectionName));
|
||||
}
|
||||
|
||||
@@ -502,24 +480,6 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
}
|
||||
}
|
||||
|
||||
public boolean exists(Query query, Class<?> entityClass) {
|
||||
return exists(query, entityClass, determineCollectionName(entityClass));
|
||||
}
|
||||
|
||||
public boolean exists(Query query, String collectionName) {
|
||||
return exists(query, null, collectionName);
|
||||
}
|
||||
|
||||
public boolean exists(Query query, Class<?> entityClass, String collectionName) {
|
||||
|
||||
if (query == null) {
|
||||
throw new InvalidDataAccessApiUsageException("Query passed in to exist can't be null");
|
||||
}
|
||||
|
||||
DBObject mappedQuery = queryMapper.getMappedObject(query.getQueryObject(), getPersistentEntity(entityClass));
|
||||
return execute(collectionName, new FindCallback(mappedQuery)).hasNext();
|
||||
}
|
||||
|
||||
// Find methods that take a Query to express the query and that return a List of objects.
|
||||
|
||||
public <T> List<T> find(Query query, Class<T> entityClass) {
|
||||
@@ -533,7 +493,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
}
|
||||
|
||||
return doFind(collectionName, query.getQueryObject(), query.getFieldsObject(), entityClass,
|
||||
new QueryCursorPreparer(query, entityClass));
|
||||
new QueryCursorPreparer(query));
|
||||
}
|
||||
|
||||
public <T> T findById(Object id, Class<T> entityClass) {
|
||||
@@ -615,8 +575,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
|
||||
public <T> T findAndModify(Query query, Update update, FindAndModifyOptions options, Class<T> entityClass,
|
||||
String collectionName) {
|
||||
return doFindAndModify(collectionName, query.getQueryObject(), query.getFieldsObject(),
|
||||
getMappedSortObject(query, entityClass), entityClass, update, options);
|
||||
return doFindAndModify(collectionName, query.getQueryObject(), query.getFieldsObject(), query.getSortObject(),
|
||||
entityClass, update, options);
|
||||
}
|
||||
|
||||
// Find methods that take a Query to express the query and that return a single object that is also removed from the
|
||||
@@ -627,9 +587,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
}
|
||||
|
||||
public <T> T findAndRemove(Query query, Class<T> entityClass, String collectionName) {
|
||||
|
||||
return doFindAndRemove(collectionName, query.getQueryObject(), query.getFieldsObject(),
|
||||
getMappedSortObject(query, entityClass), entityClass);
|
||||
return doFindAndRemove(collectionName, query.getQueryObject(), query.getFieldsObject(), query.getSortObject(),
|
||||
entityClass);
|
||||
}
|
||||
|
||||
public long count(Query query, Class<?> entityClass) {
|
||||
@@ -644,7 +603,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
private long count(Query query, Class<?> entityClass, String collectionName) {
|
||||
|
||||
Assert.hasText(collectionName);
|
||||
final DBObject dbObject = query == null ? null : queryMapper.getMappedObject(query.getQueryObject(),
|
||||
final DBObject dbObject = query == null ? null : mapper.getMappedObject(query.getQueryObject(),
|
||||
entityClass == null ? null : mappingContext.getPersistentEntity(entityClass));
|
||||
|
||||
return execute(collectionName, new CollectionCallback<Long>() {
|
||||
@@ -709,9 +668,10 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
|
||||
initializeVersionProperty(objectToSave);
|
||||
|
||||
maybeEmitEvent(new BeforeConvertEvent<T>(objectToSave));
|
||||
BasicDBObject dbDoc = new BasicDBObject();
|
||||
|
||||
DBObject dbDoc = toDbObject(objectToSave, writer);
|
||||
maybeEmitEvent(new BeforeConvertEvent<T>(objectToSave));
|
||||
writer.write(objectToSave, dbDoc);
|
||||
|
||||
maybeEmitEvent(new BeforeSaveEvent<T>(objectToSave, dbDoc));
|
||||
Object id = insertDBObject(collectionName, dbDoc, objectToSave.getClass());
|
||||
@@ -720,32 +680,13 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
maybeEmitEvent(new AfterSaveEvent<T>(objectToSave, dbDoc));
|
||||
}
|
||||
|
||||
/**
|
||||
* @param objectToSave
|
||||
* @param writer
|
||||
* @return
|
||||
*/
|
||||
private <T> DBObject toDbObject(T objectToSave, MongoWriter<T> writer) {
|
||||
|
||||
if (!(objectToSave instanceof String)) {
|
||||
DBObject dbDoc = new BasicDBObject();
|
||||
writer.write(objectToSave, dbDoc);
|
||||
return dbDoc;
|
||||
} else {
|
||||
try {
|
||||
return (DBObject) JSON.parse((String) objectToSave);
|
||||
} catch (JSONParseException e) {
|
||||
throw new MappingException("Could not parse given String to save into a JSON document!", e);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private void initializeVersionProperty(Object entity) {
|
||||
|
||||
MongoPersistentEntity<?> mongoPersistentEntity = getPersistentEntity(entity.getClass());
|
||||
|
||||
if (mongoPersistentEntity != null && mongoPersistentEntity.hasVersionProperty()) {
|
||||
BeanWrapper<Object> wrapper = BeanWrapper.create(entity, this.mongoConverter.getConversionService());
|
||||
BeanWrapper<PersistentEntity<Object, ?>, Object> wrapper = BeanWrapper.create(entity,
|
||||
this.mongoConverter.getConversionService());
|
||||
wrapper.setProperty(mongoPersistentEntity.getVersionProperty(), 0);
|
||||
}
|
||||
}
|
||||
@@ -839,11 +780,12 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
|
||||
private <T> void doSaveVersioned(T objectToSave, MongoPersistentEntity<?> entity, String collectionName) {
|
||||
|
||||
BeanWrapper<T> beanWrapper = BeanWrapper.create(objectToSave, this.mongoConverter.getConversionService());
|
||||
BeanWrapper<PersistentEntity<T, ?>, T> beanWrapper = BeanWrapper.create(objectToSave,
|
||||
this.mongoConverter.getConversionService());
|
||||
MongoPersistentProperty idProperty = entity.getIdProperty();
|
||||
MongoPersistentProperty versionProperty = entity.getVersionProperty();
|
||||
|
||||
Number version = beanWrapper.getProperty(versionProperty, Number.class);
|
||||
Number version = beanWrapper.getProperty(versionProperty, Number.class, !versionProperty.usePropertyAccess());
|
||||
|
||||
// Fresh instance -> initialize version property
|
||||
if (version == null) {
|
||||
@@ -857,7 +799,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
Query query = new Query(Criteria.where(idProperty.getName()).is(id).and(versionProperty.getName()).is(version));
|
||||
|
||||
// Bump version number
|
||||
Number number = beanWrapper.getProperty(versionProperty, Number.class);
|
||||
Number number = beanWrapper.getProperty(versionProperty, Number.class, false);
|
||||
beanWrapper.setProperty(versionProperty, number.longValue() + 1);
|
||||
|
||||
BasicDBObject dbObject = new BasicDBObject();
|
||||
@@ -877,9 +819,19 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
|
||||
assertUpdateableIdIfNotSet(objectToSave);
|
||||
|
||||
DBObject dbDoc = new BasicDBObject();
|
||||
|
||||
maybeEmitEvent(new BeforeConvertEvent<T>(objectToSave));
|
||||
|
||||
DBObject dbDoc = toDbObject(objectToSave, writer);
|
||||
if (!(objectToSave instanceof String)) {
|
||||
writer.write(objectToSave, dbDoc);
|
||||
} else {
|
||||
try {
|
||||
dbDoc = (DBObject) JSON.parse((String) objectToSave);
|
||||
} catch (JSONParseException e) {
|
||||
throw new MappingException("Could not parse given String to save into a JSON document!", e);
|
||||
}
|
||||
}
|
||||
|
||||
maybeEmitEvent(new BeforeSaveEvent<T>(objectToSave, dbDoc));
|
||||
Object id = saveDBObject(collectionName, dbDoc, objectToSave.getClass());
|
||||
@@ -963,10 +915,6 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
return doUpdate(collectionName, query, update, null, true, false);
|
||||
}
|
||||
|
||||
public WriteResult upsert(Query query, Update update, Class<?> entityClass, String collectionName) {
|
||||
return doUpdate(collectionName, query, update, entityClass, true, false);
|
||||
}
|
||||
|
||||
public WriteResult updateFirst(Query query, Update update, Class<?> entityClass) {
|
||||
return doUpdate(determineCollectionName(entityClass), query, update, entityClass, false, false);
|
||||
}
|
||||
@@ -975,10 +923,6 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
return doUpdate(collectionName, query, update, null, false, false);
|
||||
}
|
||||
|
||||
public WriteResult updateFirst(Query query, Update update, Class<?> entityClass, String collectionName) {
|
||||
return doUpdate(collectionName, query, update, entityClass, false, false);
|
||||
}
|
||||
|
||||
public WriteResult updateMulti(Query query, Update update, Class<?> entityClass) {
|
||||
return doUpdate(determineCollectionName(entityClass), query, update, entityClass, false, true);
|
||||
}
|
||||
@@ -987,10 +931,6 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
return doUpdate(collectionName, query, update, null, false, true);
|
||||
}
|
||||
|
||||
public WriteResult updateMulti(final Query query, final Update update, Class<?> entityClass, String collectionName) {
|
||||
return doUpdate(collectionName, query, update, entityClass, false, true);
|
||||
}
|
||||
|
||||
protected WriteResult doUpdate(final String collectionName, final Query query, final Update update,
|
||||
final Class<?> entityClass, final boolean upsert, final boolean multi) {
|
||||
|
||||
@@ -999,12 +939,10 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
|
||||
MongoPersistentEntity<?> entity = entityClass == null ? null : getPersistentEntity(entityClass);
|
||||
|
||||
increaseVersionForUpdateIfNecessary(entity, update);
|
||||
|
||||
DBObject queryObj = query == null ? new BasicDBObject() : queryMapper.getMappedObject(query.getQueryObject(),
|
||||
DBObject queryObj = query == null ? new BasicDBObject()
|
||||
: mapper.getMappedObject(query.getQueryObject(), entity);
|
||||
DBObject updateObj = update == null ? new BasicDBObject() : mapper.getMappedObject(update.getUpdateObject(),
|
||||
entity);
|
||||
DBObject updateObj = update == null ? new BasicDBObject() : updateMapper.getMappedObject(
|
||||
update.getUpdateObject(), entity);
|
||||
|
||||
if (LOGGER.isDebugEnabled()) {
|
||||
LOGGER.debug("Calling update using query: " + queryObj + " and update: " + updateObj + " in collection: "
|
||||
@@ -1018,7 +956,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
: collection.update(queryObj, updateObj, upsert, multi, writeConcernToUse);
|
||||
|
||||
if (entity != null && entity.hasVersionProperty() && !multi) {
|
||||
if (writeResult.getN() == 0 && dbObjectContainsVersionProperty(queryObj, entity)) {
|
||||
if (writeResult.getN() == 0) {
|
||||
throw new OptimisticLockingFailureException("Optimistic lock exception on saving entity: "
|
||||
+ updateObj.toMap().toString() + " to collection " + collectionName);
|
||||
}
|
||||
@@ -1030,72 +968,24 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
});
|
||||
}
|
||||
|
||||
private void increaseVersionForUpdateIfNecessary(MongoPersistentEntity<?> persistentEntity, Update update) {
|
||||
|
||||
if (persistentEntity != null && persistentEntity.hasVersionProperty()) {
|
||||
String versionFieldName = persistentEntity.getVersionProperty().getFieldName();
|
||||
if (!update.modifies(versionFieldName)) {
|
||||
update.inc(versionFieldName, 1L);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private boolean dbObjectContainsVersionProperty(DBObject dbObject, MongoPersistentEntity<?> persistentEntity) {
|
||||
|
||||
if (persistentEntity == null || !persistentEntity.hasVersionProperty()) {
|
||||
return false;
|
||||
}
|
||||
|
||||
return dbObject.containsField(persistentEntity.getVersionProperty().getFieldName());
|
||||
}
|
||||
|
||||
public WriteResult remove(Object object) {
|
||||
public void remove(Object object) {
|
||||
|
||||
if (object == null) {
|
||||
return null;
|
||||
return;
|
||||
}
|
||||
|
||||
return remove(getIdQueryFor(object), object.getClass());
|
||||
remove(getIdQueryFor(object), object.getClass());
|
||||
}
|
||||
|
||||
public WriteResult remove(Object object, String collection) {
|
||||
public void remove(Object object, String collection) {
|
||||
|
||||
Assert.hasText(collection);
|
||||
|
||||
if (object == null) {
|
||||
return null;
|
||||
return;
|
||||
}
|
||||
|
||||
return doRemove(collection, getIdQueryFor(object), object.getClass());
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns {@link Entry} containing the field name of the id property as {@link Entry#getKey()} and the {@link Id}s
|
||||
* property value as its {@link Entry#getValue()}.
|
||||
*
|
||||
* @param object
|
||||
* @return
|
||||
*/
|
||||
private Entry<String, Object> extractIdPropertyAndValue(Object object) {
|
||||
|
||||
Assert.notNull(object, "Id cannot be extracted from 'null'.");
|
||||
|
||||
Class<?> objectType = object.getClass();
|
||||
|
||||
if (object instanceof DBObject) {
|
||||
return Collections.singletonMap(ID_FIELD, ((DBObject) object).get(ID_FIELD)).entrySet().iterator().next();
|
||||
}
|
||||
|
||||
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(objectType);
|
||||
MongoPersistentProperty idProp = entity == null ? null : entity.getIdProperty();
|
||||
|
||||
if (idProp == null) {
|
||||
throw new MappingException("No id property found for object of type " + objectType);
|
||||
}
|
||||
|
||||
Object idValue = BeanWrapper.create(object, mongoConverter.getConversionService())
|
||||
.getProperty(idProp, Object.class);
|
||||
return Collections.singletonMap(idProp.getFieldName(), idValue).entrySet().iterator().next();
|
||||
doRemove(collection, getIdQueryFor(object), object.getClass());
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -1106,31 +996,21 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
*/
|
||||
private Query getIdQueryFor(Object object) {
|
||||
|
||||
Entry<String, Object> id = extractIdPropertyAndValue(object);
|
||||
return new Query(where(id.getKey()).is(id.getValue()));
|
||||
Assert.notNull(object);
|
||||
|
||||
Class<?> objectType = object.getClass();
|
||||
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(objectType);
|
||||
MongoPersistentProperty idProp = entity == null ? null : entity.getIdProperty();
|
||||
|
||||
if (idProp == null) {
|
||||
throw new MappingException("No id property found for object of type " + objectType);
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns a {@link Query} for the given entities by their ids.
|
||||
*
|
||||
* @param objects must not be {@literal null} or {@literal empty}.
|
||||
* @return
|
||||
*/
|
||||
private Query getIdInQueryFor(Collection<?> objects) {
|
||||
ConversionService service = mongoConverter.getConversionService();
|
||||
Object idProperty = null;
|
||||
|
||||
Assert.notEmpty(objects, "Cannot create Query for empty collection.");
|
||||
|
||||
Iterator<?> it = objects.iterator();
|
||||
Entry<String, Object> firstEntry = extractIdPropertyAndValue(it.next());
|
||||
|
||||
ArrayList<Object> ids = new ArrayList<Object>(objects.size());
|
||||
ids.add(firstEntry.getValue());
|
||||
|
||||
while (it.hasNext()) {
|
||||
ids.add(extractIdPropertyAndValue(it.next()).getValue());
|
||||
}
|
||||
|
||||
return new Query(where(firstEntry.getKey()).in(ids));
|
||||
idProperty = BeanWrapper.create(object, service).getProperty(idProp, Object.class, true);
|
||||
return new Query(where(idProp.getFieldName()).is(idProperty));
|
||||
}
|
||||
|
||||
private void assertUpdateableIdIfNotSet(Object entity) {
|
||||
@@ -1143,7 +1023,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
}
|
||||
|
||||
ConversionService service = mongoConverter.getConversionService();
|
||||
Object idValue = BeanWrapper.create(entity, service).getProperty(idProperty, Object.class);
|
||||
Object idValue = BeanWrapper.create(entity, service).getProperty(idProperty, Object.class, true);
|
||||
|
||||
if (idValue == null && !MongoSimpleTypes.AUTOGENERATED_ID_TYPES.contains(idProperty.getType())) {
|
||||
throw new InvalidDataAccessApiUsageException(String.format(
|
||||
@@ -1152,35 +1032,24 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
}
|
||||
}
|
||||
|
||||
public WriteResult remove(Query query, String collectionName) {
|
||||
return remove(query, null, collectionName);
|
||||
public <T> void remove(Query query, Class<T> entityClass) {
|
||||
Assert.notNull(query);
|
||||
doRemove(determineCollectionName(entityClass), query, entityClass);
|
||||
}
|
||||
|
||||
public WriteResult remove(Query query, Class<?> entityClass) {
|
||||
return remove(query, entityClass, determineCollectionName(entityClass));
|
||||
}
|
||||
|
||||
public WriteResult remove(Query query, Class<?> entityClass, String collectionName) {
|
||||
return doRemove(collectionName, query, entityClass);
|
||||
}
|
||||
|
||||
protected <T> WriteResult doRemove(final String collectionName, final Query query, final Class<T> entityClass) {
|
||||
protected <T> void doRemove(final String collectionName, final Query query, final Class<T> entityClass) {
|
||||
|
||||
if (query == null) {
|
||||
throw new InvalidDataAccessApiUsageException("Query passed in to remove can't be null!");
|
||||
throw new InvalidDataAccessApiUsageException("Query passed in to remove can't be null");
|
||||
}
|
||||
|
||||
Assert.hasText(collectionName, "Collection name must not be null or empty!");
|
||||
|
||||
final DBObject queryObject = query.getQueryObject();
|
||||
final MongoPersistentEntity<?> entity = getPersistentEntity(entityClass);
|
||||
|
||||
return execute(collectionName, new CollectionCallback<WriteResult>() {
|
||||
public WriteResult doInCollection(DBCollection collection) throws MongoException, DataAccessException {
|
||||
execute(collectionName, new CollectionCallback<Void>() {
|
||||
public Void doInCollection(DBCollection collection) throws MongoException, DataAccessException {
|
||||
|
||||
maybeEmitEvent(new BeforeDeleteEvent<T>(queryObject, entityClass));
|
||||
|
||||
DBObject dboq = queryMapper.getMappedObject(queryObject, entity);
|
||||
DBObject dboq = mapper.getMappedObject(queryObject, entity);
|
||||
|
||||
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.REMOVE, collectionName,
|
||||
entityClass, null, queryObject);
|
||||
@@ -1192,16 +1061,16 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
|
||||
WriteResult wr = writeConcernToUse == null ? collection.remove(dboq) : collection.remove(dboq,
|
||||
writeConcernToUse);
|
||||
|
||||
handleAnyWriteResultErrors(wr, dboq, MongoActionOperation.REMOVE);
|
||||
|
||||
maybeEmitEvent(new AfterDeleteEvent<T>(queryObject, entityClass));
|
||||
|
||||
return wr;
|
||||
return null;
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
public void remove(final Query query, String collectionName) {
|
||||
doRemove(collectionName, query, null);
|
||||
}
|
||||
|
||||
public <T> List<T> findAll(Class<T> entityClass) {
|
||||
return executeFindMultiInternal(new FindCallback(null), null, new ReadDbObjectCallback<T>(mongoConverter,
|
||||
entityClass), determineCollectionName(entityClass));
|
||||
@@ -1231,7 +1100,6 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
|
||||
public <T> MapReduceResults<T> mapReduce(Query query, String inputCollectionName, String mapFunction,
|
||||
String reduceFunction, MapReduceOptions mapReduceOptions, Class<T> entityClass) {
|
||||
|
||||
String mapFunc = replaceWithResourceIfNecessary(mapFunction);
|
||||
String reduceFunc = replaceWithResourceIfNecessary(reduceFunction);
|
||||
DBCollection inputCollection = getCollection(inputCollectionName);
|
||||
@@ -1256,12 +1124,12 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
MapReduceOutput mapReduceOutput = new MapReduceOutput(inputCollection, commandObject, commandResult);
|
||||
List<T> mappedResults = new ArrayList<T>();
|
||||
DbObjectCallback<T> callback = new ReadDbObjectCallback<T>(mongoConverter, entityClass);
|
||||
|
||||
for (DBObject dbObject : mapReduceOutput.results()) {
|
||||
mappedResults.add(callback.doWith(dbObject));
|
||||
}
|
||||
|
||||
return new MapReduceResults<T>(mappedResults, commandResult);
|
||||
MapReduceResults<T> mapReduceResult = new MapReduceResults<T>(mappedResults, commandResult);
|
||||
return mapReduceResult;
|
||||
}
|
||||
|
||||
public <T> GroupByResults<T> group(String inputCollectionName, GroupBy groupBy, Class<T> entityClass) {
|
||||
@@ -1277,7 +1145,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
if (criteria == null) {
|
||||
dbo.put("cond", null);
|
||||
} else {
|
||||
dbo.put("cond", queryMapper.getMappedObject(criteria.getCriteriaObject(), null));
|
||||
dbo.put("cond", mapper.getMappedObject(criteria.getCriteriaObject(), null));
|
||||
}
|
||||
// If initial document was a JavaScript string, potentially loaded by Spring's Resource abstraction, load it and
|
||||
// convert to DBObject
|
||||
@@ -1315,135 +1183,15 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
|
||||
@SuppressWarnings("unchecked")
|
||||
Iterable<DBObject> resultSet = (Iterable<DBObject>) commandResult.get("retval");
|
||||
|
||||
List<T> mappedResults = new ArrayList<T>();
|
||||
DbObjectCallback<T> callback = new ReadDbObjectCallback<T>(mongoConverter, entityClass);
|
||||
|
||||
for (DBObject dbObject : resultSet) {
|
||||
mappedResults.add(callback.doWith(dbObject));
|
||||
}
|
||||
GroupByResults<T> groupByResult = new GroupByResults<T>(mappedResults, commandResult);
|
||||
return groupByResult;
|
||||
|
||||
return new GroupByResults<T>(mappedResults, commandResult);
|
||||
}
|
||||
|
||||
@Override
|
||||
public <O> AggregationResults<O> aggregate(TypedAggregation<?> aggregation, Class<O> outputType) {
|
||||
return aggregate(aggregation, determineCollectionName(aggregation.getInputType()), outputType);
|
||||
}
|
||||
|
||||
@Override
|
||||
public <O> AggregationResults<O> aggregate(TypedAggregation<?> aggregation, String inputCollectionName,
|
||||
Class<O> outputType) {
|
||||
|
||||
Assert.notNull(aggregation, "Aggregation pipeline must not be null!");
|
||||
|
||||
AggregationOperationContext context = new TypeBasedAggregationOperationContext(aggregation.getInputType(),
|
||||
mappingContext, queryMapper);
|
||||
return aggregate(aggregation, inputCollectionName, outputType, context);
|
||||
}
|
||||
|
||||
@Override
|
||||
public <O> AggregationResults<O> aggregate(Aggregation aggregation, Class<?> inputType, Class<O> outputType) {
|
||||
|
||||
return aggregate(aggregation, determineCollectionName(inputType), outputType,
|
||||
new TypeBasedAggregationOperationContext(inputType, mappingContext, queryMapper));
|
||||
}
|
||||
|
||||
@Override
|
||||
public <O> AggregationResults<O> aggregate(Aggregation aggregation, String collectionName, Class<O> outputType) {
|
||||
return aggregate(aggregation, collectionName, outputType, null);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.MongoOperations#findAllAndRemove(org.springframework.data.mongodb.core.query.Query, java.lang.String)
|
||||
*/
|
||||
@Override
|
||||
public <T> List<T> findAllAndRemove(Query query, String collectionName) {
|
||||
return findAndRemove(query, null, collectionName);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.MongoOperations#findAllAndRemove(org.springframework.data.mongodb.core.query.Query, java.lang.Class)
|
||||
*/
|
||||
@Override
|
||||
public <T> List<T> findAllAndRemove(Query query, Class<T> entityClass) {
|
||||
return findAllAndRemove(query, entityClass, determineCollectionName(entityClass));
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.MongoOperations#findAllAndRemove(org.springframework.data.mongodb.core.query.Query, java.lang.Class, java.lang.String)
|
||||
*/
|
||||
@Override
|
||||
public <T> List<T> findAllAndRemove(Query query, Class<T> entityClass, String collectionName) {
|
||||
return doFindAndDelete(collectionName, query, entityClass);
|
||||
}
|
||||
|
||||
/**
|
||||
* Retrieve and remove all documents matching the given {@code query} by calling {@link #find(Query, Class, String)}
|
||||
* and {@link #remove(Query, Class, String)}, whereas the {@link Query} for {@link #remove(Query, Class, String)} is
|
||||
* constructed out of the find result.
|
||||
*
|
||||
* @param collectionName
|
||||
* @param query
|
||||
* @param entityClass
|
||||
* @return
|
||||
*/
|
||||
protected <T> List<T> doFindAndDelete(String collectionName, Query query, Class<T> entityClass) {
|
||||
|
||||
List<T> result = find(query, entityClass, collectionName);
|
||||
|
||||
if (!CollectionUtils.isEmpty(result)) {
|
||||
remove(getIdInQueryFor(result), entityClass, collectionName);
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
protected <O> AggregationResults<O> aggregate(Aggregation aggregation, String collectionName, Class<O> outputType,
|
||||
AggregationOperationContext context) {
|
||||
|
||||
Assert.hasText(collectionName, "Collection name must not be null or empty!");
|
||||
Assert.notNull(aggregation, "Aggregation pipeline must not be null!");
|
||||
Assert.notNull(outputType, "Output type must not be null!");
|
||||
|
||||
AggregationOperationContext rootContext = context == null ? Aggregation.DEFAULT_CONTEXT : context;
|
||||
DBObject command = aggregation.toDbObject(collectionName, rootContext);
|
||||
|
||||
if (LOGGER.isDebugEnabled()) {
|
||||
LOGGER.debug("Executing aggregation: {}", serializeToJsonSafely(command));
|
||||
}
|
||||
|
||||
CommandResult commandResult = executeCommand(command);
|
||||
handleCommandError(commandResult, command);
|
||||
|
||||
return new AggregationResults<O>(returnPotentiallyMappedResults(outputType, commandResult), commandResult);
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the potentially mapped results of the given {@commandResult} contained some.
|
||||
*
|
||||
* @param outputType
|
||||
* @param commandResult
|
||||
* @return
|
||||
*/
|
||||
private <O> List<O> returnPotentiallyMappedResults(Class<O> outputType, CommandResult commandResult) {
|
||||
|
||||
@SuppressWarnings("unchecked")
|
||||
Iterable<DBObject> resultSet = (Iterable<DBObject>) commandResult.get("result");
|
||||
if (resultSet == null) {
|
||||
return Collections.emptyList();
|
||||
}
|
||||
|
||||
DbObjectCallback<O> callback = new UnwrapAndReadDbObjectCallback<O>(mongoConverter, outputType);
|
||||
|
||||
List<O> mappedResults = new ArrayList<O>();
|
||||
for (DBObject dbObject : resultSet) {
|
||||
mappedResults.add(callback.doWith(dbObject));
|
||||
}
|
||||
|
||||
return mappedResults;
|
||||
}
|
||||
|
||||
protected String replaceWithResourceIfNecessary(String function) {
|
||||
@@ -1475,13 +1223,13 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
"Can not use skip or field specification with map reduce operations");
|
||||
}
|
||||
if (query.getQueryObject() != null) {
|
||||
copyMapReduceOptions.put("query", queryMapper.getMappedObject(query.getQueryObject(), null));
|
||||
copyMapReduceOptions.put("query", query.getQueryObject());
|
||||
}
|
||||
if (query.getLimit() > 0) {
|
||||
copyMapReduceOptions.put("limit", query.getLimit());
|
||||
}
|
||||
if (query.getSortObject() != null) {
|
||||
copyMapReduceOptions.put("sort", queryMapper.getMappedObject(query.getSortObject(), null));
|
||||
copyMapReduceOptions.put("sort", query.getSortObject());
|
||||
}
|
||||
}
|
||||
return copyMapReduceOptions;
|
||||
@@ -1554,28 +1302,57 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
}
|
||||
|
||||
/**
|
||||
* Map the results of an ad-hoc query on the default MongoDB collection to an object using the template's converter.
|
||||
* The query document is specified as a standard {@link DBObject} and so is the fields specification.
|
||||
* Map the results of an ad-hoc query on the default MongoDB collection to an object using the template's converter
|
||||
* <p/>
|
||||
* The query document is specified as a standard DBObject and so is the fields specification.
|
||||
*
|
||||
* @param collectionName name of the collection to retrieve the objects from.
|
||||
* @param query the query document that specifies the criteria used to find a record.
|
||||
* @param fields the document that specifies the fields to be returned.
|
||||
* @param collectionName name of the collection to retrieve the objects from
|
||||
* @param query the query document that specifies the criteria used to find a record
|
||||
* @param fields the document that specifies the fields to be returned
|
||||
* @param entityClass the parameterized type of the returned list.
|
||||
* @return the {@link List} of converted objects.
|
||||
* @return the List of converted objects.
|
||||
*/
|
||||
protected <T> T doFindOne(String collectionName, DBObject query, DBObject fields, Class<T> entityClass) {
|
||||
|
||||
EntityReader<? super T, DBObject> readerToUse = this.mongoConverter;
|
||||
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass);
|
||||
DBObject mappedQuery = queryMapper.getMappedObject(query, entity);
|
||||
DBObject mappedFields = fields == null ? null : queryMapper.getMappedObject(fields, entity);
|
||||
DBObject mappedQuery = mapper.getMappedObject(query, entity);
|
||||
|
||||
if (LOGGER.isDebugEnabled()) {
|
||||
LOGGER.debug(String.format("findOne using query: %s fields: %s for class: %s in collection: %s",
|
||||
serializeToJsonSafely(query), mappedFields, entityClass, collectionName));
|
||||
return executeFindOneInternal(new FindOneCallback(mappedQuery, fields), new ReadDbObjectCallback<T>(readerToUse,
|
||||
entityClass), collectionName);
|
||||
}
|
||||
|
||||
return executeFindOneInternal(new FindOneCallback(mappedQuery, mappedFields), new ReadDbObjectCallback<T>(
|
||||
this.mongoConverter, entityClass), collectionName);
|
||||
/**
|
||||
* Map the results of an ad-hoc query on the default MongoDB collection to a List of the specified type. The object is
|
||||
* converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless configured
|
||||
* otherwise, an instance of MappingMongoConverter will be used. The query document is specified as a standard
|
||||
* DBObject and so is the fields specification. Can be overridden by subclasses.
|
||||
*
|
||||
* @param collectionName name of the collection to retrieve the objects from
|
||||
* @param query the query document that specifies the criteria used to find a record
|
||||
* @param fields the document that specifies the fields to be returned
|
||||
* @param entityClass the parameterized type of the returned list.
|
||||
* @param preparer allows for customization of the DBCursor used when iterating over the result set, (apply limits,
|
||||
* skips and so on).
|
||||
* @return the List of converted objects.
|
||||
*/
|
||||
protected <T> List<T> doFind(String collectionName, DBObject query, DBObject fields, Class<T> entityClass,
|
||||
CursorPreparer preparer) {
|
||||
return doFind(collectionName, query, fields, entityClass, preparer, new ReadDbObjectCallback<T>(mongoConverter,
|
||||
entityClass));
|
||||
}
|
||||
|
||||
protected <S, T> List<T> doFind(String collectionName, DBObject query, DBObject fields, Class<S> entityClass,
|
||||
CursorPreparer preparer, DbObjectCallback<T> objectCallback) {
|
||||
|
||||
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass);
|
||||
|
||||
if (LOGGER.isDebugEnabled()) {
|
||||
LOGGER.debug(String.format("find using query: %s fields: %s for class: %s in collection: %s",
|
||||
serializeToJsonSafely(query), fields, entityClass, collectionName));
|
||||
}
|
||||
|
||||
return executeFindMultiInternal(new FindCallback(mapper.getMappedObject(query, entity), fields), preparer,
|
||||
objectCallback, collectionName);
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -1589,43 +1366,14 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
* @return the List of converted objects.
|
||||
*/
|
||||
protected <T> List<T> doFind(String collectionName, DBObject query, DBObject fields, Class<T> entityClass) {
|
||||
return doFind(collectionName, query, fields, entityClass, null, new ReadDbObjectCallback<T>(this.mongoConverter,
|
||||
entityClass));
|
||||
}
|
||||
|
||||
/**
|
||||
* Map the results of an ad-hoc query on the default MongoDB collection to a List of the specified type. The object is
|
||||
* converted from the MongoDB native representation using an instance of {@see MongoConverter}. The query document is
|
||||
* specified as a standard DBObject and so is the fields specification.
|
||||
*
|
||||
* @param collectionName name of the collection to retrieve the objects from.
|
||||
* @param query the query document that specifies the criteria used to find a record.
|
||||
* @param fields the document that specifies the fields to be returned.
|
||||
* @param entityClass the parameterized type of the returned list.
|
||||
* @param preparer allows for customization of the {@link DBCursor} used when iterating over the result set, (apply
|
||||
* limits, skips and so on).
|
||||
* @return the {@link List} of converted objects.
|
||||
*/
|
||||
protected <T> List<T> doFind(String collectionName, DBObject query, DBObject fields, Class<T> entityClass,
|
||||
CursorPreparer preparer) {
|
||||
return doFind(collectionName, query, fields, entityClass, preparer, new ReadDbObjectCallback<T>(mongoConverter,
|
||||
entityClass));
|
||||
}
|
||||
|
||||
protected <S, T> List<T> doFind(String collectionName, DBObject query, DBObject fields, Class<S> entityClass,
|
||||
CursorPreparer preparer, DbObjectCallback<T> objectCallback) {
|
||||
|
||||
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass);
|
||||
DBObject mappedFields = fields == null ? null : queryMapper.getMappedObject(fields, entity);
|
||||
DBObject mappedQuery = queryMapper.getMappedObject(query, entity);
|
||||
|
||||
if (LOGGER.isDebugEnabled()) {
|
||||
LOGGER.debug(String.format("find using query: %s fields: %s for class: %s in collection: %s",
|
||||
serializeToJsonSafely(query), mappedFields, entityClass, collectionName));
|
||||
LOGGER.debug("find using query: " + query + " fields: " + fields + " for class: " + entityClass
|
||||
+ " in collection: " + collectionName);
|
||||
}
|
||||
|
||||
return executeFindMultiInternal(new FindCallback(mappedQuery, mappedFields), preparer, objectCallback,
|
||||
collectionName);
|
||||
EntityReader<? super T, DBObject> readerToUse = this.mongoConverter;
|
||||
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass);
|
||||
return executeFindMultiInternal(new FindCallback(mapper.getMappedObject(query, entity), fields), null,
|
||||
new ReadDbObjectCallback<T>(readerToUse, entityClass), collectionName);
|
||||
}
|
||||
|
||||
protected DBObject convertToDbObject(CollectionOptions collectionOptions) {
|
||||
@@ -1663,7 +1411,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
+ entityClass + " in collection: " + collectionName);
|
||||
}
|
||||
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass);
|
||||
return executeFindOneInternal(new FindAndRemoveCallback(queryMapper.getMappedObject(query, entity), fields, sort),
|
||||
return executeFindOneInternal(new FindAndRemoveCallback(mapper.getMappedObject(query, entity), fields, sort),
|
||||
new ReadDbObjectCallback<T>(readerToUse, entityClass), collectionName);
|
||||
}
|
||||
|
||||
@@ -1678,10 +1426,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
|
||||
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass);
|
||||
|
||||
increaseVersionForUpdateIfNecessary(entity, update);
|
||||
|
||||
DBObject mappedQuery = queryMapper.getMappedObject(query, entity);
|
||||
DBObject mappedUpdate = updateMapper.getMappedObject(update.getUpdateObject(), entity);
|
||||
DBObject mappedUpdate = mapper.getMappedObject(update.getUpdateObject(), entity);
|
||||
DBObject mappedQuery = mapper.getMappedObject(query, entity);
|
||||
|
||||
if (LOGGER.isDebugEnabled()) {
|
||||
LOGGER.debug("findAndModify using query: " + mappedQuery + " fields: " + fields + " sort: " + sort
|
||||
@@ -1717,9 +1463,9 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
}
|
||||
|
||||
ConversionService conversionService = mongoConverter.getConversionService();
|
||||
BeanWrapper<Object> wrapper = BeanWrapper.create(savedObject, conversionService);
|
||||
BeanWrapper<PersistentEntity<Object, ?>, Object> wrapper = BeanWrapper.create(savedObject, conversionService);
|
||||
|
||||
Object idValue = wrapper.getProperty(idProp, idProp.getType());
|
||||
Object idValue = wrapper.getProperty(idProp, idProp.getType(), true);
|
||||
|
||||
if (idValue != null) {
|
||||
return;
|
||||
@@ -1956,23 +1702,11 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
}
|
||||
|
||||
private static final MongoConverter getDefaultMongoConverter(MongoDbFactory factory) {
|
||||
|
||||
DbRefResolver dbRefResolver = new DefaultDbRefResolver(factory);
|
||||
MappingMongoConverter converter = new MappingMongoConverter(dbRefResolver, new MongoMappingContext());
|
||||
MappingMongoConverter converter = new MappingMongoConverter(factory, new MongoMappingContext());
|
||||
converter.afterPropertiesSet();
|
||||
return converter;
|
||||
}
|
||||
|
||||
private DBObject getMappedSortObject(Query query, Class<?> type) {
|
||||
|
||||
if (query == null || query.getSortObject() == null) {
|
||||
return null;
|
||||
}
|
||||
|
||||
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(type);
|
||||
return queryMapper.getMappedObject(query.getSortObject(), entity);
|
||||
}
|
||||
|
||||
// Callback implementations
|
||||
|
||||
/**
|
||||
@@ -2125,35 +1859,6 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
}
|
||||
}
|
||||
|
||||
class UnwrapAndReadDbObjectCallback<T> extends ReadDbObjectCallback<T> {
|
||||
|
||||
public UnwrapAndReadDbObjectCallback(EntityReader<? super T, DBObject> reader, Class<T> type) {
|
||||
super(reader, type);
|
||||
}
|
||||
|
||||
@Override
|
||||
public T doWith(DBObject object) {
|
||||
|
||||
Object idField = object.get(Fields.UNDERSCORE_ID);
|
||||
|
||||
if (!(idField instanceof DBObject)) {
|
||||
return super.doWith(object);
|
||||
}
|
||||
|
||||
DBObject toMap = new BasicDBObject();
|
||||
DBObject nested = (DBObject) idField;
|
||||
toMap.putAll(nested);
|
||||
|
||||
for (String key : object.keySet()) {
|
||||
if (!Fields.UNDERSCORE_ID.equals(key)) {
|
||||
toMap.put(key, object.get(key));
|
||||
}
|
||||
}
|
||||
|
||||
return super.doWith(toMap);
|
||||
}
|
||||
}
|
||||
|
||||
private enum DefaultWriteConcernResolver implements WriteConcernResolver {
|
||||
|
||||
INSTANCE;
|
||||
@@ -2166,12 +1871,9 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
class QueryCursorPreparer implements CursorPreparer {
|
||||
|
||||
private final Query query;
|
||||
private final Class<?> type;
|
||||
|
||||
public QueryCursorPreparer(Query query, Class<?> type) {
|
||||
|
||||
public QueryCursorPreparer(Query query) {
|
||||
this.query = query;
|
||||
this.type = type;
|
||||
}
|
||||
|
||||
/*
|
||||
@@ -2199,8 +1901,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
cursorToUse = cursorToUse.limit(query.getLimit());
|
||||
}
|
||||
if (query.getSortObject() != null) {
|
||||
DBObject sortDbo = type != null ? getMappedSortObject(query, type) : query.getSortObject();
|
||||
cursorToUse = cursorToUse.sort(sortDbo);
|
||||
cursorToUse = cursorToUse.sort(query.getSortObject());
|
||||
}
|
||||
if (StringUtils.hasText(query.getHint())) {
|
||||
cursorToUse = cursorToUse.hint(query.getHint());
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2011-2013 the original author or authors.
|
||||
* Copyright 2011-2012 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -19,11 +19,9 @@ import java.net.UnknownHostException;
|
||||
|
||||
import org.springframework.beans.factory.DisposableBean;
|
||||
import org.springframework.dao.DataAccessException;
|
||||
import org.springframework.dao.support.PersistenceExceptionTranslator;
|
||||
import org.springframework.data.authentication.UserCredentials;
|
||||
import org.springframework.data.mongodb.MongoDbFactory;
|
||||
import org.springframework.util.Assert;
|
||||
import org.springframework.util.StringUtils;
|
||||
|
||||
import com.mongodb.DB;
|
||||
import com.mongodb.Mongo;
|
||||
@@ -36,7 +34,6 @@ import com.mongodb.WriteConcern;
|
||||
*
|
||||
* @author Mark Pollack
|
||||
* @author Oliver Gierke
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
|
||||
|
||||
@@ -44,9 +41,6 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
|
||||
private final String databaseName;
|
||||
private final boolean mongoInstanceCreated;
|
||||
private final UserCredentials credentials;
|
||||
private final PersistenceExceptionTranslator exceptionTranslator;
|
||||
private final String authenticationDatabaseName;
|
||||
|
||||
private WriteConcern writeConcern;
|
||||
|
||||
/**
|
||||
@@ -56,7 +50,7 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
|
||||
* @param databaseName database name, not be {@literal null} or empty.
|
||||
*/
|
||||
public SimpleMongoDbFactory(Mongo mongo, String databaseName) {
|
||||
this(mongo, databaseName, null);
|
||||
this(mongo, databaseName, UserCredentials.NO_CREDENTIALS, false);
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -67,20 +61,7 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
|
||||
* @param credentials username and password.
|
||||
*/
|
||||
public SimpleMongoDbFactory(Mongo mongo, String databaseName, UserCredentials credentials) {
|
||||
this(mongo, databaseName, credentials, false, null);
|
||||
}
|
||||
|
||||
/**
|
||||
* Create an instance of SimpleMongoDbFactory given the Mongo instance, database name, and username/password
|
||||
*
|
||||
* @param mongo Mongo instance, must not be {@literal null}.
|
||||
* @param databaseName Database name, must not be {@literal null} or empty.
|
||||
* @param credentials username and password.
|
||||
* @param authenticationDatabaseName the database name to use for authentication
|
||||
*/
|
||||
public SimpleMongoDbFactory(Mongo mongo, String databaseName, UserCredentials credentials,
|
||||
String authenticationDatabaseName) {
|
||||
this(mongo, databaseName, credentials, false, authenticationDatabaseName);
|
||||
this(mongo, databaseName, credentials, false);
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -91,14 +72,12 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
|
||||
* @throws UnknownHostException
|
||||
* @see MongoURI
|
||||
*/
|
||||
@SuppressWarnings("deprecation")
|
||||
public SimpleMongoDbFactory(MongoURI uri) throws MongoException, UnknownHostException {
|
||||
this(new Mongo(uri), uri.getDatabase(), new UserCredentials(uri.getUsername(), parseChars(uri.getPassword())),
|
||||
true, uri.getDatabase());
|
||||
this(new Mongo(uri), uri.getDatabase(), new UserCredentials(uri.getUsername(), parseChars(uri.getPassword())), true);
|
||||
}
|
||||
|
||||
private SimpleMongoDbFactory(Mongo mongo, String databaseName, UserCredentials credentials,
|
||||
boolean mongoInstanceCreated, String authenticationDatabaseName) {
|
||||
boolean mongoInstanceCreated) {
|
||||
|
||||
Assert.notNull(mongo, "Mongo must not be null");
|
||||
Assert.hasText(databaseName, "Database name must not be empty");
|
||||
@@ -109,12 +88,6 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
|
||||
this.databaseName = databaseName;
|
||||
this.mongoInstanceCreated = mongoInstanceCreated;
|
||||
this.credentials = credentials == null ? UserCredentials.NO_CREDENTIALS : credentials;
|
||||
this.exceptionTranslator = new MongoExceptionTranslator();
|
||||
this.authenticationDatabaseName = StringUtils.hasText(authenticationDatabaseName) ? authenticationDatabaseName
|
||||
: databaseName;
|
||||
|
||||
Assert.isTrue(this.authenticationDatabaseName.matches("[\\w-]+"),
|
||||
"Authentication database name must only contain letters, numbers, underscores and dashes!");
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -142,7 +115,7 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
|
||||
|
||||
Assert.hasText(dbName, "Database name must not be empty.");
|
||||
|
||||
DB db = MongoDbUtils.getDB(mongo, dbName, credentials, authenticationDatabaseName);
|
||||
DB db = MongoDbUtils.getDB(mongo, dbName, credentials);
|
||||
|
||||
if (writeConcern != null) {
|
||||
db.setWriteConcern(writeConcern);
|
||||
@@ -165,13 +138,4 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
|
||||
private static String parseChars(char[] chars) {
|
||||
return chars == null ? null : String.valueOf(chars);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.MongoDbFactory#getExceptionTranslator()
|
||||
*/
|
||||
@Override
|
||||
public PersistenceExceptionTranslator getExceptionTranslator() {
|
||||
return this.exceptionTranslator;
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,424 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013-2014 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.aggregation;
|
||||
|
||||
import static org.springframework.data.mongodb.core.aggregation.Fields.*;
|
||||
|
||||
import java.util.ArrayList;
|
||||
import java.util.Arrays;
|
||||
import java.util.List;
|
||||
|
||||
import org.springframework.data.domain.Sort;
|
||||
import org.springframework.data.domain.Sort.Direction;
|
||||
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
|
||||
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
|
||||
import org.springframework.data.mongodb.core.aggregation.Fields.AggregationField;
|
||||
import org.springframework.data.mongodb.core.query.Criteria;
|
||||
import org.springframework.data.mongodb.core.query.SerializationUtils;
|
||||
import org.springframework.util.Assert;
|
||||
|
||||
import com.mongodb.BasicDBObject;
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
* An {@code Aggregation} is a representation of a list of aggregation steps to be performed by the MongoDB Aggregation
|
||||
* Framework.
|
||||
*
|
||||
* @author Tobias Trelle
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
* @since 1.3
|
||||
*/
|
||||
public class Aggregation {
|
||||
|
||||
/**
|
||||
* References the root document, i.e. the top-level document, currently being processed in the aggregation pipeline
|
||||
* stage.
|
||||
*/
|
||||
public static final String ROOT = SystemVariable.ROOT.toString();
|
||||
|
||||
/**
|
||||
* References the start of the field path being processed in the aggregation pipeline stage. Unless documented
|
||||
* otherwise, all stages start with CURRENT the same as ROOT.
|
||||
*/
|
||||
public static final String CURRENT = SystemVariable.CURRENT.toString();
|
||||
|
||||
public static final AggregationOperationContext DEFAULT_CONTEXT = new NoOpAggregationOperationContext();
|
||||
public static final AggregationOptions DEFAULT_OPTIONS = newAggregationOptions().build();
|
||||
|
||||
protected final List<AggregationOperation> operations;
|
||||
private final AggregationOptions options;
|
||||
|
||||
/**
|
||||
* Creates a new {@link Aggregation} from the given {@link AggregationOperation}s.
|
||||
*
|
||||
* @param operations must not be {@literal null} or empty.
|
||||
*/
|
||||
public static Aggregation newAggregation(List<? extends AggregationOperation> operations) {
|
||||
return newAggregation(operations.toArray(new AggregationOperation[operations.size()]));
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link Aggregation} from the given {@link AggregationOperation}s.
|
||||
*
|
||||
* @param operations must not be {@literal null} or empty.
|
||||
*/
|
||||
public static Aggregation newAggregation(AggregationOperation... operations) {
|
||||
return new Aggregation(operations);
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns a copy of this {@link Aggregation} with the given {@link AggregationOptions} set. Note that options are
|
||||
* supported in MongoDB version 2.6+.
|
||||
*
|
||||
* @param options must not be {@literal null}.
|
||||
* @return
|
||||
* @since 1.6
|
||||
*/
|
||||
public Aggregation withOptions(AggregationOptions options) {
|
||||
|
||||
Assert.notNull(options, "AggregationOptions must not be null.");
|
||||
return new Aggregation(this.operations, options);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link TypedAggregation} for the given type and {@link AggregationOperation}s.
|
||||
*
|
||||
* @param type must not be {@literal null}.
|
||||
* @param operations must not be {@literal null} or empty.
|
||||
*/
|
||||
public static <T> TypedAggregation<T> newAggregation(Class<T> type, List<? extends AggregationOperation> operations) {
|
||||
return newAggregation(type, operations.toArray(new AggregationOperation[operations.size()]));
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link TypedAggregation} for the given type and {@link AggregationOperation}s.
|
||||
*
|
||||
* @param type must not be {@literal null}.
|
||||
* @param operations must not be {@literal null} or empty.
|
||||
*/
|
||||
public static <T> TypedAggregation<T> newAggregation(Class<T> type, AggregationOperation... operations) {
|
||||
return new TypedAggregation<T>(type, operations);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link Aggregation} from the given {@link AggregationOperation}s.
|
||||
*
|
||||
* @param aggregationOperations must not be {@literal null} or empty.
|
||||
*/
|
||||
protected Aggregation(AggregationOperation... aggregationOperations) {
|
||||
this(asAggregationList(aggregationOperations));
|
||||
}
|
||||
|
||||
/**
|
||||
* @param aggregationOperations must not be {@literal null} or empty.
|
||||
* @return
|
||||
*/
|
||||
protected static List<AggregationOperation> asAggregationList(AggregationOperation... aggregationOperations) {
|
||||
|
||||
Assert.notEmpty(aggregationOperations, "AggregationOperations must not be null or empty!");
|
||||
|
||||
return Arrays.asList(aggregationOperations);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link Aggregation} from the given {@link AggregationOperation}s.
|
||||
*
|
||||
* @param aggregationOperations must not be {@literal null} or empty.
|
||||
*/
|
||||
protected Aggregation(List<AggregationOperation> aggregationOperations) {
|
||||
this(aggregationOperations, DEFAULT_OPTIONS);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link Aggregation} from the given {@link AggregationOperation}s.
|
||||
*
|
||||
* @param aggregationOperations must not be {@literal null} or empty.
|
||||
* @param options must not be {@literal null} or empty.
|
||||
*/
|
||||
protected Aggregation(List<AggregationOperation> aggregationOperations, AggregationOptions options) {
|
||||
|
||||
Assert.notNull(aggregationOperations, "AggregationOperations must not be null!");
|
||||
Assert.isTrue(aggregationOperations.size() > 0, "At least one AggregationOperation has to be provided");
|
||||
Assert.notNull(options, "AggregationOptions must not be null!");
|
||||
|
||||
this.operations = aggregationOperations;
|
||||
this.options = options;
|
||||
}
|
||||
|
||||
/**
|
||||
* A pointer to the previous {@link AggregationOperation}.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public static String previousOperation() {
|
||||
return "_id";
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link ProjectionOperation} including the given fields.
|
||||
*
|
||||
* @param fields must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
public static ProjectionOperation project(String... fields) {
|
||||
return project(fields(fields));
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link ProjectionOperation} includeing the given {@link Fields}.
|
||||
*
|
||||
* @param fields must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
public static ProjectionOperation project(Fields fields) {
|
||||
return new ProjectionOperation(fields);
|
||||
}
|
||||
|
||||
/**
|
||||
* Factory method to create a new {@link UnwindOperation} for the field with the given name.
|
||||
*
|
||||
* @param fieldName must not be {@literal null} or empty.
|
||||
* @return
|
||||
*/
|
||||
public static UnwindOperation unwind(String field) {
|
||||
return new UnwindOperation(field(field));
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link GroupOperation} for the given fields.
|
||||
*
|
||||
* @param fields must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
public static GroupOperation group(String... fields) {
|
||||
return group(fields(fields));
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link GroupOperation} for the given {@link Fields}.
|
||||
*
|
||||
* @param fields must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
public static GroupOperation group(Fields fields) {
|
||||
return new GroupOperation(fields);
|
||||
}
|
||||
|
||||
/**
|
||||
* Factory method to create a new {@link SortOperation} for the given {@link Sort}.
|
||||
*
|
||||
* @param sort must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
public static SortOperation sort(Sort sort) {
|
||||
return new SortOperation(sort);
|
||||
}
|
||||
|
||||
/**
|
||||
* Factory method to create a new {@link SortOperation} for the given sort {@link Direction} and {@code fields}.
|
||||
*
|
||||
* @param direction must not be {@literal null}.
|
||||
* @param fields must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
public static SortOperation sort(Direction direction, String... fields) {
|
||||
return new SortOperation(new Sort(direction, fields));
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link SkipOperation} skipping the given number of elements.
|
||||
*
|
||||
* @param elementsToSkip must not be less than zero.
|
||||
* @return
|
||||
*/
|
||||
public static SkipOperation skip(int elementsToSkip) {
|
||||
return new SkipOperation(elementsToSkip);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link LimitOperation} limiting the result to the given number of elements.
|
||||
*
|
||||
* @param maxElements must not be less than zero.
|
||||
* @return
|
||||
*/
|
||||
public static LimitOperation limit(long maxElements) {
|
||||
return new LimitOperation(maxElements);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link MatchOperation} using the given {@link Criteria}.
|
||||
*
|
||||
* @param criteria must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
public static MatchOperation match(Criteria criteria) {
|
||||
return new MatchOperation(criteria);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link Fields} instance for the given field names.
|
||||
*
|
||||
* @see Fields#fields(String...)
|
||||
* @param fields must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
public static Fields fields(String... fields) {
|
||||
return Fields.fields(fields);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link Fields} instance from the given field name and target reference.
|
||||
*
|
||||
* @param name must not be {@literal null} or empty.
|
||||
* @param target must not be {@literal null} or empty.
|
||||
* @return
|
||||
*/
|
||||
public static Fields bind(String name, String target) {
|
||||
return Fields.from(field(name, target));
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns a new {@link AggregationOptions.Builder}.
|
||||
*
|
||||
* @return
|
||||
* @since 1.6
|
||||
*/
|
||||
public static AggregationOptions.Builder newAggregationOptions() {
|
||||
return new AggregationOptions.Builder();
|
||||
}
|
||||
|
||||
/**
|
||||
* Converts this {@link Aggregation} specification to a {@link DBObject}.
|
||||
*
|
||||
* @param inputCollectionName the name of the input collection
|
||||
* @return the {@code DBObject} representing this aggregation
|
||||
*/
|
||||
public DBObject toDbObject(String inputCollectionName, AggregationOperationContext rootContext) {
|
||||
|
||||
AggregationOperationContext context = rootContext;
|
||||
List<DBObject> operationDocuments = new ArrayList<DBObject>(operations.size());
|
||||
|
||||
for (AggregationOperation operation : operations) {
|
||||
|
||||
operationDocuments.add(operation.toDBObject(context));
|
||||
|
||||
if (operation instanceof FieldsExposingAggregationOperation) {
|
||||
FieldsExposingAggregationOperation exposedFieldsOperation = (FieldsExposingAggregationOperation) operation;
|
||||
context = new ExposedFieldsAggregationOperationContext(exposedFieldsOperation.getFields(), rootContext);
|
||||
}
|
||||
}
|
||||
|
||||
DBObject command = new BasicDBObject("aggregate", inputCollectionName);
|
||||
command.put("pipeline", operationDocuments);
|
||||
|
||||
command = options.applyAndReturnPotentiallyChangedCommand(command);
|
||||
|
||||
return command;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see java.lang.Object#toString()
|
||||
*/
|
||||
@Override
|
||||
public String toString() {
|
||||
return SerializationUtils
|
||||
.serializeToJsonSafely(toDbObject("__collection__", new NoOpAggregationOperationContext()));
|
||||
}
|
||||
|
||||
/**
|
||||
* Simple {@link AggregationOperationContext} that just returns {@link FieldReference}s as is.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
private static class NoOpAggregationOperationContext implements AggregationOperationContext {
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getMappedObject(com.mongodb.DBObject)
|
||||
*/
|
||||
@Override
|
||||
public DBObject getMappedObject(DBObject dbObject) {
|
||||
return dbObject;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getReference(org.springframework.data.mongodb.core.aggregation.ExposedFields.AvailableField)
|
||||
*/
|
||||
@Override
|
||||
public FieldReference getReference(Field field) {
|
||||
return new FieldReference(new ExposedField(field, true));
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getReference(java.lang.String)
|
||||
*/
|
||||
@Override
|
||||
public FieldReference getReference(String name) {
|
||||
return new FieldReference(new ExposedField(new AggregationField(name), true));
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Describes the system variables available in MongoDB aggregation framework pipeline expressions.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
* @see http://docs.mongodb.org/manual/reference/aggregation-variables
|
||||
*/
|
||||
enum SystemVariable {
|
||||
|
||||
ROOT, CURRENT;
|
||||
|
||||
private static final String PREFIX = "$$";
|
||||
|
||||
/**
|
||||
* Return {@literal true} if the given {@code fieldRef} denotes a well-known system variable, {@literal false}
|
||||
* otherwise.
|
||||
*
|
||||
* @param fieldRef may be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
public static boolean isReferingToSystemVariable(String fieldRef) {
|
||||
|
||||
if (fieldRef == null || !fieldRef.startsWith(PREFIX) || fieldRef.length() <= 2) {
|
||||
return false;
|
||||
}
|
||||
|
||||
int indexOfFirstDot = fieldRef.indexOf('.');
|
||||
String candidate = fieldRef.substring(2, indexOfFirstDot == -1 ? fieldRef.length() : indexOfFirstDot);
|
||||
|
||||
for (SystemVariable value : values()) {
|
||||
if (value.name().equals(candidate)) {
|
||||
return true;
|
||||
}
|
||||
}
|
||||
|
||||
return false;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see java.lang.Enum#toString()
|
||||
*/
|
||||
@Override
|
||||
public String toString() {
|
||||
return PREFIX.concat(name());
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,82 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.aggregation;
|
||||
|
||||
import org.springframework.data.mongodb.core.aggregation.AggregationExpressionTransformer.AggregationExpressionTransformationContext;
|
||||
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
|
||||
import org.springframework.data.mongodb.core.spel.ExpressionNode;
|
||||
import org.springframework.data.mongodb.core.spel.ExpressionTransformationContextSupport;
|
||||
import org.springframework.data.mongodb.core.spel.ExpressionTransformer;
|
||||
import org.springframework.util.Assert;
|
||||
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
* Interface to type an {@link ExpressionTransformer} to the contained
|
||||
* {@link AggregationExpressionTransformationContext}.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
interface AggregationExpressionTransformer extends
|
||||
ExpressionTransformer<AggregationExpressionTransformationContext<ExpressionNode>> {
|
||||
|
||||
/**
|
||||
* A special {@link ExpressionTransformationContextSupport} to be aware of the {@link AggregationOperationContext}.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
public static class AggregationExpressionTransformationContext<T extends ExpressionNode> extends
|
||||
ExpressionTransformationContextSupport<T> {
|
||||
|
||||
private final AggregationOperationContext aggregationContext;
|
||||
|
||||
/**
|
||||
* Creates an {@link AggregationExpressionTransformationContext}.
|
||||
*
|
||||
* @param currentNode must not be {@literal null}.
|
||||
* @param parentNode
|
||||
* @param previousOperationObject
|
||||
* @param aggregationContext must not be {@literal null}.
|
||||
*/
|
||||
public AggregationExpressionTransformationContext(T currentNode, ExpressionNode parentNode,
|
||||
DBObject previousOperationObject, AggregationOperationContext context) {
|
||||
|
||||
super(currentNode, parentNode, previousOperationObject);
|
||||
|
||||
Assert.notNull(context, "AggregationOperationContext must not be null!");
|
||||
this.aggregationContext = context;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the underlying {@link AggregationOperationContext}.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public AggregationOperationContext getAggregationContext() {
|
||||
return aggregationContext;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the {@link FieldReference} for the current {@link ExpressionNode}.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public FieldReference getFieldReference() {
|
||||
return aggregationContext.getReference(getCurrentNode().getName());
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,37 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.aggregation;
|
||||
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
* Represents one single operation in an aggregation pipeline.
|
||||
*
|
||||
* @author Sebastian Herold
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
* @since 1.3
|
||||
*/
|
||||
public interface AggregationOperation {
|
||||
|
||||
/**
|
||||
* Turns the {@link AggregationOperation} into a {@link DBObject} by using the given
|
||||
* {@link AggregationOperationContext}.
|
||||
*
|
||||
* @return the DBObject
|
||||
*/
|
||||
DBObject toDBObject(AggregationOperationContext context);
|
||||
}
|
||||
@@ -1,55 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.aggregation;
|
||||
|
||||
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
|
||||
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
* The context for an {@link AggregationOperation}.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
* @since 1.3
|
||||
*/
|
||||
public interface AggregationOperationContext {
|
||||
|
||||
/**
|
||||
* Returns the mapped {@link DBObject}, potentially converting the source considering mapping metadata etc.
|
||||
*
|
||||
* @param dbObject will never be {@literal null}.
|
||||
* @return must not be {@literal null}.
|
||||
*/
|
||||
DBObject getMappedObject(DBObject dbObject);
|
||||
|
||||
/**
|
||||
* Returns a {@link FieldReference} for the given field or {@literal null} if the context does not expose the given
|
||||
* field.
|
||||
*
|
||||
* @param field must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
FieldReference getReference(Field field);
|
||||
|
||||
/**
|
||||
* Returns the {@link FieldReference} for the field with the given name or {@literal null} if the context does not
|
||||
* expose a field with the given name.
|
||||
*
|
||||
* @param name must not be {@literal null} or empty.
|
||||
* @return
|
||||
*/
|
||||
FieldReference getReference(String name);
|
||||
}
|
||||
@@ -1,189 +0,0 @@
|
||||
/*
|
||||
* Copyright 2014 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.aggregation;
|
||||
|
||||
import com.mongodb.BasicDBObject;
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
* Holds a set of configurable aggregation options that can be used within an aggregation pipeline. A list of support
|
||||
* aggregation options can be found in the MongoDB reference documentation
|
||||
* http://docs.mongodb.org/manual/reference/command/aggregate/#aggregate
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
* @see Aggregation#withOptions(AggregationOptions)
|
||||
* @see TypedAggregation#withOptions(AggregationOptions)
|
||||
* @since 1.6
|
||||
*/
|
||||
public class AggregationOptions {
|
||||
|
||||
private static final String CURSOR = "cursor";
|
||||
private static final String EXPLAIN = "explain";
|
||||
private static final String ALLOW_DISK_USE = "allowDiskUse";
|
||||
|
||||
private final boolean allowDiskUse;
|
||||
private final boolean explain;
|
||||
private final DBObject cursor;
|
||||
|
||||
/**
|
||||
* Creates a new {@link AggregationOptions}.
|
||||
*
|
||||
* @param allowDiskUse whether to off-load intensive sort-operations to disk.
|
||||
* @param explain whether to get the execution plan for the aggregation instead of the actual results.
|
||||
* @param cursor can be {@literal null}, used to pass additional options to the aggregation.
|
||||
*/
|
||||
public AggregationOptions(boolean allowDiskUse, boolean explain, DBObject cursor) {
|
||||
|
||||
this.allowDiskUse = allowDiskUse;
|
||||
this.explain = explain;
|
||||
this.cursor = cursor;
|
||||
}
|
||||
|
||||
/**
|
||||
* Enables writing to temporary files. When set to true, aggregation stages can write data to the _tmp subdirectory in
|
||||
* the dbPath directory.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public boolean isAllowDiskUse() {
|
||||
return allowDiskUse;
|
||||
}
|
||||
|
||||
/**
|
||||
* Specifies to return the information on the processing of the pipeline.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public boolean isExplain() {
|
||||
return explain;
|
||||
}
|
||||
|
||||
/**
|
||||
* Specify a document that contains options that control the creation of the cursor object.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public DBObject getCursor() {
|
||||
return cursor;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns a new potentially adjusted copy for the given {@code aggregationCommandObject} with the configuration
|
||||
* applied.
|
||||
*
|
||||
* @param command the aggregation command.
|
||||
* @return
|
||||
*/
|
||||
DBObject applyAndReturnPotentiallyChangedCommand(DBObject command) {
|
||||
|
||||
DBObject result = new BasicDBObject(command.toMap());
|
||||
|
||||
if (allowDiskUse && !result.containsField(ALLOW_DISK_USE)) {
|
||||
result.put(ALLOW_DISK_USE, allowDiskUse);
|
||||
}
|
||||
|
||||
if (explain && !result.containsField(EXPLAIN)) {
|
||||
result.put(EXPLAIN, explain);
|
||||
}
|
||||
|
||||
if (cursor != null && !result.containsField(CURSOR)) {
|
||||
result.put("cursor", cursor);
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns a {@link DBObject} representation of this {@link AggregationOptions}.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public DBObject toDbObject() {
|
||||
|
||||
DBObject dbo = new BasicDBObject();
|
||||
dbo.put(ALLOW_DISK_USE, allowDiskUse);
|
||||
dbo.put(EXPLAIN, explain);
|
||||
dbo.put(CURSOR, cursor);
|
||||
|
||||
return dbo;
|
||||
}
|
||||
|
||||
/* (non-Javadoc)
|
||||
* @see java.lang.Object#toString()
|
||||
*/
|
||||
@Override
|
||||
public String toString() {
|
||||
return toDbObject().toString();
|
||||
}
|
||||
|
||||
/**
|
||||
* A Builder for {@link AggregationOptions}.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
public static class Builder {
|
||||
|
||||
private boolean allowDiskUse;
|
||||
private boolean explain;
|
||||
private DBObject cursor;
|
||||
|
||||
/**
|
||||
* Defines whether to off-load intensive sort-operations to disk.
|
||||
*
|
||||
* @param allowDiskUse
|
||||
* @return
|
||||
*/
|
||||
public Builder allowDiskUse(boolean allowDiskUse) {
|
||||
|
||||
this.allowDiskUse = allowDiskUse;
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Defines whether to get the execution plan for the aggregation instead of the actual results.
|
||||
*
|
||||
* @param explain
|
||||
* @return
|
||||
*/
|
||||
public Builder explain(boolean explain) {
|
||||
|
||||
this.explain = explain;
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Additional options to the aggregation.
|
||||
*
|
||||
* @param cursor
|
||||
* @return
|
||||
*/
|
||||
public Builder cursor(DBObject cursor) {
|
||||
|
||||
this.cursor = cursor;
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns a new {@link AggregationOptions} instance with the given configuration.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public AggregationOptions build() {
|
||||
return new AggregationOptions(allowDiskUse, explain, cursor);
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,109 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013-2014 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.aggregation;
|
||||
|
||||
import java.util.Collections;
|
||||
import java.util.Iterator;
|
||||
import java.util.List;
|
||||
|
||||
import org.springframework.util.Assert;
|
||||
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
* Collects the results of executing an aggregation operation.
|
||||
*
|
||||
* @author Tobias Trelle
|
||||
* @author Oliver Gierke
|
||||
* @author Thomas Darimont
|
||||
* @param <T> The class in which the results are mapped onto.
|
||||
* @since 1.3
|
||||
*/
|
||||
public class AggregationResults<T> implements Iterable<T> {
|
||||
|
||||
private final List<T> mappedResults;
|
||||
private final DBObject rawResults;
|
||||
private final String serverUsed;
|
||||
|
||||
/**
|
||||
* Creates a new {@link AggregationResults} instance from the given mapped and raw results.
|
||||
*
|
||||
* @param mappedResults must not be {@literal null}.
|
||||
* @param rawResults must not be {@literal null}.
|
||||
*/
|
||||
public AggregationResults(List<T> mappedResults, DBObject rawResults) {
|
||||
|
||||
Assert.notNull(mappedResults);
|
||||
Assert.notNull(rawResults);
|
||||
|
||||
this.mappedResults = Collections.unmodifiableList(mappedResults);
|
||||
this.rawResults = rawResults;
|
||||
this.serverUsed = parseServerUsed();
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the aggregation results.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public List<T> getMappedResults() {
|
||||
return mappedResults;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the unique mapped result. Assumes no result or exactly one.
|
||||
*
|
||||
* @return
|
||||
* @throws IllegalArgumentException in case more than one result is available.
|
||||
*/
|
||||
public T getUniqueMappedResult() {
|
||||
Assert.isTrue(mappedResults.size() < 2, "Expected unique result or null, but got more than one!");
|
||||
return mappedResults.size() == 1 ? mappedResults.get(0) : null;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see java.lang.Iterable#iterator()
|
||||
*/
|
||||
public Iterator<T> iterator() {
|
||||
return mappedResults.iterator();
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the server that has been used to perform the aggregation.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public String getServerUsed() {
|
||||
return serverUsed;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the raw result that was returned by the server.
|
||||
*
|
||||
* @return
|
||||
* @since 1.6
|
||||
*/
|
||||
public DBObject getRawResults() {
|
||||
return rawResults;
|
||||
}
|
||||
|
||||
private String parseServerUsed() {
|
||||
|
||||
Object object = rawResults.get("serverUsed");
|
||||
return object instanceof String ? (String) object : null;
|
||||
}
|
||||
}
|
||||
@@ -1,413 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013-2014 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.aggregation;
|
||||
|
||||
import java.util.ArrayList;
|
||||
import java.util.Arrays;
|
||||
import java.util.Collections;
|
||||
import java.util.Iterator;
|
||||
import java.util.List;
|
||||
|
||||
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
|
||||
import org.springframework.util.Assert;
|
||||
import org.springframework.util.CompositeIterator;
|
||||
|
||||
/**
|
||||
* Value object to capture the fields exposed by an {@link AggregationOperation}.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
* @author Thomas Darimont
|
||||
* @since 1.3
|
||||
*/
|
||||
public final class ExposedFields implements Iterable<ExposedField> {
|
||||
|
||||
private static final List<ExposedField> NO_FIELDS = Collections.emptyList();
|
||||
private static final ExposedFields EMPTY = new ExposedFields(NO_FIELDS, NO_FIELDS);
|
||||
|
||||
private final List<ExposedField> originalFields;
|
||||
private final List<ExposedField> syntheticFields;
|
||||
|
||||
/**
|
||||
* Creates a new {@link ExposedFields} instance from the given {@link ExposedField}s.
|
||||
*
|
||||
* @param fields must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
public static ExposedFields from(ExposedField... fields) {
|
||||
return from(Arrays.asList(fields));
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link ExposedFields} instance from the given {@link ExposedField}s.
|
||||
*
|
||||
* @param fields must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
private static ExposedFields from(List<ExposedField> fields) {
|
||||
|
||||
ExposedFields result = EMPTY;
|
||||
|
||||
for (ExposedField field : fields) {
|
||||
result = result.and(field);
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates synthetic {@link ExposedFields} from the given {@link Fields}.
|
||||
*
|
||||
* @param fields must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
public static ExposedFields synthetic(Fields fields) {
|
||||
return createFields(fields, true);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates non-synthetic {@link ExposedFields} from the given {@link Fields}.
|
||||
*
|
||||
* @param fields must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
public static ExposedFields nonSynthetic(Fields fields) {
|
||||
return createFields(fields, false);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link ExposedFields} instance for the given fields in either sythetic or non-synthetic way.
|
||||
*
|
||||
* @param fields must not be {@literal null}.
|
||||
* @param synthetic
|
||||
* @return
|
||||
*/
|
||||
private static ExposedFields createFields(Fields fields, boolean synthetic) {
|
||||
|
||||
Assert.notNull(fields, "Fields must not be null!");
|
||||
List<ExposedField> result = new ArrayList<ExposedField>();
|
||||
|
||||
for (Field field : fields) {
|
||||
result.add(new ExposedField(field, synthetic));
|
||||
}
|
||||
|
||||
return ExposedFields.from(result);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link ExposedFields} with the given orignals and synthetics.
|
||||
*
|
||||
* @param originals must not be {@literal null}.
|
||||
* @param synthetic must not be {@literal null}.
|
||||
*/
|
||||
private ExposedFields(List<ExposedField> originals, List<ExposedField> synthetic) {
|
||||
|
||||
this.originalFields = originals;
|
||||
this.syntheticFields = synthetic;
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link ExposedFields} adding the given {@link ExposedField}.
|
||||
*
|
||||
* @param field must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
public ExposedFields and(ExposedField field) {
|
||||
|
||||
Assert.notNull(field, "Exposed field must not be null!");
|
||||
|
||||
ArrayList<ExposedField> result = new ArrayList<ExposedField>();
|
||||
result.addAll(field.synthetic ? syntheticFields : originalFields);
|
||||
result.add(field);
|
||||
|
||||
return new ExposedFields(field.synthetic ? originalFields : result, field.synthetic ? result : syntheticFields);
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the field with the given name or {@literal null} if no field with the given name is available.
|
||||
*
|
||||
* @param name
|
||||
* @return
|
||||
*/
|
||||
public ExposedField getField(String name) {
|
||||
|
||||
for (ExposedField field : this) {
|
||||
if (field.canBeReferredToBy(name)) {
|
||||
return field;
|
||||
}
|
||||
}
|
||||
|
||||
return null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns whether the {@link ExposedFields} exposes no non-synthetic fields at all.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
boolean exposesNoNonSyntheticFields() {
|
||||
return originalFields.isEmpty();
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns whether the {@link ExposedFields} exposes a single non-synthetic field only.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
boolean exposesSingleNonSyntheticFieldOnly() {
|
||||
return originalFields.size() == 1;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns whether the {@link ExposedFields} exposes no fields at all.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
boolean exposesNoFields() {
|
||||
return exposedFieldsCount() == 0;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns whether the {@link ExposedFields} exposes a single field only.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
boolean exposesSingleFieldOnly() {
|
||||
return exposedFieldsCount() == 1;
|
||||
}
|
||||
|
||||
/**
|
||||
* @return
|
||||
*/
|
||||
private int exposedFieldsCount() {
|
||||
return originalFields.size() + syntheticFields.size();
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see java.lang.Iterable#iterator()
|
||||
*/
|
||||
@Override
|
||||
public Iterator<ExposedField> iterator() {
|
||||
|
||||
CompositeIterator<ExposedField> iterator = new CompositeIterator<ExposedField>();
|
||||
iterator.add(syntheticFields.iterator());
|
||||
iterator.add(originalFields.iterator());
|
||||
|
||||
return iterator;
|
||||
}
|
||||
|
||||
/**
|
||||
* A single exposed field.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
static class ExposedField implements Field {
|
||||
|
||||
private final boolean synthetic;
|
||||
private final Field field;
|
||||
|
||||
/**
|
||||
* Creates a new {@link ExposedField} with the given key.
|
||||
*
|
||||
* @param key must not be {@literal null} or empty.
|
||||
* @param synthetic whether the exposed field is synthetic.
|
||||
*/
|
||||
public ExposedField(String key, boolean synthetic) {
|
||||
this(Fields.field(key), synthetic);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link ExposedField} for the given {@link Field}.
|
||||
*
|
||||
* @param delegate must not be {@literal null}.
|
||||
* @param synthetic whether the exposed field is synthetic.
|
||||
*/
|
||||
public ExposedField(Field delegate, boolean synthetic) {
|
||||
|
||||
this.field = delegate;
|
||||
this.synthetic = synthetic;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.Field#getKey()
|
||||
*/
|
||||
@Override
|
||||
public String getName() {
|
||||
return field.getName();
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.Field#getTarget()
|
||||
*/
|
||||
@Override
|
||||
public String getTarget() {
|
||||
return field.getTarget();
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.Field#isAliased()
|
||||
*/
|
||||
@Override
|
||||
public boolean isAliased() {
|
||||
return field.isAliased();
|
||||
}
|
||||
|
||||
/**
|
||||
* @return the synthetic
|
||||
*/
|
||||
public boolean isSynthetic() {
|
||||
return synthetic;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns whether the field can be referred to using the given name.
|
||||
*
|
||||
* @param name
|
||||
* @return
|
||||
*/
|
||||
public boolean canBeReferredToBy(String name) {
|
||||
return getName().equals(name) || getTarget().equals(name);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see java.lang.Object#toString()
|
||||
*/
|
||||
@Override
|
||||
public String toString() {
|
||||
return String.format("AggregationField: %s, synthetic: %s", field, synthetic);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see java.lang.Object#equals(java.lang.Object)
|
||||
*/
|
||||
@Override
|
||||
public boolean equals(Object obj) {
|
||||
|
||||
if (this == obj) {
|
||||
return true;
|
||||
}
|
||||
|
||||
if (!(obj instanceof ExposedField)) {
|
||||
return false;
|
||||
}
|
||||
|
||||
ExposedField that = (ExposedField) obj;
|
||||
|
||||
return this.field.equals(that.field) && this.synthetic == that.synthetic;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see java.lang.Object#hashCode()
|
||||
*/
|
||||
@Override
|
||||
public int hashCode() {
|
||||
|
||||
int result = 17;
|
||||
|
||||
result += 31 * field.hashCode();
|
||||
result += 31 * (synthetic ? 0 : 1);
|
||||
|
||||
return result;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* A reference to an {@link ExposedField}.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
static class FieldReference {
|
||||
|
||||
private final ExposedField field;
|
||||
|
||||
/**
|
||||
* Creates a new {@link FieldReference} for the given {@link ExposedField}.
|
||||
*
|
||||
* @param field must not be {@literal null}.
|
||||
*/
|
||||
public FieldReference(ExposedField field) {
|
||||
|
||||
Assert.notNull(field, "ExposedField must not be null!");
|
||||
|
||||
this.field = field;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the raw, unqualified reference, i.e. the field reference without a {@literal $} prefix.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public String getRaw() {
|
||||
|
||||
String target = field.getTarget();
|
||||
return field.synthetic ? target : String.format("%s.%s", Fields.UNDERSCORE_ID, target);
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the referenve value for the given field reference. Will return 1 for a synthetic, unaliased field or the
|
||||
* raw rendering of the reference otherwise.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public Object getReferenceValue() {
|
||||
return field.synthetic && !field.isAliased() ? 1 : toString();
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see java.lang.Object#toString()
|
||||
*/
|
||||
@Override
|
||||
public String toString() {
|
||||
return String.format("$%s", getRaw());
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see java.lang.Object#equals(java.lang.Object)
|
||||
*/
|
||||
@Override
|
||||
public boolean equals(Object obj) {
|
||||
|
||||
if (this == obj) {
|
||||
return true;
|
||||
}
|
||||
|
||||
if (!(obj instanceof FieldReference)) {
|
||||
return false;
|
||||
}
|
||||
|
||||
FieldReference that = (FieldReference) obj;
|
||||
|
||||
return this.field.equals(that.field);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see java.lang.Object#hashCode()
|
||||
*/
|
||||
@Override
|
||||
public int hashCode() {
|
||||
return field.hashCode();
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,117 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013-2014 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.aggregation;
|
||||
|
||||
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
|
||||
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
|
||||
import org.springframework.util.Assert;
|
||||
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
* {@link AggregationOperationContext} that combines the available field references from a given
|
||||
* {@code AggregationOperationContext} and an {@link FieldsExposingAggregationOperation}.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
* @since 1.4
|
||||
*/
|
||||
class ExposedFieldsAggregationOperationContext implements AggregationOperationContext {
|
||||
|
||||
private final ExposedFields exposedFields;
|
||||
private final AggregationOperationContext rootContext;
|
||||
|
||||
/**
|
||||
* Creates a new {@link ExposedFieldsAggregationOperationContext} from the given {@link ExposedFields}. Uses the given
|
||||
* {@link AggregationOperationContext} to perform a mapping to mongo types if necessary.
|
||||
*
|
||||
* @param exposedFields must not be {@literal null}.
|
||||
* @param rootContext must not be {@literal null}.
|
||||
*/
|
||||
public ExposedFieldsAggregationOperationContext(ExposedFields exposedFields, AggregationOperationContext rootContext) {
|
||||
|
||||
Assert.notNull(exposedFields, "ExposedFields must not be null!");
|
||||
Assert.notNull(rootContext, "RootContext must not be null!");
|
||||
|
||||
this.exposedFields = exposedFields;
|
||||
this.rootContext = rootContext;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getMappedObject(com.mongodb.DBObject)
|
||||
*/
|
||||
@Override
|
||||
public DBObject getMappedObject(DBObject dbObject) {
|
||||
return rootContext.getMappedObject(dbObject);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getReference(org.springframework.data.mongodb.core.aggregation.ExposedFields.AvailableField)
|
||||
*/
|
||||
@Override
|
||||
public FieldReference getReference(Field field) {
|
||||
return getReference(field, field.getTarget());
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getReference(java.lang.String)
|
||||
*/
|
||||
@Override
|
||||
public FieldReference getReference(String name) {
|
||||
return getReference(null, name);
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns a {@link FieldReference} to the given {@link Field} with the given {@code name}.
|
||||
*
|
||||
* @param field may be {@literal null}
|
||||
* @param name must not be {@literal null}
|
||||
* @return
|
||||
*/
|
||||
private FieldReference getReference(Field field, String name) {
|
||||
|
||||
Assert.notNull(name, "Name must not be null!");
|
||||
|
||||
ExposedField exposedField = exposedFields.getField(name);
|
||||
|
||||
if (exposedField != null) {
|
||||
|
||||
if (field != null) {
|
||||
// we return a FieldReference to the given field directly to make sure that we reference the proper alias here.
|
||||
return new FieldReference(new ExposedField(field, exposedField.isSynthetic()));
|
||||
}
|
||||
|
||||
return new FieldReference(exposedField);
|
||||
}
|
||||
|
||||
if (name.contains(".")) {
|
||||
|
||||
// for nested field references we only check that the root field exists.
|
||||
ExposedField rootField = exposedFields.getField(name.split("\\.")[0]);
|
||||
|
||||
if (rootField != null) {
|
||||
|
||||
// We have to synthetic to true, in order to render the field-name as is.
|
||||
return new FieldReference(new ExposedField(name, true));
|
||||
}
|
||||
}
|
||||
|
||||
throw new IllegalArgumentException(String.format("Invalid reference '%s'!", name));
|
||||
}
|
||||
}
|
||||
@@ -1,46 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.aggregation;
|
||||
|
||||
/**
|
||||
* Abstraction for a field.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
* @since 1.3
|
||||
*/
|
||||
public interface Field {
|
||||
|
||||
/**
|
||||
* Returns the name of the field.
|
||||
*
|
||||
* @return must not be {@literal null}.
|
||||
*/
|
||||
String getName();
|
||||
|
||||
/**
|
||||
* Returns the target of the field. In case no explicit target is available {@link #getName()} should be returned.
|
||||
*
|
||||
* @return must not be {@literal null}.
|
||||
*/
|
||||
String getTarget();
|
||||
|
||||
/**
|
||||
* Returns whether the Field is aliased, which means it has a name set different from the target.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
boolean isAliased();
|
||||
}
|
||||
@@ -1,298 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013-2014 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.aggregation;
|
||||
|
||||
import java.util.ArrayList;
|
||||
import java.util.Arrays;
|
||||
import java.util.HashMap;
|
||||
import java.util.Iterator;
|
||||
import java.util.List;
|
||||
import java.util.Map;
|
||||
|
||||
import org.springframework.util.Assert;
|
||||
import org.springframework.util.ObjectUtils;
|
||||
import org.springframework.util.StringUtils;
|
||||
|
||||
/**
|
||||
* Value object to capture a list of {@link Field} instances.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
* @author Thomas Darimont
|
||||
* @since 1.3
|
||||
*/
|
||||
public final class Fields implements Iterable<Field> {
|
||||
|
||||
private static final String AMBIGUOUS_EXCEPTION = "Found two fields both using '%s' as name: %s and %s! Please "
|
||||
+ "customize your field definitions to get to unique field names!";
|
||||
|
||||
public static final String UNDERSCORE_ID = "_id";
|
||||
public static final String UNDERSCORE_ID_REF = "$_id";
|
||||
|
||||
private final List<Field> fields;
|
||||
|
||||
/**
|
||||
* Creates a new {@link Fields} instance from the given {@link Fields}.
|
||||
*
|
||||
* @param fields must not be {@literal null} or empty.
|
||||
* @return
|
||||
*/
|
||||
public static Fields from(Field... fields) {
|
||||
|
||||
Assert.notNull(fields, "Fields must not be null!");
|
||||
return new Fields(Arrays.asList(fields));
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link Fields} instance for {@link Field}s with the given names.
|
||||
*
|
||||
* @param names must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
public static Fields fields(String... names) {
|
||||
|
||||
Assert.notNull(names, "Field names must not be null!");
|
||||
|
||||
List<Field> fields = new ArrayList<Field>();
|
||||
|
||||
for (String name : names) {
|
||||
fields.add(field(name));
|
||||
}
|
||||
|
||||
return new Fields(fields);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a {@link Field} with the given name.
|
||||
*
|
||||
* @param name must not be {@literal null} or empty.
|
||||
* @return
|
||||
*/
|
||||
public static Field field(String name) {
|
||||
return new AggregationField(name);
|
||||
}
|
||||
|
||||
public static Field field(String name, String target) {
|
||||
Assert.hasText(target, "Target must not be null or empty!");
|
||||
return new AggregationField(name, target);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link Fields} instance using the given {@link Field}s.
|
||||
*
|
||||
* @param fields must not be {@literal null}.
|
||||
*/
|
||||
private Fields(List<Field> fields) {
|
||||
|
||||
Assert.notNull(fields, "Fields must not be null!");
|
||||
|
||||
this.fields = verify(fields);
|
||||
}
|
||||
|
||||
private static final List<Field> verify(List<Field> fields) {
|
||||
|
||||
Map<String, Field> reference = new HashMap<String, Field>();
|
||||
|
||||
for (Field field : fields) {
|
||||
|
||||
String name = field.getName();
|
||||
Field found = reference.get(name);
|
||||
|
||||
if (found != null) {
|
||||
throw new IllegalArgumentException(String.format(AMBIGUOUS_EXCEPTION, name, found, field));
|
||||
}
|
||||
|
||||
reference.put(name, field);
|
||||
}
|
||||
|
||||
return fields;
|
||||
}
|
||||
|
||||
private Fields(Fields existing, Field tail) {
|
||||
|
||||
this.fields = new ArrayList<Field>(existing.fields.size() + 1);
|
||||
this.fields.addAll(existing.fields);
|
||||
this.fields.add(tail);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link Fields} instance with a new {@link Field} of the given name added.
|
||||
*
|
||||
* @param name must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
public Fields and(String name) {
|
||||
return and(new AggregationField(name));
|
||||
}
|
||||
|
||||
public Fields and(String name, String target) {
|
||||
return and(new AggregationField(name, target));
|
||||
}
|
||||
|
||||
public Fields and(Field field) {
|
||||
return new Fields(this, field);
|
||||
}
|
||||
|
||||
public Fields and(Fields fields) {
|
||||
|
||||
Fields result = this;
|
||||
|
||||
for (Field field : fields) {
|
||||
result = result.and(field);
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
public Field getField(String name) {
|
||||
|
||||
for (Field field : fields) {
|
||||
if (field.getName().equals(name)) {
|
||||
return field;
|
||||
}
|
||||
}
|
||||
|
||||
return null;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see java.lang.Iterable#iterator()
|
||||
*/
|
||||
@Override
|
||||
public Iterator<Field> iterator() {
|
||||
return fields.iterator();
|
||||
}
|
||||
|
||||
/**
|
||||
* Value object to encapsulate a field in an aggregation operation.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
static class AggregationField implements Field {
|
||||
|
||||
private final String name;
|
||||
private final String target;
|
||||
|
||||
/**
|
||||
* Creates an aggregation field with the given name. As no target is set explicitly, the name will be used as target
|
||||
* as well.
|
||||
*
|
||||
* @param key
|
||||
*/
|
||||
public AggregationField(String key) {
|
||||
this(key, null);
|
||||
}
|
||||
|
||||
public AggregationField(String name, String target) {
|
||||
|
||||
String nameToSet = cleanUp(name);
|
||||
String targetToSet = cleanUp(target);
|
||||
|
||||
Assert.hasText(nameToSet, "AggregationField name must not be null or empty!");
|
||||
|
||||
if (target == null && name.contains(".")) {
|
||||
this.name = nameToSet.substring(nameToSet.indexOf('.') + 1);
|
||||
this.target = nameToSet;
|
||||
} else {
|
||||
this.name = nameToSet;
|
||||
this.target = targetToSet;
|
||||
}
|
||||
}
|
||||
|
||||
private static final String cleanUp(String source) {
|
||||
|
||||
if (source == null) {
|
||||
return source;
|
||||
}
|
||||
|
||||
if (Aggregation.SystemVariable.isReferingToSystemVariable(source)) {
|
||||
return source;
|
||||
}
|
||||
|
||||
int dollarIndex = source.lastIndexOf('$');
|
||||
return dollarIndex == -1 ? source : source.substring(dollarIndex + 1);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.Field#getKey()
|
||||
*/
|
||||
public String getName() {
|
||||
return name;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.Field#getAlias()
|
||||
*/
|
||||
public String getTarget() {
|
||||
return StringUtils.hasText(this.target) ? this.target : this.name;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.Field#isAliased()
|
||||
*/
|
||||
@Override
|
||||
public boolean isAliased() {
|
||||
return !getName().equals(getTarget());
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see java.lang.Object#toString()
|
||||
*/
|
||||
@Override
|
||||
public String toString() {
|
||||
return String.format("AggregationField - name: %s, target: %s", name, target);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see java.lang.Object#equals(java.lang.Object)
|
||||
*/
|
||||
@Override
|
||||
public boolean equals(Object obj) {
|
||||
|
||||
if (this == obj) {
|
||||
return true;
|
||||
}
|
||||
|
||||
if (!(obj instanceof AggregationField)) {
|
||||
return false;
|
||||
}
|
||||
|
||||
AggregationField that = (AggregationField) obj;
|
||||
|
||||
return this.name.equals(that.name) && ObjectUtils.nullSafeEquals(this.target, that.target);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see java.lang.Object#hashCode()
|
||||
*/
|
||||
@Override
|
||||
public int hashCode() {
|
||||
|
||||
int result = 17;
|
||||
|
||||
result += 31 * name.hashCode();
|
||||
result += 31 * ObjectUtils.nullSafeHashCode(target);
|
||||
|
||||
return result;
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,32 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.aggregation;
|
||||
|
||||
/**
|
||||
* {@link AggregationOperation} that exposes new {@link ExposedFields} that can be used for later aggregation pipeline
|
||||
* {@code AggregationOperation}s.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
public interface FieldsExposingAggregationOperation extends AggregationOperation {
|
||||
|
||||
/**
|
||||
* Returns the fields exposed by the {@link AggregationOperation}.
|
||||
*
|
||||
* @return will never be {@literal null}.
|
||||
*/
|
||||
ExposedFields getFields();
|
||||
}
|
||||
@@ -1,46 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.aggregation;
|
||||
|
||||
import org.springframework.data.mongodb.core.query.NearQuery;
|
||||
import org.springframework.util.Assert;
|
||||
|
||||
import com.mongodb.BasicDBObject;
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
* @author Thomas Darimont
|
||||
* @since 1.3
|
||||
*/
|
||||
public class GeoNearOperation implements AggregationOperation {
|
||||
|
||||
private final NearQuery nearQuery;
|
||||
|
||||
public GeoNearOperation(NearQuery nearQuery) {
|
||||
|
||||
Assert.notNull(nearQuery);
|
||||
this.nearQuery = nearQuery;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
|
||||
*/
|
||||
@Override
|
||||
public DBObject toDBObject(AggregationOperationContext context) {
|
||||
return new BasicDBObject("$geoNear", context.getMappedObject(nearQuery.toDBObject()));
|
||||
}
|
||||
}
|
||||
@@ -1,384 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013-2014 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.aggregation;
|
||||
|
||||
import java.util.ArrayList;
|
||||
import java.util.Arrays;
|
||||
import java.util.Collections;
|
||||
import java.util.List;
|
||||
import java.util.Locale;
|
||||
|
||||
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
|
||||
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
|
||||
import org.springframework.util.Assert;
|
||||
import org.springframework.util.StringUtils;
|
||||
|
||||
import com.mongodb.BasicDBObject;
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
* Encapsulates the aggregation framework {@code $group}-operation.
|
||||
*
|
||||
* @see http://docs.mongodb.org/manual/reference/aggregation/group/#stage._S_group
|
||||
* @author Sebastian Herold
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
* @since 1.3
|
||||
*/
|
||||
public class GroupOperation implements FieldsExposingAggregationOperation {
|
||||
|
||||
/**
|
||||
* Holds the non-synthetic fields which are the fields of the group-id structure.
|
||||
*/
|
||||
private final ExposedFields idFields;
|
||||
|
||||
private final List<Operation> operations;
|
||||
|
||||
/**
|
||||
* Creates a new {@link GroupOperation} including the given {@link Fields}.
|
||||
*
|
||||
* @param fields must not be {@literal null}.
|
||||
*/
|
||||
public GroupOperation(Fields fields) {
|
||||
|
||||
this.idFields = ExposedFields.nonSynthetic(fields);
|
||||
this.operations = new ArrayList<Operation>();
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link GroupOperation} from the given {@link GroupOperation}.
|
||||
*
|
||||
* @param groupOperation must not be {@literal null}.
|
||||
*/
|
||||
protected GroupOperation(GroupOperation groupOperation) {
|
||||
this(groupOperation, Collections.<Operation> emptyList());
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link GroupOperation} from the given {@link GroupOperation} and the given {@link Operation}s.
|
||||
*
|
||||
* @param groupOperation
|
||||
* @param nextOperations
|
||||
*/
|
||||
private GroupOperation(GroupOperation groupOperation, List<Operation> nextOperations) {
|
||||
|
||||
Assert.notNull(groupOperation, "GroupOperation must not be null!");
|
||||
Assert.notNull(nextOperations, "NextOperations must not be null!");
|
||||
|
||||
this.idFields = groupOperation.idFields;
|
||||
this.operations = new ArrayList<Operation>(nextOperations.size() + 1);
|
||||
this.operations.addAll(groupOperation.operations);
|
||||
this.operations.addAll(nextOperations);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link GroupOperation} from the current one adding the given {@link Operation}.
|
||||
*
|
||||
* @param operation must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
protected GroupOperation and(Operation operation) {
|
||||
return new GroupOperation(this, Arrays.asList(operation));
|
||||
}
|
||||
|
||||
/**
|
||||
* Builder for {@link GroupOperation}s on a field.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
public static final class GroupOperationBuilder {
|
||||
|
||||
private final GroupOperation groupOperation;
|
||||
private final Operation operation;
|
||||
|
||||
/**
|
||||
* Creates a new {@link GroupOperationBuilder} from the given {@link GroupOperation} and {@link Operation}.
|
||||
*
|
||||
* @param groupOperation
|
||||
* @param operation
|
||||
*/
|
||||
private GroupOperationBuilder(GroupOperation groupOperation, Operation operation) {
|
||||
|
||||
Assert.notNull(groupOperation, "GroupOperation must not be null!");
|
||||
Assert.notNull(operation, "Operation must not be null!");
|
||||
|
||||
this.groupOperation = groupOperation;
|
||||
this.operation = operation;
|
||||
}
|
||||
|
||||
/**
|
||||
* Allows to specify an alias for the new-operation operation.
|
||||
*
|
||||
* @param alias
|
||||
* @return
|
||||
*/
|
||||
public GroupOperation as(String alias) {
|
||||
return this.groupOperation.and(operation.withAlias(alias));
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Generates an {@link GroupOperationBuilder} for a {@code $sum}-expression.
|
||||
* <p>
|
||||
* Count expressions are emulated via {@code $sum: 1}.
|
||||
* <p>
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public GroupOperationBuilder count() {
|
||||
return newBuilder(GroupOps.SUM, null, 1);
|
||||
}
|
||||
|
||||
/**
|
||||
* Generates an {@link GroupOperationBuilder} for a {@code $sum}-expression for the given field-reference.
|
||||
*
|
||||
* @param reference
|
||||
* @return
|
||||
*/
|
||||
public GroupOperationBuilder sum(String reference) {
|
||||
return sum(reference, null);
|
||||
}
|
||||
|
||||
private GroupOperationBuilder sum(String reference, Object value) {
|
||||
return newBuilder(GroupOps.SUM, reference, value);
|
||||
}
|
||||
|
||||
/**
|
||||
* Generates an {@link GroupOperationBuilder} for an {@code $add_to_set}-expression for the given field-reference.
|
||||
*
|
||||
* @param reference
|
||||
* @return
|
||||
*/
|
||||
public GroupOperationBuilder addToSet(String reference) {
|
||||
return addToSet(reference, null);
|
||||
}
|
||||
|
||||
/**
|
||||
* Generates an {@link GroupOperationBuilder} for an {@code $add_to_set}-expression for the given value.
|
||||
*
|
||||
* @param value
|
||||
* @return
|
||||
*/
|
||||
public GroupOperationBuilder addToSet(Object value) {
|
||||
return addToSet(null, value);
|
||||
}
|
||||
|
||||
private GroupOperationBuilder addToSet(String reference, Object value) {
|
||||
return newBuilder(GroupOps.ADD_TO_SET, reference, value);
|
||||
}
|
||||
|
||||
/**
|
||||
* Generates an {@link GroupOperationBuilder} for an {@code $last}-expression for the given field-reference.
|
||||
*
|
||||
* @param reference
|
||||
* @return
|
||||
*/
|
||||
public GroupOperationBuilder last(String reference) {
|
||||
return newBuilder(GroupOps.LAST, reference, null);
|
||||
}
|
||||
|
||||
/**
|
||||
* Generates an {@link GroupOperationBuilder} for a {@code $first}-expression for the given field-reference.
|
||||
*
|
||||
* @param reference
|
||||
* @return
|
||||
*/
|
||||
public GroupOperationBuilder first(String reference) {
|
||||
return newBuilder(GroupOps.FIRST, reference, null);
|
||||
}
|
||||
|
||||
/**
|
||||
* Generates an {@link GroupOperationBuilder} for an {@code $avg}-expression for the given field-reference.
|
||||
*
|
||||
* @param reference
|
||||
* @return
|
||||
*/
|
||||
public GroupOperationBuilder avg(String reference) {
|
||||
return newBuilder(GroupOps.AVG, reference, null);
|
||||
}
|
||||
|
||||
/**
|
||||
* Generates an {@link GroupOperationBuilder} for an {@code $push}-expression for the given field-reference.
|
||||
*
|
||||
* @param reference
|
||||
* @return
|
||||
*/
|
||||
public GroupOperationBuilder push(String reference) {
|
||||
return push(reference, null);
|
||||
}
|
||||
|
||||
/**
|
||||
* Generates an {@link GroupOperationBuilder} for an {@code $push}-expression for the given value.
|
||||
*
|
||||
* @param value
|
||||
* @return
|
||||
*/
|
||||
public GroupOperationBuilder push(Object value) {
|
||||
return push(null, value);
|
||||
}
|
||||
|
||||
private GroupOperationBuilder push(String reference, Object value) {
|
||||
return newBuilder(GroupOps.PUSH, reference, value);
|
||||
}
|
||||
|
||||
/**
|
||||
* Generates an {@link GroupOperationBuilder} for an {@code $min}-expression that for the given field-reference.
|
||||
*
|
||||
* @param reference
|
||||
* @return
|
||||
*/
|
||||
public GroupOperationBuilder min(String reference) {
|
||||
return newBuilder(GroupOps.MIN, reference, null);
|
||||
}
|
||||
|
||||
/**
|
||||
* Generates an {@link GroupOperationBuilder} for an {@code $max}-expression that for the given field-reference.
|
||||
*
|
||||
* @param reference
|
||||
* @return
|
||||
*/
|
||||
public GroupOperationBuilder max(String reference) {
|
||||
return newBuilder(GroupOps.MAX, reference, null);
|
||||
}
|
||||
|
||||
private GroupOperationBuilder newBuilder(Keyword keyword, String reference, Object value) {
|
||||
return new GroupOperationBuilder(this, new Operation(keyword, null, reference, value));
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getFields()
|
||||
*/
|
||||
@Override
|
||||
public ExposedFields getFields() {
|
||||
|
||||
ExposedFields fields = this.idFields.and(new ExposedField(Fields.UNDERSCORE_ID, true));
|
||||
|
||||
for (Operation operation : operations) {
|
||||
fields = fields.and(operation.asField());
|
||||
}
|
||||
|
||||
return fields;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
|
||||
*/
|
||||
@Override
|
||||
public com.mongodb.DBObject toDBObject(AggregationOperationContext context) {
|
||||
|
||||
BasicDBObject operationObject = new BasicDBObject();
|
||||
|
||||
if (idFields.exposesNoNonSyntheticFields()) {
|
||||
|
||||
operationObject.put(Fields.UNDERSCORE_ID, null);
|
||||
|
||||
} else if (idFields.exposesSingleNonSyntheticFieldOnly()) {
|
||||
|
||||
FieldReference reference = context.getReference(idFields.iterator().next());
|
||||
operationObject.put(Fields.UNDERSCORE_ID, reference.toString());
|
||||
|
||||
} else {
|
||||
|
||||
BasicDBObject inner = new BasicDBObject();
|
||||
|
||||
for (ExposedField field : idFields) {
|
||||
FieldReference reference = context.getReference(field);
|
||||
inner.put(field.getName(), reference.toString());
|
||||
}
|
||||
|
||||
operationObject.put(Fields.UNDERSCORE_ID, inner);
|
||||
}
|
||||
|
||||
for (Operation operation : operations) {
|
||||
operationObject.putAll(operation.toDBObject(context));
|
||||
}
|
||||
|
||||
return new BasicDBObject("$group", operationObject);
|
||||
}
|
||||
|
||||
interface Keyword {
|
||||
|
||||
String toString();
|
||||
}
|
||||
|
||||
private static enum GroupOps implements Keyword {
|
||||
|
||||
SUM, LAST, FIRST, PUSH, AVG, MIN, MAX, ADD_TO_SET, COUNT;
|
||||
|
||||
@Override
|
||||
public String toString() {
|
||||
|
||||
String[] parts = name().split("_");
|
||||
|
||||
StringBuilder builder = new StringBuilder();
|
||||
|
||||
for (String part : parts) {
|
||||
String lowerCase = part.toLowerCase(Locale.US);
|
||||
builder.append(builder.length() == 0 ? lowerCase : StringUtils.capitalize(lowerCase));
|
||||
}
|
||||
|
||||
return "$" + builder.toString();
|
||||
}
|
||||
}
|
||||
|
||||
static class Operation implements AggregationOperation {
|
||||
|
||||
private final Keyword op;
|
||||
private final String key;
|
||||
private final String reference;
|
||||
private final Object value;
|
||||
|
||||
public Operation(Keyword op, String key, String reference, Object value) {
|
||||
|
||||
this.op = op;
|
||||
this.key = key;
|
||||
this.reference = reference;
|
||||
this.value = value;
|
||||
}
|
||||
|
||||
public Operation withAlias(String key) {
|
||||
return new Operation(op, key, reference, value);
|
||||
}
|
||||
|
||||
public ExposedField asField() {
|
||||
return new ExposedField(key, true);
|
||||
}
|
||||
|
||||
public DBObject toDBObject(AggregationOperationContext context) {
|
||||
return new BasicDBObject(key, new BasicDBObject(op.toString(), getValue(context)));
|
||||
}
|
||||
|
||||
public Object getValue(AggregationOperationContext context) {
|
||||
|
||||
if (reference == null) {
|
||||
return value;
|
||||
}
|
||||
|
||||
if (Aggregation.SystemVariable.isReferingToSystemVariable(reference)) {
|
||||
return reference;
|
||||
}
|
||||
|
||||
return context.getReference(reference).toString();
|
||||
}
|
||||
|
||||
@Override
|
||||
public String toString() {
|
||||
return "Operation [op=" + op + ", key=" + key + ", reference=" + reference + ", value=" + value + "]";
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,52 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.aggregation;
|
||||
|
||||
import org.springframework.util.Assert;
|
||||
|
||||
import com.mongodb.BasicDBObject;
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
* Encapsulates the {@code $limit}-operation
|
||||
*
|
||||
* @see http://docs.mongodb.org/manual/reference/aggregation/limit/
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
* @since 1.3
|
||||
*/
|
||||
class LimitOperation implements AggregationOperation {
|
||||
|
||||
private final long maxElements;
|
||||
|
||||
/**
|
||||
* @param maxElements Number of documents to consider.
|
||||
*/
|
||||
public LimitOperation(long maxElements) {
|
||||
|
||||
Assert.isTrue(maxElements >= 0, "Maximum number of elements must be greater or equal to zero!");
|
||||
this.maxElements = maxElements;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
|
||||
*/
|
||||
@Override
|
||||
public DBObject toDBObject(AggregationOperationContext context) {
|
||||
return new BasicDBObject("$limit", maxElements);
|
||||
}
|
||||
}
|
||||
@@ -1,56 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.aggregation;
|
||||
|
||||
import org.springframework.data.mongodb.core.query.Criteria;
|
||||
import org.springframework.util.Assert;
|
||||
|
||||
import com.mongodb.BasicDBObject;
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
* Encapsulates the {@code $match}-operation
|
||||
*
|
||||
* @see http://docs.mongodb.org/manual/reference/aggregation/match/
|
||||
* @author Sebastian Herold
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
* @since 1.3
|
||||
*/
|
||||
public class MatchOperation implements AggregationOperation {
|
||||
|
||||
private final Criteria criteria;
|
||||
|
||||
/**
|
||||
* Creates a new {@link MatchOperation} for the given {@link Criteria}.
|
||||
*
|
||||
* @param criteria must not be {@literal null}.
|
||||
*/
|
||||
public MatchOperation(Criteria criteria) {
|
||||
|
||||
Assert.notNull(criteria, "Criteria must not be null!");
|
||||
this.criteria = criteria;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
|
||||
*/
|
||||
@Override
|
||||
public DBObject toDBObject(AggregationOperationContext context) {
|
||||
return new BasicDBObject("$match", context.getMappedObject(criteria.getCriteriaObject()));
|
||||
}
|
||||
}
|
||||
@@ -1,940 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013-2014 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.aggregation;
|
||||
|
||||
import java.util.ArrayList;
|
||||
import java.util.Arrays;
|
||||
import java.util.Collections;
|
||||
import java.util.List;
|
||||
|
||||
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
|
||||
import org.springframework.data.mongodb.core.aggregation.ProjectionOperation.ProjectionOperationBuilder.FieldProjection;
|
||||
import org.springframework.util.Assert;
|
||||
|
||||
import com.mongodb.BasicDBObject;
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
* Encapsulates the aggregation framework {@code $project}-operation. Projection of field to be used in an
|
||||
* {@link Aggregation}. A projection is similar to a {@link Field} inclusion/exclusion but more powerful. It can
|
||||
* generate new fields, change values of given field etc.
|
||||
* <p>
|
||||
*
|
||||
* @see http://docs.mongodb.org/manual/reference/aggregation/project/
|
||||
* @author Tobias Trelle
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
* @since 1.3
|
||||
*/
|
||||
public class ProjectionOperation implements FieldsExposingAggregationOperation {
|
||||
|
||||
private static final List<Projection> NONE = Collections.emptyList();
|
||||
private static final String EXCLUSION_ERROR = "Exclusion of field %s not allowed. Projections by the mongodb "
|
||||
+ "aggregation framework only support the exclusion of the %s field!";
|
||||
|
||||
private final List<Projection> projections;
|
||||
|
||||
/**
|
||||
* Creates a new empty {@link ProjectionOperation}.
|
||||
*/
|
||||
public ProjectionOperation() {
|
||||
this(NONE, NONE);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link ProjectionOperation} including the given {@link Fields}.
|
||||
*
|
||||
* @param fields must not be {@literal null}.
|
||||
*/
|
||||
public ProjectionOperation(Fields fields) {
|
||||
this(NONE, ProjectionOperationBuilder.FieldProjection.from(fields));
|
||||
}
|
||||
|
||||
/**
|
||||
* Copy constructor to allow building up {@link ProjectionOperation} instances from already existing
|
||||
* {@link Projection}s.
|
||||
*
|
||||
* @param current must not be {@literal null}.
|
||||
* @param projections must not be {@literal null}.
|
||||
*/
|
||||
private ProjectionOperation(List<? extends Projection> current, List<? extends Projection> projections) {
|
||||
|
||||
Assert.notNull(current, "Current projections must not be null!");
|
||||
Assert.notNull(projections, "Projections must not be null!");
|
||||
|
||||
this.projections = new ArrayList<ProjectionOperation.Projection>(current.size() + projections.size());
|
||||
this.projections.addAll(current);
|
||||
this.projections.addAll(projections);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link ProjectionOperation} with the current {@link Projection}s and the given one.
|
||||
*
|
||||
* @param projection must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
private ProjectionOperation and(Projection projection) {
|
||||
return new ProjectionOperation(this.projections, Arrays.asList(projection));
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link ProjectionOperation} with the current {@link Projection}s replacing the last current one with
|
||||
* the given one.
|
||||
*
|
||||
* @param projection must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
private ProjectionOperation andReplaceLastOneWith(Projection projection) {
|
||||
|
||||
List<Projection> projections = this.projections.isEmpty() ? Collections.<Projection> emptyList() : this.projections
|
||||
.subList(0, this.projections.size() - 1);
|
||||
return new ProjectionOperation(projections, Arrays.asList(projection));
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link ProjectionOperationBuilder} to define a projection for the field with the given name.
|
||||
*
|
||||
* @param name must not be {@literal null} or empty.
|
||||
* @return
|
||||
*/
|
||||
public ProjectionOperationBuilder and(String name) {
|
||||
return new ProjectionOperationBuilder(name, this, null);
|
||||
}
|
||||
|
||||
public ExpressionProjectionOperationBuilder andExpression(String expression, Object... params) {
|
||||
return new ExpressionProjectionOperationBuilder(expression, this, params);
|
||||
}
|
||||
|
||||
/**
|
||||
* Excludes the given fields from the projection.
|
||||
*
|
||||
* @param fieldNames must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
public ProjectionOperation andExclude(String... fieldNames) {
|
||||
|
||||
for (String fieldName : fieldNames) {
|
||||
Assert.isTrue(Fields.UNDERSCORE_ID.equals(fieldName),
|
||||
String.format(EXCLUSION_ERROR, fieldName, Fields.UNDERSCORE_ID));
|
||||
}
|
||||
|
||||
List<FieldProjection> excludeProjections = FieldProjection.from(Fields.fields(fieldNames), false);
|
||||
return new ProjectionOperation(this.projections, excludeProjections);
|
||||
}
|
||||
|
||||
/**
|
||||
* Includes the given fields into the projection.
|
||||
*
|
||||
* @param fieldNames must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
public ProjectionOperation andInclude(String... fieldNames) {
|
||||
|
||||
List<FieldProjection> projections = FieldProjection.from(Fields.fields(fieldNames), true);
|
||||
return new ProjectionOperation(this.projections, projections);
|
||||
}
|
||||
|
||||
/**
|
||||
* Includes the given fields into the projection.
|
||||
*
|
||||
* @param fields must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
public ProjectionOperation andInclude(Fields fields) {
|
||||
return new ProjectionOperation(this.projections, FieldProjection.from(fields, true));
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.FieldsExposingAggregationOperation#getFields()
|
||||
*/
|
||||
@Override
|
||||
public ExposedFields getFields() {
|
||||
|
||||
ExposedFields fields = null;
|
||||
|
||||
for (Projection projection : projections) {
|
||||
ExposedField field = projection.getExposedField();
|
||||
fields = fields == null ? ExposedFields.from(field) : fields.and(field);
|
||||
}
|
||||
|
||||
return fields;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
|
||||
*/
|
||||
@Override
|
||||
public DBObject toDBObject(AggregationOperationContext context) {
|
||||
|
||||
BasicDBObject fieldObject = new BasicDBObject();
|
||||
|
||||
for (Projection projection : projections) {
|
||||
fieldObject.putAll(projection.toDBObject(context));
|
||||
}
|
||||
|
||||
return new BasicDBObject("$project", fieldObject);
|
||||
}
|
||||
|
||||
/**
|
||||
* Base class for {@link ProjectionOperationBuilder}s.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
private static abstract class AbstractProjectionOperationBuilder implements AggregationOperation {
|
||||
|
||||
protected final Object value;
|
||||
protected final ProjectionOperation operation;
|
||||
|
||||
/**
|
||||
* Creates a new {@link AbstractProjectionOperationBuilder} fot the given value and {@link ProjectionOperation}.
|
||||
*
|
||||
* @param value must not be {@literal null}.
|
||||
* @param operation must not be {@literal null}.
|
||||
*/
|
||||
public AbstractProjectionOperationBuilder(Object value, ProjectionOperation operation) {
|
||||
|
||||
Assert.notNull(value, "value must not be null or empty!");
|
||||
Assert.notNull(operation, "ProjectionOperation must not be null!");
|
||||
|
||||
this.value = value;
|
||||
this.operation = operation;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
|
||||
*/
|
||||
@Override
|
||||
public DBObject toDBObject(AggregationOperationContext context) {
|
||||
return this.operation.toDBObject(context);
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the finally to be applied {@link ProjectionOperation} with the given alias.
|
||||
*
|
||||
* @param alias will never be {@literal null} or empty.
|
||||
* @return
|
||||
*/
|
||||
public abstract ProjectionOperation as(String alias);
|
||||
}
|
||||
|
||||
/**
|
||||
* An {@link ProjectionOperationBuilder} that is used for SpEL expression based projections.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
public static class ExpressionProjectionOperationBuilder extends ProjectionOperationBuilder {
|
||||
|
||||
private final Object[] params;
|
||||
private final String expression;
|
||||
|
||||
/**
|
||||
* Creates a new {@link ExpressionProjectionOperationBuilder} for the given value, {@link ProjectionOperation} and
|
||||
* parameters.
|
||||
*
|
||||
* @param expression must not be {@literal null}.
|
||||
* @param operation must not be {@literal null}.
|
||||
* @param parameters
|
||||
*/
|
||||
public ExpressionProjectionOperationBuilder(String expression, ProjectionOperation operation, Object[] parameters) {
|
||||
|
||||
super(expression, operation, null);
|
||||
this.expression = expression;
|
||||
this.params = parameters.clone();
|
||||
}
|
||||
|
||||
/* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.ProjectionOperationBuilder#project(java.lang.String, java.lang.Object[])
|
||||
*/
|
||||
@Override
|
||||
public ProjectionOperationBuilder project(String operation, final Object... values) {
|
||||
|
||||
OperationProjection operationProjection = new OperationProjection(Fields.field(value.toString()), operation,
|
||||
values) {
|
||||
@Override
|
||||
protected List<Object> getOperationArguments(AggregationOperationContext context) {
|
||||
|
||||
List<Object> result = new ArrayList<Object>(values.length + 1);
|
||||
result.add(ExpressionProjection.toMongoExpression(context,
|
||||
ExpressionProjectionOperationBuilder.this.expression, ExpressionProjectionOperationBuilder.this.params));
|
||||
result.addAll(Arrays.asList(values));
|
||||
|
||||
return result;
|
||||
}
|
||||
};
|
||||
|
||||
return new ProjectionOperationBuilder(value, this.operation.and(operationProjection), operationProjection);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.AbstractProjectionOperationBuilder#as(java.lang.String)
|
||||
*/
|
||||
@Override
|
||||
public ProjectionOperation as(String alias) {
|
||||
|
||||
Field expressionField = Fields.field(alias, alias);
|
||||
return this.operation.and(new ExpressionProjection(expressionField, this.value.toString(), params));
|
||||
}
|
||||
|
||||
/**
|
||||
* A {@link Projection} based on a SpEL expression.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
static class ExpressionProjection extends Projection {
|
||||
|
||||
private static final SpelExpressionTransformer TRANSFORMER = new SpelExpressionTransformer();
|
||||
|
||||
private final String expression;
|
||||
private final Object[] params;
|
||||
|
||||
/**
|
||||
* Creates a new {@link ExpressionProjection} for the given field, SpEL expression and parameters.
|
||||
*
|
||||
* @param field must not be {@literal null}.
|
||||
* @param expression must not be {@literal null} or empty.
|
||||
* @param parameters must not be {@literal null}.
|
||||
*/
|
||||
public ExpressionProjection(Field field, String expression, Object[] parameters) {
|
||||
|
||||
super(field);
|
||||
|
||||
Assert.hasText(expression, "Expression must not be null!");
|
||||
Assert.notNull(parameters, "Parameters must not be null!");
|
||||
|
||||
this.expression = expression;
|
||||
this.params = parameters.clone();
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.Projection#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
|
||||
*/
|
||||
@Override
|
||||
public DBObject toDBObject(AggregationOperationContext context) {
|
||||
return new BasicDBObject(getExposedField().getName(), toMongoExpression(context, expression, params));
|
||||
}
|
||||
|
||||
protected static Object toMongoExpression(AggregationOperationContext context, String expression, Object[] params) {
|
||||
return TRANSFORMER.transform(expression, context, params);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Builder for {@link ProjectionOperation}s on a field.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
public static class ProjectionOperationBuilder extends AbstractProjectionOperationBuilder {
|
||||
|
||||
private static final String NUMBER_NOT_NULL = "Number must not be null!";
|
||||
private static final String FIELD_REFERENCE_NOT_NULL = "Field reference must not be null!";
|
||||
|
||||
private final String name;
|
||||
private final OperationProjection previousProjection;
|
||||
|
||||
/**
|
||||
* Creates a new {@link ProjectionOperationBuilder} for the field with the given name on top of the given
|
||||
* {@link ProjectionOperation}.
|
||||
*
|
||||
* @param name must not be {@literal null} or empty.
|
||||
* @param operation must not be {@literal null}.
|
||||
* @param previousProjection the previous operation projection, may be {@literal null}.
|
||||
*/
|
||||
public ProjectionOperationBuilder(String name, ProjectionOperation operation, OperationProjection previousProjection) {
|
||||
super(name, operation);
|
||||
|
||||
this.name = name;
|
||||
this.previousProjection = previousProjection;
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link ProjectionOperationBuilder} for the field with the given value on top of the given
|
||||
* {@link ProjectionOperation}.
|
||||
*
|
||||
* @param value
|
||||
* @param operation
|
||||
* @param previousProjection
|
||||
*/
|
||||
protected ProjectionOperationBuilder(Object value, ProjectionOperation operation,
|
||||
OperationProjection previousProjection) {
|
||||
|
||||
super(value, operation);
|
||||
|
||||
this.name = null;
|
||||
this.previousProjection = previousProjection;
|
||||
}
|
||||
|
||||
/**
|
||||
* Projects the result of the previous operation onto the current field. Will automatically add an exclusion for
|
||||
* {@code _id} as what would be held in it by default will now go into the field just projected into.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public ProjectionOperation previousOperation() {
|
||||
|
||||
return this.operation.andExclude(Fields.UNDERSCORE_ID) //
|
||||
.and(new PreviousOperationProjection(name));
|
||||
}
|
||||
|
||||
/**
|
||||
* Defines a nested field binding for the current field.
|
||||
*
|
||||
* @param fields must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
public ProjectionOperation nested(Fields fields) {
|
||||
return this.operation.and(new NestedFieldProjection(name, fields));
|
||||
}
|
||||
|
||||
/**
|
||||
* Allows to specify an alias for the previous projection operation.
|
||||
*
|
||||
* @param string
|
||||
* @return
|
||||
*/
|
||||
@Override
|
||||
public ProjectionOperation as(String alias) {
|
||||
|
||||
if (this.previousProjection != null) {
|
||||
return this.operation.andReplaceLastOneWith(this.previousProjection.withAlias(alias));
|
||||
} else {
|
||||
return this.operation.and(new FieldProjection(Fields.field(alias, name), null));
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Generates an {@code $add} expression that adds the given number to the previously mentioned field.
|
||||
*
|
||||
* @param number
|
||||
* @return
|
||||
*/
|
||||
public ProjectionOperationBuilder plus(Number number) {
|
||||
|
||||
Assert.notNull(number, NUMBER_NOT_NULL);
|
||||
return project("add", number);
|
||||
}
|
||||
|
||||
/**
|
||||
* Generates an {@code $add} expression that adds the value of the given field to the previously mentioned field.
|
||||
*
|
||||
* @param fieldReference
|
||||
* @return
|
||||
*/
|
||||
public ProjectionOperationBuilder plus(String fieldReference) {
|
||||
|
||||
Assert.notNull(fieldReference, "Field reference must not be null!");
|
||||
return project("add", Fields.field(fieldReference));
|
||||
}
|
||||
|
||||
/**
|
||||
* Generates an {@code $subtract} expression that subtracts the given number to the previously mentioned field.
|
||||
*
|
||||
* @param number
|
||||
* @return
|
||||
*/
|
||||
public ProjectionOperationBuilder minus(Number number) {
|
||||
|
||||
Assert.notNull(number, "Number must not be null!");
|
||||
return project("subtract", number);
|
||||
}
|
||||
|
||||
/**
|
||||
* Generates an {@code $subtract} expression that subtracts the value of the given field to the previously mentioned
|
||||
* field.
|
||||
*
|
||||
* @param fieldReference
|
||||
* @return
|
||||
*/
|
||||
public ProjectionOperationBuilder minus(String fieldReference) {
|
||||
|
||||
Assert.notNull(fieldReference, FIELD_REFERENCE_NOT_NULL);
|
||||
return project("subtract", Fields.field(fieldReference));
|
||||
}
|
||||
|
||||
/**
|
||||
* Generates an {@code $multiply} expression that multiplies the given number with the previously mentioned field.
|
||||
*
|
||||
* @param number
|
||||
* @return
|
||||
*/
|
||||
public ProjectionOperationBuilder multiply(Number number) {
|
||||
|
||||
Assert.notNull(number, NUMBER_NOT_NULL);
|
||||
return project("multiply", number);
|
||||
}
|
||||
|
||||
/**
|
||||
* Generates an {@code $multiply} expression that multiplies the value of the given field with the previously
|
||||
* mentioned field.
|
||||
*
|
||||
* @param fieldReference
|
||||
* @return
|
||||
*/
|
||||
public ProjectionOperationBuilder multiply(String fieldReference) {
|
||||
|
||||
Assert.notNull(fieldReference, FIELD_REFERENCE_NOT_NULL);
|
||||
return project("multiply", Fields.field(fieldReference));
|
||||
}
|
||||
|
||||
/**
|
||||
* Generates an {@code $divide} expression that divides the previously mentioned field by the given number.
|
||||
*
|
||||
* @param number
|
||||
* @return
|
||||
*/
|
||||
public ProjectionOperationBuilder divide(Number number) {
|
||||
|
||||
Assert.notNull(number, FIELD_REFERENCE_NOT_NULL);
|
||||
Assert.isTrue(Math.abs(number.intValue()) != 0, "Number must not be zero!");
|
||||
return project("divide", number);
|
||||
}
|
||||
|
||||
/**
|
||||
* Generates an {@code $divide} expression that divides the value of the given field by the previously mentioned
|
||||
* field.
|
||||
*
|
||||
* @param fieldReference
|
||||
* @return
|
||||
*/
|
||||
public ProjectionOperationBuilder divide(String fieldReference) {
|
||||
|
||||
Assert.notNull(fieldReference, FIELD_REFERENCE_NOT_NULL);
|
||||
return project("divide", Fields.field(fieldReference));
|
||||
}
|
||||
|
||||
/**
|
||||
* Generates an {@code $mod} expression that divides the previously mentioned field by the given number and returns
|
||||
* the remainder.
|
||||
*
|
||||
* @param number
|
||||
* @return
|
||||
*/
|
||||
public ProjectionOperationBuilder mod(Number number) {
|
||||
|
||||
Assert.notNull(number, NUMBER_NOT_NULL);
|
||||
Assert.isTrue(Math.abs(number.intValue()) != 0, "Number must not be zero!");
|
||||
return project("mod", number);
|
||||
}
|
||||
|
||||
/**
|
||||
* Generates an {@code $mod} expression that divides the value of the given field by the previously mentioned field
|
||||
* and returns the remainder.
|
||||
*
|
||||
* @param fieldReference
|
||||
* @return
|
||||
*/
|
||||
public ProjectionOperationBuilder mod(String fieldReference) {
|
||||
|
||||
Assert.notNull(fieldReference, FIELD_REFERENCE_NOT_NULL);
|
||||
return project("mod", Fields.field(fieldReference));
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
|
||||
*/
|
||||
@Override
|
||||
public DBObject toDBObject(AggregationOperationContext context) {
|
||||
return this.operation.toDBObject(context);
|
||||
}
|
||||
|
||||
/**
|
||||
* Adds a generic projection for the current field.
|
||||
*
|
||||
* @param operation the operation key, e.g. {@code $add}.
|
||||
* @param values the values to be set for the projection operation.
|
||||
* @return
|
||||
*/
|
||||
public ProjectionOperationBuilder project(String operation, Object... values) {
|
||||
OperationProjection operationProjection = new OperationProjection(Fields.field(value.toString()), operation,
|
||||
values);
|
||||
return new ProjectionOperationBuilder(value, this.operation.and(operationProjection), operationProjection);
|
||||
}
|
||||
|
||||
/**
|
||||
* A {@link Projection} to pull in the result of the previous operation.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
static class PreviousOperationProjection extends Projection {
|
||||
|
||||
private final String name;
|
||||
|
||||
/**
|
||||
* Creates a new {@link PreviousOperationProjection} for the field with the given name.
|
||||
*
|
||||
* @param name must not be {@literal null} or empty.
|
||||
*/
|
||||
public PreviousOperationProjection(String name) {
|
||||
super(Fields.field(name));
|
||||
this.name = name;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.Projection#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
|
||||
*/
|
||||
@Override
|
||||
public DBObject toDBObject(AggregationOperationContext context) {
|
||||
return new BasicDBObject(name, Fields.UNDERSCORE_ID_REF);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* A {@link FieldProjection} to map a result of a previous {@link AggregationOperation} to a new field.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
static class FieldProjection extends Projection {
|
||||
|
||||
private final Field field;
|
||||
private final Object value;
|
||||
|
||||
/**
|
||||
* Creates a new {@link FieldProjection} for the field of the given name, assigning the given value.
|
||||
*
|
||||
* @param name must not be {@literal null} or empty.
|
||||
* @param value
|
||||
*/
|
||||
public FieldProjection(String name, Object value) {
|
||||
this(Fields.field(name), value);
|
||||
}
|
||||
|
||||
private FieldProjection(Field field, Object value) {
|
||||
|
||||
super(field);
|
||||
|
||||
this.field = field;
|
||||
this.value = value;
|
||||
}
|
||||
|
||||
/**
|
||||
* Factory method to easily create {@link FieldProjection}s for the given {@link Fields}. Fields are projected as
|
||||
* references with their given name. A field {@code foo} will be projected as: {@code foo : 1 } .
|
||||
*
|
||||
* @param fields the {@link Fields} to in- or exclude, must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
public static List<? extends Projection> from(Fields fields) {
|
||||
return from(fields, null);
|
||||
}
|
||||
|
||||
/**
|
||||
* Factory method to easily create {@link FieldProjection}s for the given {@link Fields}.
|
||||
*
|
||||
* @param fields the {@link Fields} to in- or exclude, must not be {@literal null}.
|
||||
* @param value to use for the given field.
|
||||
* @return
|
||||
*/
|
||||
public static List<FieldProjection> from(Fields fields, Object value) {
|
||||
|
||||
Assert.notNull(fields, "Fields must not be null!");
|
||||
List<FieldProjection> projections = new ArrayList<FieldProjection>();
|
||||
|
||||
for (Field field : fields) {
|
||||
projections.add(new FieldProjection(field, value));
|
||||
}
|
||||
|
||||
return projections;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.Projection#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
|
||||
*/
|
||||
@Override
|
||||
public DBObject toDBObject(AggregationOperationContext context) {
|
||||
return new BasicDBObject(field.getName(), renderFieldValue(context));
|
||||
}
|
||||
|
||||
private Object renderFieldValue(AggregationOperationContext context) {
|
||||
|
||||
// implicit reference or explicit include?
|
||||
if (value == null || Boolean.TRUE.equals(value)) {
|
||||
|
||||
if (Aggregation.SystemVariable.isReferingToSystemVariable(field.getTarget())) {
|
||||
return field.getTarget();
|
||||
}
|
||||
|
||||
// check whether referenced field exists in the context
|
||||
return context.getReference(field).getReferenceValue();
|
||||
|
||||
} else if (Boolean.FALSE.equals(value)) {
|
||||
|
||||
// render field as excluded
|
||||
return 0;
|
||||
}
|
||||
|
||||
return value;
|
||||
}
|
||||
}
|
||||
|
||||
static class OperationProjection extends Projection {
|
||||
|
||||
private final Field field;
|
||||
private final String operation;
|
||||
private final List<Object> values;
|
||||
|
||||
/**
|
||||
* Creates a new {@link OperationProjection} for the given field.
|
||||
*
|
||||
* @param field the name of the field to add the operation projection for, must not be {@literal null} or empty.
|
||||
* @param operation the actual operation key, must not be {@literal null} or empty.
|
||||
* @param values the values to pass into the operation, must not be {@literal null}.
|
||||
*/
|
||||
public OperationProjection(Field field, String operation, Object[] values) {
|
||||
|
||||
super(field);
|
||||
|
||||
Assert.hasText(operation, "Operation must not be null or empty!");
|
||||
Assert.notNull(values, "Values must not be null!");
|
||||
|
||||
this.field = field;
|
||||
this.operation = operation;
|
||||
this.values = Arrays.asList(values);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.Projection#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
|
||||
*/
|
||||
@Override
|
||||
public DBObject toDBObject(AggregationOperationContext context) {
|
||||
|
||||
DBObject inner = new BasicDBObject("$" + operation, getOperationArguments(context));
|
||||
|
||||
return new BasicDBObject(getField().getName(), inner);
|
||||
}
|
||||
|
||||
protected List<Object> getOperationArguments(AggregationOperationContext context) {
|
||||
|
||||
List<Object> result = new ArrayList<Object>(values.size());
|
||||
result.add(context.getReference(getField().getName()).toString());
|
||||
|
||||
for (Object element : values) {
|
||||
result.add(element instanceof Field ? context.getReference((Field) element).toString() : element);
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the field that holds the {@link OperationProjection}.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
protected Field getField() {
|
||||
return field;
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new instance of this {@link OperationProjection} with the given alias.
|
||||
*
|
||||
* @param alias the alias to set
|
||||
* @return
|
||||
*/
|
||||
public OperationProjection withAlias(String alias) {
|
||||
|
||||
final Field aliasedField = Fields.field(alias, this.field.getName());
|
||||
return new OperationProjection(aliasedField, operation, values.toArray()) {
|
||||
|
||||
/* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.ProjectionOperationBuilder.OperationProjection#getField()
|
||||
*/
|
||||
@Override
|
||||
protected Field getField() {
|
||||
return aliasedField;
|
||||
}
|
||||
|
||||
@Override
|
||||
protected List<Object> getOperationArguments(AggregationOperationContext context) {
|
||||
|
||||
// We have to make sure that we use the arguments from the "previous" OperationProjection that we replace
|
||||
// with this new instance.
|
||||
|
||||
return OperationProjection.this.getOperationArguments(context);
|
||||
}
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
static class NestedFieldProjection extends Projection {
|
||||
|
||||
private final String name;
|
||||
private final Fields fields;
|
||||
|
||||
public NestedFieldProjection(String name, Fields fields) {
|
||||
|
||||
super(Fields.field(name));
|
||||
this.name = name;
|
||||
this.fields = fields;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.Projection#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
|
||||
*/
|
||||
@Override
|
||||
public DBObject toDBObject(AggregationOperationContext context) {
|
||||
|
||||
DBObject nestedObject = new BasicDBObject();
|
||||
|
||||
for (Field field : fields) {
|
||||
nestedObject.put(field.getName(), context.getReference(field.getTarget()).toString());
|
||||
}
|
||||
|
||||
return new BasicDBObject(name, nestedObject);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Extracts the minute from a date expression.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public ProjectionOperationBuilder extractMinute() {
|
||||
return project("minute");
|
||||
}
|
||||
|
||||
/**
|
||||
* Extracts the hour from a date expression.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public ProjectionOperationBuilder extractHour() {
|
||||
return project("hour");
|
||||
}
|
||||
|
||||
/**
|
||||
* Extracts the second from a date expression.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public ProjectionOperationBuilder extractSecond() {
|
||||
return project("second");
|
||||
}
|
||||
|
||||
/**
|
||||
* Extracts the millisecond from a date expression.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public ProjectionOperationBuilder extractMillisecond() {
|
||||
return project("millisecond");
|
||||
}
|
||||
|
||||
/**
|
||||
* Extracts the year from a date expression.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public ProjectionOperationBuilder extractYear() {
|
||||
return project("year");
|
||||
}
|
||||
|
||||
/**
|
||||
* Extracts the month from a date expression.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public ProjectionOperationBuilder extractMonth() {
|
||||
return project("month");
|
||||
}
|
||||
|
||||
/**
|
||||
* Extracts the week from a date expression.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public ProjectionOperationBuilder extractWeek() {
|
||||
return project("week");
|
||||
}
|
||||
|
||||
/**
|
||||
* Extracts the dayOfYear from a date expression.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public ProjectionOperationBuilder extractDayOfYear() {
|
||||
return project("dayOfYear");
|
||||
}
|
||||
|
||||
/**
|
||||
* Extracts the dayOfMonth from a date expression.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public ProjectionOperationBuilder extractDayOfMonth() {
|
||||
return project("dayOfMonth");
|
||||
}
|
||||
|
||||
/**
|
||||
* Extracts the dayOfWeek from a date expression.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public ProjectionOperationBuilder extractDayOfWeek() {
|
||||
return project("dayOfWeek");
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Base class for {@link Projection} implementations.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
private static abstract class Projection {
|
||||
|
||||
private final ExposedField field;
|
||||
|
||||
/**
|
||||
* Creates new {@link Projection} for the given {@link Field}.
|
||||
*
|
||||
* @param field must not be {@literal null}.
|
||||
*/
|
||||
public Projection(Field field) {
|
||||
|
||||
Assert.notNull(field, "Field must not be null!");
|
||||
this.field = new ExposedField(field, true);
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the field exposed by the {@link Projection}.
|
||||
*
|
||||
* @return will never be {@literal null}.
|
||||
*/
|
||||
public ExposedField getExposedField() {
|
||||
return field;
|
||||
}
|
||||
|
||||
/**
|
||||
* Renders the current {@link Projection} into a {@link DBObject} based on the given
|
||||
* {@link AggregationOperationContext}.
|
||||
*
|
||||
* @param context will never be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
public abstract DBObject toDBObject(AggregationOperationContext context);
|
||||
}
|
||||
}
|
||||
@@ -1,54 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.aggregation;
|
||||
|
||||
import org.springframework.util.Assert;
|
||||
|
||||
import com.mongodb.BasicDBObject;
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
* Encapsulates the aggregation framework {@code $skip}-operation.
|
||||
*
|
||||
* @see http://docs.mongodb.org/manual/reference/aggregation/skip/
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
* @since 1.3
|
||||
*/
|
||||
public class SkipOperation implements AggregationOperation {
|
||||
|
||||
private final long skipCount;
|
||||
|
||||
/**
|
||||
* Creates a new {@link SkipOperation} skipping the given number of elements.
|
||||
*
|
||||
* @param skipCount number of documents to skip.
|
||||
*/
|
||||
public SkipOperation(long skipCount) {
|
||||
|
||||
Assert.isTrue(skipCount >= 0, "Skip count must not be negative!");
|
||||
this.skipCount = skipCount;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
|
||||
*/
|
||||
@Override
|
||||
public DBObject toDBObject(AggregationOperationContext context) {
|
||||
return new BasicDBObject("$skip", skipCount);
|
||||
}
|
||||
}
|
||||
@@ -1,76 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.aggregation;
|
||||
|
||||
import org.springframework.data.domain.Sort;
|
||||
import org.springframework.data.domain.Sort.Direction;
|
||||
import org.springframework.data.domain.Sort.Order;
|
||||
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
|
||||
import org.springframework.util.Assert;
|
||||
|
||||
import com.mongodb.BasicDBObject;
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
* Encapsulates the aggregation framework {@code $sort}-operation.
|
||||
*
|
||||
* @see http://docs.mongodb.org/manual/reference/aggregation/sort/#pipe._S_sort
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
* @since 1.3
|
||||
*/
|
||||
public class SortOperation implements AggregationOperation {
|
||||
|
||||
private final Sort sort;
|
||||
|
||||
/**
|
||||
* Creates a new {@link SortOperation} for the given {@link Sort} instance.
|
||||
*
|
||||
* @param sort must not be {@literal null}.
|
||||
*/
|
||||
public SortOperation(Sort sort) {
|
||||
|
||||
Assert.notNull(sort, "Sort must not be null!");
|
||||
this.sort = sort;
|
||||
}
|
||||
|
||||
public SortOperation and(Direction direction, String... fields) {
|
||||
return and(new Sort(direction, fields));
|
||||
}
|
||||
|
||||
public SortOperation and(Sort sort) {
|
||||
return new SortOperation(this.sort.and(sort));
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
|
||||
*/
|
||||
@Override
|
||||
public DBObject toDBObject(AggregationOperationContext context) {
|
||||
|
||||
BasicDBObject object = new BasicDBObject();
|
||||
|
||||
for (Order order : sort) {
|
||||
|
||||
// Check reference
|
||||
FieldReference reference = context.getReference(order.getProperty());
|
||||
object.put(reference.getRaw(), order.isAscending() ? 1 : -1);
|
||||
}
|
||||
|
||||
return new BasicDBObject("$sort", object);
|
||||
}
|
||||
}
|
||||
@@ -1,513 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013-2014 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.aggregation;
|
||||
|
||||
import static org.springframework.data.mongodb.util.DBObjectUtils.*;
|
||||
|
||||
import java.util.ArrayList;
|
||||
import java.util.Collections;
|
||||
import java.util.List;
|
||||
|
||||
import org.springframework.core.GenericTypeResolver;
|
||||
import org.springframework.data.mongodb.core.spel.ExpressionNode;
|
||||
import org.springframework.data.mongodb.core.spel.ExpressionTransformationContextSupport;
|
||||
import org.springframework.data.mongodb.core.spel.LiteralNode;
|
||||
import org.springframework.data.mongodb.core.spel.MethodReferenceNode;
|
||||
import org.springframework.data.mongodb.core.spel.OperatorNode;
|
||||
import org.springframework.expression.spel.ExpressionState;
|
||||
import org.springframework.expression.spel.SpelNode;
|
||||
import org.springframework.expression.spel.SpelParserConfiguration;
|
||||
import org.springframework.expression.spel.ast.CompoundExpression;
|
||||
import org.springframework.expression.spel.ast.Indexer;
|
||||
import org.springframework.expression.spel.ast.InlineList;
|
||||
import org.springframework.expression.spel.ast.PropertyOrFieldReference;
|
||||
import org.springframework.expression.spel.standard.SpelExpression;
|
||||
import org.springframework.expression.spel.standard.SpelExpressionParser;
|
||||
import org.springframework.expression.spel.support.StandardEvaluationContext;
|
||||
import org.springframework.util.Assert;
|
||||
import org.springframework.util.NumberUtils;
|
||||
|
||||
import com.mongodb.BasicDBList;
|
||||
import com.mongodb.BasicDBObject;
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
* Renders the AST of a SpEL expression as a MongoDB Aggregation Framework projection expression.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
class SpelExpressionTransformer implements AggregationExpressionTransformer {
|
||||
|
||||
// TODO: remove explicit usage of a configuration once SPR-11031 gets fixed
|
||||
private static final SpelParserConfiguration CONFIG = new SpelParserConfiguration(false, false);
|
||||
private static final SpelExpressionParser PARSER = new SpelExpressionParser(CONFIG);
|
||||
private final List<ExpressionNodeConversion<? extends ExpressionNode>> conversions;
|
||||
|
||||
/**
|
||||
* Creates a new {@link SpelExpressionTransformer}.
|
||||
*/
|
||||
public SpelExpressionTransformer() {
|
||||
|
||||
List<ExpressionNodeConversion<? extends ExpressionNode>> conversions = new ArrayList<ExpressionNodeConversion<? extends ExpressionNode>>();
|
||||
conversions.add(new OperatorNodeConversion(this));
|
||||
conversions.add(new LiteralNodeConversion(this));
|
||||
conversions.add(new IndexerNodeConversion(this));
|
||||
conversions.add(new InlineListNodeConversion(this));
|
||||
conversions.add(new PropertyOrFieldReferenceNodeConversion(this));
|
||||
conversions.add(new CompoundExpressionNodeConversion(this));
|
||||
conversions.add(new MethodReferenceNodeConversion(this));
|
||||
|
||||
this.conversions = Collections.unmodifiableList(conversions);
|
||||
}
|
||||
|
||||
/**
|
||||
* Transforms the given SpEL expression to a corresponding MongoDB expression against the given
|
||||
* {@link AggregationOperationContext} {@code context}.
|
||||
* <p>
|
||||
* Exposes the given @{code params} as <code>[0] ... [n]</code>.
|
||||
*
|
||||
* @param expression must not be {@literal null}
|
||||
* @param context must not be {@literal null}
|
||||
* @param params must not be {@literal null}
|
||||
* @return
|
||||
*/
|
||||
public Object transform(String expression, AggregationOperationContext context, Object... params) {
|
||||
|
||||
Assert.notNull(expression, "Expression must not be null!");
|
||||
Assert.notNull(context, "AggregationOperationContext must not be null!");
|
||||
Assert.notNull(params, "Parameters must not be null!");
|
||||
|
||||
SpelExpression spelExpression = (SpelExpression) PARSER.parseExpression(expression);
|
||||
ExpressionState state = new ExpressionState(new StandardEvaluationContext(params), CONFIG);
|
||||
ExpressionNode node = ExpressionNode.from(spelExpression.getAST(), state);
|
||||
|
||||
return transform(new AggregationExpressionTransformationContext<ExpressionNode>(node, null, null, context));
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.spel.ExpressionTransformer#transform(org.springframework.data.mongodb.core.spel.ExpressionTransformationContextSupport)
|
||||
*/
|
||||
public Object transform(AggregationExpressionTransformationContext<ExpressionNode> context) {
|
||||
return lookupConversionFor(context.getCurrentNode()).convert(context);
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns an appropriate {@link ExpressionNodeConversion} for the given {@code node}. Throws an
|
||||
* {@link IllegalArgumentException} if no conversion could be found.
|
||||
*
|
||||
* @param node
|
||||
* @return the appropriate {@link ExpressionNodeConversion} for the given {@link ExpressionNode}.
|
||||
*/
|
||||
@SuppressWarnings("unchecked")
|
||||
private ExpressionNodeConversion<ExpressionNode> lookupConversionFor(ExpressionNode node) {
|
||||
|
||||
for (ExpressionNodeConversion<? extends ExpressionNode> candidate : conversions) {
|
||||
if (candidate.supports(node)) {
|
||||
return (ExpressionNodeConversion<ExpressionNode>) candidate;
|
||||
}
|
||||
}
|
||||
|
||||
throw new IllegalArgumentException("Unsupported Element: " + node + " Type: " + node.getClass()
|
||||
+ " You probably have a syntax error in your SpEL expression!");
|
||||
}
|
||||
|
||||
/**
|
||||
* Abstract base class for {@link SpelNode} to (Db)-object conversions.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
private static abstract class ExpressionNodeConversion<T extends ExpressionNode> implements
|
||||
AggregationExpressionTransformer {
|
||||
|
||||
private final AggregationExpressionTransformer transformer;
|
||||
private final Class<? extends ExpressionNode> nodeType;
|
||||
|
||||
/**
|
||||
* Creates a new {@link ExpressionNodeConversion}.
|
||||
*
|
||||
* @param transformer must not be {@literal null}.
|
||||
*/
|
||||
@SuppressWarnings("unchecked")
|
||||
public ExpressionNodeConversion(AggregationExpressionTransformer transformer) {
|
||||
|
||||
Assert.notNull(transformer, "Transformer must not be null!");
|
||||
|
||||
this.nodeType = (Class<? extends ExpressionNode>) GenericTypeResolver.resolveTypeArgument(this.getClass(),
|
||||
ExpressionNodeConversion.class);
|
||||
this.transformer = transformer;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns whether the current conversion supports the given {@link ExpressionNode}. By default we will match the
|
||||
* node type against the genric type the subclass types the type parameter to.
|
||||
*
|
||||
* @param node will never be {@literal null}.
|
||||
* @return true if {@literal this} conversion can be applied to the given {@code node}.
|
||||
*/
|
||||
protected boolean supports(ExpressionNode node) {
|
||||
return nodeType.equals(node.getClass());
|
||||
}
|
||||
|
||||
/**
|
||||
* Triggers the transformation for the given {@link ExpressionNode} and the given current context.
|
||||
*
|
||||
* @param node must not be {@literal null}.
|
||||
* @param context must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
protected Object transform(ExpressionNode node, AggregationExpressionTransformationContext<?> context) {
|
||||
|
||||
Assert.notNull(node, "ExpressionNode must not be null!");
|
||||
Assert.notNull(context, "AggregationExpressionTransformationContext must not be null!");
|
||||
|
||||
return transform(node, context.getParentNode(), null, context);
|
||||
}
|
||||
|
||||
/**
|
||||
* Triggers the transformation with the given new {@link ExpressionNode}, new parent node, the current operation and
|
||||
* the previous context.
|
||||
*
|
||||
* @param node must not be {@literal null}.
|
||||
* @param parent
|
||||
* @param operation
|
||||
* @param context must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
protected Object transform(ExpressionNode node, ExpressionNode parent, DBObject operation,
|
||||
AggregationExpressionTransformationContext<?> context) {
|
||||
|
||||
Assert.notNull(node, "ExpressionNode must not be null!");
|
||||
Assert.notNull(context, "AggregationExpressionTransformationContext must not be null!");
|
||||
|
||||
return transform(new AggregationExpressionTransformationContext<ExpressionNode>(node, parent, operation,
|
||||
context.getAggregationContext()));
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.NodeConversion#transform(org.springframework.data.mongodb.core.aggregation.AggregationExpressionTransformer.AggregationExpressionTransformationContext)
|
||||
*/
|
||||
@Override
|
||||
public Object transform(AggregationExpressionTransformationContext<ExpressionNode> context) {
|
||||
return transformer.transform(context);
|
||||
}
|
||||
|
||||
/**
|
||||
* Performs the actual conversion from {@link SpelNode} to the corresponding representation for MongoDB.
|
||||
*
|
||||
* @param context
|
||||
* @return
|
||||
*/
|
||||
protected abstract Object convert(AggregationExpressionTransformationContext<T> context);
|
||||
}
|
||||
|
||||
/**
|
||||
* A {@link ExpressionNodeConversion} that converts arithmetic operations.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
private static class OperatorNodeConversion extends ExpressionNodeConversion<OperatorNode> {
|
||||
|
||||
public OperatorNodeConversion(AggregationExpressionTransformer transformer) {
|
||||
super(transformer);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.SpelNodeWrapper#convertSpelNodeToMongoObjectExpression(org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.ExpressionConversionContext)
|
||||
*/
|
||||
@Override
|
||||
protected Object convert(AggregationExpressionTransformationContext<OperatorNode> context) {
|
||||
|
||||
OperatorNode currentNode = context.getCurrentNode();
|
||||
|
||||
DBObject operationObject = createOperationObjectAndAddToPreviousArgumentsIfNecessary(context, currentNode);
|
||||
Object leftResult = transform(currentNode.getLeft(), currentNode, operationObject, context);
|
||||
|
||||
if (currentNode.isUnaryMinus()) {
|
||||
return convertUnaryMinusOp(context, leftResult);
|
||||
}
|
||||
|
||||
// we deliberately ignore the RHS result
|
||||
transform(currentNode.getRight(), currentNode, operationObject, context);
|
||||
|
||||
return operationObject;
|
||||
}
|
||||
|
||||
private DBObject createOperationObjectAndAddToPreviousArgumentsIfNecessary(
|
||||
AggregationExpressionTransformationContext<OperatorNode> context, OperatorNode currentNode) {
|
||||
|
||||
DBObject nextDbObject = new BasicDBObject(currentNode.getMongoOperator(), new BasicDBList());
|
||||
|
||||
if (!context.hasPreviousOperation()) {
|
||||
return nextDbObject;
|
||||
}
|
||||
|
||||
if (context.parentIsSameOperation()) {
|
||||
|
||||
// same operator applied in a row e.g. 1 + 2 + 3 carry on with the operation and render as $add: [1, 2 ,3]
|
||||
nextDbObject = context.getPreviousOperationObject();
|
||||
} else if (!currentNode.isUnaryOperator()) {
|
||||
|
||||
// different operator -> add context object for next level to list if arguments of previous expression
|
||||
context.addToPreviousOperation(nextDbObject);
|
||||
}
|
||||
|
||||
return nextDbObject;
|
||||
}
|
||||
|
||||
private Object convertUnaryMinusOp(ExpressionTransformationContextSupport<OperatorNode> context, Object leftResult) {
|
||||
|
||||
Object result = leftResult instanceof Number ? leftResult
|
||||
: new BasicDBObject("$multiply", dbList(-1, leftResult));
|
||||
|
||||
if (leftResult != null && context.hasPreviousOperation()) {
|
||||
context.addToPreviousOperation(result);
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.SpelNodeWrapper#supports(java.lang.Class)
|
||||
*/
|
||||
@Override
|
||||
protected boolean supports(ExpressionNode node) {
|
||||
return node.isMathematicalOperation();
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* A {@link ExpressionNodeConversion} that converts indexed expressions.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
private static class IndexerNodeConversion extends ExpressionNodeConversion<ExpressionNode> {
|
||||
|
||||
public IndexerNodeConversion(AggregationExpressionTransformer transformer) {
|
||||
super(transformer);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.SpelNodeWrapper#convertSpelNodeToMongoObjectExpression(org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.ExpressionConversionContext)
|
||||
*/
|
||||
@Override
|
||||
protected Object convert(AggregationExpressionTransformationContext<ExpressionNode> context) {
|
||||
return context.addToPreviousOrReturn(context.getCurrentNode().getValue());
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.NodeConversion#supports(org.springframework.data.mongodb.core.spel.ExpressionNode)
|
||||
*/
|
||||
@Override
|
||||
protected boolean supports(ExpressionNode node) {
|
||||
return node.isOfType(Indexer.class);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* A {@link ExpressionNodeConversion} that converts in-line list expressions.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
private static class InlineListNodeConversion extends ExpressionNodeConversion<ExpressionNode> {
|
||||
|
||||
public InlineListNodeConversion(AggregationExpressionTransformer transformer) {
|
||||
super(transformer);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.SpelNodeWrapper#convertSpelNodeToMongoObjectExpression(org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.ExpressionConversionContext)
|
||||
*/
|
||||
@Override
|
||||
protected Object convert(AggregationExpressionTransformationContext<ExpressionNode> context) {
|
||||
|
||||
ExpressionNode currentNode = context.getCurrentNode();
|
||||
|
||||
if (!currentNode.hasChildren()) {
|
||||
return null;
|
||||
}
|
||||
|
||||
// just take the first item
|
||||
return transform(currentNode.getChild(0), currentNode, null, context);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.NodeConversion#supports(org.springframework.data.mongodb.core.spel.ExpressionNode)
|
||||
*/
|
||||
@Override
|
||||
protected boolean supports(ExpressionNode node) {
|
||||
return node.isOfType(InlineList.class);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* A {@link ExpressionNodeConversion} that converts property or field reference expressions.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
private static class PropertyOrFieldReferenceNodeConversion extends ExpressionNodeConversion<ExpressionNode> {
|
||||
|
||||
public PropertyOrFieldReferenceNodeConversion(AggregationExpressionTransformer transformer) {
|
||||
super(transformer);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.NodeConversion#convert(org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.ExpressionTransformationContext)
|
||||
*/
|
||||
@Override
|
||||
protected Object convert(AggregationExpressionTransformationContext<ExpressionNode> context) {
|
||||
|
||||
String fieldReference = context.getFieldReference().toString();
|
||||
return context.addToPreviousOrReturn(fieldReference);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.NodeConversion#supports(org.springframework.data.mongodb.core.spel.ExpressionNode)
|
||||
*/
|
||||
@Override
|
||||
protected boolean supports(ExpressionNode node) {
|
||||
return node.isOfType(PropertyOrFieldReference.class);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* A {@link ExpressionNodeConversion} that converts literal expressions.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
private static class LiteralNodeConversion extends ExpressionNodeConversion<LiteralNode> {
|
||||
|
||||
public LiteralNodeConversion(AggregationExpressionTransformer transformer) {
|
||||
super(transformer);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.SpelNodeWrapper#convertSpelNodeToMongoObjectExpression(org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.ExpressionConversionContext)
|
||||
*/
|
||||
@Override
|
||||
@SuppressWarnings("unchecked")
|
||||
protected Object convert(AggregationExpressionTransformationContext<LiteralNode> context) {
|
||||
|
||||
LiteralNode node = context.getCurrentNode();
|
||||
Object value = node.getValue();
|
||||
|
||||
if (context.hasPreviousOperation()) {
|
||||
|
||||
if (node.isUnaryMinus(context.getParentNode())) {
|
||||
// unary minus operator
|
||||
return NumberUtils.convertNumberToTargetClass(((Number) value).doubleValue() * -1,
|
||||
(Class<Number>) value.getClass()); // retain type, e.g. int to -int
|
||||
}
|
||||
|
||||
return context.addToPreviousOperation(value);
|
||||
}
|
||||
|
||||
return value;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.SpelNodeWrapper#supports(org.springframework.expression.spel.SpelNode)
|
||||
*/
|
||||
@Override
|
||||
protected boolean supports(ExpressionNode node) {
|
||||
return node.isLiteral();
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* A {@link ExpressionNodeConversion} that converts method reference expressions.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
private static class MethodReferenceNodeConversion extends ExpressionNodeConversion<MethodReferenceNode> {
|
||||
|
||||
public MethodReferenceNodeConversion(AggregationExpressionTransformer transformer) {
|
||||
super(transformer);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.SpelNodeWrapper#convertSpelNodeToMongoObjectExpression(org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.ExpressionConversionContext)
|
||||
*/
|
||||
@Override
|
||||
protected Object convert(AggregationExpressionTransformationContext<MethodReferenceNode> context) {
|
||||
|
||||
MethodReferenceNode node = context.getCurrentNode();
|
||||
List<Object> args = new ArrayList<Object>();
|
||||
|
||||
for (ExpressionNode childNode : node) {
|
||||
args.add(transform(childNode, context));
|
||||
}
|
||||
|
||||
return context.addToPreviousOrReturn(new BasicDBObject(node.getMethodName(), dbList(args.toArray())));
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* A {@link ExpressionNodeConversion} that converts method compound expressions.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
private static class CompoundExpressionNodeConversion extends ExpressionNodeConversion<ExpressionNode> {
|
||||
|
||||
public CompoundExpressionNodeConversion(AggregationExpressionTransformer transformer) {
|
||||
super(transformer);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.SpelNodeWrapper#convertSpelNodeToMongoObjectExpression(org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.ExpressionConversionContext)
|
||||
*/
|
||||
@Override
|
||||
protected Object convert(AggregationExpressionTransformationContext<ExpressionNode> context) {
|
||||
|
||||
ExpressionNode currentNode = context.getCurrentNode();
|
||||
|
||||
if (currentNode.hasfirstChildNotOfType(Indexer.class)) {
|
||||
// we have a property path expression like: foo.bar -> render as reference
|
||||
return context.addToPreviousOrReturn(context.getFieldReference().toString());
|
||||
}
|
||||
|
||||
return context.addToPreviousOrReturn(currentNode.getValue());
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.NodeConversion#supports(org.springframework.data.mongodb.core.spel.ExpressionNode)
|
||||
*/
|
||||
@Override
|
||||
protected boolean supports(ExpressionNode node) {
|
||||
return node.isOfType(CompoundExpression.class);
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,103 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.aggregation;
|
||||
|
||||
import static org.springframework.data.mongodb.core.aggregation.Fields.*;
|
||||
|
||||
import org.springframework.data.mapping.PropertyPath;
|
||||
import org.springframework.data.mapping.context.MappingContext;
|
||||
import org.springframework.data.mapping.context.PersistentPropertyPath;
|
||||
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
|
||||
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
|
||||
import org.springframework.data.mongodb.core.convert.QueryMapper;
|
||||
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
|
||||
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
|
||||
import org.springframework.util.Assert;
|
||||
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
* {@link AggregationOperationContext} aware of a particular type and a {@link MappingContext} to potentially translate
|
||||
* property references into document field names.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
* @since 1.3
|
||||
*/
|
||||
public class TypeBasedAggregationOperationContext implements AggregationOperationContext {
|
||||
|
||||
private final Class<?> type;
|
||||
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
|
||||
private final QueryMapper mapper;
|
||||
|
||||
/**
|
||||
* Creates a new {@link TypeBasedAggregationOperationContext} for the given type, {@link MappingContext} and
|
||||
* {@link QueryMapper}.
|
||||
*
|
||||
* @param type must not be {@literal null}.
|
||||
* @param mappingContext must not be {@literal null}.
|
||||
* @param mapper must not be {@literal null}.
|
||||
*/
|
||||
public TypeBasedAggregationOperationContext(Class<?> type,
|
||||
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext, QueryMapper mapper) {
|
||||
|
||||
Assert.notNull(type, "Type must not be null!");
|
||||
Assert.notNull(mappingContext, "MappingContext must not be null!");
|
||||
Assert.notNull(mapper, "QueryMapper must not be null!");
|
||||
|
||||
this.type = type;
|
||||
this.mappingContext = mappingContext;
|
||||
this.mapper = mapper;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getMappedObject(com.mongodb.DBObject)
|
||||
*/
|
||||
@Override
|
||||
public DBObject getMappedObject(DBObject dbObject) {
|
||||
return mapper.getMappedObject(dbObject, mappingContext.getPersistentEntity(type));
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getReference(org.springframework.data.mongodb.core.aggregation.Field)
|
||||
*/
|
||||
@Override
|
||||
public FieldReference getReference(Field field) {
|
||||
|
||||
PropertyPath.from(field.getTarget(), type);
|
||||
return getReferenceFor(field);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getReference(java.lang.String)
|
||||
*/
|
||||
@Override
|
||||
public FieldReference getReference(String name) {
|
||||
return getReferenceFor(field(name));
|
||||
}
|
||||
|
||||
private FieldReference getReferenceFor(Field field) {
|
||||
|
||||
PersistentPropertyPath<MongoPersistentProperty> propertyPath = mappingContext.getPersistentPropertyPath(
|
||||
field.getTarget(), type);
|
||||
Field mappedField = field(propertyPath.getLeafProperty().getName(),
|
||||
propertyPath.toDotPath(MongoPersistentProperty.PropertyToFieldNameConverter.INSTANCE));
|
||||
|
||||
return new FieldReference(new ExposedField(mappedField, true));
|
||||
}
|
||||
}
|
||||
@@ -1,86 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013-2014 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.aggregation;
|
||||
|
||||
import java.util.List;
|
||||
|
||||
import org.springframework.util.Assert;
|
||||
|
||||
/**
|
||||
* A {@code TypedAggregation} is a special {@link Aggregation} that holds information of the input aggregation type.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
public class TypedAggregation<I> extends Aggregation {
|
||||
|
||||
private final Class<I> inputType;
|
||||
|
||||
/**
|
||||
* Creates a new {@link TypedAggregation} from the given {@link AggregationOperation}s.
|
||||
*
|
||||
* @param inputType must not be {@literal null}.
|
||||
* @param operations must not be {@literal null} or empty.
|
||||
*/
|
||||
public TypedAggregation(Class<I> inputType, AggregationOperation... operations) {
|
||||
this(inputType, asAggregationList(operations));
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link TypedAggregation} from the given {@link AggregationOperation}s.
|
||||
*
|
||||
* @param inputType must not be {@literal null}.
|
||||
* @param operations must not be {@literal null} or empty.
|
||||
*/
|
||||
public TypedAggregation(Class<I> inputType, List<AggregationOperation> operations) {
|
||||
this(inputType, operations, DEFAULT_OPTIONS);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link TypedAggregation} from the given {@link AggregationOperation}s and the given
|
||||
* {@link AggregationOptions}.
|
||||
*
|
||||
* @param inputType must not be {@literal null}.
|
||||
* @param operations must not be {@literal null} or empty.
|
||||
* @param options must not be {@literal null}.
|
||||
*/
|
||||
public TypedAggregation(Class<I> inputType, List<AggregationOperation> operations, AggregationOptions options) {
|
||||
|
||||
super(operations, options);
|
||||
|
||||
Assert.notNull(inputType, "Input type must not be null!");
|
||||
this.inputType = inputType;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the input type for the {@link Aggregation}.
|
||||
*
|
||||
* @return the inputType will never be {@literal null}.
|
||||
*/
|
||||
public Class<I> getInputType() {
|
||||
return inputType;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.Aggregation#withOptions(org.springframework.data.mongodb.core.aggregation.AggregationOptions)
|
||||
*/
|
||||
public TypedAggregation<I> withOptions(AggregationOptions options) {
|
||||
|
||||
Assert.notNull(options, "AggregationOptions must not be null.");
|
||||
return new TypedAggregation<I>(inputType, operations, options);
|
||||
}
|
||||
}
|
||||
@@ -1,55 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.aggregation;
|
||||
|
||||
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
|
||||
import org.springframework.util.Assert;
|
||||
|
||||
import com.mongodb.BasicDBObject;
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
* Encapsulates the aggregation framework {@code $unwind}-operation.
|
||||
*
|
||||
* @see http://docs.mongodb.org/manual/reference/aggregation/unwind/#pipe._S_unwind
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
* @since 1.3
|
||||
*/
|
||||
public class UnwindOperation implements AggregationOperation {
|
||||
|
||||
private final ExposedField field;
|
||||
|
||||
/**
|
||||
* Creates a new {@link UnwindOperation} for the given {@link Field}.
|
||||
*
|
||||
* @param field must not be {@literal null}.
|
||||
*/
|
||||
public UnwindOperation(Field field) {
|
||||
|
||||
Assert.notNull(field);
|
||||
this.field = new ExposedField(field, true);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
|
||||
*/
|
||||
@Override
|
||||
public DBObject toDBObject(AggregationOperationContext context) {
|
||||
return new BasicDBObject("$unwind", context.getReference(field).toString());
|
||||
}
|
||||
}
|
||||
@@ -1,5 +0,0 @@
|
||||
/**
|
||||
* Support for the MongoDB aggregation framework.
|
||||
* @since 1.3
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.aggregation;
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2011-2014 the original author or authors.
|
||||
* Copyright 2011-2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -17,14 +17,12 @@ package org.springframework.data.mongodb.core.convert;
|
||||
|
||||
import java.util.ArrayList;
|
||||
import java.util.Arrays;
|
||||
import java.util.Collections;
|
||||
import java.util.HashMap;
|
||||
import java.util.HashSet;
|
||||
import java.util.LinkedHashSet;
|
||||
import java.util.List;
|
||||
import java.util.Locale;
|
||||
import java.util.Map;
|
||||
import java.util.Set;
|
||||
import java.util.concurrent.ConcurrentHashMap;
|
||||
import java.util.concurrent.ConcurrentMap;
|
||||
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
@@ -41,7 +39,6 @@ import org.springframework.data.convert.WritingConverter;
|
||||
import org.springframework.data.mapping.model.SimpleTypeHolder;
|
||||
import org.springframework.data.mongodb.core.convert.MongoConverters.BigDecimalToStringConverter;
|
||||
import org.springframework.data.mongodb.core.convert.MongoConverters.BigIntegerToStringConverter;
|
||||
import org.springframework.data.mongodb.core.convert.MongoConverters.DBObjectToStringConverter;
|
||||
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToBigDecimalConverter;
|
||||
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToBigIntegerConverter;
|
||||
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToURLConverter;
|
||||
@@ -57,7 +54,6 @@ import org.springframework.util.Assert;
|
||||
* .
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
public class CustomConversions {
|
||||
|
||||
@@ -69,7 +65,7 @@ public class CustomConversions {
|
||||
private final Set<ConvertiblePair> writingPairs;
|
||||
private final Set<Class<?>> customSimpleTypes;
|
||||
private final SimpleTypeHolder simpleTypeHolder;
|
||||
private final ConcurrentMap<ConvertiblePair, CacheValue> customReadTargetTypes;
|
||||
private final Map<Class<?>, HashMap<Class<?>, CacheValue>> cache;
|
||||
|
||||
private final List<Object> converters;
|
||||
|
||||
@@ -89,34 +85,26 @@ public class CustomConversions {
|
||||
|
||||
Assert.notNull(converters);
|
||||
|
||||
this.readingPairs = new LinkedHashSet<ConvertiblePair>();
|
||||
this.writingPairs = new LinkedHashSet<ConvertiblePair>();
|
||||
this.readingPairs = new HashSet<ConvertiblePair>();
|
||||
this.writingPairs = new HashSet<ConvertiblePair>();
|
||||
this.customSimpleTypes = new HashSet<Class<?>>();
|
||||
this.customReadTargetTypes = new ConcurrentHashMap<GenericConverter.ConvertiblePair, CacheValue>();
|
||||
this.cache = new HashMap<Class<?>, HashMap<Class<?>, CacheValue>>();
|
||||
|
||||
List<Object> toRegister = new ArrayList<Object>();
|
||||
this.converters = new ArrayList<Object>();
|
||||
this.converters.add(CustomToStringConverter.INSTANCE);
|
||||
this.converters.add(BigDecimalToStringConverter.INSTANCE);
|
||||
this.converters.add(StringToBigDecimalConverter.INSTANCE);
|
||||
this.converters.add(BigIntegerToStringConverter.INSTANCE);
|
||||
this.converters.add(StringToBigIntegerConverter.INSTANCE);
|
||||
this.converters.add(URLToStringConverter.INSTANCE);
|
||||
this.converters.add(StringToURLConverter.INSTANCE);
|
||||
this.converters.addAll(JodaTimeConverters.getConvertersToRegister());
|
||||
this.converters.addAll(converters);
|
||||
|
||||
// Add user provided converters to make sure they can override the defaults
|
||||
toRegister.addAll(converters);
|
||||
toRegister.add(CustomToStringConverter.INSTANCE);
|
||||
toRegister.add(BigDecimalToStringConverter.INSTANCE);
|
||||
toRegister.add(StringToBigDecimalConverter.INSTANCE);
|
||||
toRegister.add(BigIntegerToStringConverter.INSTANCE);
|
||||
toRegister.add(StringToBigIntegerConverter.INSTANCE);
|
||||
toRegister.add(URLToStringConverter.INSTANCE);
|
||||
toRegister.add(StringToURLConverter.INSTANCE);
|
||||
toRegister.add(DBObjectToStringConverter.INSTANCE);
|
||||
|
||||
toRegister.addAll(JodaTimeConverters.getConvertersToRegister());
|
||||
toRegister.addAll(GeoConverters.getConvertersToRegister());
|
||||
|
||||
for (Object c : toRegister) {
|
||||
for (Object c : this.converters) {
|
||||
registerConversion(c);
|
||||
}
|
||||
|
||||
Collections.reverse(toRegister);
|
||||
|
||||
this.converters = Collections.unmodifiableList(toRegister);
|
||||
this.simpleTypeHolder = new SimpleTypeHolder(customSimpleTypes, MongoSimpleTypes.HOLDER);
|
||||
}
|
||||
|
||||
@@ -204,25 +192,25 @@ public class CustomConversions {
|
||||
*
|
||||
* @param pair
|
||||
*/
|
||||
private void register(ConverterRegistration converterRegistration) {
|
||||
private void register(ConverterRegistration context) {
|
||||
|
||||
ConvertiblePair pair = converterRegistration.getConvertiblePair();
|
||||
ConvertiblePair pair = context.getConvertiblePair();
|
||||
|
||||
if (converterRegistration.isReading()) {
|
||||
if (context.isReading()) {
|
||||
|
||||
readingPairs.add(pair);
|
||||
|
||||
if (LOG.isWarnEnabled() && !converterRegistration.isSimpleSourceType()) {
|
||||
if (LOG.isWarnEnabled() && !context.isSimpleSourceType()) {
|
||||
LOG.warn(String.format(READ_CONVERTER_NOT_SIMPLE, pair.getSourceType(), pair.getTargetType()));
|
||||
}
|
||||
}
|
||||
|
||||
if (converterRegistration.isWriting()) {
|
||||
if (context.isWriting()) {
|
||||
|
||||
writingPairs.add(pair);
|
||||
customSimpleTypes.add(pair.getSourceType());
|
||||
|
||||
if (LOG.isWarnEnabled() && !converterRegistration.isSimpleTargetType()) {
|
||||
if (LOG.isWarnEnabled() && !context.isSimpleTargetType()) {
|
||||
LOG.warn(String.format(WRITE_CONVERTER_NOT_SIMPLE, pair.getSourceType(), pair.getTargetType()));
|
||||
}
|
||||
}
|
||||
@@ -232,11 +220,11 @@ public class CustomConversions {
|
||||
* Returns the target type to convert to in case we have a custom conversion registered to convert the given source
|
||||
* type into a Mongo native one.
|
||||
*
|
||||
* @param sourceType must not be {@literal null}
|
||||
* @param source must not be {@literal null}
|
||||
* @return
|
||||
*/
|
||||
public Class<?> getCustomWriteTarget(Class<?> sourceType) {
|
||||
return getCustomWriteTarget(sourceType, null);
|
||||
public Class<?> getCustomWriteTarget(Class<?> source) {
|
||||
return getCustomWriteTarget(source, null);
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -244,78 +232,71 @@ public class CustomConversions {
|
||||
* oth the given expected type though. If {@code expectedTargetType} is {@literal null} we will simply return the
|
||||
* first target type matching or {@literal null} if no conversion can be found.
|
||||
*
|
||||
* @param sourceType must not be {@literal null}
|
||||
* @param requestedTargetType
|
||||
* @param source must not be {@literal null}
|
||||
* @param expectedTargetType
|
||||
* @return
|
||||
*/
|
||||
public Class<?> getCustomWriteTarget(Class<?> sourceType, Class<?> requestedTargetType) {
|
||||
|
||||
Assert.notNull(sourceType);
|
||||
|
||||
return getCustomTarget(sourceType, requestedTargetType, writingPairs);
|
||||
public Class<?> getCustomWriteTarget(Class<?> source, Class<?> expectedTargetType) {
|
||||
Assert.notNull(source);
|
||||
return getCustomTarget(source, expectedTargetType, writingPairs);
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns whether we have a custom conversion registered to write into a Mongo native type. The returned type might
|
||||
* be a subclass of the given expected type though.
|
||||
* be a subclass oth the given expected type though.
|
||||
*
|
||||
* @param sourceType must not be {@literal null}
|
||||
* @param source must not be {@literal null}
|
||||
* @return
|
||||
*/
|
||||
public boolean hasCustomWriteTarget(Class<?> sourceType) {
|
||||
|
||||
Assert.notNull(sourceType);
|
||||
return hasCustomWriteTarget(sourceType, null);
|
||||
public boolean hasCustomWriteTarget(Class<?> source) {
|
||||
return hasCustomWriteTarget(source, null);
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns whether we have a custom conversion registered to write an object of the given source type into an object
|
||||
* of the given Mongo native target type.
|
||||
*
|
||||
* @param sourceType must not be {@literal null}.
|
||||
* @param requestedTargetType
|
||||
* @param source must not be {@literal null}.
|
||||
* @param expectedTargetType
|
||||
* @return
|
||||
*/
|
||||
public boolean hasCustomWriteTarget(Class<?> sourceType, Class<?> requestedTargetType) {
|
||||
|
||||
Assert.notNull(sourceType);
|
||||
return getCustomWriteTarget(sourceType, requestedTargetType) != null;
|
||||
public boolean hasCustomWriteTarget(Class<?> source, Class<?> expectedTargetType) {
|
||||
return getCustomWriteTarget(source, expectedTargetType) != null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns whether we have a custom conversion registered to read the given source into the given target type.
|
||||
*
|
||||
* @param sourceType must not be {@literal null}
|
||||
* @param requestedTargetType must not be {@literal null}
|
||||
* @param source must not be {@literal null}
|
||||
* @param expectedTargetType must not be {@literal null}
|
||||
* @return
|
||||
*/
|
||||
public boolean hasCustomReadTarget(Class<?> sourceType, Class<?> requestedTargetType) {
|
||||
public boolean hasCustomReadTarget(Class<?> source, Class<?> expectedTargetType) {
|
||||
|
||||
Assert.notNull(sourceType);
|
||||
Assert.notNull(requestedTargetType);
|
||||
Assert.notNull(source);
|
||||
Assert.notNull(expectedTargetType);
|
||||
|
||||
return getCustomReadTarget(sourceType, requestedTargetType) != null;
|
||||
return getCustomReadTarget(source, expectedTargetType) != null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Inspects the given {@link ConvertiblePair} for ones that have a source compatible type as source. Additionally
|
||||
* checks assignability of the target type if one is given.
|
||||
* checks assignabilty of the target type if one is given.
|
||||
*
|
||||
* @param sourceType must not be {@literal null}.
|
||||
* @param requestedTargetType can be {@literal null}.
|
||||
* @param pairs must not be {@literal null}.
|
||||
* @param source must not be {@literal null}
|
||||
* @param expectedTargetType
|
||||
* @param pairs must not be {@literal null}
|
||||
* @return
|
||||
*/
|
||||
private static Class<?> getCustomTarget(Class<?> sourceType, Class<?> requestedTargetType,
|
||||
Iterable<ConvertiblePair> pairs) {
|
||||
private static Class<?> getCustomTarget(Class<?> source, Class<?> expectedTargetType, Iterable<ConvertiblePair> pairs) {
|
||||
|
||||
Assert.notNull(sourceType);
|
||||
Assert.notNull(source);
|
||||
Assert.notNull(pairs);
|
||||
|
||||
for (ConvertiblePair typePair : pairs) {
|
||||
if (typePair.getSourceType().isAssignableFrom(sourceType)) {
|
||||
if (typePair.getSourceType().isAssignableFrom(source)) {
|
||||
Class<?> targetType = typePair.getTargetType();
|
||||
if (requestedTargetType == null || targetType.isAssignableFrom(requestedTargetType)) {
|
||||
if (expectedTargetType == null || targetType.isAssignableFrom(expectedTargetType)) {
|
||||
return targetType;
|
||||
}
|
||||
}
|
||||
@@ -324,33 +305,27 @@ public class CustomConversions {
|
||||
return null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the actual target type for the given {@code sourceType} and {@code requestedTargetType}. Note that the
|
||||
* returned {@link Class} could be an assignable type to the given {@code requestedTargetType}.
|
||||
*
|
||||
* @param sourceType must not be {@literal null}.
|
||||
* @param requestedTargetType can be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
private Class<?> getCustomReadTarget(Class<?> sourceType, Class<?> requestedTargetType) {
|
||||
private Class<?> getCustomReadTarget(Class<?> source, Class<?> expectedTargetType) {
|
||||
|
||||
Assert.notNull(sourceType);
|
||||
Class<?> type = expectedTargetType == null ? PlaceholderType.class : expectedTargetType;
|
||||
|
||||
if (requestedTargetType == null) {
|
||||
return null;
|
||||
Map<Class<?>, CacheValue> map;
|
||||
CacheValue toReturn;
|
||||
|
||||
if ((map = cache.get(source)) == null || (toReturn = map.get(type)) == null) {
|
||||
|
||||
Class<?> target = getCustomTarget(source, type, readingPairs);
|
||||
|
||||
if (cache.get(source) == null) {
|
||||
cache.put(source, new HashMap<Class<?>, CacheValue>());
|
||||
}
|
||||
|
||||
ConvertiblePair lookupKey = new ConvertiblePair(sourceType, requestedTargetType);
|
||||
CacheValue readTargetTypeValue = customReadTargetTypes.get(lookupKey);
|
||||
|
||||
if (readTargetTypeValue != null) {
|
||||
return readTargetTypeValue.getType();
|
||||
Map<Class<?>, CacheValue> value = cache.get(source);
|
||||
toReturn = target == null ? CacheValue.NULL : new CacheValue(target);
|
||||
value.put(type, toReturn);
|
||||
}
|
||||
|
||||
readTargetTypeValue = CacheValue.of(getCustomTarget(sourceType, requestedTargetType, readingPairs));
|
||||
CacheValue cacheValue = customReadTargetTypes.putIfAbsent(lookupKey, readTargetTypeValue);
|
||||
|
||||
return cacheValue != null ? cacheValue.getType() : readTargetTypeValue.getType();
|
||||
return toReturn.clazz;
|
||||
}
|
||||
|
||||
@WritingConverter
|
||||
@@ -359,10 +334,8 @@ public class CustomConversions {
|
||||
INSTANCE;
|
||||
|
||||
public Set<ConvertiblePair> getConvertibleTypes() {
|
||||
|
||||
ConvertiblePair localeToString = new ConvertiblePair(Locale.class, String.class);
|
||||
ConvertiblePair booleanToString = new ConvertiblePair(Character.class, String.class);
|
||||
|
||||
return new HashSet<ConvertiblePair>(Arrays.asList(localeToString, booleanToString));
|
||||
}
|
||||
|
||||
@@ -371,29 +344,29 @@ public class CustomConversions {
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Placeholder type to allow registering not-found values in the converter cache.
|
||||
*
|
||||
* @author Patryk Wasik
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
private static class PlaceholderType {
|
||||
|
||||
}
|
||||
|
||||
/**
|
||||
* Wrapper to safely store {@literal null} values in the type cache.
|
||||
*
|
||||
* @author Patryk Wasik
|
||||
* @author Oliver Gierke
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
private static class CacheValue {
|
||||
|
||||
private static final CacheValue ABSENT = new CacheValue(null);
|
||||
public static final CacheValue NULL = new CacheValue(null);
|
||||
private final Class<?> clazz;
|
||||
|
||||
private final Class<?> type;
|
||||
|
||||
public CacheValue(Class<?> type) {
|
||||
this.type = type;
|
||||
}
|
||||
|
||||
public Class<?> getType() {
|
||||
return type;
|
||||
}
|
||||
|
||||
static CacheValue of(Class<?> type) {
|
||||
return type == null ? ABSENT : new CacheValue(type);
|
||||
public CacheValue(Class<?> clazz) {
|
||||
this.clazz = clazz;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,123 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.convert;
|
||||
|
||||
import java.util.Arrays;
|
||||
import java.util.Iterator;
|
||||
import java.util.Map;
|
||||
|
||||
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
|
||||
import org.springframework.util.Assert;
|
||||
|
||||
import com.mongodb.BasicDBObject;
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
* Wrapper value object for a {@link BasicDBObject} to be able to access raw values by {@link MongoPersistentProperty}
|
||||
* references. The accessors will transparently resolve nested document values that a {@link MongoPersistentProperty}
|
||||
* might refer to through a path expression in field names.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
class DBObjectAccessor {
|
||||
|
||||
private final DBObject dbObject;
|
||||
|
||||
/**
|
||||
* Creates a new {@link DBObjectAccessor} for the given {@link DBObject}.
|
||||
*
|
||||
* @param dbObject must be a {@link BasicDBObject} effectively, must not be {@literal null}.
|
||||
*/
|
||||
public DBObjectAccessor(DBObject dbObject) {
|
||||
|
||||
Assert.notNull(dbObject, "DBObject must not be null!");
|
||||
Assert.isInstanceOf(BasicDBObject.class, dbObject, "Given DBObject must be a BasicDBObject!");
|
||||
|
||||
this.dbObject = dbObject;
|
||||
}
|
||||
|
||||
/**
|
||||
* Puts the given value into the backing {@link DBObject} based on the coordinates defined through the given
|
||||
* {@link MongoPersistentProperty}. By default this will be the plain field name. But field names might also consist
|
||||
* of path traversals so we might need to create intermediate {@link BasicDBObject}s.
|
||||
*
|
||||
* @param prop must not be {@literal null}.
|
||||
* @param value
|
||||
*/
|
||||
public void put(MongoPersistentProperty prop, Object value) {
|
||||
|
||||
Assert.notNull(prop, "MongoPersistentProperty must not be null!");
|
||||
String fieldName = prop.getFieldName();
|
||||
|
||||
Iterator<String> parts = Arrays.asList(fieldName.split("\\.")).iterator();
|
||||
DBObject dbObject = this.dbObject;
|
||||
|
||||
while (parts.hasNext()) {
|
||||
|
||||
String part = parts.next();
|
||||
|
||||
if (parts.hasNext()) {
|
||||
BasicDBObject nestedDbObject = new BasicDBObject();
|
||||
dbObject.put(part, nestedDbObject);
|
||||
dbObject = nestedDbObject;
|
||||
} else {
|
||||
dbObject.put(part, value);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the value the given {@link MongoPersistentProperty} refers to. By default this will be a direct field but
|
||||
* the method will also transparently resolve nested values the {@link MongoPersistentProperty} might refer to through
|
||||
* a path expression in the field name metadata.
|
||||
*
|
||||
* @param property must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
@SuppressWarnings("unchecked")
|
||||
public Object get(MongoPersistentProperty property) {
|
||||
|
||||
String fieldName = property.getFieldName();
|
||||
Iterator<String> parts = Arrays.asList(fieldName.split("\\.")).iterator();
|
||||
Map<Object, Object> source = this.dbObject.toMap();
|
||||
Object result = null;
|
||||
|
||||
while (source != null && parts.hasNext()) {
|
||||
|
||||
result = source.get(parts.next());
|
||||
|
||||
if (parts.hasNext()) {
|
||||
source = getAsMap(result);
|
||||
}
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
@SuppressWarnings("unchecked")
|
||||
private Map<Object, Object> getAsMap(Object source) {
|
||||
|
||||
if (source instanceof BasicDBObject) {
|
||||
return ((DBObject) source).toMap();
|
||||
}
|
||||
|
||||
if (source instanceof Map) {
|
||||
return (Map<Object, Object>) source;
|
||||
}
|
||||
|
||||
return null;
|
||||
}
|
||||
}
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2012-2014 the original author or authors.
|
||||
* Copyright 2012 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -31,30 +31,18 @@ import com.mongodb.DBObject;
|
||||
*/
|
||||
class DBObjectPropertyAccessor extends MapAccessor {
|
||||
|
||||
static final MapAccessor INSTANCE = new DBObjectPropertyAccessor();
|
||||
static MapAccessor INSTANCE = new DBObjectPropertyAccessor();
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.context.expression.MapAccessor#getSpecificTargetClasses()
|
||||
*/
|
||||
@Override
|
||||
public Class<?>[] getSpecificTargetClasses() {
|
||||
return new Class[] { DBObject.class };
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.context.expression.MapAccessor#canRead(org.springframework.expression.EvaluationContext, java.lang.Object, java.lang.String)
|
||||
*/
|
||||
@Override
|
||||
public boolean canRead(EvaluationContext context, Object target, String name) {
|
||||
return true;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.context.expression.MapAccessor#read(org.springframework.expression.EvaluationContext, java.lang.Object, java.lang.String)
|
||||
*/
|
||||
@Override
|
||||
@SuppressWarnings("unchecked")
|
||||
public TypedValue read(EvaluationContext context, Object target, String name) {
|
||||
|
||||
@@ -1,55 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013-2014 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.convert;
|
||||
|
||||
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
|
||||
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
|
||||
|
||||
import com.mongodb.DBRef;
|
||||
|
||||
/**
|
||||
* Used to resolve associations annotated with {@link org.springframework.data.mongodb.core.mapping.DBRef}.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
* @since 1.4
|
||||
*/
|
||||
public interface DbRefResolver {
|
||||
|
||||
/**
|
||||
* Resolves the given {@link DBRef} into an object of the given {@link MongoPersistentProperty}'s type. The method
|
||||
* might return a proxy object for the {@link DBRef} or resolve it immediately. In both cases the
|
||||
* {@link DbRefResolverCallback} will be used to obtain the actual backing object.
|
||||
*
|
||||
* @param property will never be {@literal null}.
|
||||
* @param dbref the {@link DBRef} to resolve.
|
||||
* @param callback will never be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
Object resolveDbRef(MongoPersistentProperty property, DBRef dbref, DbRefResolverCallback callback);
|
||||
|
||||
/**
|
||||
* Creates a {@link DBRef} instance for the given {@link org.springframework.data.mongodb.core.mapping.DBRef}
|
||||
* annotation, {@link MongoPersistentEntity} and id.
|
||||
*
|
||||
* @param annotation will never be {@literal null}.
|
||||
* @param entity will never be {@literal null}.
|
||||
* @param id will never be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
DBRef createDbRef(org.springframework.data.mongodb.core.mapping.DBRef annotation, MongoPersistentEntity<?> entity,
|
||||
Object id);
|
||||
}
|
||||
@@ -1,35 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.convert;
|
||||
|
||||
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
|
||||
|
||||
/**
|
||||
* Callback interface to be used in conjunction with {@link DbRefResolver}.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
public interface DbRefResolverCallback {
|
||||
|
||||
/**
|
||||
* Resolve the final object for the given {@link MongoPersistentProperty}.
|
||||
*
|
||||
* @param property will never be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
Object resolve(MongoPersistentProperty property);
|
||||
}
|
||||
@@ -1,383 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013-2014 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.convert;
|
||||
|
||||
import static org.springframework.util.ReflectionUtils.*;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.io.ObjectInputStream;
|
||||
import java.io.ObjectOutputStream;
|
||||
import java.io.Serializable;
|
||||
import java.lang.reflect.Method;
|
||||
|
||||
import org.aopalliance.intercept.MethodInterceptor;
|
||||
import org.aopalliance.intercept.MethodInvocation;
|
||||
import org.springframework.aop.framework.ProxyFactory;
|
||||
import org.springframework.cglib.proxy.Callback;
|
||||
import org.springframework.cglib.proxy.Enhancer;
|
||||
import org.springframework.cglib.proxy.Factory;
|
||||
import org.springframework.cglib.proxy.MethodProxy;
|
||||
import org.springframework.dao.DataAccessException;
|
||||
import org.springframework.dao.support.PersistenceExceptionTranslator;
|
||||
import org.springframework.data.mongodb.LazyLoadingException;
|
||||
import org.springframework.data.mongodb.MongoDbFactory;
|
||||
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
|
||||
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
|
||||
import org.springframework.objenesis.ObjenesisStd;
|
||||
import org.springframework.util.Assert;
|
||||
import org.springframework.util.ReflectionUtils;
|
||||
import org.springframework.util.StringUtils;
|
||||
|
||||
import com.mongodb.DB;
|
||||
import com.mongodb.DBRef;
|
||||
|
||||
/**
|
||||
* A {@link DbRefResolver} that resolves {@link org.springframework.data.mongodb.core.mapping.DBRef}s by delegating to a
|
||||
* {@link DbRefResolverCallback} than is able to generate lazy loading proxies.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
* @since 1.4
|
||||
*/
|
||||
public class DefaultDbRefResolver implements DbRefResolver {
|
||||
|
||||
private final MongoDbFactory mongoDbFactory;
|
||||
private final PersistenceExceptionTranslator exceptionTranslator;
|
||||
private final ObjenesisStd objenesis;
|
||||
|
||||
/**
|
||||
* Creates a new {@link DefaultDbRefResolver} with the given {@link MongoDbFactory}.
|
||||
*
|
||||
* @param mongoDbFactory must not be {@literal null}.
|
||||
*/
|
||||
public DefaultDbRefResolver(MongoDbFactory mongoDbFactory) {
|
||||
|
||||
Assert.notNull(mongoDbFactory, "MongoDbFactory translator must not be null!");
|
||||
|
||||
this.mongoDbFactory = mongoDbFactory;
|
||||
this.exceptionTranslator = mongoDbFactory.getExceptionTranslator();
|
||||
this.objenesis = new ObjenesisStd(true);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.convert.DbRefResolver#resolveDbRef(org.springframework.data.mongodb.core.mapping.MongoPersistentProperty, org.springframework.data.mongodb.core.convert.DbRefResolverCallback)
|
||||
*/
|
||||
@Override
|
||||
public Object resolveDbRef(MongoPersistentProperty property, DBRef dbref, DbRefResolverCallback callback) {
|
||||
|
||||
Assert.notNull(property, "Property must not be null!");
|
||||
Assert.notNull(callback, "Callback must not be null!");
|
||||
|
||||
if (isLazyDbRef(property)) {
|
||||
return createLazyLoadingProxy(property, dbref, callback);
|
||||
}
|
||||
|
||||
return callback.resolve(property);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.convert.DbRefResolver#created(org.springframework.data.mongodb.core.mapping.MongoPersistentProperty, org.springframework.data.mongodb.core.mapping.MongoPersistentEntity, java.lang.Object)
|
||||
*/
|
||||
@Override
|
||||
public DBRef createDbRef(org.springframework.data.mongodb.core.mapping.DBRef annotation,
|
||||
MongoPersistentEntity<?> entity, Object id) {
|
||||
|
||||
DB db = mongoDbFactory.getDb();
|
||||
db = annotation != null && StringUtils.hasText(annotation.db()) ? mongoDbFactory.getDb(annotation.db()) : db;
|
||||
|
||||
return new DBRef(db, entity.getCollection(), id);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a proxy for the given {@link MongoPersistentProperty} using the given {@link DbRefResolverCallback} to
|
||||
* eventually resolve the value of the property.
|
||||
*
|
||||
* @param property must not be {@literal null}.
|
||||
* @param dbref can be {@literal null}.
|
||||
* @param callback must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
private Object createLazyLoadingProxy(MongoPersistentProperty property, DBRef dbref, DbRefResolverCallback callback) {
|
||||
|
||||
Class<?> propertyType = property.getType();
|
||||
LazyLoadingInterceptor interceptor = new LazyLoadingInterceptor(property, dbref, exceptionTranslator, callback);
|
||||
|
||||
if (!propertyType.isInterface()) {
|
||||
|
||||
Factory factory = (Factory) objenesis.newInstance(getEnhancedTypeFor(propertyType));
|
||||
factory.setCallbacks(new Callback[] { interceptor });
|
||||
|
||||
return factory;
|
||||
}
|
||||
|
||||
ProxyFactory proxyFactory = new ProxyFactory();
|
||||
|
||||
for (Class<?> type : propertyType.getInterfaces()) {
|
||||
proxyFactory.addInterface(type);
|
||||
}
|
||||
|
||||
proxyFactory.addInterface(LazyLoadingProxy.class);
|
||||
proxyFactory.addInterface(propertyType);
|
||||
proxyFactory.addAdvice(interceptor);
|
||||
|
||||
return proxyFactory.getProxy();
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the CGLib enhanced type for the given source type.
|
||||
*
|
||||
* @param type
|
||||
* @return
|
||||
*/
|
||||
private Class<?> getEnhancedTypeFor(Class<?> type) {
|
||||
|
||||
Enhancer enhancer = new Enhancer();
|
||||
enhancer.setSuperclass(type);
|
||||
enhancer.setCallbackType(org.springframework.cglib.proxy.MethodInterceptor.class);
|
||||
enhancer.setInterfaces(new Class[] { LazyLoadingProxy.class });
|
||||
|
||||
return enhancer.createClass();
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns whether the property shall be resolved lazily.
|
||||
*
|
||||
* @param property must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
private boolean isLazyDbRef(MongoPersistentProperty property) {
|
||||
return property.getDBRef() != null && property.getDBRef().lazy();
|
||||
}
|
||||
|
||||
/**
|
||||
* A {@link MethodInterceptor} that is used within a lazy loading proxy. The property resolving is delegated to a
|
||||
* {@link DbRefResolverCallback}. The resolving process is triggered by a method invocation on the proxy and is
|
||||
* guaranteed to be performed only once.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
static class LazyLoadingInterceptor implements MethodInterceptor, org.springframework.cglib.proxy.MethodInterceptor,
|
||||
Serializable {
|
||||
|
||||
private static final Method INITIALIZE_METHOD, TO_DBREF_METHOD;
|
||||
|
||||
private final DbRefResolverCallback callback;
|
||||
private final MongoPersistentProperty property;
|
||||
private final PersistenceExceptionTranslator exceptionTranslator;
|
||||
|
||||
private volatile boolean resolved;
|
||||
private Object result;
|
||||
private DBRef dbref;
|
||||
|
||||
static {
|
||||
try {
|
||||
INITIALIZE_METHOD = LazyLoadingProxy.class.getMethod("initialize");
|
||||
TO_DBREF_METHOD = LazyLoadingProxy.class.getMethod("toDBRef");
|
||||
} catch (Exception e) {
|
||||
throw new RuntimeException(e);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link LazyLoadingInterceptor} for the given {@link MongoPersistentProperty},
|
||||
* {@link PersistenceExceptionTranslator} and {@link DbRefResolverCallback}.
|
||||
*
|
||||
* @param property must not be {@literal null}.
|
||||
* @param dbref can be {@literal null}.
|
||||
* @param callback must not be {@literal null}.
|
||||
*/
|
||||
public LazyLoadingInterceptor(MongoPersistentProperty property, DBRef dbref,
|
||||
PersistenceExceptionTranslator exceptionTranslator, DbRefResolverCallback callback) {
|
||||
|
||||
Assert.notNull(property, "Property must not be null!");
|
||||
Assert.notNull(exceptionTranslator, "Exception translator must not be null!");
|
||||
Assert.notNull(callback, "Callback must not be null!");
|
||||
|
||||
this.dbref = dbref;
|
||||
this.callback = callback;
|
||||
this.exceptionTranslator = exceptionTranslator;
|
||||
this.property = property;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.aopalliance.intercept.MethodInterceptor#invoke(org.aopalliance.intercept.MethodInvocation)
|
||||
*/
|
||||
@Override
|
||||
public Object invoke(MethodInvocation invocation) throws Throwable {
|
||||
return intercept(invocation.getThis(), invocation.getMethod(), invocation.getArguments(), null);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.cglib.proxy.MethodInterceptor#intercept(java.lang.Object, java.lang.reflect.Method, java.lang.Object[], org.springframework.cglib.proxy.MethodProxy)
|
||||
*/
|
||||
@Override
|
||||
public Object intercept(Object obj, Method method, Object[] args, MethodProxy proxy) throws Throwable {
|
||||
|
||||
if (INITIALIZE_METHOD.equals(method)) {
|
||||
return ensureResolved();
|
||||
}
|
||||
|
||||
if (TO_DBREF_METHOD.equals(method)) {
|
||||
return this.dbref;
|
||||
}
|
||||
|
||||
if (isObjectMethod(method) && Object.class.equals(method.getDeclaringClass())) {
|
||||
|
||||
if (ReflectionUtils.isToStringMethod(method)) {
|
||||
return proxyToString(proxy);
|
||||
}
|
||||
|
||||
if (ReflectionUtils.isEqualsMethod(method)) {
|
||||
return proxyEquals(proxy, args[0]);
|
||||
}
|
||||
|
||||
if (ReflectionUtils.isHashCodeMethod(method)) {
|
||||
return proxyHashCode(proxy);
|
||||
}
|
||||
}
|
||||
|
||||
Object target = ensureResolved();
|
||||
|
||||
if (target == null) {
|
||||
return null;
|
||||
}
|
||||
|
||||
return method.invoke(target, args);
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns a to string representation for the given {@code proxy}.
|
||||
*
|
||||
* @param proxy
|
||||
* @return
|
||||
*/
|
||||
private String proxyToString(Object proxy) {
|
||||
|
||||
StringBuilder description = new StringBuilder();
|
||||
if (dbref != null) {
|
||||
description.append(dbref.getRef());
|
||||
description.append(":");
|
||||
description.append(dbref.getId());
|
||||
} else {
|
||||
description.append(System.identityHashCode(proxy));
|
||||
}
|
||||
description.append("$").append(LazyLoadingProxy.class.getSimpleName());
|
||||
|
||||
return description.toString();
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the hashcode for the given {@code proxy}.
|
||||
*
|
||||
* @param proxy
|
||||
* @return
|
||||
*/
|
||||
private int proxyHashCode(Object proxy) {
|
||||
return proxyToString(proxy).hashCode();
|
||||
}
|
||||
|
||||
/**
|
||||
* Performs an equality check for the given {@code proxy}.
|
||||
*
|
||||
* @param proxy
|
||||
* @param that
|
||||
* @return
|
||||
*/
|
||||
private boolean proxyEquals(Object proxy, Object that) {
|
||||
|
||||
if (!(that instanceof LazyLoadingProxy)) {
|
||||
return false;
|
||||
}
|
||||
|
||||
if (that == proxy) {
|
||||
return true;
|
||||
}
|
||||
|
||||
return proxyToString(proxy).equals(that.toString());
|
||||
}
|
||||
|
||||
/**
|
||||
* Will trigger the resolution if the proxy is not resolved already or return a previously resolved result.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
private Object ensureResolved() {
|
||||
|
||||
if (!resolved) {
|
||||
this.result = resolve();
|
||||
this.resolved = true;
|
||||
}
|
||||
|
||||
return this.result;
|
||||
}
|
||||
|
||||
/**
|
||||
* Callback method for serialization.
|
||||
*
|
||||
* @param out
|
||||
* @throws IOException
|
||||
*/
|
||||
private void writeObject(ObjectOutputStream out) throws IOException {
|
||||
|
||||
ensureResolved();
|
||||
out.writeObject(this.result);
|
||||
}
|
||||
|
||||
/**
|
||||
* Callback method for deserialization.
|
||||
*
|
||||
* @param in
|
||||
* @throws IOException
|
||||
*/
|
||||
private void readObject(ObjectInputStream in) throws IOException {
|
||||
|
||||
try {
|
||||
this.resolved = true;
|
||||
this.result = in.readObject();
|
||||
} catch (ClassNotFoundException e) {
|
||||
throw new LazyLoadingException("Could not deserialize result", e);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Resolves the proxy into its backing object.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
private synchronized Object resolve() {
|
||||
|
||||
if (!resolved) {
|
||||
|
||||
try {
|
||||
|
||||
return callback.resolve(property);
|
||||
|
||||
} catch (RuntimeException ex) {
|
||||
|
||||
DataAccessException translatedException = this.exceptionTranslator.translateExceptionIfPossible(ex);
|
||||
throw new LazyLoadingException("Unable to lazily resolve DBRef!", translatedException);
|
||||
}
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2011-2013 the original author or authors.
|
||||
* Copyright 2011 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -18,19 +18,16 @@ package org.springframework.data.mongodb.core.convert;
|
||||
import java.util.Arrays;
|
||||
import java.util.List;
|
||||
import java.util.Map;
|
||||
import java.util.Set;
|
||||
|
||||
import org.springframework.data.convert.DefaultTypeMapper;
|
||||
import org.springframework.data.convert.SimpleTypeInformationMapper;
|
||||
import org.springframework.data.convert.DefaultTypeMapper;
|
||||
import org.springframework.data.convert.TypeAliasAccessor;
|
||||
import org.springframework.data.convert.TypeInformationMapper;
|
||||
import org.springframework.data.mapping.PersistentEntity;
|
||||
import org.springframework.data.mapping.context.MappingContext;
|
||||
import org.springframework.data.util.ClassTypeInformation;
|
||||
import org.springframework.data.util.TypeInformation;
|
||||
|
||||
import com.mongodb.BasicDBList;
|
||||
import com.mongodb.BasicDBObject;
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
@@ -40,43 +37,33 @@ import com.mongodb.DBObject;
|
||||
* respectively.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
public class DefaultMongoTypeMapper extends DefaultTypeMapper<DBObject> implements MongoTypeMapper {
|
||||
|
||||
public static final String DEFAULT_TYPE_KEY = "_class";
|
||||
@SuppressWarnings("rawtypes")//
|
||||
@SuppressWarnings("rawtypes")
|
||||
private static final TypeInformation<List> LIST_TYPE_INFO = ClassTypeInformation.from(List.class);
|
||||
@SuppressWarnings("rawtypes")//
|
||||
@SuppressWarnings("rawtypes")
|
||||
private static final TypeInformation<Map> MAP_TYPE_INFO = ClassTypeInformation.from(Map.class);
|
||||
|
||||
private final TypeAliasAccessor<DBObject> accessor;
|
||||
private final String typeKey;
|
||||
private String typeKey = DEFAULT_TYPE_KEY;
|
||||
|
||||
public DefaultMongoTypeMapper() {
|
||||
this(DEFAULT_TYPE_KEY);
|
||||
this(DEFAULT_TYPE_KEY, Arrays.asList(SimpleTypeInformationMapper.INSTANCE));
|
||||
}
|
||||
|
||||
public DefaultMongoTypeMapper(String typeKey) {
|
||||
this(typeKey, Arrays.asList(SimpleTypeInformationMapper.INSTANCE));
|
||||
super(new DBObjectTypeAliasAccessor(typeKey));
|
||||
this.typeKey = typeKey;
|
||||
}
|
||||
|
||||
public DefaultMongoTypeMapper(String typeKey, MappingContext<? extends PersistentEntity<?, ?>, ?> mappingContext) {
|
||||
this(typeKey, new DBObjectTypeAliasAccessor(typeKey), mappingContext, Arrays
|
||||
.asList(SimpleTypeInformationMapper.INSTANCE));
|
||||
super(new DBObjectTypeAliasAccessor(typeKey), mappingContext, Arrays.asList(SimpleTypeInformationMapper.INSTANCE));
|
||||
this.typeKey = typeKey;
|
||||
}
|
||||
|
||||
public DefaultMongoTypeMapper(String typeKey, List<? extends TypeInformationMapper> mappers) {
|
||||
this(typeKey, new DBObjectTypeAliasAccessor(typeKey), null, mappers);
|
||||
}
|
||||
|
||||
private DefaultMongoTypeMapper(String typeKey, TypeAliasAccessor<DBObject> accessor,
|
||||
MappingContext<? extends PersistentEntity<?, ?>, ?> mappingContext, List<? extends TypeInformationMapper> mappers) {
|
||||
|
||||
super(accessor, mappingContext, mappers);
|
||||
|
||||
super(new DBObjectTypeAliasAccessor(typeKey), mappers);
|
||||
this.typeKey = typeKey;
|
||||
this.accessor = accessor;
|
||||
}
|
||||
|
||||
/*
|
||||
@@ -87,31 +74,6 @@ public class DefaultMongoTypeMapper extends DefaultTypeMapper<DBObject> implemen
|
||||
return typeKey == null ? false : typeKey.equals(key);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.convert.MongoTypeMapper#writeTypeRestrictions(java.util.Set)
|
||||
*/
|
||||
@Override
|
||||
public void writeTypeRestrictions(DBObject result, Set<Class<?>> restrictedTypes) {
|
||||
|
||||
if (restrictedTypes == null || restrictedTypes.isEmpty()) {
|
||||
return;
|
||||
}
|
||||
|
||||
BasicDBList restrictedMappedTypes = new BasicDBList();
|
||||
|
||||
for (Class<?> restrictedType : restrictedTypes) {
|
||||
|
||||
Object typeAlias = getAliasFor(ClassTypeInformation.from(restrictedType));
|
||||
|
||||
if (typeAlias != null) {
|
||||
restrictedMappedTypes.add(typeAlias);
|
||||
}
|
||||
}
|
||||
|
||||
accessor.writeTypeTo(result, new BasicDBObject("$in", restrictedMappedTypes));
|
||||
}
|
||||
|
||||
/* (non-Javadoc)
|
||||
* @see org.springframework.data.convert.DefaultTypeMapper#getFallbackTypeFor(java.lang.Object)
|
||||
*/
|
||||
@@ -121,7 +83,6 @@ public class DefaultMongoTypeMapper extends DefaultTypeMapper<DBObject> implemen
|
||||
}
|
||||
|
||||
/**
|
||||
* {@link TypeAliasAccessor} to store aliases in a {@link DBObject}.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
|
||||
@@ -1,520 +0,0 @@
|
||||
/*
|
||||
* Copyright 2014 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.convert;
|
||||
|
||||
import java.util.ArrayList;
|
||||
import java.util.Arrays;
|
||||
import java.util.Collection;
|
||||
import java.util.List;
|
||||
|
||||
import org.springframework.core.convert.converter.Converter;
|
||||
import org.springframework.data.convert.ReadingConverter;
|
||||
import org.springframework.data.convert.WritingConverter;
|
||||
import org.springframework.data.geo.Box;
|
||||
import org.springframework.data.geo.Circle;
|
||||
import org.springframework.data.geo.Distance;
|
||||
import org.springframework.data.geo.Metrics;
|
||||
import org.springframework.data.geo.Point;
|
||||
import org.springframework.data.geo.Polygon;
|
||||
import org.springframework.data.geo.Shape;
|
||||
import org.springframework.data.mongodb.core.geo.Sphere;
|
||||
import org.springframework.data.mongodb.core.query.GeoCommand;
|
||||
import org.springframework.util.Assert;
|
||||
|
||||
import com.mongodb.BasicDBList;
|
||||
import com.mongodb.BasicDBObject;
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
* Wrapper class to contain useful geo structure converters for the usage with Mongo.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
* @since 1.5
|
||||
*/
|
||||
abstract class GeoConverters {
|
||||
|
||||
/**
|
||||
* Private constructor to prevent instantiation.
|
||||
*/
|
||||
private GeoConverters() {}
|
||||
|
||||
/**
|
||||
* Returns the geo converters to be registered.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
@SuppressWarnings("unchecked")
|
||||
public static Collection<? extends Object> getConvertersToRegister() {
|
||||
return Arrays.asList( //
|
||||
BoxToDbObjectConverter.INSTANCE //
|
||||
, PolygonToDbObjectConverter.INSTANCE //
|
||||
, CircleToDbObjectConverter.INSTANCE //
|
||||
, LegacyCircleToDbObjectConverter.INSTANCE //
|
||||
, SphereToDbObjectConverter.INSTANCE //
|
||||
, DbObjectToBoxConverter.INSTANCE //
|
||||
, DbObjectToPolygonConverter.INSTANCE //
|
||||
, DbObjectToCircleConverter.INSTANCE //
|
||||
, DbObjectToLegacyCircleConverter.INSTANCE //
|
||||
, DbObjectToSphereConverter.INSTANCE //
|
||||
, DbObjectToPointConverter.INSTANCE //
|
||||
, PointToDbObjectConverter.INSTANCE //
|
||||
, GeoCommandToDbObjectConverter.INSTANCE);
|
||||
}
|
||||
|
||||
/**
|
||||
* Converts a {@link List} of {@link Double}s into a {@link Point}.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
* @since 1.5
|
||||
*/
|
||||
@ReadingConverter
|
||||
public static enum DbObjectToPointConverter implements Converter<DBObject, Point> {
|
||||
|
||||
INSTANCE;
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
|
||||
*/
|
||||
@Override
|
||||
@SuppressWarnings("deprecation")
|
||||
public Point convert(DBObject source) {
|
||||
|
||||
Assert.isTrue(source.keySet().size() == 2, "Source must contain 2 elements");
|
||||
|
||||
return source == null ? null : new org.springframework.data.mongodb.core.geo.Point((Double) source.get("x"),
|
||||
(Double) source.get("y"));
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Converts a {@link Point} into a {@link List} of {@link Double}s.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
* @since 1.5
|
||||
*/
|
||||
public static enum PointToDbObjectConverter implements Converter<Point, DBObject> {
|
||||
|
||||
INSTANCE;
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
|
||||
*/
|
||||
@Override
|
||||
public DBObject convert(Point source) {
|
||||
return source == null ? null : new BasicDBObject("x", source.getX()).append("y", source.getY());
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Converts a {@link Box} into a {@link BasicDBList}.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
* @since 1.5
|
||||
*/
|
||||
@WritingConverter
|
||||
public static enum BoxToDbObjectConverter implements Converter<Box, DBObject> {
|
||||
|
||||
INSTANCE;
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
|
||||
*/
|
||||
@Override
|
||||
public DBObject convert(Box source) {
|
||||
|
||||
if (source == null) {
|
||||
return null;
|
||||
}
|
||||
|
||||
BasicDBObject result = new BasicDBObject();
|
||||
result.put("first", PointToDbObjectConverter.INSTANCE.convert(source.getFirst()));
|
||||
result.put("second", PointToDbObjectConverter.INSTANCE.convert(source.getSecond()));
|
||||
return result;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Converts a {@link BasicDBList} into a {@link org.springframework.data.mongodb.core.geo.Box}.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
* @since 1.5
|
||||
*/
|
||||
@ReadingConverter
|
||||
public static enum DbObjectToBoxConverter implements Converter<DBObject, Box> {
|
||||
|
||||
INSTANCE;
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
|
||||
*/
|
||||
@Override
|
||||
@SuppressWarnings("deprecation")
|
||||
public Box convert(DBObject source) {
|
||||
|
||||
if (source == null) {
|
||||
return null;
|
||||
}
|
||||
|
||||
Point first = DbObjectToPointConverter.INSTANCE.convert((DBObject) source.get("first"));
|
||||
Point second = DbObjectToPointConverter.INSTANCE.convert((DBObject) source.get("second"));
|
||||
|
||||
return new org.springframework.data.mongodb.core.geo.Box(first, second);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Converts a {@link Circle} into a {@link BasicDBList}.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
* @since 1.5
|
||||
*/
|
||||
public static enum CircleToDbObjectConverter implements Converter<Circle, DBObject> {
|
||||
|
||||
INSTANCE;
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
|
||||
*/
|
||||
@Override
|
||||
public DBObject convert(Circle source) {
|
||||
|
||||
if (source == null) {
|
||||
return null;
|
||||
}
|
||||
|
||||
DBObject result = new BasicDBObject();
|
||||
result.put("center", PointToDbObjectConverter.INSTANCE.convert(source.getCenter()));
|
||||
result.put("radius", source.getRadius().getNormalizedValue());
|
||||
result.put("metric", source.getRadius().getMetric().toString());
|
||||
return result;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Converts a {@link DBObject} into a {@link org.springframework.data.mongodb.core.geo.Circle}.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
* @since 1.5
|
||||
*/
|
||||
@ReadingConverter
|
||||
public static enum DbObjectToCircleConverter implements Converter<DBObject, Circle> {
|
||||
|
||||
INSTANCE;
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
|
||||
*/
|
||||
@Override
|
||||
public Circle convert(DBObject source) {
|
||||
|
||||
if (source == null) {
|
||||
return null;
|
||||
}
|
||||
|
||||
DBObject center = (DBObject) source.get("center");
|
||||
Double radius = (Double) source.get("radius");
|
||||
|
||||
Distance distance = new Distance(radius);
|
||||
|
||||
if (source.containsField("metric")) {
|
||||
|
||||
String metricString = (String) source.get("metric");
|
||||
Assert.notNull(metricString, "Metric must not be null!");
|
||||
|
||||
distance = distance.in(Metrics.valueOf(metricString));
|
||||
}
|
||||
|
||||
Assert.notNull(center, "Center must not be null!");
|
||||
Assert.notNull(radius, "Radius must not be null!");
|
||||
|
||||
return new Circle(DbObjectToPointConverter.INSTANCE.convert(center), distance);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Converts a {@link Circle} into a {@link BasicDBList}.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
* @since 1.5
|
||||
*/
|
||||
@SuppressWarnings("deprecation")
|
||||
public static enum LegacyCircleToDbObjectConverter implements
|
||||
Converter<org.springframework.data.mongodb.core.geo.Circle, DBObject> {
|
||||
|
||||
INSTANCE;
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
|
||||
*/
|
||||
@Override
|
||||
public DBObject convert(org.springframework.data.mongodb.core.geo.Circle source) {
|
||||
|
||||
if (source == null) {
|
||||
return null;
|
||||
}
|
||||
|
||||
DBObject result = new BasicDBObject();
|
||||
result.put("center", PointToDbObjectConverter.INSTANCE.convert(source.getCenter()));
|
||||
result.put("radius", source.getRadius());
|
||||
return result;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Converts a {@link BasicDBList} into a {@link org.springframework.data.mongodb.core.geo.Circle}.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
* @since 1.5
|
||||
*/
|
||||
@ReadingConverter
|
||||
@SuppressWarnings("deprecation")
|
||||
public static enum DbObjectToLegacyCircleConverter implements
|
||||
Converter<DBObject, org.springframework.data.mongodb.core.geo.Circle> {
|
||||
|
||||
INSTANCE;
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
|
||||
*/
|
||||
@Override
|
||||
public org.springframework.data.mongodb.core.geo.Circle convert(DBObject source) {
|
||||
|
||||
if (source == null) {
|
||||
return null;
|
||||
}
|
||||
|
||||
DBObject centerSource = (DBObject) source.get("center");
|
||||
Double radius = (Double) source.get("radius");
|
||||
|
||||
Assert.notNull(centerSource, "Center must not be null!");
|
||||
Assert.notNull(radius, "Radius must not be null!");
|
||||
|
||||
Point center = DbObjectToPointConverter.INSTANCE.convert(centerSource);
|
||||
return new org.springframework.data.mongodb.core.geo.Circle(center, radius);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Converts a {@link Sphere} into a {@link BasicDBList}.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
* @since 1.5
|
||||
*/
|
||||
public static enum SphereToDbObjectConverter implements Converter<Sphere, DBObject> {
|
||||
|
||||
INSTANCE;
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
|
||||
*/
|
||||
@Override
|
||||
public DBObject convert(Sphere source) {
|
||||
|
||||
if (source == null) {
|
||||
return null;
|
||||
}
|
||||
|
||||
DBObject result = new BasicDBObject();
|
||||
result.put("center", PointToDbObjectConverter.INSTANCE.convert(source.getCenter()));
|
||||
result.put("radius", source.getRadius().getNormalizedValue());
|
||||
result.put("metric", source.getRadius().getMetric().toString());
|
||||
return result;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Converts a {@link BasicDBList} into a {@link Sphere}.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
* @since 1.5
|
||||
*/
|
||||
@ReadingConverter
|
||||
public static enum DbObjectToSphereConverter implements Converter<DBObject, Sphere> {
|
||||
|
||||
INSTANCE;
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
|
||||
*/
|
||||
@Override
|
||||
public Sphere convert(DBObject source) {
|
||||
|
||||
if (source == null) {
|
||||
return null;
|
||||
}
|
||||
|
||||
DBObject center = (DBObject) source.get("center");
|
||||
Double radius = (Double) source.get("radius");
|
||||
|
||||
Distance distance = new Distance(radius);
|
||||
|
||||
if (source.containsField("metric")) {
|
||||
|
||||
String metricString = (String) source.get("metric");
|
||||
Assert.notNull(metricString, "Metric must not be null!");
|
||||
|
||||
distance = distance.in(Metrics.valueOf(metricString));
|
||||
}
|
||||
|
||||
Assert.notNull(center, "Center must not be null!");
|
||||
Assert.notNull(radius, "Radius must not be null!");
|
||||
|
||||
return new Sphere(DbObjectToPointConverter.INSTANCE.convert(center), distance);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Converts a {@link Polygon} into a {@link BasicDBList}.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
* @since 1.5
|
||||
*/
|
||||
public static enum PolygonToDbObjectConverter implements Converter<Polygon, DBObject> {
|
||||
|
||||
INSTANCE;
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
|
||||
*/
|
||||
@Override
|
||||
public DBObject convert(Polygon source) {
|
||||
|
||||
if (source == null) {
|
||||
return null;
|
||||
}
|
||||
|
||||
List<Point> points = source.getPoints();
|
||||
List<DBObject> pointTuples = new ArrayList<DBObject>(points.size());
|
||||
|
||||
for (Point point : points) {
|
||||
pointTuples.add(PointToDbObjectConverter.INSTANCE.convert(point));
|
||||
}
|
||||
|
||||
DBObject result = new BasicDBObject();
|
||||
result.put("points", pointTuples);
|
||||
return result;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Converts a {@link BasicDBList} into a {@link org.springframework.data.mongodb.core.geo.Polygon}.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
* @since 1.5
|
||||
*/
|
||||
@ReadingConverter
|
||||
public static enum DbObjectToPolygonConverter implements Converter<DBObject, Polygon> {
|
||||
|
||||
INSTANCE;
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
|
||||
*/
|
||||
@Override
|
||||
@SuppressWarnings({ "deprecation", "unchecked" })
|
||||
public Polygon convert(DBObject source) {
|
||||
|
||||
if (source == null) {
|
||||
return null;
|
||||
}
|
||||
|
||||
List<DBObject> points = (List<DBObject>) source.get("points");
|
||||
List<Point> newPoints = new ArrayList<Point>(points.size());
|
||||
|
||||
for (DBObject element : points) {
|
||||
|
||||
Assert.notNull(element, "Point elements of polygon must not be null!");
|
||||
newPoints.add(DbObjectToPointConverter.INSTANCE.convert(element));
|
||||
}
|
||||
|
||||
return new org.springframework.data.mongodb.core.geo.Polygon(newPoints);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Converts a {@link Sphere} into a {@link BasicDBList}.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
* @since 1.5
|
||||
*/
|
||||
public static enum GeoCommandToDbObjectConverter implements Converter<GeoCommand, DBObject> {
|
||||
|
||||
INSTANCE;
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
|
||||
*/
|
||||
@Override
|
||||
@SuppressWarnings("deprecation")
|
||||
public DBObject convert(GeoCommand source) {
|
||||
|
||||
if (source == null) {
|
||||
return null;
|
||||
}
|
||||
|
||||
BasicDBList argument = new BasicDBList();
|
||||
|
||||
Shape shape = source.getShape();
|
||||
|
||||
if (shape instanceof Box) {
|
||||
|
||||
argument.add(toList(((Box) shape).getFirst()));
|
||||
argument.add(toList(((Box) shape).getSecond()));
|
||||
|
||||
} else if (shape instanceof Circle) {
|
||||
|
||||
argument.add(toList(((Circle) shape).getCenter()));
|
||||
argument.add(((Circle) shape).getRadius().getNormalizedValue());
|
||||
|
||||
} else if (shape instanceof org.springframework.data.mongodb.core.geo.Circle) {
|
||||
|
||||
argument.add(toList(((org.springframework.data.mongodb.core.geo.Circle) shape).getCenter()));
|
||||
argument.add(((org.springframework.data.mongodb.core.geo.Circle) shape).getRadius());
|
||||
|
||||
} else if (shape instanceof Polygon) {
|
||||
|
||||
for (Point point : ((Polygon) shape).getPoints()) {
|
||||
argument.add(toList(point));
|
||||
}
|
||||
|
||||
} else if (shape instanceof Sphere) {
|
||||
|
||||
argument.add(toList(((Sphere) shape).getCenter()));
|
||||
argument.add(((Sphere) shape).getRadius().getNormalizedValue());
|
||||
}
|
||||
|
||||
return new BasicDBObject(source.getCommand(), argument);
|
||||
}
|
||||
}
|
||||
|
||||
static List<Double> toList(Point point) {
|
||||
return Arrays.asList(point.getX(), point.getY());
|
||||
}
|
||||
}
|
||||
@@ -1,45 +0,0 @@
|
||||
/*
|
||||
* Copyright 2014 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.convert;
|
||||
|
||||
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver.LazyLoadingInterceptor;
|
||||
|
||||
import com.mongodb.DBRef;
|
||||
|
||||
/**
|
||||
* Allows direct interaction with the underlying {@link LazyLoadingInterceptor}.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
* @since 1.5
|
||||
*/
|
||||
public interface LazyLoadingProxy {
|
||||
|
||||
/**
|
||||
* Initializes the proxy and returns the wrapped value.
|
||||
*
|
||||
* @return
|
||||
* @since 1.5
|
||||
*/
|
||||
Object initialize();
|
||||
|
||||
/**
|
||||
* Returns the {@link DBRef} represented by this {@link LazyLoadingProxy}, may be null.
|
||||
*
|
||||
* @return
|
||||
* @since 1.5
|
||||
*/
|
||||
DBRef toDBRef();
|
||||
}
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2011-2014 by the original author(s).
|
||||
* Copyright 2011-2013 by the original author(s).
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -29,10 +29,10 @@ import org.slf4j.LoggerFactory;
|
||||
import org.springframework.beans.BeansException;
|
||||
import org.springframework.context.ApplicationContext;
|
||||
import org.springframework.context.ApplicationContextAware;
|
||||
import org.springframework.core.CollectionFactory;
|
||||
import org.springframework.core.convert.ConversionException;
|
||||
import org.springframework.core.convert.ConversionService;
|
||||
import org.springframework.core.convert.support.ConversionServiceFactory;
|
||||
import org.springframework.data.convert.CollectionFactory;
|
||||
import org.springframework.data.convert.EntityInstantiator;
|
||||
import org.springframework.data.convert.TypeMapper;
|
||||
import org.springframework.data.mapping.Association;
|
||||
@@ -57,9 +57,11 @@ import org.springframework.data.util.TypeInformation;
|
||||
import org.springframework.expression.spel.standard.SpelExpressionParser;
|
||||
import org.springframework.util.Assert;
|
||||
import org.springframework.util.CollectionUtils;
|
||||
import org.springframework.util.StringUtils;
|
||||
|
||||
import com.mongodb.BasicDBList;
|
||||
import com.mongodb.BasicDBObject;
|
||||
import com.mongodb.DB;
|
||||
import com.mongodb.DBObject;
|
||||
import com.mongodb.DBRef;
|
||||
|
||||
@@ -69,40 +71,38 @@ import com.mongodb.DBRef;
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
* @author Jon Brisbin
|
||||
* @author Patrik Wasik
|
||||
* @author Thomas Darimont
|
||||
* @author Christoph Strobl
|
||||
*/
|
||||
public class MappingMongoConverter extends AbstractMongoConverter implements ApplicationContextAware {
|
||||
|
||||
protected static final Logger LOGGER = LoggerFactory.getLogger(MappingMongoConverter.class);
|
||||
protected static final Logger log = LoggerFactory.getLogger(MappingMongoConverter.class);
|
||||
|
||||
protected final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
|
||||
protected final SpelExpressionParser spelExpressionParser = new SpelExpressionParser();
|
||||
protected final MongoDbFactory mongoDbFactory;
|
||||
protected final QueryMapper idMapper;
|
||||
protected final DbRefResolver dbRefResolver;
|
||||
protected ApplicationContext applicationContext;
|
||||
protected boolean useFieldAccessOnly = true;
|
||||
protected MongoTypeMapper typeMapper;
|
||||
protected String mapKeyDotReplacement = null;
|
||||
|
||||
private SpELContext spELContext;
|
||||
|
||||
/**
|
||||
* Creates a new {@link MappingMongoConverter} given the new {@link DbRefResolver} and {@link MappingContext}.
|
||||
* Creates a new {@link MappingMongoConverter} given the new {@link MongoDbFactory} and {@link MappingContext}.
|
||||
*
|
||||
* @param mongoDbFactory must not be {@literal null}.
|
||||
* @param mappingContext must not be {@literal null}.
|
||||
*/
|
||||
@SuppressWarnings("deprecation")
|
||||
public MappingMongoConverter(DbRefResolver dbRefResolver,
|
||||
public MappingMongoConverter(MongoDbFactory mongoDbFactory,
|
||||
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext) {
|
||||
|
||||
super(ConversionServiceFactory.createDefaultConversionService());
|
||||
|
||||
Assert.notNull(dbRefResolver, "DbRefResolver must not be null!");
|
||||
Assert.notNull(mappingContext, "MappingContext must not be null!");
|
||||
Assert.notNull(mongoDbFactory);
|
||||
Assert.notNull(mappingContext);
|
||||
|
||||
this.dbRefResolver = dbRefResolver;
|
||||
this.mongoDbFactory = mongoDbFactory;
|
||||
this.mappingContext = mappingContext;
|
||||
this.typeMapper = new DefaultMongoTypeMapper(DefaultMongoTypeMapper.DEFAULT_TYPE_KEY, mappingContext);
|
||||
this.idMapper = new QueryMapper(this);
|
||||
@@ -110,19 +110,6 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
|
||||
this.spELContext = new SpELContext(DBObjectPropertyAccessor.INSTANCE);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link MappingMongoConverter} given the new {@link MongoDbFactory} and {@link MappingContext}.
|
||||
*
|
||||
* @deprecated use the constructor taking a {@link DbRefResolver} instead.
|
||||
* @param mongoDbFactory must not be {@literal null}.
|
||||
* @param mappingContext must not be {@literal null}.
|
||||
*/
|
||||
@Deprecated
|
||||
public MappingMongoConverter(MongoDbFactory mongoDbFactory,
|
||||
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext) {
|
||||
this(new DefaultDbRefResolver(mongoDbFactory), mappingContext);
|
||||
}
|
||||
|
||||
/**
|
||||
* Configures the {@link MongoTypeMapper} to be used to add type information to {@link DBObject}s created by the
|
||||
* converter and how to lookup type information from {@link DBObject}s when reading them. Uses a
|
||||
@@ -136,15 +123,6 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
|
||||
mappingContext) : typeMapper;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.convert.MongoConverter#getTypeMapper()
|
||||
*/
|
||||
@Override
|
||||
public MongoTypeMapper getTypeMapper() {
|
||||
return this.typeMapper;
|
||||
}
|
||||
|
||||
/**
|
||||
* Configure the characters dots potentially contained in a {@link Map} shall be replaced with. By default we don't do
|
||||
* any translation but rather reject a {@link Map} with keys containing dots causing the conversion for the entire
|
||||
@@ -165,6 +143,17 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
|
||||
return mappingContext;
|
||||
}
|
||||
|
||||
/**
|
||||
* Configures whether to use field access only for entity mapping. Setting this to true will force the
|
||||
* {@link MongoConverter} to not go through getters or setters even if they are present for getting and setting
|
||||
* property values.
|
||||
*
|
||||
* @param useFieldAccessOnly
|
||||
*/
|
||||
public void setUseFieldAccessOnly(boolean useFieldAccessOnly) {
|
||||
this.useFieldAccessOnly = useFieldAccessOnly;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.context.ApplicationContextAware#setApplicationContext(org.springframework.context.ApplicationContext)
|
||||
@@ -234,7 +223,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
|
||||
parent);
|
||||
}
|
||||
|
||||
private <S extends Object> S read(final MongoPersistentEntity<S> entity, final DBObject dbo, final Object parent) {
|
||||
private <S extends Object> S read(final MongoPersistentEntity<S> entity, final DBObject dbo, Object parent) {
|
||||
|
||||
final DefaultSpELExpressionEvaluator evaluator = new DefaultSpELExpressionEvaluator(dbo, spELContext);
|
||||
|
||||
@@ -242,7 +231,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
|
||||
EntityInstantiator instantiator = instantiators.getInstantiatorFor(entity);
|
||||
S instance = instantiator.createInstance(entity, provider);
|
||||
|
||||
final BeanWrapper<S> wrapper = BeanWrapper.create(instance, conversionService);
|
||||
final BeanWrapper<MongoPersistentEntity<S>, S> wrapper = BeanWrapper.create(instance, conversionService);
|
||||
final S result = wrapper.getBean();
|
||||
|
||||
// Set properties not already set in the constructor
|
||||
@@ -254,27 +243,18 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
|
||||
}
|
||||
|
||||
Object obj = getValueInternal(prop, dbo, evaluator, result);
|
||||
wrapper.setProperty(prop, obj);
|
||||
wrapper.setProperty(prop, obj, useFieldAccessOnly);
|
||||
}
|
||||
});
|
||||
|
||||
// Handle associations
|
||||
entity.doWithAssociations(new AssociationHandler<MongoPersistentProperty>() {
|
||||
public void doWithAssociation(Association<MongoPersistentProperty> association) {
|
||||
MongoPersistentProperty inverseProp = association.getInverse();
|
||||
Object obj = getValueInternal(inverseProp, dbo, evaluator, result);
|
||||
|
||||
MongoPersistentProperty property = association.getInverse();
|
||||
wrapper.setProperty(inverseProp, obj);
|
||||
|
||||
Object value = dbo.get(property.getName());
|
||||
DBRef dbref = value instanceof DBRef ? (DBRef) value : null;
|
||||
Object obj = dbRefResolver.resolveDbRef(property, dbref, new DbRefResolverCallback() {
|
||||
|
||||
@Override
|
||||
public Object resolve(MongoPersistentProperty property) {
|
||||
return getValueInternal(property, dbo, evaluator, parent);
|
||||
}
|
||||
});
|
||||
|
||||
wrapper.setProperty(property, obj);
|
||||
}
|
||||
});
|
||||
|
||||
@@ -294,12 +274,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
|
||||
Assert.isTrue(annotation != null, "The referenced property has to be mapped with @DBRef!");
|
||||
}
|
||||
|
||||
// @see DATAMONGO-913
|
||||
if (object instanceof LazyLoadingProxy) {
|
||||
return ((LazyLoadingProxy) object).toDBRef();
|
||||
}
|
||||
|
||||
return createDBRef(object, referingProperty);
|
||||
return createDBRef(object, annotation);
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -370,33 +345,37 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
|
||||
throw new MappingException("No mapping metadata found for entity of type " + obj.getClass().getName());
|
||||
}
|
||||
|
||||
final BeanWrapper<Object> wrapper = BeanWrapper.create(obj, conversionService);
|
||||
final BeanWrapper<MongoPersistentEntity<Object>, Object> wrapper = BeanWrapper.create(obj, conversionService);
|
||||
final MongoPersistentProperty idProperty = entity.getIdProperty();
|
||||
|
||||
if (!dbo.containsField("_id") && null != idProperty) {
|
||||
|
||||
boolean fieldAccessOnly = idProperty.usePropertyAccess() ? false : useFieldAccessOnly;
|
||||
|
||||
try {
|
||||
Object id = wrapper.getProperty(idProperty, Object.class);
|
||||
Object id = wrapper.getProperty(idProperty, Object.class, fieldAccessOnly);
|
||||
dbo.put("_id", idMapper.convertId(id));
|
||||
} catch (ConversionException ignored) {}
|
||||
} catch (ConversionException ignored) {
|
||||
}
|
||||
}
|
||||
|
||||
// Write the properties
|
||||
entity.doWithProperties(new PropertyHandler<MongoPersistentProperty>() {
|
||||
public void doWithPersistentProperty(MongoPersistentProperty prop) {
|
||||
|
||||
if (prop.equals(idProperty) || !prop.isWritable()) {
|
||||
if (prop.equals(idProperty)) {
|
||||
return;
|
||||
}
|
||||
|
||||
Object propertyObj = wrapper.getProperty(prop);
|
||||
boolean fieldAccessOnly = prop.usePropertyAccess() ? false : useFieldAccessOnly;
|
||||
|
||||
Object propertyObj = wrapper.getProperty(prop, prop.getType(), fieldAccessOnly);
|
||||
|
||||
if (null != propertyObj) {
|
||||
|
||||
if (!conversions.isSimpleType(propertyObj.getClass())) {
|
||||
writePropertyInternal(propertyObj, dbo, prop);
|
||||
} else {
|
||||
writeSimpleInternal(propertyObj, dbo, prop);
|
||||
writeSimpleInternal(propertyObj, dbo, prop.getFieldName());
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -406,7 +385,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
|
||||
public void doWithAssociation(Association<MongoPersistentProperty> association) {
|
||||
MongoPersistentProperty inverseProp = association.getInverse();
|
||||
Class<?> type = inverseProp.getType();
|
||||
Object propertyObj = wrapper.getProperty(inverseProp, type);
|
||||
Object propertyObj = wrapper.getProperty(inverseProp, type, useFieldAccessOnly);
|
||||
if (null != propertyObj) {
|
||||
writePropertyInternal(propertyObj, dbo, inverseProp);
|
||||
}
|
||||
@@ -421,68 +400,47 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
|
||||
return;
|
||||
}
|
||||
|
||||
DBObjectAccessor accessor = new DBObjectAccessor(dbo);
|
||||
|
||||
String name = prop.getFieldName();
|
||||
TypeInformation<?> valueType = ClassTypeInformation.from(obj.getClass());
|
||||
TypeInformation<?> type = prop.getTypeInformation();
|
||||
|
||||
if (valueType.isCollectionLike()) {
|
||||
DBObject collectionInternal = createCollection(asCollection(obj), prop);
|
||||
accessor.put(prop, collectionInternal);
|
||||
dbo.put(name, collectionInternal);
|
||||
return;
|
||||
}
|
||||
|
||||
if (valueType.isMap()) {
|
||||
DBObject mapDbObj = createMap((Map<Object, Object>) obj, prop);
|
||||
accessor.put(prop, mapDbObj);
|
||||
BasicDBObject mapDbObj = new BasicDBObject();
|
||||
writeMapInternal((Map<Object, Object>) obj, mapDbObj, type);
|
||||
dbo.put(name, mapDbObj);
|
||||
return;
|
||||
}
|
||||
|
||||
if (prop.isDbReference()) {
|
||||
|
||||
DBRef dbRefObj = null;
|
||||
|
||||
/*
|
||||
* If we already have a LazyLoadingProxy, we use it's cached DBRef value instead of
|
||||
* unnecessarily initializing it only to convert it to a DBRef a few instructions later.
|
||||
*/
|
||||
if (obj instanceof LazyLoadingProxy) {
|
||||
dbRefObj = ((LazyLoadingProxy) obj).toDBRef();
|
||||
}
|
||||
|
||||
dbRefObj = dbRefObj != null ? dbRefObj : createDBRef(obj, prop);
|
||||
|
||||
DBRef dbRefObj = createDBRef(obj, prop.getDBRef());
|
||||
if (null != dbRefObj) {
|
||||
accessor.put(prop, dbRefObj);
|
||||
dbo.put(name, dbRefObj);
|
||||
return;
|
||||
}
|
||||
}
|
||||
|
||||
/*
|
||||
* If we have a LazyLoadingProxy we make sure it is initialized first.
|
||||
*/
|
||||
if (obj instanceof LazyLoadingProxy) {
|
||||
obj = ((LazyLoadingProxy) obj).initialize();
|
||||
}
|
||||
|
||||
// Lookup potential custom target type
|
||||
Class<?> basicTargetType = conversions.getCustomWriteTarget(obj.getClass(), null);
|
||||
|
||||
if (basicTargetType != null) {
|
||||
accessor.put(prop, conversionService.convert(obj, basicTargetType));
|
||||
dbo.put(name, conversionService.convert(obj, basicTargetType));
|
||||
return;
|
||||
}
|
||||
|
||||
Object existingValue = accessor.get(prop);
|
||||
BasicDBObject propDbObj = existingValue instanceof BasicDBObject ? (BasicDBObject) existingValue
|
||||
: new BasicDBObject();
|
||||
BasicDBObject propDbObj = new BasicDBObject();
|
||||
addCustomTypeKeyIfNecessary(type, obj, propDbObj);
|
||||
|
||||
MongoPersistentEntity<?> entity = isSubtype(prop.getType(), obj.getClass()) ? mappingContext
|
||||
.getPersistentEntity(obj.getClass()) : mappingContext.getPersistentEntity(type);
|
||||
|
||||
writeInternal(obj, propDbObj, entity);
|
||||
accessor.put(prop, propDbObj);
|
||||
dbo.put(name, propDbObj);
|
||||
}
|
||||
|
||||
private boolean isSubtype(Class<?> left, Class<?> right) {
|
||||
@@ -527,49 +485,13 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
|
||||
continue;
|
||||
}
|
||||
|
||||
DBRef dbRef = createDBRef(element, property);
|
||||
DBRef dbRef = createDBRef(element, property.getDBRef());
|
||||
dbList.add(dbRef);
|
||||
}
|
||||
|
||||
return dbList;
|
||||
}
|
||||
|
||||
/**
|
||||
* Writes the given {@link Map} using the given {@link MongoPersistentProperty} information.
|
||||
*
|
||||
* @param map must not {@literal null}.
|
||||
* @param property must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
protected DBObject createMap(Map<Object, Object> map, MongoPersistentProperty property) {
|
||||
|
||||
Assert.notNull(map, "Given map must not be null!");
|
||||
Assert.notNull(property, "PersistentProperty must not be null!");
|
||||
|
||||
if (!property.isDbReference()) {
|
||||
return writeMapInternal(map, new BasicDBObject(), property.getTypeInformation());
|
||||
}
|
||||
|
||||
BasicDBObject dbObject = new BasicDBObject();
|
||||
|
||||
for (Map.Entry<Object, Object> entry : map.entrySet()) {
|
||||
|
||||
Object key = entry.getKey();
|
||||
Object value = entry.getValue();
|
||||
|
||||
if (conversions.isSimpleType(key.getClass())) {
|
||||
|
||||
String simpleKey = potentiallyEscapeMapKey(key.toString());
|
||||
dbObject.put(simpleKey, value != null ? createDBRef(value, property) : null);
|
||||
|
||||
} else {
|
||||
throw new MappingException("Cannot use a complex object as a key value.");
|
||||
}
|
||||
}
|
||||
|
||||
return dbObject;
|
||||
}
|
||||
|
||||
/**
|
||||
* Populates the given {@link BasicDBList} with values from the given {@link Collection}.
|
||||
*
|
||||
@@ -680,7 +602,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
|
||||
*/
|
||||
protected void addCustomTypeKeyIfNecessary(TypeInformation<?> type, Object value, DBObject dbObject) {
|
||||
|
||||
TypeInformation<?> actualType = type != null ? type.getActualType() : null;
|
||||
TypeInformation<?> actualType = type != null ? type.getActualType() : type;
|
||||
Class<?> reference = actualType == null ? Object.class : actualType.getType();
|
||||
|
||||
boolean notTheSameClass = !value.getClass().equals(reference);
|
||||
@@ -700,11 +622,6 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
|
||||
dbObject.put(key, getPotentiallyConvertedSimpleWrite(value));
|
||||
}
|
||||
|
||||
private void writeSimpleInternal(Object value, DBObject dbObject, MongoPersistentProperty property) {
|
||||
DBObjectAccessor accessor = new DBObjectAccessor(dbObject);
|
||||
accessor.put(property, getPotentiallyConvertedSimpleWrite(value));
|
||||
}
|
||||
|
||||
/**
|
||||
* Checks whether we have a custom conversion registered for the given value into an arbitrary simple Mongo type.
|
||||
* Returns the converted value if so. If not, we perform special enum handling or simply return the value as is.
|
||||
@@ -753,7 +670,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
|
||||
return target.isAssignableFrom(value.getClass()) ? value : conversionService.convert(value, target);
|
||||
}
|
||||
|
||||
protected DBRef createDBRef(Object target, MongoPersistentProperty property) {
|
||||
protected DBRef createDBRef(Object target, org.springframework.data.mongodb.core.mapping.DBRef dbref) {
|
||||
|
||||
Assert.notNull(target);
|
||||
|
||||
@@ -762,7 +679,6 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
|
||||
}
|
||||
|
||||
MongoPersistentEntity<?> targetEntity = mappingContext.getPersistentEntity(target.getClass());
|
||||
targetEntity = targetEntity == null ? targetEntity = mappingContext.getPersistentEntity(property) : targetEntity;
|
||||
|
||||
if (null == targetEntity) {
|
||||
throw new MappingException("No mapping metadata found for " + target.getClass());
|
||||
@@ -774,21 +690,17 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
|
||||
throw new MappingException("No id property found on class " + targetEntity.getType());
|
||||
}
|
||||
|
||||
Object id = null;
|
||||
|
||||
if (target.getClass().equals(idProperty.getType())) {
|
||||
id = target;
|
||||
} else {
|
||||
BeanWrapper<Object> wrapper = BeanWrapper.create(target, conversionService);
|
||||
id = wrapper.getProperty(idProperty, Object.class);
|
||||
}
|
||||
BeanWrapper<MongoPersistentEntity<Object>, Object> wrapper = BeanWrapper.create(target, conversionService);
|
||||
Object id = wrapper.getProperty(idProperty, Object.class, useFieldAccessOnly);
|
||||
|
||||
if (null == id) {
|
||||
throw new MappingException("Cannot create a reference to an object with a NULL id.");
|
||||
}
|
||||
|
||||
return dbRefResolver.createDbRef(property == null ? null : property.getDBRef(), targetEntity,
|
||||
idMapper.convertId(id));
|
||||
DB db = mongoDbFactory.getDb();
|
||||
db = dbref != null && StringUtils.hasText(dbref.db()) ? mongoDbFactory.getDb(dbref.db()) : db;
|
||||
|
||||
return new DBRef(db, targetEntity.getCollection(), idMapper.convertId(id));
|
||||
}
|
||||
|
||||
protected Object getValueInternal(MongoPersistentProperty prop, DBObject dbo, SpELExpressionEvaluator eval,
|
||||
@@ -805,6 +717,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
|
||||
* @param sourceValue must not be {@literal null}.
|
||||
* @return the converted {@link Collection} or array, will never be {@literal null}.
|
||||
*/
|
||||
@SuppressWarnings("unchecked")
|
||||
private Object readCollectionOrArray(TypeInformation<?> targetType, BasicDBList sourceValue, Object parent) {
|
||||
|
||||
Assert.notNull(targetType);
|
||||
@@ -815,19 +728,19 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
|
||||
return getPotentiallyConvertedSimpleRead(new HashSet<Object>(), collectionType);
|
||||
}
|
||||
|
||||
collectionType = Collection.class.isAssignableFrom(collectionType) ? collectionType : List.class;
|
||||
|
||||
Collection<Object> items = targetType.getType().isArray() ? new ArrayList<Object>() : CollectionFactory
|
||||
.createCollection(collectionType, sourceValue.size());
|
||||
TypeInformation<?> componentType = targetType.getComponentType();
|
||||
Class<?> rawComponentType = componentType == null ? null : componentType.getType();
|
||||
|
||||
collectionType = Collection.class.isAssignableFrom(collectionType) ? collectionType : List.class;
|
||||
Collection<Object> items = targetType.getType().isArray() ? new ArrayList<Object>() : CollectionFactory
|
||||
.createCollection(collectionType, rawComponentType, sourceValue.size());
|
||||
|
||||
for (int i = 0; i < sourceValue.size(); i++) {
|
||||
|
||||
Object dbObjItem = sourceValue.get(i);
|
||||
|
||||
if (dbObjItem instanceof DBRef) {
|
||||
items.add(DBRef.class.equals(rawComponentType) ? dbObjItem : read(componentType, readRef((DBRef) dbObjItem),
|
||||
items.add(DBRef.class.equals(rawComponentType) ? dbObjItem : read(componentType, ((DBRef) dbObjItem).fetch(),
|
||||
parent));
|
||||
} else if (dbObjItem instanceof DBObject) {
|
||||
items.add(read(componentType, (DBObject) dbObjItem, parent));
|
||||
@@ -852,14 +765,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
|
||||
Assert.notNull(dbObject);
|
||||
|
||||
Class<?> mapType = typeMapper.readType(dbObject, type).getType();
|
||||
|
||||
TypeInformation<?> keyType = type.getComponentType();
|
||||
Class<?> rawKeyType = keyType == null ? null : keyType.getType();
|
||||
|
||||
TypeInformation<?> valueType = type.getMapValueType();
|
||||
Class<?> rawValueType = valueType == null ? null : valueType.getType();
|
||||
|
||||
Map<Object, Object> map = CollectionFactory.createMap(mapType, rawKeyType, dbObject.keySet().size());
|
||||
Map<Object, Object> map = CollectionFactory.createMap(mapType, dbObject.keySet().size());
|
||||
Map<String, Object> sourceMap = dbObject.toMap();
|
||||
|
||||
for (Entry<String, Object> entry : sourceMap.entrySet()) {
|
||||
@@ -869,16 +775,20 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
|
||||
|
||||
Object key = potentiallyUnescapeMapKey(entry.getKey());
|
||||
|
||||
if (rawKeyType != null) {
|
||||
key = conversionService.convert(key, rawKeyType);
|
||||
TypeInformation<?> keyTypeInformation = type.getComponentType();
|
||||
if (keyTypeInformation != null) {
|
||||
Class<?> keyType = keyTypeInformation.getType();
|
||||
key = conversionService.convert(key, keyType);
|
||||
}
|
||||
|
||||
Object value = entry.getValue();
|
||||
TypeInformation<?> valueType = type.getMapValueType();
|
||||
Class<?> rawValueType = valueType == null ? null : valueType.getType();
|
||||
|
||||
if (value instanceof DBObject) {
|
||||
map.put(key, read(valueType, (DBObject) value, parent));
|
||||
} else if (value instanceof DBRef) {
|
||||
map.put(key, DBRef.class.equals(rawValueType) ? value : read(valueType, readRef((DBRef) value)));
|
||||
map.put(key, DBRef.class.equals(rawValueType) ? value : read(valueType, ((DBRef) value).fetch()));
|
||||
} else {
|
||||
Class<?> valueClass = valueType == null ? null : valueType.getType();
|
||||
map.put(key, getPotentiallyConvertedSimpleRead(value, valueClass));
|
||||
@@ -924,17 +834,15 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
|
||||
return getPotentiallyConvertedSimpleWrite(obj);
|
||||
}
|
||||
|
||||
TypeInformation<?> typeHint = typeInformation == null ? ClassTypeInformation.OBJECT : typeInformation;
|
||||
|
||||
if (obj instanceof BasicDBList) {
|
||||
return maybeConvertList((BasicDBList) obj, typeHint);
|
||||
return maybeConvertList((BasicDBList) obj);
|
||||
}
|
||||
|
||||
if (obj instanceof DBObject) {
|
||||
DBObject newValueDbo = new BasicDBObject();
|
||||
for (String vk : ((DBObject) obj).keySet()) {
|
||||
Object o = ((DBObject) obj).get(vk);
|
||||
newValueDbo.put(vk, convertToMongoType(o, typeHint));
|
||||
newValueDbo.put(vk, convertToMongoType(o));
|
||||
}
|
||||
return newValueDbo;
|
||||
}
|
||||
@@ -942,17 +850,17 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
|
||||
if (obj instanceof Map) {
|
||||
DBObject result = new BasicDBObject();
|
||||
for (Map.Entry<Object, Object> entry : ((Map<Object, Object>) obj).entrySet()) {
|
||||
result.put(entry.getKey().toString(), convertToMongoType(entry.getValue(), typeHint));
|
||||
result.put(entry.getKey().toString(), convertToMongoType(entry.getValue()));
|
||||
}
|
||||
return result;
|
||||
}
|
||||
|
||||
if (obj.getClass().isArray()) {
|
||||
return maybeConvertList(Arrays.asList((Object[]) obj), typeHint);
|
||||
return maybeConvertList(Arrays.asList((Object[]) obj));
|
||||
}
|
||||
|
||||
if (obj instanceof Collection) {
|
||||
return maybeConvertList((Collection<?>) obj, typeHint);
|
||||
return maybeConvertList((Collection<?>) obj);
|
||||
}
|
||||
|
||||
DBObject newDbo = new BasicDBObject();
|
||||
@@ -965,13 +873,11 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
|
||||
return !obj.getClass().equals(typeInformation.getType()) ? newDbo : removeTypeInfoRecursively(newDbo);
|
||||
}
|
||||
|
||||
public BasicDBList maybeConvertList(Iterable<?> source, TypeInformation<?> typeInformation) {
|
||||
|
||||
public BasicDBList maybeConvertList(Iterable<?> source) {
|
||||
BasicDBList newDbl = new BasicDBList();
|
||||
for (Object element : source) {
|
||||
newDbl.add(convertToMongoType(element, typeInformation));
|
||||
newDbl.add(convertToMongoType(element));
|
||||
}
|
||||
|
||||
return newDbl;
|
||||
}
|
||||
|
||||
@@ -1014,7 +920,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
|
||||
|
||||
private class MongoDbPropertyValueProvider implements PropertyValueProvider<MongoPersistentProperty> {
|
||||
|
||||
private final DBObjectAccessor source;
|
||||
private final DBObject source;
|
||||
private final SpELExpressionEvaluator evaluator;
|
||||
private final Object parent;
|
||||
|
||||
@@ -1027,7 +933,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
|
||||
Assert.notNull(source);
|
||||
Assert.notNull(evaluator);
|
||||
|
||||
this.source = new DBObjectAccessor(source);
|
||||
this.source = source;
|
||||
this.evaluator = evaluator;
|
||||
this.parent = parent;
|
||||
}
|
||||
@@ -1039,7 +945,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
|
||||
public <T> T getPropertyValue(MongoPersistentProperty property) {
|
||||
|
||||
String expression = property.getSpelExpression();
|
||||
Object value = expression != null ? evaluator.evaluate(expression) : source.get(property);
|
||||
Object value = expression != null ? evaluator.evaluate(expression) : source.get(property.getFieldName());
|
||||
|
||||
if (value == null) {
|
||||
return null;
|
||||
@@ -1092,7 +998,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
|
||||
if (conversions.hasCustomReadTarget(value.getClass(), rawType)) {
|
||||
return (T) conversionService.convert(value, rawType);
|
||||
} else if (value instanceof DBRef) {
|
||||
return (T) (rawType.equals(DBRef.class) ? value : read(type, readRef((DBRef) value), parent));
|
||||
return (T) (rawType.equals(DBRef.class) ? value : read(type, ((DBRef) value).fetch(), parent));
|
||||
} else if (value instanceof BasicDBList) {
|
||||
return (T) readCollectionOrArray(type, (BasicDBList) value, parent);
|
||||
} else if (value instanceof DBObject) {
|
||||
@@ -1101,14 +1007,4 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
|
||||
return (T) getPotentiallyConvertedSimpleRead(value, rawType);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Performs the fetch operation for the given {@link DBRef}.
|
||||
*
|
||||
* @param ref
|
||||
* @return
|
||||
*/
|
||||
DBObject readRef(DBRef ref) {
|
||||
return ref.fetch();
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2010-2013 the original author or authors.
|
||||
* Copyright 2010-2011 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -17,7 +17,6 @@ package org.springframework.data.mongodb.core.convert;
|
||||
|
||||
import org.springframework.data.convert.EntityConverter;
|
||||
import org.springframework.data.convert.EntityReader;
|
||||
import org.springframework.data.convert.TypeMapper;
|
||||
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
|
||||
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
|
||||
|
||||
@@ -27,17 +26,9 @@ import com.mongodb.DBObject;
|
||||
* Central Mongo specific converter interface which combines {@link MongoWriter} and {@link MongoReader}.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
public interface MongoConverter extends
|
||||
EntityConverter<MongoPersistentEntity<?>, MongoPersistentProperty, Object, DBObject>, MongoWriter<Object>,
|
||||
EntityReader<Object, DBObject> {
|
||||
|
||||
/**
|
||||
* Returns thw {@link TypeMapper} being used to write type information into {@link DBObject}s created with that
|
||||
* converter.
|
||||
*
|
||||
* @return will never be {@literal null}.
|
||||
*/
|
||||
MongoTypeMapper getTypeMapper();
|
||||
}
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2011-2014 the original author or authors.
|
||||
* Copyright 2011-2012 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -24,23 +24,21 @@ import org.bson.types.ObjectId;
|
||||
import org.springframework.core.convert.ConversionFailedException;
|
||||
import org.springframework.core.convert.TypeDescriptor;
|
||||
import org.springframework.core.convert.converter.Converter;
|
||||
import org.springframework.data.convert.ReadingConverter;
|
||||
import org.springframework.util.StringUtils;
|
||||
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
* Wrapper class to contain useful converters for the usage with Mongo.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
abstract class MongoConverters {
|
||||
|
||||
/**
|
||||
* Private constructor to prevent instantiation.
|
||||
*/
|
||||
private MongoConverters() {}
|
||||
private MongoConverters() {
|
||||
|
||||
}
|
||||
|
||||
/**
|
||||
* Simple singleton to convert {@link ObjectId}s to their {@link String} representation.
|
||||
@@ -149,15 +147,4 @@ abstract class MongoConverters {
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@ReadingConverter
|
||||
public static enum DBObjectToStringConverter implements Converter<DBObject, String> {
|
||||
|
||||
INSTANCE;
|
||||
|
||||
@Override
|
||||
public String convert(DBObject source) {
|
||||
return source == null ? null : source.toString();
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -15,8 +15,6 @@
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.convert;
|
||||
|
||||
import java.util.Set;
|
||||
|
||||
import org.springframework.data.convert.TypeMapper;
|
||||
|
||||
import com.mongodb.DBObject;
|
||||
@@ -34,14 +32,4 @@ public interface MongoTypeMapper extends TypeMapper<DBObject> {
|
||||
* @return
|
||||
*/
|
||||
boolean isTypeKey(String key);
|
||||
|
||||
/**
|
||||
* Writes type restrictions to the given {@link DBObject}. This usually results in an {@code $in}-clause to be
|
||||
* generated that restricts the type-key (e.g. {@code _class}) to be in the set of type aliases for the given
|
||||
* {@code restrictedTypes}.
|
||||
*
|
||||
* @param result must not be {@literal null}
|
||||
* @param restrictedTypes must not be {@literal null}
|
||||
*/
|
||||
void writeTypeRestrictions(DBObject result, Set<Class<?>> restrictedTypes);
|
||||
}
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2011-2014 the original author or authors.
|
||||
* Copyright 2011-2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -17,27 +17,19 @@ package org.springframework.data.mongodb.core.convert;
|
||||
|
||||
import java.util.ArrayList;
|
||||
import java.util.Arrays;
|
||||
import java.util.Collections;
|
||||
import java.util.Iterator;
|
||||
import java.util.List;
|
||||
import java.util.Map.Entry;
|
||||
import java.util.Set;
|
||||
|
||||
import org.bson.types.ObjectId;
|
||||
import org.springframework.core.convert.ConversionException;
|
||||
import org.springframework.core.convert.ConversionService;
|
||||
import org.springframework.core.convert.converter.Converter;
|
||||
import org.springframework.data.mapping.Association;
|
||||
import org.springframework.data.mapping.PersistentEntity;
|
||||
import org.springframework.data.mapping.PropertyPath;
|
||||
import org.springframework.data.mapping.PropertyReferenceException;
|
||||
import org.springframework.data.mapping.context.MappingContext;
|
||||
import org.springframework.data.mapping.context.PersistentPropertyPath;
|
||||
import org.springframework.data.mapping.model.MappingException;
|
||||
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
|
||||
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
|
||||
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty.PropertyToFieldNameConverter;
|
||||
import org.springframework.data.mongodb.core.query.Query;
|
||||
import org.springframework.util.Assert;
|
||||
|
||||
import com.mongodb.BasicDBList;
|
||||
@@ -51,12 +43,11 @@ import com.mongodb.DBRef;
|
||||
* @author Jon Brisbin
|
||||
* @author Oliver Gierke
|
||||
* @author Patryk Wasik
|
||||
* @author Thomas Darimont
|
||||
* @author Christoph Strobl
|
||||
*/
|
||||
public class QueryMapper {
|
||||
|
||||
private static final List<String> DEFAULT_ID_NAMES = Arrays.asList("id", "_id");
|
||||
private static final String N_OR_PATTERN = "\\$.*or";
|
||||
|
||||
private final ConversionService conversionService;
|
||||
private final MongoConverter converter;
|
||||
@@ -84,10 +75,9 @@ public class QueryMapper {
|
||||
* @param entity can be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
@SuppressWarnings("deprecation")
|
||||
public DBObject getMappedObject(DBObject query, MongoPersistentEntity<?> entity) {
|
||||
|
||||
if (isNestedKeyword(query)) {
|
||||
if (Keyword.isKeyword(query)) {
|
||||
return getMappedKeyword(new Keyword(query), entity);
|
||||
}
|
||||
|
||||
@@ -95,87 +85,51 @@ public class QueryMapper {
|
||||
|
||||
for (String key : query.keySet()) {
|
||||
|
||||
// TODO: remove one once QueryMapper can work with Query instances directly
|
||||
if (Query.isRestrictedTypeKey(key)) {
|
||||
|
||||
@SuppressWarnings("unchecked")
|
||||
Set<Class<?>> restrictedTypes = (Set<Class<?>>) query.get(key);
|
||||
this.converter.getTypeMapper().writeTypeRestrictions(result, restrictedTypes);
|
||||
|
||||
continue;
|
||||
}
|
||||
|
||||
if (isKeyword(key)) {
|
||||
if (Keyword.isKeyword(key)) {
|
||||
result.putAll(getMappedKeyword(new Keyword(query, key), entity));
|
||||
continue;
|
||||
}
|
||||
|
||||
Field field = createPropertyField(entity, key, mappingContext);
|
||||
Entry<String, Object> entry = getMappedObjectForField(field, query.get(key));
|
||||
Field field = entity == null ? new Field(key) : new MetadataBackedField(key, entity, mappingContext);
|
||||
|
||||
result.put(entry.getKey(), entry.getValue());
|
||||
Object rawValue = query.get(key);
|
||||
String newKey = field.getMappedKey();
|
||||
|
||||
if (Keyword.isKeyword(rawValue) && !field.isIdField()) {
|
||||
Keyword keyword = new Keyword((DBObject) rawValue);
|
||||
result.put(newKey, getMappedKeyword(field, keyword));
|
||||
} else {
|
||||
result.put(newKey, getMappedValue(field, query.get(key)));
|
||||
}
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
/**
|
||||
* Extracts the mapped object value for given field out of rawValue taking nested {@link Keyword}s into account
|
||||
*
|
||||
* @param field
|
||||
* @param rawValue
|
||||
* @return
|
||||
*/
|
||||
protected Entry<String, Object> getMappedObjectForField(Field field, Object rawValue) {
|
||||
|
||||
String key = field.getMappedKey();
|
||||
Object value;
|
||||
|
||||
if (isNestedKeyword(rawValue) && !field.isIdField()) {
|
||||
Keyword keyword = new Keyword((DBObject) rawValue);
|
||||
value = getMappedKeyword(field, keyword);
|
||||
} else {
|
||||
value = getMappedValue(field, rawValue);
|
||||
}
|
||||
|
||||
return createMapEntry(key, value);
|
||||
}
|
||||
|
||||
/**
|
||||
* @param entity
|
||||
* @param key
|
||||
* @param mappingContext
|
||||
* @return
|
||||
*/
|
||||
protected Field createPropertyField(MongoPersistentEntity<?> entity, String key,
|
||||
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext) {
|
||||
return entity == null ? new Field(key) : new MetadataBackedField(key, entity, mappingContext);
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the given {@link DBObject} representing a keyword by mapping the keyword's value.
|
||||
*
|
||||
* @param keyword the {@link DBObject} representing a keyword (e.g. {@code $ne : … } )
|
||||
* @param query the {@link DBObject} representing a keyword (e.g. {@code $ne : … } )
|
||||
* @param entity
|
||||
* @return
|
||||
*/
|
||||
protected DBObject getMappedKeyword(Keyword keyword, MongoPersistentEntity<?> entity) {
|
||||
private DBObject getMappedKeyword(Keyword query, MongoPersistentEntity<?> entity) {
|
||||
|
||||
// $or/$nor
|
||||
if (keyword.isOrOrNor() || keyword.hasIterableValue()) {
|
||||
if (query.key.matches(N_OR_PATTERN) || query.value instanceof Iterable) {
|
||||
|
||||
Iterable<?> conditions = keyword.getValue();
|
||||
Iterable<?> conditions = (Iterable<?>) query.value;
|
||||
BasicDBList newConditions = new BasicDBList();
|
||||
|
||||
for (Object condition : conditions) {
|
||||
newConditions.add(isDBObject(condition) ? getMappedObject((DBObject) condition, entity)
|
||||
newConditions.add(condition instanceof DBObject ? getMappedObject((DBObject) condition, entity)
|
||||
: convertSimpleOrDBObject(condition, entity));
|
||||
}
|
||||
|
||||
return new BasicDBObject(keyword.getKey(), newConditions);
|
||||
return new BasicDBObject(query.key, newConditions);
|
||||
}
|
||||
|
||||
return new BasicDBObject(keyword.getKey(), convertSimpleOrDBObject(keyword.getValue(), entity));
|
||||
return new BasicDBObject(query.key, convertSimpleOrDBObject(query.value, entity));
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -185,15 +139,13 @@ public class QueryMapper {
|
||||
* @param keyword
|
||||
* @return
|
||||
*/
|
||||
protected DBObject getMappedKeyword(Field property, Keyword keyword) {
|
||||
private DBObject getMappedKeyword(Field property, Keyword keyword) {
|
||||
|
||||
boolean needsAssociationConversion = property.isAssociation() && !keyword.isExists();
|
||||
Object value = keyword.getValue();
|
||||
Object value = needsAssociationConversion ? convertAssociation(keyword.value, property.getProperty())
|
||||
: getMappedValue(property.with(keyword.key), keyword.value);
|
||||
|
||||
Object convertedValue = needsAssociationConversion ? convertAssociation(value, property) : getMappedValue(
|
||||
property.with(keyword.getKey()), value);
|
||||
|
||||
return new BasicDBObject(keyword.key, convertedValue);
|
||||
return new BasicDBObject(keyword.key, value);
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -205,78 +157,43 @@ public class QueryMapper {
|
||||
* @param newKey the key the value will be bound to eventually
|
||||
* @return
|
||||
*/
|
||||
protected Object getMappedValue(Field documentField, Object value) {
|
||||
private Object getMappedValue(Field documentField, Object value) {
|
||||
|
||||
if (documentField.isIdField()) {
|
||||
|
||||
if (isDBObject(value)) {
|
||||
if (value instanceof DBObject) {
|
||||
DBObject valueDbo = (DBObject) value;
|
||||
DBObject resultDbo = new BasicDBObject(valueDbo.toMap());
|
||||
|
||||
if (valueDbo.containsField("$in") || valueDbo.containsField("$nin")) {
|
||||
String inKey = valueDbo.containsField("$in") ? "$in" : "$nin";
|
||||
List<Object> ids = new ArrayList<Object>();
|
||||
for (Object id : (Iterable<?>) valueDbo.get(inKey)) {
|
||||
ids.add(convertId(id));
|
||||
}
|
||||
resultDbo.put(inKey, ids.toArray(new Object[ids.size()]));
|
||||
valueDbo.put(inKey, ids.toArray(new Object[ids.size()]));
|
||||
} else if (valueDbo.containsField("$ne")) {
|
||||
resultDbo.put("$ne", convertId(valueDbo.get("$ne")));
|
||||
valueDbo.put("$ne", convertId(valueDbo.get("$ne")));
|
||||
} else {
|
||||
return getMappedObject(resultDbo, null);
|
||||
return getMappedObject((DBObject) value, null);
|
||||
}
|
||||
|
||||
return resultDbo;
|
||||
return valueDbo;
|
||||
|
||||
} else {
|
||||
return convertId(value);
|
||||
}
|
||||
}
|
||||
|
||||
if (isNestedKeyword(value)) {
|
||||
if (Keyword.isKeyword(value)) {
|
||||
return getMappedKeyword(new Keyword((DBObject) value), null);
|
||||
}
|
||||
|
||||
if (isAssociationConversionNecessary(documentField, value)) {
|
||||
return convertAssociation(value, documentField);
|
||||
if (documentField.isAssociation()) {
|
||||
return convertAssociation(value, documentField.getProperty());
|
||||
}
|
||||
|
||||
return convertSimpleOrDBObject(value, documentField.getPropertyEntity());
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns whether the given {@link Field} represents an association reference that together with the given value
|
||||
* requires conversion to a {@link org.springframework.data.mongodb.core.mapping.DBRef} object. We check whether the
|
||||
* type of the given value is compatible with the type of the given document field in order to deal with potential
|
||||
* query field exclusions, since MongoDB uses the {@code int} {@literal 0} as an indicator for an excluded field.
|
||||
*
|
||||
* @param documentField must not be {@literal null}.
|
||||
* @param value
|
||||
* @return
|
||||
*/
|
||||
protected boolean isAssociationConversionNecessary(Field documentField, Object value) {
|
||||
|
||||
Assert.notNull(documentField, "Document field must not be null!");
|
||||
|
||||
if (value == null) {
|
||||
return false;
|
||||
}
|
||||
|
||||
if (!documentField.isAssociation()) {
|
||||
return false;
|
||||
}
|
||||
|
||||
Class<? extends Object> type = value.getClass();
|
||||
MongoPersistentProperty property = documentField.getProperty();
|
||||
|
||||
if (property.getActualType().isAssignableFrom(type)) {
|
||||
return true;
|
||||
}
|
||||
|
||||
MongoPersistentEntity<?> entity = documentField.getPropertyEntity();
|
||||
return entity.hasIdProperty() && entity.getIdProperty().getActualType().isAssignableFrom(type);
|
||||
}
|
||||
|
||||
/**
|
||||
* Retriggers mapping if the given source is a {@link DBObject} or simply invokes the
|
||||
*
|
||||
@@ -284,33 +201,17 @@ public class QueryMapper {
|
||||
* @param entity
|
||||
* @return
|
||||
*/
|
||||
protected Object convertSimpleOrDBObject(Object source, MongoPersistentEntity<?> entity) {
|
||||
private Object convertSimpleOrDBObject(Object source, MongoPersistentEntity<?> entity) {
|
||||
|
||||
if (source instanceof BasicDBList) {
|
||||
return delegateConvertToMongoType(source, entity);
|
||||
return converter.convertToMongoType(source);
|
||||
}
|
||||
|
||||
if (isDBObject(source)) {
|
||||
if (source instanceof DBObject) {
|
||||
return getMappedObject((DBObject) source, entity);
|
||||
}
|
||||
|
||||
return delegateConvertToMongoType(source, entity);
|
||||
}
|
||||
|
||||
/**
|
||||
* Converts the given source Object to a mongo type with the type information of the original source type omitted.
|
||||
* Subclasses may overwrite this method to retain the type information of the source type on the resulting mongo type.
|
||||
*
|
||||
* @param source
|
||||
* @param entity
|
||||
* @return the converted mongo type or null if source is null
|
||||
*/
|
||||
protected Object delegateConvertToMongoType(Object source, MongoPersistentEntity<?> entity) {
|
||||
return converter.convertToMongoType(source, entity == null ? null : entity.getTypeInformation());
|
||||
}
|
||||
|
||||
protected Object convertAssociation(Object source, Field field) {
|
||||
return convertAssociation(source, field.getProperty());
|
||||
return converter.convertToMongoType(source);
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -320,16 +221,16 @@ public class QueryMapper {
|
||||
* @param property
|
||||
* @return
|
||||
*/
|
||||
protected Object convertAssociation(Object source, MongoPersistentProperty property) {
|
||||
private Object convertAssociation(Object source, MongoPersistentProperty property) {
|
||||
|
||||
if (property == null || source == null || source instanceof DBRef || source instanceof DBObject) {
|
||||
if (property == null || !property.isAssociation()) {
|
||||
return source;
|
||||
}
|
||||
|
||||
if (source instanceof Iterable) {
|
||||
BasicDBList result = new BasicDBList();
|
||||
for (Object element : (Iterable<?>) source) {
|
||||
result.add(createDbRefFor(element, property));
|
||||
result.add(element instanceof DBRef ? element : converter.toDBRef(element, property));
|
||||
}
|
||||
return result;
|
||||
}
|
||||
@@ -338,55 +239,13 @@ public class QueryMapper {
|
||||
BasicDBObject result = new BasicDBObject();
|
||||
DBObject dbObject = (DBObject) source;
|
||||
for (String key : dbObject.keySet()) {
|
||||
result.put(key, createDbRefFor(dbObject.get(key), property));
|
||||
Object o = dbObject.get(key);
|
||||
result.put(key, o instanceof DBRef ? o : converter.toDBRef(o, property));
|
||||
}
|
||||
return result;
|
||||
}
|
||||
|
||||
return createDbRefFor(source, property);
|
||||
}
|
||||
|
||||
/**
|
||||
* Checks whether the given value is a {@link DBObject}.
|
||||
*
|
||||
* @param value can be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
protected final boolean isDBObject(Object value) {
|
||||
return value instanceof DBObject;
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link Entry} for the given {@link Field} with the given value.
|
||||
*
|
||||
* @param field must not be {@literal null}.
|
||||
* @param value can be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
protected final Entry<String, Object> createMapEntry(Field field, Object value) {
|
||||
return createMapEntry(field.getMappedKey(), value);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link Entry} with the given key and value.
|
||||
*
|
||||
* @param key must not be {@literal null} or empty.
|
||||
* @param value can be {@literal null}
|
||||
* @return
|
||||
*/
|
||||
private Entry<String, Object> createMapEntry(String key, Object value) {
|
||||
|
||||
Assert.hasText(key, "Key must not be null or empty!");
|
||||
return Collections.singletonMap(key, value).entrySet().iterator().next();
|
||||
}
|
||||
|
||||
private DBRef createDbRefFor(Object source, MongoPersistentProperty property) {
|
||||
|
||||
if (source instanceof DBRef) {
|
||||
return (DBRef) source;
|
||||
}
|
||||
|
||||
return converter.toDBRef(source, property);
|
||||
return source == null || source instanceof DBRef ? source : converter.toDBRef(source, property);
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -403,40 +262,7 @@ public class QueryMapper {
|
||||
// Ignore
|
||||
}
|
||||
|
||||
return delegateConvertToMongoType(id, null);
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns whether the given {@link Object} is a keyword, i.e. if it's a {@link DBObject} with a keyword key.
|
||||
*
|
||||
* @param candidate
|
||||
* @return
|
||||
*/
|
||||
protected boolean isNestedKeyword(Object candidate) {
|
||||
|
||||
if (!(candidate instanceof BasicDBObject)) {
|
||||
return false;
|
||||
}
|
||||
|
||||
BasicDBObject dbObject = (BasicDBObject) candidate;
|
||||
Set<String> keys = dbObject.keySet();
|
||||
|
||||
if (keys.size() != 1) {
|
||||
return false;
|
||||
}
|
||||
|
||||
return isKeyword(keys.iterator().next().toString());
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns whether the given {@link String} is a MongoDB keyword. The default implementation will check against the
|
||||
* set of registered keywords returned by {@link #getKeywords()}.
|
||||
*
|
||||
* @param candidate
|
||||
* @return
|
||||
*/
|
||||
protected boolean isKeyword(String candidate) {
|
||||
return candidate.startsWith("$");
|
||||
return converter.convertToMongoType(id);
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -444,12 +270,10 @@ public class QueryMapper {
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
static class Keyword {
|
||||
private static class Keyword {
|
||||
|
||||
private static final String N_OR_PATTERN = "\\$.*or";
|
||||
|
||||
private final String key;
|
||||
private final Object value;
|
||||
String key;
|
||||
Object value;
|
||||
|
||||
public Keyword(DBObject source, String key) {
|
||||
this.key = key;
|
||||
@@ -474,21 +298,25 @@ public class QueryMapper {
|
||||
return "$exists".equalsIgnoreCase(key);
|
||||
}
|
||||
|
||||
public boolean isOrOrNor() {
|
||||
return key.matches(N_OR_PATTERN);
|
||||
/**
|
||||
* Returns whether the given value actually represents a keyword. If this returns {@literal true} it's safe to call
|
||||
* the constructor.
|
||||
*
|
||||
* @param value
|
||||
* @return
|
||||
*/
|
||||
public static boolean isKeyword(Object value) {
|
||||
|
||||
if (value instanceof String) {
|
||||
return ((String) value).startsWith("$");
|
||||
}
|
||||
|
||||
public boolean hasIterableValue() {
|
||||
return value instanceof Iterable;
|
||||
if (!(value instanceof DBObject)) {
|
||||
return false;
|
||||
}
|
||||
|
||||
public String getKey() {
|
||||
return key;
|
||||
}
|
||||
|
||||
@SuppressWarnings("unchecked")
|
||||
public <T> T getValue() {
|
||||
return (T) value;
|
||||
DBObject dbObject = (DBObject) value;
|
||||
return dbObject.keySet().size() == 1 && dbObject.keySet().iterator().next().startsWith("$");
|
||||
}
|
||||
}
|
||||
|
||||
@@ -497,14 +325,14 @@ public class QueryMapper {
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
protected static class Field {
|
||||
private static class Field {
|
||||
|
||||
private static final String ID_KEY = "_id";
|
||||
|
||||
protected final String name;
|
||||
|
||||
/**
|
||||
* Creates a new {@link DocumentField} without meta-information but the given name.
|
||||
* Creates a new {@link Field} without meta-information but the given name.
|
||||
*
|
||||
* @param name must not be {@literal null} or empty.
|
||||
*/
|
||||
@@ -515,7 +343,7 @@ public class QueryMapper {
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns a new {@link DocumentField} with the given name.
|
||||
* Returns a new {@link Field} with the given name.
|
||||
*
|
||||
* @param name must not be {@literal null} or empty.
|
||||
* @return
|
||||
@@ -534,9 +362,7 @@ public class QueryMapper {
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the underlying {@link MongoPersistentProperty} backing the field. For path traversals this will be the
|
||||
* property that represents the value to handle. This means it'll be the leaf property for plain paths or the
|
||||
* association property in case we refer to an association somewhere in the path.
|
||||
* Returns the underlying {@link MongoPersistentProperty} backing the field.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
@@ -570,36 +396,18 @@ public class QueryMapper {
|
||||
public String getMappedKey() {
|
||||
return isIdField() ? ID_KEY : name;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns whether the field references an association in case it refers to a nested field.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public boolean containsAssociation() {
|
||||
return false;
|
||||
}
|
||||
|
||||
public Association<MongoPersistentProperty> getAssociation() {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Extension of {@link DocumentField} to be backed with mapping metadata.
|
||||
* Extension of {@link Field} to be backed with mapping metadata.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
protected static class MetadataBackedField extends Field {
|
||||
|
||||
private static final String INVALID_ASSOCIATION_REFERENCE = "Invalid path reference %s! Associations can only be pointed to directly or via their id property!";
|
||||
private static class MetadataBackedField extends Field {
|
||||
|
||||
private final MongoPersistentEntity<?> entity;
|
||||
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
|
||||
private final MongoPersistentProperty property;
|
||||
private final PersistentPropertyPath<MongoPersistentProperty> path;
|
||||
private final Association<MongoPersistentProperty> association;
|
||||
|
||||
/**
|
||||
* Creates a new {@link MetadataBackedField} with the given name, {@link MongoPersistentEntity} and
|
||||
@@ -611,21 +419,6 @@ public class QueryMapper {
|
||||
*/
|
||||
public MetadataBackedField(String name, MongoPersistentEntity<?> entity,
|
||||
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> context) {
|
||||
this(name, entity, context, null);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link MetadataBackedField} with the given name, {@link MongoPersistentEntity} and
|
||||
* {@link MappingContext} with the given {@link MongoPersistentProperty}.
|
||||
*
|
||||
* @param name must not be {@literal null} or empty.
|
||||
* @param entity must not be {@literal null}.
|
||||
* @param context must not be {@literal null}.
|
||||
* @param property may be {@literal null}.
|
||||
*/
|
||||
public MetadataBackedField(String name, MongoPersistentEntity<?> entity,
|
||||
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> context,
|
||||
MongoPersistentProperty property) {
|
||||
|
||||
super(name);
|
||||
|
||||
@@ -634,9 +427,8 @@ public class QueryMapper {
|
||||
this.entity = entity;
|
||||
this.mappingContext = context;
|
||||
|
||||
this.path = getPath(name);
|
||||
this.property = path == null ? property : path.getLeafProperty();
|
||||
this.association = findAssociation();
|
||||
PersistentPropertyPath<MongoPersistentProperty> path = getPath(name);
|
||||
this.property = path == null ? null : path.getLeafProperty();
|
||||
}
|
||||
|
||||
/*
|
||||
@@ -645,7 +437,7 @@ public class QueryMapper {
|
||||
*/
|
||||
@Override
|
||||
public MetadataBackedField with(String name) {
|
||||
return new MetadataBackedField(name, entity, mappingContext, property);
|
||||
return new MetadataBackedField(name, entity, mappingContext);
|
||||
}
|
||||
|
||||
/*
|
||||
@@ -670,7 +462,7 @@ public class QueryMapper {
|
||||
*/
|
||||
@Override
|
||||
public MongoPersistentProperty getProperty() {
|
||||
return association == null ? property : association.getInverse();
|
||||
return property;
|
||||
}
|
||||
|
||||
/*
|
||||
@@ -689,34 +481,9 @@ public class QueryMapper {
|
||||
*/
|
||||
@Override
|
||||
public boolean isAssociation() {
|
||||
return association != null;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.convert.QueryMapper.Field#getAssociation()
|
||||
*/
|
||||
@Override
|
||||
public Association<MongoPersistentProperty> getAssociation() {
|
||||
return association;
|
||||
}
|
||||
|
||||
/**
|
||||
* Finds the association property in the {@link PersistentPropertyPath}.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
private final Association<MongoPersistentProperty> findAssociation() {
|
||||
|
||||
if (this.path != null) {
|
||||
for (MongoPersistentProperty p : this.path) {
|
||||
if (p.isAssociation()) {
|
||||
return p.getAssociation();
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return null;
|
||||
MongoPersistentProperty property = getProperty();
|
||||
return property == null ? false : property.isAssociation();
|
||||
}
|
||||
|
||||
/*
|
||||
@@ -725,57 +492,19 @@ public class QueryMapper {
|
||||
*/
|
||||
@Override
|
||||
public String getMappedKey() {
|
||||
return path == null ? name : path.toDotPath(getPropertyConverter());
|
||||
|
||||
PersistentPropertyPath<MongoPersistentProperty> path = getPath(name);
|
||||
return path == null ? name : path.toDotPath(MongoPersistentProperty.PropertyToFieldNameConverter.INSTANCE);
|
||||
}
|
||||
|
||||
protected PersistentPropertyPath<MongoPersistentProperty> getPath() {
|
||||
return path;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the {@link PersistentPropertyPath} for the given <code>pathExpression</code>.
|
||||
*
|
||||
* @param pathExpression
|
||||
* @return
|
||||
*/
|
||||
private PersistentPropertyPath<MongoPersistentProperty> getPath(String pathExpression) {
|
||||
private PersistentPropertyPath<MongoPersistentProperty> getPath(String name) {
|
||||
|
||||
try {
|
||||
|
||||
PropertyPath path = PropertyPath.from(pathExpression, entity.getTypeInformation());
|
||||
PersistentPropertyPath<MongoPersistentProperty> propertyPath = mappingContext.getPersistentPropertyPath(path);
|
||||
|
||||
Iterator<MongoPersistentProperty> iterator = propertyPath.iterator();
|
||||
boolean associationDetected = false;
|
||||
|
||||
while (iterator.hasNext()) {
|
||||
|
||||
MongoPersistentProperty property = iterator.next();
|
||||
|
||||
if (property.isAssociation()) {
|
||||
associationDetected = true;
|
||||
continue;
|
||||
}
|
||||
|
||||
if (associationDetected && !property.isIdProperty()) {
|
||||
throw new MappingException(String.format(INVALID_ASSOCIATION_REFERENCE, pathExpression));
|
||||
}
|
||||
}
|
||||
|
||||
return propertyPath;
|
||||
PropertyPath path = PropertyPath.from(name, entity.getTypeInformation());
|
||||
return mappingContext.getPersistentPropertyPath(path);
|
||||
} catch (PropertyReferenceException e) {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Return the {@link Converter} to be used to created the mapped key. Default implementation will use
|
||||
* {@link PropertyToFieldNameConverter}.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
protected Converter<MongoPersistentProperty, String> getPropertyConverter() {
|
||||
return PropertyToFieldNameConverter.INSTANCE;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,275 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013-2014 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.convert;
|
||||
|
||||
import java.util.Arrays;
|
||||
import java.util.Iterator;
|
||||
import java.util.Map.Entry;
|
||||
|
||||
import org.springframework.core.convert.converter.Converter;
|
||||
import org.springframework.data.mapping.Association;
|
||||
import org.springframework.data.mapping.context.MappingContext;
|
||||
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
|
||||
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
|
||||
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty.PropertyToFieldNameConverter;
|
||||
import org.springframework.data.mongodb.core.query.Query;
|
||||
import org.springframework.data.mongodb.core.query.Update.Modifier;
|
||||
import org.springframework.data.mongodb.core.query.Update.Modifiers;
|
||||
import org.springframework.data.util.ClassTypeInformation;
|
||||
import org.springframework.util.Assert;
|
||||
|
||||
import com.mongodb.BasicDBObject;
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
* A subclass of {@link QueryMapper} that retains type information on the mongo types.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
* @author Christoph Strobl
|
||||
*/
|
||||
public class UpdateMapper extends QueryMapper {
|
||||
|
||||
private final MongoConverter converter;
|
||||
|
||||
/**
|
||||
* Creates a new {@link UpdateMapper} using the given {@link MongoConverter}.
|
||||
*
|
||||
* @param converter must not be {@literal null}.
|
||||
*/
|
||||
public UpdateMapper(MongoConverter converter) {
|
||||
|
||||
super(converter);
|
||||
this.converter = converter;
|
||||
}
|
||||
|
||||
/**
|
||||
* Converts the given source object to a mongo type retaining the original type information of the source type on the
|
||||
* mongo type.
|
||||
*
|
||||
* @see org.springframework.data.mongodb.core.convert.QueryMapper#delegateConvertToMongoType(java.lang.Object,
|
||||
* org.springframework.data.mongodb.core.mapping.MongoPersistentEntity)
|
||||
*/
|
||||
@Override
|
||||
protected Object delegateConvertToMongoType(Object source, MongoPersistentEntity<?> entity) {
|
||||
return entity == null ? super.delegateConvertToMongoType(source, null) : converter.convertToMongoType(source,
|
||||
entity.getTypeInformation());
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.convert.QueryMapper#getMappedObjectForField(org.springframework.data.mongodb.core.convert.QueryMapper.Field, java.lang.Object)
|
||||
*/
|
||||
@Override
|
||||
protected Entry<String, Object> getMappedObjectForField(Field field, Object rawValue) {
|
||||
|
||||
if (isDBObject(rawValue)) {
|
||||
return createMapEntry(field, convertSimpleOrDBObject(rawValue, field.getPropertyEntity()));
|
||||
}
|
||||
|
||||
if (isQuery(rawValue)) {
|
||||
return createMapEntry(field,
|
||||
super.getMappedObject(((Query) rawValue).getQueryObject(), field.getPropertyEntity()));
|
||||
}
|
||||
|
||||
if (isUpdateModifier(rawValue)) {
|
||||
return getMappedUpdateModifier(field, rawValue);
|
||||
}
|
||||
|
||||
return super.getMappedObjectForField(field, getMappedValue(field, rawValue));
|
||||
}
|
||||
|
||||
private Entry<String, Object> getMappedUpdateModifier(Field field, Object rawValue) {
|
||||
Object value = null;
|
||||
|
||||
if (rawValue instanceof Modifier) {
|
||||
|
||||
value = getMappedValue((Modifier) rawValue);
|
||||
|
||||
} else if (rawValue instanceof Modifiers) {
|
||||
|
||||
DBObject modificationOperations = new BasicDBObject();
|
||||
|
||||
for (Modifier modifier : ((Modifiers) rawValue).getModifiers()) {
|
||||
modificationOperations.putAll(getMappedValue(modifier).toMap());
|
||||
}
|
||||
|
||||
value = modificationOperations;
|
||||
} else {
|
||||
throw new IllegalArgumentException(String.format("Unable to map value of type '%s'!", rawValue.getClass()));
|
||||
}
|
||||
|
||||
return createMapEntry(field, value);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.convert.QueryMapper#isAssociationConversionNecessary(org.springframework.data.mongodb.core.convert.QueryMapper.Field, java.lang.Object)
|
||||
*/
|
||||
@Override
|
||||
protected boolean isAssociationConversionNecessary(Field documentField, Object value) {
|
||||
return super.isAssociationConversionNecessary(documentField, value) || documentField.containsAssociation();
|
||||
}
|
||||
|
||||
private boolean isUpdateModifier(Object value) {
|
||||
return value instanceof Modifier || value instanceof Modifiers;
|
||||
}
|
||||
|
||||
private boolean isQuery(Object value) {
|
||||
return value instanceof Query;
|
||||
}
|
||||
|
||||
private DBObject getMappedValue(Modifier modifier) {
|
||||
|
||||
Object value = converter.convertToMongoType(modifier.getValue(), ClassTypeInformation.OBJECT);
|
||||
return new BasicDBObject(modifier.getKey(), value);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.convert.QueryMapper#createPropertyField(org.springframework.data.mongodb.core.mapping.MongoPersistentEntity, java.lang.String, org.springframework.data.mapping.context.MappingContext)
|
||||
*/
|
||||
@Override
|
||||
protected Field createPropertyField(MongoPersistentEntity<?> entity, String key,
|
||||
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext) {
|
||||
|
||||
return entity == null ? super.createPropertyField(entity, key, mappingContext) : //
|
||||
new MetadataBackedUpdateField(entity, key, mappingContext);
|
||||
}
|
||||
|
||||
/**
|
||||
* {@link MetadataBackedField} that handles {@literal $} paths inside a field key. We clean up an update key
|
||||
* containing a {@literal $} before handing it to the super class to make sure property lookups and transformations
|
||||
* continue to work as expected. We provide a custom property converter to re-applied the cleaned up {@literal $}s
|
||||
* when constructing the mapped key.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
private static class MetadataBackedUpdateField extends MetadataBackedField {
|
||||
|
||||
private final String key;
|
||||
|
||||
/**
|
||||
* Creates a new {@link MetadataBackedField} with the given {@link MongoPersistentEntity}, key and
|
||||
* {@link MappingContext}. We clean up the key before handing it up to the super class to make sure it continues to
|
||||
* work as expected.
|
||||
*
|
||||
* @param entity must not be {@literal null}.
|
||||
* @param key must not be {@literal null} or empty.
|
||||
* @param mappingContext must not be {@literal null}.
|
||||
*/
|
||||
public MetadataBackedUpdateField(MongoPersistentEntity<?> entity, String key,
|
||||
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext) {
|
||||
|
||||
super(key.replaceAll("\\.\\$", ""), entity, mappingContext);
|
||||
this.key = key;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.convert.QueryMapper.MetadataBackedField#getMappedKey()
|
||||
*/
|
||||
@Override
|
||||
public String getMappedKey() {
|
||||
return this.getPath() == null ? key : super.getMappedKey();
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.convert.QueryMapper.MetadataBackedField#getPropertyConverter()
|
||||
*/
|
||||
@Override
|
||||
protected Converter<MongoPersistentProperty, String> getPropertyConverter() {
|
||||
return isAssociation() ? new AssociationConverter(getAssociation()) : new UpdatePropertyConverter(key);
|
||||
}
|
||||
|
||||
/**
|
||||
* Converter to skip all properties after an association property was rendered.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
private static class AssociationConverter implements Converter<MongoPersistentProperty, String> {
|
||||
|
||||
private final MongoPersistentProperty property;
|
||||
private boolean associationFound;
|
||||
|
||||
/**
|
||||
* Creates a new {@link AssociationConverter} for the given {@link Association}.
|
||||
*
|
||||
* @param association must not be {@literal null}.
|
||||
*/
|
||||
public AssociationConverter(Association<MongoPersistentProperty> association) {
|
||||
|
||||
Assert.notNull(association, "Association must not be null!");
|
||||
this.property = association.getInverse();
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
|
||||
*/
|
||||
@Override
|
||||
public String convert(MongoPersistentProperty source) {
|
||||
|
||||
if (associationFound) {
|
||||
return null;
|
||||
}
|
||||
|
||||
if (property.equals(source)) {
|
||||
associationFound = true;
|
||||
}
|
||||
|
||||
return source.getFieldName();
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Special {@link Converter} for {@link MongoPersistentProperty} instances that will concatenate the {@literal $}
|
||||
* contained in the source update key.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
private static class UpdatePropertyConverter implements Converter<MongoPersistentProperty, String> {
|
||||
|
||||
private final Iterator<String> iterator;
|
||||
|
||||
/**
|
||||
* Creates a new {@link UpdatePropertyConverter} with the given update key.
|
||||
*
|
||||
* @param updateKey must not be {@literal null} or empty.
|
||||
*/
|
||||
public UpdatePropertyConverter(String updateKey) {
|
||||
|
||||
Assert.hasText(updateKey, "Update key must not be null or empty!");
|
||||
|
||||
this.iterator = Arrays.asList(updateKey.split("\\.")).iterator();
|
||||
this.iterator.next();
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
|
||||
*/
|
||||
@Override
|
||||
public String convert(MongoPersistentProperty property) {
|
||||
|
||||
String mappedName = PropertyToFieldNameConverter.INSTANCE.convert(property);
|
||||
return iterator.hasNext() && iterator.next().equals("$") ? String.format("%s.$", mappedName) : mappedName;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2010-2014 the original author or authors.
|
||||
* Copyright 2010-2011 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -16,31 +16,44 @@
|
||||
package org.springframework.data.mongodb.core.geo;
|
||||
|
||||
import java.util.ArrayList;
|
||||
import java.util.Arrays;
|
||||
import java.util.List;
|
||||
|
||||
import org.springframework.data.geo.Point;
|
||||
import org.springframework.data.mongodb.core.mapping.Field;
|
||||
import org.springframework.util.Assert;
|
||||
|
||||
/**
|
||||
* Represents a geospatial box value.
|
||||
* Represents a geospatial box value
|
||||
*
|
||||
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Box}. This class is scheduled to be
|
||||
* removed in the next major release.
|
||||
* @author Mark Pollack
|
||||
* @author Oliver Gierke
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
@Deprecated
|
||||
public class Box extends org.springframework.data.geo.Box implements Shape {
|
||||
public class Box implements Shape {
|
||||
|
||||
public static final String COMMAND = "$box";
|
||||
@Field(order = 10)
|
||||
private final Point first;
|
||||
@Field(order = 20)
|
||||
private final Point second;
|
||||
|
||||
public Box(Point lowerLeft, Point upperRight) {
|
||||
super(lowerLeft, upperRight);
|
||||
Assert.notNull(lowerLeft);
|
||||
Assert.notNull(upperRight);
|
||||
this.first = lowerLeft;
|
||||
this.second = upperRight;
|
||||
}
|
||||
|
||||
public Box(double[] lowerLeft, double[] upperRight) {
|
||||
super(lowerLeft, upperRight);
|
||||
Assert.isTrue(lowerLeft.length == 2, "Point array has to have 2 elements!");
|
||||
Assert.isTrue(upperRight.length == 2, "Point array has to have 2 elements!");
|
||||
this.first = new Point(lowerLeft[0], lowerLeft[1]);
|
||||
this.second = new Point(upperRight[0], upperRight[1]);
|
||||
}
|
||||
|
||||
public Point getLowerLeft() {
|
||||
return first;
|
||||
}
|
||||
|
||||
public Point getUpperRight() {
|
||||
return second;
|
||||
}
|
||||
|
||||
/*
|
||||
@@ -48,28 +61,46 @@ public class Box extends org.springframework.data.geo.Box implements Shape {
|
||||
* @see org.springframework.data.mongodb.core.geo.Shape#asList()
|
||||
*/
|
||||
public List<? extends Object> asList() {
|
||||
|
||||
List<List<Double>> list = new ArrayList<List<Double>>();
|
||||
|
||||
list.add(Arrays.asList(getFirst().getX(), getFirst().getY()));
|
||||
list.add(Arrays.asList(getSecond().getX(), getSecond().getY()));
|
||||
|
||||
list.add(getLowerLeft().asList());
|
||||
list.add(getUpperRight().asList());
|
||||
return list;
|
||||
}
|
||||
|
||||
public org.springframework.data.mongodb.core.geo.Point getLowerLeft() {
|
||||
return new org.springframework.data.mongodb.core.geo.Point(getFirst());
|
||||
}
|
||||
|
||||
public org.springframework.data.mongodb.core.geo.Point getUpperRight() {
|
||||
return new org.springframework.data.mongodb.core.geo.Point(getSecond());
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.geo.Shape#getCommand()
|
||||
*/
|
||||
public String getCommand() {
|
||||
return COMMAND;
|
||||
return "$box";
|
||||
}
|
||||
|
||||
@Override
|
||||
public String toString() {
|
||||
return String.format("Box [%s, %s]", first, second);
|
||||
}
|
||||
|
||||
@Override
|
||||
public int hashCode() {
|
||||
|
||||
int result = 31;
|
||||
result += 17 * first.hashCode();
|
||||
result += 17 * second.hashCode();
|
||||
return result;
|
||||
}
|
||||
|
||||
@Override
|
||||
public boolean equals(Object obj) {
|
||||
if (this == obj) {
|
||||
return true;
|
||||
}
|
||||
if (obj == null) {
|
||||
return false;
|
||||
}
|
||||
if (getClass() != obj.getClass()) {
|
||||
return false;
|
||||
}
|
||||
Box that = (Box) obj;
|
||||
return this.first.equals(that.first) && this.second.equals(that.second);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2010-2014 the original author or authors.
|
||||
* Copyright 2010-2011 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -16,32 +16,19 @@
|
||||
package org.springframework.data.mongodb.core.geo;
|
||||
|
||||
import java.util.ArrayList;
|
||||
import java.util.Arrays;
|
||||
import java.util.List;
|
||||
|
||||
import org.springframework.data.annotation.PersistenceConstructor;
|
||||
import org.springframework.data.geo.Distance;
|
||||
import org.springframework.data.geo.Metrics;
|
||||
import org.springframework.data.geo.Point;
|
||||
import org.springframework.util.Assert;
|
||||
|
||||
/**
|
||||
* Represents a geospatial circle value.
|
||||
* <p>
|
||||
* Note: We deliberately do not extend org.springframework.data.geo.Circle because introducing it's distance concept
|
||||
* would break the clients that use the old Circle API.
|
||||
* Represents a geospatial circle value
|
||||
*
|
||||
* @author Mark Pollack
|
||||
* @author Oliver Gierke
|
||||
* @author Thomas Darimont
|
||||
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Circle}. This class is scheduled to be
|
||||
* removed in the next major release.
|
||||
*/
|
||||
@Deprecated
|
||||
public class Circle implements Shape {
|
||||
|
||||
public static final String COMMAND = "$center";
|
||||
|
||||
private final Point center;
|
||||
private final double radius;
|
||||
|
||||
@@ -62,8 +49,7 @@ public class Circle implements Shape {
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link Circle} from the given coordinates and radius as {@link Distance} with a
|
||||
* {@link Metrics#NEUTRAL}.
|
||||
* Creates a new {@link Circle} from the given coordinates and radius.
|
||||
*
|
||||
* @param centerX
|
||||
* @param centerY
|
||||
@@ -96,11 +82,9 @@ public class Circle implements Shape {
|
||||
* @see org.springframework.data.mongodb.core.geo.Shape#asList()
|
||||
*/
|
||||
public List<Object> asList() {
|
||||
|
||||
List<Object> result = new ArrayList<Object>();
|
||||
result.add(Arrays.asList(getCenter().getX(), getCenter().getY()));
|
||||
result.add(getCenter().asList());
|
||||
result.add(getRadius());
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
@@ -109,7 +93,7 @@ public class Circle implements Shape {
|
||||
* @see org.springframework.data.mongodb.core.geo.Shape#getCommand()
|
||||
*/
|
||||
public String getCommand() {
|
||||
return COMMAND;
|
||||
return "$center";
|
||||
}
|
||||
|
||||
/*
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2013-2014 the original author or authors.
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -18,13 +18,11 @@ package org.springframework.data.mongodb.core.geo;
|
||||
/**
|
||||
* Value object to create custom {@link Metric}s on the fly.
|
||||
*
|
||||
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Metric}. This class is scheduled to be
|
||||
* removed in the next major release.
|
||||
* @author Oliver Gierke
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
@Deprecated
|
||||
public class CustomMetric extends org.springframework.data.geo.CustomMetric implements Metric {
|
||||
public class CustomMetric implements Metric {
|
||||
|
||||
private final double multiplier;
|
||||
|
||||
/**
|
||||
* Creates a custom {@link Metric} using the given multiplier.
|
||||
@@ -32,6 +30,14 @@ public class CustomMetric extends org.springframework.data.geo.CustomMetric impl
|
||||
* @param multiplier
|
||||
*/
|
||||
public CustomMetric(double multiplier) {
|
||||
super(multiplier);
|
||||
this.multiplier = multiplier;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.geo.Metric#getMultiplier()
|
||||
*/
|
||||
public double getMultiplier() {
|
||||
return multiplier;
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2010-2014 the original author or authors.
|
||||
* Copyright 2010-2011 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -15,19 +15,17 @@
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.geo;
|
||||
|
||||
import org.springframework.data.geo.Metric;
|
||||
import org.springframework.data.geo.Metrics;
|
||||
import org.springframework.util.ObjectUtils;
|
||||
|
||||
/**
|
||||
* Value object to represent distances in a given metric.
|
||||
*
|
||||
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Distance}. This class is scheduled to
|
||||
* be removed in the next major release.
|
||||
* @author Oliver Gierke
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
@Deprecated
|
||||
public class Distance extends org.springframework.data.geo.Distance {
|
||||
public class Distance {
|
||||
|
||||
private final double value;
|
||||
private final Metric metric;
|
||||
|
||||
/**
|
||||
* Creates a new {@link Distance}.
|
||||
@@ -38,7 +36,110 @@ public class Distance extends org.springframework.data.geo.Distance {
|
||||
this(value, Metrics.NEUTRAL);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link Distance} with the given {@link Metric}.
|
||||
*
|
||||
* @param value
|
||||
* @param metric
|
||||
*/
|
||||
public Distance(double value, Metric metric) {
|
||||
super(value, metric);
|
||||
this.value = value;
|
||||
this.metric = metric == null ? Metrics.NEUTRAL : metric;
|
||||
}
|
||||
|
||||
/**
|
||||
* @return the value
|
||||
*/
|
||||
public double getValue() {
|
||||
return value;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the normalized value regarding the underlying {@link Metric}.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public double getNormalizedValue() {
|
||||
return value / metric.getMultiplier();
|
||||
}
|
||||
|
||||
/**
|
||||
* @return the metric
|
||||
*/
|
||||
public Metric getMetric() {
|
||||
return metric;
|
||||
}
|
||||
|
||||
/**
|
||||
* Adds the given distance to the current one. The resulting {@link Distance} will be in the same metric as the
|
||||
* current one.
|
||||
*
|
||||
* @param other
|
||||
* @return
|
||||
*/
|
||||
public Distance add(Distance other) {
|
||||
double newNormalizedValue = getNormalizedValue() + other.getNormalizedValue();
|
||||
return new Distance(newNormalizedValue * metric.getMultiplier(), metric);
|
||||
}
|
||||
|
||||
/**
|
||||
* Adds the given {@link Distance} to the current one and forces the result to be in a given {@link Metric}.
|
||||
*
|
||||
* @param other
|
||||
* @param metric
|
||||
* @return
|
||||
*/
|
||||
public Distance add(Distance other, Metric metric) {
|
||||
double newLeft = getNormalizedValue() * metric.getMultiplier();
|
||||
double newRight = other.getNormalizedValue() * metric.getMultiplier();
|
||||
return new Distance(newLeft + newRight, metric);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see java.lang.Object#equals(java.lang.Object)
|
||||
*/
|
||||
@Override
|
||||
public boolean equals(Object obj) {
|
||||
|
||||
if (this == obj) {
|
||||
return true;
|
||||
}
|
||||
|
||||
if (obj == null || !getClass().equals(obj.getClass())) {
|
||||
return false;
|
||||
}
|
||||
|
||||
Distance that = (Distance) obj;
|
||||
|
||||
return this.value == that.value && ObjectUtils.nullSafeEquals(this.metric, that.metric);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see java.lang.Object#hashCode()
|
||||
*/
|
||||
@Override
|
||||
public int hashCode() {
|
||||
int result = 17;
|
||||
result += 31 * Double.doubleToLongBits(value);
|
||||
result += 31 * ObjectUtils.nullSafeHashCode(metric);
|
||||
return result;
|
||||
}
|
||||
|
||||
/* (non-Javadoc)
|
||||
* @see java.lang.Object#toString()
|
||||
*/
|
||||
@Override
|
||||
public String toString() {
|
||||
|
||||
StringBuilder builder = new StringBuilder();
|
||||
builder.append(value);
|
||||
|
||||
if (metric != Metrics.NEUTRAL) {
|
||||
builder.append(" ").append(metric.toString());
|
||||
}
|
||||
|
||||
return builder.toString();
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2011-2014 the original author or authors.
|
||||
* Copyright 2011 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -16,21 +16,19 @@
|
||||
package org.springframework.data.mongodb.core.geo;
|
||||
|
||||
import org.springframework.data.domain.Page;
|
||||
import org.springframework.data.domain.PageImpl;
|
||||
import org.springframework.data.domain.Pageable;
|
||||
|
||||
/**
|
||||
* Custom {@link Page} to carry the average distance retrieved from the {@link GeoResults} the {@link GeoPage} is set up
|
||||
* from.
|
||||
*
|
||||
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.GeoPage}. This class is scheduled to
|
||||
* be removed in the next major release.
|
||||
* @author Oliver Gierke
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
@Deprecated
|
||||
public class GeoPage<T> extends org.springframework.data.geo.GeoPage<T> {
|
||||
public class GeoPage<T> extends PageImpl<GeoResult<T>> {
|
||||
|
||||
private static final long serialVersionUID = 23421312312412L;
|
||||
private final Distance averageDistance;
|
||||
|
||||
/**
|
||||
* Creates a new {@link GeoPage} from the given {@link GeoResults}.
|
||||
@@ -38,7 +36,8 @@ public class GeoPage<T> extends org.springframework.data.geo.GeoPage<T> {
|
||||
* @param content must not be {@literal null}.
|
||||
*/
|
||||
public GeoPage(GeoResults<T> results) {
|
||||
super(results);
|
||||
super(results.getContent());
|
||||
this.averageDistance = results.getAverageDistance();
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -49,6 +48,16 @@ public class GeoPage<T> extends org.springframework.data.geo.GeoPage<T> {
|
||||
* @param total
|
||||
*/
|
||||
public GeoPage(GeoResults<T> results, Pageable pageable, long total) {
|
||||
super(results, pageable, total);
|
||||
super(results.getContent(), pageable, total);
|
||||
this.averageDistance = results.getAverageDistance();
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the average distance of the underlying results.
|
||||
*
|
||||
* @return the averageDistance
|
||||
*/
|
||||
public Distance getAverageDistance() {
|
||||
return averageDistance;
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2011-2014 the original author or authors.
|
||||
* Copyright 2011 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -15,16 +15,17 @@
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.geo;
|
||||
|
||||
import org.springframework.util.Assert;
|
||||
|
||||
/**
|
||||
* Calue object capturing some arbitrary object plus a distance.
|
||||
*
|
||||
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.GeoResult}. This class is scheduled to
|
||||
* be removed in the next major release.
|
||||
* @author Oliver Gierke
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
@Deprecated
|
||||
public class GeoResult<T> extends org.springframework.data.geo.GeoResult<T> {
|
||||
public class GeoResult<T> {
|
||||
|
||||
private final T content;
|
||||
private final Distance distance;
|
||||
|
||||
/**
|
||||
* Creates a new {@link GeoResult} for the given content and distance.
|
||||
@@ -33,6 +34,69 @@ public class GeoResult<T> extends org.springframework.data.geo.GeoResult<T> {
|
||||
* @param distance must not be {@literal null}.
|
||||
*/
|
||||
public GeoResult(T content, Distance distance) {
|
||||
super(content, distance);
|
||||
Assert.notNull(content);
|
||||
Assert.notNull(distance);
|
||||
this.content = content;
|
||||
this.distance = distance;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the actual content object.
|
||||
*
|
||||
* @return the content
|
||||
*/
|
||||
public T getContent() {
|
||||
return content;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the distance the actual content object has from the origin.
|
||||
*
|
||||
* @return the distance
|
||||
*/
|
||||
public Distance getDistance() {
|
||||
return distance;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see java.lang.Object#equals(java.lang.Object)
|
||||
*/
|
||||
@Override
|
||||
public boolean equals(Object obj) {
|
||||
|
||||
if (this == obj) {
|
||||
return true;
|
||||
}
|
||||
|
||||
if (obj == null || !getClass().equals(obj.getClass())) {
|
||||
return false;
|
||||
}
|
||||
|
||||
GeoResult<?> that = (GeoResult<?>) obj;
|
||||
|
||||
return this.content.equals(that.content) && this.distance.equals(that.distance);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see java.lang.Object#hashCode()
|
||||
*/
|
||||
@Override
|
||||
public int hashCode() {
|
||||
|
||||
int result = 17;
|
||||
result += 31 * distance.hashCode();
|
||||
result += 31 * content.hashCode();
|
||||
return result;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see java.lang.Object#toString()
|
||||
*/
|
||||
@Override
|
||||
public String toString() {
|
||||
return String.format("GeoResult [content: %s, distance: %s, ]", content.toString(), distance.toString());
|
||||
}
|
||||
}
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2011-2014 the original author or authors.
|
||||
* Copyright 2011 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -15,23 +15,23 @@
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.geo;
|
||||
|
||||
import java.util.Collections;
|
||||
import java.util.Iterator;
|
||||
import java.util.List;
|
||||
|
||||
import org.springframework.data.annotation.PersistenceConstructor;
|
||||
import org.springframework.data.geo.Distance;
|
||||
import org.springframework.data.geo.GeoResult;
|
||||
import org.springframework.data.geo.Metric;
|
||||
import org.springframework.util.Assert;
|
||||
import org.springframework.util.StringUtils;
|
||||
|
||||
/**
|
||||
* Value object to capture {@link GeoResult}s as well as the average distance they have.
|
||||
*
|
||||
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.GeoResults}. This class is scheduled
|
||||
* to be removed in the next major release.
|
||||
* @author Oliver Gierke
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
@Deprecated
|
||||
public class GeoResults<T> extends org.springframework.data.geo.GeoResults<T> {
|
||||
public class GeoResults<T> implements Iterable<GeoResult<T>> {
|
||||
|
||||
private final List<GeoResult<T>> results;
|
||||
private final Distance averageDistance;
|
||||
|
||||
/**
|
||||
* Creates a new {@link GeoResults} instance manually calculating the average distance from the distance values of the
|
||||
@@ -39,12 +39,12 @@ public class GeoResults<T> extends org.springframework.data.geo.GeoResults<T> {
|
||||
*
|
||||
* @param results must not be {@literal null}.
|
||||
*/
|
||||
public GeoResults(List<? extends GeoResult<T>> results) {
|
||||
super(results);
|
||||
public GeoResults(List<GeoResult<T>> results) {
|
||||
this(results, (Metric) null);
|
||||
}
|
||||
|
||||
public GeoResults(List<? extends GeoResult<T>> results, Metric metric) {
|
||||
super(results, metric);
|
||||
public GeoResults(List<GeoResult<T>> results, Metric metric) {
|
||||
this(results, calculateAverageDistance(results, metric));
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -54,7 +54,92 @@ public class GeoResults<T> extends org.springframework.data.geo.GeoResults<T> {
|
||||
* @param averageDistance
|
||||
*/
|
||||
@PersistenceConstructor
|
||||
public GeoResults(List<? extends GeoResult<T>> results, Distance averageDistance) {
|
||||
super(results, averageDistance);
|
||||
public GeoResults(List<GeoResult<T>> results, Distance averageDistance) {
|
||||
Assert.notNull(results);
|
||||
this.results = results;
|
||||
this.averageDistance = averageDistance;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the average distance of all {@link GeoResult}s in this list.
|
||||
*
|
||||
* @return the averageDistance
|
||||
*/
|
||||
public Distance getAverageDistance() {
|
||||
return averageDistance;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see java.lang.Iterable#iterator()
|
||||
*/
|
||||
public Iterator<GeoResult<T>> iterator() {
|
||||
return results.iterator();
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the actual
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public List<GeoResult<T>> getContent() {
|
||||
return Collections.unmodifiableList(results);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see java.lang.Object#equals(java.lang.Object)
|
||||
*/
|
||||
@Override
|
||||
public boolean equals(Object obj) {
|
||||
|
||||
if (this == obj) {
|
||||
return true;
|
||||
}
|
||||
|
||||
if (obj == null || !getClass().equals(obj.getClass())) {
|
||||
return false;
|
||||
}
|
||||
|
||||
GeoResults<?> that = (GeoResults<?>) obj;
|
||||
|
||||
return this.results.equals(that.results) && this.averageDistance == that.averageDistance;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see java.lang.Object#hashCode()
|
||||
*/
|
||||
@Override
|
||||
public int hashCode() {
|
||||
int result = 17;
|
||||
result += 31 * results.hashCode();
|
||||
result += 31 * averageDistance.hashCode();
|
||||
return result;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see java.lang.Object#toString()
|
||||
*/
|
||||
@Override
|
||||
public String toString() {
|
||||
return String.format("GeoResults: [averageDistance: %s, results: %s]", averageDistance.toString(),
|
||||
StringUtils.collectionToCommaDelimitedString(results));
|
||||
}
|
||||
|
||||
private static Distance calculateAverageDistance(List<? extends GeoResult<?>> results, Metric metric) {
|
||||
|
||||
if (results.isEmpty()) {
|
||||
return new Distance(0, metric);
|
||||
}
|
||||
|
||||
double averageDistance = 0;
|
||||
|
||||
for (GeoResult<?> result : results) {
|
||||
averageDistance += result.getDistance().getValue();
|
||||
}
|
||||
|
||||
return new Distance(averageDistance / results.size(), metric);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,27 +1,16 @@
|
||||
/*
|
||||
* Copyright 2011-2014 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.geo;
|
||||
|
||||
/**
|
||||
* Interface for {@link Metric}s that can be applied to a base scale.
|
||||
*
|
||||
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Metric}. This class is scheduled to be
|
||||
* removed in the next major release.
|
||||
* @author Oliver Gierke
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
@Deprecated
|
||||
public interface Metric extends org.springframework.data.geo.Metric {}
|
||||
public interface Metric {
|
||||
|
||||
/**
|
||||
* Returns the multiplier to calculate metrics values from a base scale.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
double getMultiplier();
|
||||
}
|
||||
@@ -1,18 +1,3 @@
|
||||
/*
|
||||
* Copyright 2011-2014 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.geo;
|
||||
|
||||
import org.springframework.data.mongodb.core.query.NearQuery;
|
||||
@@ -20,17 +5,11 @@ import org.springframework.data.mongodb.core.query.NearQuery;
|
||||
/**
|
||||
* Commonly used {@link Metrics} for {@link NearQuery}s.
|
||||
*
|
||||
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Metrics}. This class is scheduled to
|
||||
* be removed in the next major release.
|
||||
* @author Oliver Gierke
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
@Deprecated
|
||||
public enum Metrics implements Metric {
|
||||
|
||||
KILOMETERS(org.springframework.data.geo.Metrics.KILOMETERS.getMultiplier()), //
|
||||
MILES(org.springframework.data.geo.Metrics.MILES.getMultiplier()), //
|
||||
NEUTRAL(org.springframework.data.geo.Metrics.NEUTRAL.getMultiplier()); //
|
||||
KILOMETERS(6378.137), MILES(3963.191), NEUTRAL(1);
|
||||
|
||||
private final double multiplier;
|
||||
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2010-2014 the original author or authors.
|
||||
* Copyright 2010-2011 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -19,37 +19,85 @@ import java.util.Arrays;
|
||||
import java.util.List;
|
||||
|
||||
import org.springframework.data.annotation.PersistenceConstructor;
|
||||
import org.springframework.data.mongodb.core.mapping.Field;
|
||||
import org.springframework.util.Assert;
|
||||
|
||||
/**
|
||||
* Represents a geospatial point value.
|
||||
*
|
||||
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Point}. This class is scheduled to be
|
||||
* removed in the next major release.
|
||||
* @author Mark Pollack
|
||||
* @author Oliver Gierke
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
@Deprecated
|
||||
public class Point extends org.springframework.data.geo.Point {
|
||||
public class Point {
|
||||
|
||||
@Field(order = 10)
|
||||
private final double x;
|
||||
@Field(order = 20)
|
||||
private final double y;
|
||||
|
||||
@PersistenceConstructor
|
||||
public Point(double x, double y) {
|
||||
super(x, y);
|
||||
this.x = x;
|
||||
this.y = y;
|
||||
}
|
||||
|
||||
public Point(org.springframework.data.geo.Point point) {
|
||||
super(point);
|
||||
public Point(Point point) {
|
||||
Assert.notNull(point);
|
||||
this.x = point.x;
|
||||
this.y = point.y;
|
||||
}
|
||||
|
||||
public double getX() {
|
||||
return x;
|
||||
}
|
||||
|
||||
public double getY() {
|
||||
return y;
|
||||
}
|
||||
|
||||
public double[] asArray() {
|
||||
return new double[] { getX(), getY() };
|
||||
return new double[] { x, y };
|
||||
}
|
||||
|
||||
public List<Double> asList() {
|
||||
return asList(this);
|
||||
return Arrays.asList(x, y);
|
||||
}
|
||||
|
||||
public static List<Double> asList(org.springframework.data.geo.Point point) {
|
||||
return Arrays.asList(point.getX(), point.getY());
|
||||
@Override
|
||||
public int hashCode() {
|
||||
final int prime = 31;
|
||||
int result = 1;
|
||||
long temp;
|
||||
temp = Double.doubleToLongBits(x);
|
||||
result = prime * result + (int) (temp ^ (temp >>> 32));
|
||||
temp = Double.doubleToLongBits(y);
|
||||
result = prime * result + (int) (temp ^ (temp >>> 32));
|
||||
return result;
|
||||
}
|
||||
|
||||
@Override
|
||||
public boolean equals(Object obj) {
|
||||
if (this == obj) {
|
||||
return true;
|
||||
}
|
||||
if (obj == null) {
|
||||
return false;
|
||||
}
|
||||
if (getClass() != obj.getClass()) {
|
||||
return false;
|
||||
}
|
||||
Point other = (Point) obj;
|
||||
if (Double.doubleToLongBits(x) != Double.doubleToLongBits(other.x)) {
|
||||
return false;
|
||||
}
|
||||
if (Double.doubleToLongBits(y) != Double.doubleToLongBits(other.y)) {
|
||||
return false;
|
||||
}
|
||||
return true;
|
||||
}
|
||||
|
||||
@Override
|
||||
public String toString() {
|
||||
return String.format("Point [latitude=%f, longitude=%f]", x, y);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2011-2014 the original author or authors.
|
||||
* Copyright 2011 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -17,22 +17,19 @@ package org.springframework.data.mongodb.core.geo;
|
||||
|
||||
import java.util.ArrayList;
|
||||
import java.util.Arrays;
|
||||
import java.util.Iterator;
|
||||
import java.util.List;
|
||||
|
||||
import org.springframework.data.geo.Point;
|
||||
import org.springframework.util.Assert;
|
||||
|
||||
/**
|
||||
* Simple value object to represent a {@link Polygon}.
|
||||
*
|
||||
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Point}. This class is scheduled to be
|
||||
* removed in the next major release.
|
||||
* @author Oliver Gierke
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
@Deprecated
|
||||
public class Polygon extends org.springframework.data.geo.Polygon implements Shape {
|
||||
public class Polygon implements Shape, Iterable<Point> {
|
||||
|
||||
public static final String COMMAND = "$polygon";
|
||||
private final List<Point> points;
|
||||
|
||||
/**
|
||||
* Creates a new {@link Polygon} for the given Points.
|
||||
@@ -42,17 +39,31 @@ public class Polygon extends org.springframework.data.geo.Polygon implements Sha
|
||||
* @param z
|
||||
* @param others
|
||||
*/
|
||||
public <P extends Point> Polygon(P x, P y, P z, P... others) {
|
||||
super(x, y, z, others);
|
||||
public Polygon(Point x, Point y, Point z, Point... others) {
|
||||
|
||||
Assert.notNull(x);
|
||||
Assert.notNull(y);
|
||||
Assert.notNull(z);
|
||||
Assert.notNull(others);
|
||||
|
||||
this.points = new ArrayList<Point>(3 + others.length);
|
||||
this.points.addAll(Arrays.asList(x, y, z));
|
||||
this.points.addAll(Arrays.asList(others));
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link Polygon} for the given Points.
|
||||
*
|
||||
* @param points
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.geo.Shape#asList()
|
||||
*/
|
||||
public <P extends Point> Polygon(List<P> points) {
|
||||
super(points);
|
||||
public List<List<Double>> asList() {
|
||||
|
||||
List<List<Double>> result = new ArrayList<List<Double>>();
|
||||
|
||||
for (Point point : points) {
|
||||
result.add(point.asList());
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
/*
|
||||
@@ -60,33 +71,43 @@ public class Polygon extends org.springframework.data.geo.Polygon implements Sha
|
||||
* @see org.springframework.data.mongodb.core.geo.Shape#getCommand()
|
||||
*/
|
||||
public String getCommand() {
|
||||
return COMMAND;
|
||||
return "$polygon";
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.geo.Shape#asList()
|
||||
* @see java.lang.Iterable#iterator()
|
||||
*/
|
||||
public Iterator<Point> iterator() {
|
||||
return this.points.iterator();
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see java.lang.Object#equals(java.lang.Object)
|
||||
*/
|
||||
@Override
|
||||
public List<? extends Object> asList() {
|
||||
return asList(this);
|
||||
public boolean equals(Object obj) {
|
||||
|
||||
if (this == obj) {
|
||||
return true;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns a {@link List} of x,y-coordinate tuples of {@link Point}s from the given {@link Polygon}.
|
||||
*
|
||||
* @param polygon
|
||||
* @return
|
||||
if (obj == null || !getClass().equals(obj.getClass())) {
|
||||
return false;
|
||||
}
|
||||
|
||||
Polygon that = (Polygon) obj;
|
||||
|
||||
return this.points.equals(that.points);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see java.lang.Object#hashCode()
|
||||
*/
|
||||
public static List<? extends Object> asList(org.springframework.data.geo.Polygon polygon) {
|
||||
|
||||
List<Point> points = polygon.getPoints();
|
||||
List<List<Double>> tuples = new ArrayList<List<Double>>(points.size());
|
||||
|
||||
for (Point point : points) {
|
||||
tuples.add(Arrays.asList(point.getX(), point.getY()));
|
||||
}
|
||||
|
||||
return tuples;
|
||||
@Override
|
||||
public int hashCode() {
|
||||
return points.hashCode();
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2011-2014 the original author or authors.
|
||||
* Copyright 2011 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -20,13 +20,9 @@ import java.util.List;
|
||||
/**
|
||||
* Common interface for all shapes. Allows building MongoDB representations of them.
|
||||
*
|
||||
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Shape}. This class is scheduled to be
|
||||
* removed in the next major release.
|
||||
* @author Oliver Gierke
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
@Deprecated
|
||||
public interface Shape extends org.springframework.data.geo.Shape {
|
||||
public interface Shape {
|
||||
|
||||
/**
|
||||
* Returns the {@link Shape} as a list of usually {@link Double} or {@link List}s of {@link Double}s. Wildcard bound
|
||||
|
||||
@@ -1,161 +0,0 @@
|
||||
/*
|
||||
* Copyright 2014 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.geo;
|
||||
|
||||
import java.util.Arrays;
|
||||
import java.util.List;
|
||||
|
||||
import org.springframework.data.annotation.PersistenceConstructor;
|
||||
import org.springframework.data.geo.Circle;
|
||||
import org.springframework.data.geo.Distance;
|
||||
import org.springframework.data.geo.Point;
|
||||
import org.springframework.util.Assert;
|
||||
|
||||
/**
|
||||
* Represents a geospatial sphere value.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
* @since 1.5
|
||||
*/
|
||||
@SuppressWarnings("deprecation")
|
||||
public class Sphere implements Shape {
|
||||
|
||||
public static final String COMMAND = "$centerSphere";
|
||||
private final Point center;
|
||||
private final Distance radius;
|
||||
|
||||
/**
|
||||
* Creates a Sphere around the given center {@link Point} with the given radius.
|
||||
*
|
||||
* @param center must not be {@literal null}.
|
||||
* @param radius must not be {@literal null}.
|
||||
*/
|
||||
@PersistenceConstructor
|
||||
public Sphere(Point center, Distance radius) {
|
||||
|
||||
Assert.notNull(center);
|
||||
Assert.notNull(radius);
|
||||
Assert.isTrue(radius.getValue() >= 0, "Radius must not be negative!");
|
||||
|
||||
this.center = center;
|
||||
this.radius = radius;
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a Sphere around the given center {@link Point} with the given radius.
|
||||
*
|
||||
* @param center
|
||||
* @param radius
|
||||
*/
|
||||
public Sphere(Point center, double radius) {
|
||||
this(center, new Distance(radius));
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a Sphere from the given {@link Circle}.
|
||||
*
|
||||
* @param circle
|
||||
*/
|
||||
public Sphere(Circle circle) {
|
||||
this(circle.getCenter(), circle.getRadius());
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a Sphere from the given {@link Circle}.
|
||||
*
|
||||
* @param circle
|
||||
*/
|
||||
@Deprecated
|
||||
public Sphere(org.springframework.data.mongodb.core.geo.Circle circle) {
|
||||
this(circle.getCenter(), circle.getRadius());
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the center of the {@link Circle}.
|
||||
*
|
||||
* @return will never be {@literal null}.
|
||||
*/
|
||||
public org.springframework.data.mongodb.core.geo.Point getCenter() {
|
||||
return new org.springframework.data.mongodb.core.geo.Point(this.center);
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the radius of the {@link Circle}.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public Distance getRadius() {
|
||||
return radius;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see java.lang.Object#toString()
|
||||
*/
|
||||
@Override
|
||||
public String toString() {
|
||||
return String.format("Sphere [center=%s, radius=%s]", center, radius);
|
||||
}
|
||||
|
||||
/* (non-Javadoc)
|
||||
* @see java.lang.Object#equals(java.lang.Object)
|
||||
*/
|
||||
@Override
|
||||
public boolean equals(Object obj) {
|
||||
|
||||
if (this == obj) {
|
||||
return true;
|
||||
}
|
||||
|
||||
if (obj == null || !(obj instanceof Sphere)) {
|
||||
return false;
|
||||
}
|
||||
|
||||
Sphere that = (Sphere) obj;
|
||||
|
||||
return this.center.equals(that.center) && this.radius.equals(that.radius);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see java.lang.Object#hashCode()
|
||||
*/
|
||||
@Override
|
||||
public int hashCode() {
|
||||
int result = 17;
|
||||
result += 31 * center.hashCode();
|
||||
result += 31 * radius.hashCode();
|
||||
return result;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.geo.Shape#asList()
|
||||
*/
|
||||
@Override
|
||||
public List<? extends Object> asList() {
|
||||
return Arrays.asList(Arrays.asList(center.getX(), center.getY()), this.radius.getValue());
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.geo.Shape#getCommand()
|
||||
*/
|
||||
@Override
|
||||
public String getCommand() {
|
||||
return COMMAND;
|
||||
}
|
||||
}
|
||||
@@ -1,18 +1,3 @@
|
||||
/*
|
||||
* Copyright 2011-2014 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
/**
|
||||
* Support for MongoDB geo-spatial queries.
|
||||
*/
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2011-2013 the original author or authors.
|
||||
* Copyright 2011-2012 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -26,8 +26,6 @@ import java.lang.annotation.Target;
|
||||
*
|
||||
* @author Jon Brisbin
|
||||
* @author Oliver Gierke
|
||||
* @author Philipp Schneider
|
||||
* @author Johno Crawford
|
||||
*/
|
||||
@Target({ ElementType.TYPE })
|
||||
@Documented
|
||||
@@ -36,12 +34,11 @@ public @interface CompoundIndex {
|
||||
|
||||
/**
|
||||
* The actual index definition in JSON format. The keys of the JSON document are the fields to be indexed, the values
|
||||
* define the index direction (1 for ascending, -1 for descending). <br />
|
||||
* If left empty on nested document, the whole document will be indexed.
|
||||
* define the index direction (1 for ascending, -1 for descending).
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
String def() default "";
|
||||
String def();
|
||||
|
||||
/**
|
||||
* It does not actually make sense to use that attribute as the direction has to be defined in the {@link #def()}
|
||||
@@ -52,24 +49,10 @@ public @interface CompoundIndex {
|
||||
@Deprecated
|
||||
IndexDirection direction() default IndexDirection.ASCENDING;
|
||||
|
||||
/**
|
||||
* @see http://docs.mongodb.org/manual/core/index-unique/
|
||||
* @return
|
||||
*/
|
||||
boolean unique() default false;
|
||||
|
||||
/**
|
||||
* If set to true index will skip over any document that is missing the indexed field.
|
||||
*
|
||||
* @see http://docs.mongodb.org/manual/core/index-sparse/
|
||||
* @return
|
||||
*/
|
||||
boolean sparse() default false;
|
||||
|
||||
/**
|
||||
* @see http://docs.mongodb.org/manual/core/index-creation/#index-creation-duplicate-dropping
|
||||
* @return
|
||||
*/
|
||||
boolean dropDups() default false;
|
||||
|
||||
/**
|
||||
@@ -79,15 +62,6 @@ public @interface CompoundIndex {
|
||||
*/
|
||||
String name() default "";
|
||||
|
||||
/**
|
||||
* If set to {@literal true} then MongoDB will ignore the given index name and instead generate a new name. Defaults
|
||||
* to {@literal false}.
|
||||
*
|
||||
* @return
|
||||
* @since 1.5
|
||||
*/
|
||||
boolean useGeneratedName() default false;
|
||||
|
||||
/**
|
||||
* The collection the index will be created in. Will default to the collection the annotated domain class will be
|
||||
* stored in.
|
||||
@@ -95,23 +69,4 @@ public @interface CompoundIndex {
|
||||
* @return
|
||||
*/
|
||||
String collection() default "";
|
||||
|
||||
/**
|
||||
* If {@literal true} the index will be created in the background.
|
||||
*
|
||||
* @see http://docs.mongodb.org/manual/core/indexes/#background-construction
|
||||
* @return
|
||||
*/
|
||||
boolean background() default false;
|
||||
|
||||
/**
|
||||
* Configures the number of seconds after which the collection should expire. Defaults to -1 for no expiry.
|
||||
*
|
||||
* @deprecated TTL cannot be defined for {@link CompoundIndex} having more than one field as key. Will be removed in
|
||||
* 1.6.
|
||||
* @see http://docs.mongodb.org/manual/tutorial/expire-data/
|
||||
* @return
|
||||
*/
|
||||
@Deprecated
|
||||
int expireAfterSeconds() default -1;
|
||||
}
|
||||
|
||||
@@ -1,56 +0,0 @@
|
||||
/*
|
||||
* Copyright 2014 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.index;
|
||||
|
||||
import org.springframework.util.Assert;
|
||||
|
||||
import com.mongodb.BasicDBObject;
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
* Index definition to span multiple keys.
|
||||
*
|
||||
* @author Christoph Strobl
|
||||
* @since 1.5
|
||||
*/
|
||||
public class CompoundIndexDefinition extends Index {
|
||||
|
||||
private DBObject keys;
|
||||
|
||||
/**
|
||||
* Creates a new {@link CompoundIndexDefinition} for the given keys.
|
||||
*
|
||||
* @param keys must not be {@literal null}.
|
||||
*/
|
||||
public CompoundIndexDefinition(DBObject keys) {
|
||||
|
||||
Assert.notNull(keys, "Keys must not be null!");
|
||||
this.keys = keys;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.index.Index#getIndexKeys()
|
||||
*/
|
||||
@Override
|
||||
public DBObject getIndexKeys() {
|
||||
|
||||
BasicDBObject dbo = new BasicDBObject();
|
||||
dbo.putAll(this.keys);
|
||||
dbo.putAll(super.getIndexKeys());
|
||||
return dbo;
|
||||
}
|
||||
}
|
||||
@@ -1,41 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.index;
|
||||
|
||||
/**
|
||||
* Geoposatial index type.
|
||||
*
|
||||
* @author Laurent Canet
|
||||
* @author Oliver Gierke
|
||||
* @since 1.4
|
||||
*/
|
||||
public enum GeoSpatialIndexType {
|
||||
|
||||
/**
|
||||
* Simple 2-Dimensional index for legacy-format points.
|
||||
*/
|
||||
GEO_2D,
|
||||
|
||||
/**
|
||||
* 2D Index for GeoJSON-formatted data over a sphere. Only available in Mongo 2.4.
|
||||
*/
|
||||
GEO_2DSPHERE,
|
||||
|
||||
/**
|
||||
* An haystack index for grouping results over small results.
|
||||
*/
|
||||
GEO_HAYSTACK
|
||||
}
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2010-2014 the original author or authors.
|
||||
* Copyright (c) 2011 by the original author(s).
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -13,6 +13,7 @@
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
|
||||
package org.springframework.data.mongodb.core.index;
|
||||
|
||||
import java.lang.annotation.ElementType;
|
||||
@@ -23,9 +24,7 @@ import java.lang.annotation.Target;
|
||||
/**
|
||||
* Mark a field to be indexed using MongoDB's geospatial indexing feature.
|
||||
*
|
||||
* @author Jon Brisbin
|
||||
* @author Laurent Canet
|
||||
* @author Thomas Darimont
|
||||
* @author Jon Brisbin <jbrisbin@vmware.com>
|
||||
*/
|
||||
@Target(ElementType.FIELD)
|
||||
@Retention(RetentionPolicy.RUNTIME)
|
||||
@@ -38,15 +37,6 @@ public @interface GeoSpatialIndexed {
|
||||
*/
|
||||
String name() default "";
|
||||
|
||||
/**
|
||||
* If set to {@literal true} then MongoDB will ignore the given index name and instead generate a new name. Defaults
|
||||
* to {@literal false}.
|
||||
*
|
||||
* @return
|
||||
* @since 1.5
|
||||
*/
|
||||
boolean useGeneratedName() default false;
|
||||
|
||||
/**
|
||||
* Name of the collection in which to create the index.
|
||||
*
|
||||
@@ -75,27 +65,4 @@ public @interface GeoSpatialIndexed {
|
||||
*/
|
||||
int bits() default 26;
|
||||
|
||||
/**
|
||||
* The type of the geospatial index. Default is {@link GeoSpatialIndexType#GEO_2D}
|
||||
*
|
||||
* @since 1.4
|
||||
* @return
|
||||
*/
|
||||
GeoSpatialIndexType type() default GeoSpatialIndexType.GEO_2D;
|
||||
|
||||
/**
|
||||
* The bucket size for {@link GeoSpatialIndexType#GEO_HAYSTACK} indexes, in coordinate units.
|
||||
*
|
||||
* @since 1.4
|
||||
* @return
|
||||
*/
|
||||
double bucketSize() default 1.0;
|
||||
|
||||
/**
|
||||
* The name of the additional field to use for {@link GeoSpatialIndexType#GEO_HAYSTACK} indexes
|
||||
*
|
||||
* @since 1.4
|
||||
* @return
|
||||
*/
|
||||
String additionalField() default "";
|
||||
}
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2010-2014 the original author or authors.
|
||||
* Copyright 2010-2011 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -16,7 +16,6 @@
|
||||
package org.springframework.data.mongodb.core.index;
|
||||
|
||||
import org.springframework.util.Assert;
|
||||
import org.springframework.util.StringUtils;
|
||||
|
||||
import com.mongodb.BasicDBObject;
|
||||
import com.mongodb.DBObject;
|
||||
@@ -26,19 +25,14 @@ import com.mongodb.DBObject;
|
||||
*
|
||||
* @author Jon Brisbin
|
||||
* @author Oliver Gierke
|
||||
* @author Laurent Canet
|
||||
* @author Christoph Strobl
|
||||
*/
|
||||
public class GeospatialIndex implements IndexDefinition {
|
||||
|
||||
private final String field;
|
||||
private String name;
|
||||
private Integer min;
|
||||
private Integer max;
|
||||
private Integer bits;
|
||||
private GeoSpatialIndexType type = GeoSpatialIndexType.GEO_2D;
|
||||
private Double bucketSize = 1.0;
|
||||
private String additionalField;
|
||||
private Integer min = null;
|
||||
private Integer max = null;
|
||||
private Integer bits = null;
|
||||
|
||||
/**
|
||||
* Creates a new {@link GeospatialIndex} for the given field.
|
||||
@@ -46,123 +40,44 @@ public class GeospatialIndex implements IndexDefinition {
|
||||
* @param field must not be empty or {@literal null}.
|
||||
*/
|
||||
public GeospatialIndex(String field) {
|
||||
|
||||
Assert.hasText(field, "Field must have text!");
|
||||
|
||||
Assert.hasText(field);
|
||||
this.field = field;
|
||||
}
|
||||
|
||||
/**
|
||||
* @param name must not be {@literal null} or empty.
|
||||
* @return
|
||||
*/
|
||||
public GeospatialIndex named(String name) {
|
||||
|
||||
this.name = name;
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* @param min
|
||||
* @return
|
||||
*/
|
||||
public GeospatialIndex withMin(int min) {
|
||||
this.min = Integer.valueOf(min);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* @param max
|
||||
* @return
|
||||
*/
|
||||
public GeospatialIndex withMax(int max) {
|
||||
this.max = Integer.valueOf(max);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* @param bits
|
||||
* @return
|
||||
*/
|
||||
public GeospatialIndex withBits(int bits) {
|
||||
this.bits = Integer.valueOf(bits);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* @param type must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
public GeospatialIndex typed(GeoSpatialIndexType type) {
|
||||
|
||||
Assert.notNull(type, "Type must not be null!");
|
||||
|
||||
this.type = type;
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* @param bucketSize
|
||||
* @return
|
||||
*/
|
||||
public GeospatialIndex withBucketSize(double bucketSize) {
|
||||
this.bucketSize = bucketSize;
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* @param fieldName.
|
||||
* @return
|
||||
*/
|
||||
public GeospatialIndex withAdditionalField(String fieldName) {
|
||||
this.additionalField = fieldName;
|
||||
return this;
|
||||
}
|
||||
|
||||
public DBObject getIndexKeys() {
|
||||
|
||||
DBObject dbo = new BasicDBObject();
|
||||
|
||||
switch (type) {
|
||||
|
||||
case GEO_2D:
|
||||
dbo.put(field, "2d");
|
||||
break;
|
||||
|
||||
case GEO_2DSPHERE:
|
||||
dbo.put(field, "2dsphere");
|
||||
break;
|
||||
|
||||
case GEO_HAYSTACK:
|
||||
dbo.put(field, "geoHaystack");
|
||||
if (!StringUtils.hasText(additionalField)) {
|
||||
throw new IllegalArgumentException("When defining geoHaystack index, an additionnal field must be defined");
|
||||
}
|
||||
dbo.put(additionalField, 1);
|
||||
break;
|
||||
|
||||
default:
|
||||
throw new IllegalArgumentException("Unsupported geospatial index " + type);
|
||||
}
|
||||
|
||||
return dbo;
|
||||
}
|
||||
|
||||
public DBObject getIndexOptions() {
|
||||
|
||||
if (!StringUtils.hasText(name) && min == null && max == null && bucketSize == null) {
|
||||
if (name == null && min == null && max == null) {
|
||||
return null;
|
||||
}
|
||||
|
||||
DBObject dbo = new BasicDBObject();
|
||||
if (StringUtils.hasText(name)) {
|
||||
if (name != null) {
|
||||
dbo.put("name", name);
|
||||
}
|
||||
|
||||
switch (type) {
|
||||
|
||||
case GEO_2D:
|
||||
|
||||
if (min != null) {
|
||||
dbo.put("min", min);
|
||||
}
|
||||
@@ -172,20 +87,6 @@ public class GeospatialIndex implements IndexDefinition {
|
||||
if (bits != null) {
|
||||
dbo.put("bits", bits);
|
||||
}
|
||||
break;
|
||||
|
||||
case GEO_2DSPHERE:
|
||||
|
||||
break;
|
||||
|
||||
case GEO_HAYSTACK:
|
||||
|
||||
if (bucketSize != null) {
|
||||
dbo.put("bucketSize", bucketSize);
|
||||
}
|
||||
break;
|
||||
}
|
||||
|
||||
return dbo;
|
||||
}
|
||||
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2010-2014 the original author or authors.
|
||||
* Copyright 2010-2011 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -17,28 +17,19 @@ package org.springframework.data.mongodb.core.index;
|
||||
|
||||
import java.util.LinkedHashMap;
|
||||
import java.util.Map;
|
||||
import java.util.concurrent.TimeUnit;
|
||||
|
||||
import org.springframework.data.domain.Sort.Direction;
|
||||
import org.springframework.data.mongodb.core.query.Order;
|
||||
import org.springframework.util.Assert;
|
||||
import org.springframework.util.StringUtils;
|
||||
|
||||
import com.mongodb.BasicDBObject;
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
* @author Oliver Gierke
|
||||
* @author Christoph Strobl
|
||||
*/
|
||||
@SuppressWarnings("deprecation")
|
||||
public class Index implements IndexDefinition {
|
||||
|
||||
public enum Duplicates {
|
||||
RETAIN, DROP
|
||||
}
|
||||
|
||||
private final Map<String, Direction> fieldSpec = new LinkedHashMap<String, Direction>();
|
||||
private final Map<String, Order> fieldSpec = new LinkedHashMap<String, Order>();
|
||||
|
||||
private String name;
|
||||
|
||||
@@ -48,43 +39,15 @@ public class Index implements IndexDefinition {
|
||||
|
||||
private boolean sparse = false;
|
||||
|
||||
private boolean background = false;
|
||||
|
||||
private long expire = -1;
|
||||
|
||||
public Index() {}
|
||||
|
||||
public Index(String key, Direction direction) {
|
||||
fieldSpec.put(key, direction);
|
||||
public Index() {
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link Indexed} on the given key and {@link Order}.
|
||||
*
|
||||
* @deprecated use {@link #Index(String, Direction)} instead.
|
||||
* @param key must not be {@literal null} or empty.
|
||||
* @param order must not be {@literal null}.
|
||||
*/
|
||||
@Deprecated
|
||||
public Index(String key, Order order) {
|
||||
this(key, order.toDirection());
|
||||
fieldSpec.put(key, order);
|
||||
}
|
||||
|
||||
/**
|
||||
* Adds the given field to the index.
|
||||
*
|
||||
* @deprecated use {@link #on(String, Direction)} instead.
|
||||
* @param key must not be {@literal null} or empty.
|
||||
* @param order must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
@Deprecated
|
||||
public Index on(String key, Order order) {
|
||||
return on(key, order.toDirection());
|
||||
}
|
||||
|
||||
public Index on(String key, Direction direction) {
|
||||
fieldSpec.put(key, direction);
|
||||
fieldSpec.put(key, order);
|
||||
return this;
|
||||
}
|
||||
|
||||
@@ -93,71 +56,16 @@ public class Index implements IndexDefinition {
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Reject all documents that contain a duplicate value for the indexed field.
|
||||
*
|
||||
* @see http://docs.mongodb.org/manual/core/index-unique/
|
||||
* @return
|
||||
*/
|
||||
public Index unique() {
|
||||
this.unique = true;
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Skip over any document that is missing the indexed field.
|
||||
*
|
||||
* @see http://docs.mongodb.org/manual/core/index-sparse/
|
||||
* @return
|
||||
*/
|
||||
public Index sparse() {
|
||||
this.sparse = true;
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Build the index in background (non blocking).
|
||||
*
|
||||
* @return
|
||||
* @since 1.5
|
||||
*/
|
||||
public Index background() {
|
||||
|
||||
this.background = true;
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Specifies TTL in seconds.
|
||||
*
|
||||
* @param value
|
||||
* @return
|
||||
* @since 1.5
|
||||
*/
|
||||
public Index expire(long value) {
|
||||
return expire(value, TimeUnit.SECONDS);
|
||||
}
|
||||
|
||||
/**
|
||||
* Specifies TTL with given {@link TimeUnit}.
|
||||
*
|
||||
* @param value
|
||||
* @param unit
|
||||
* @return
|
||||
* @since 1.5
|
||||
*/
|
||||
public Index expire(long value, TimeUnit unit) {
|
||||
|
||||
Assert.notNull(unit, "TimeUnit for expiration must not be null.");
|
||||
this.expire = unit.toSeconds(value);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* @see http://docs.mongodb.org/manual/core/index-creation/#index-creation-duplicate-dropping
|
||||
* @param duplicates
|
||||
* @return
|
||||
*/
|
||||
public Index unique(Duplicates duplicates) {
|
||||
if (duplicates == Duplicates.DROP) {
|
||||
this.dropDuplicates = true;
|
||||
@@ -168,15 +76,17 @@ public class Index implements IndexDefinition {
|
||||
public DBObject getIndexKeys() {
|
||||
DBObject dbo = new BasicDBObject();
|
||||
for (String k : fieldSpec.keySet()) {
|
||||
dbo.put(k, fieldSpec.get(k).equals(Direction.ASC) ? 1 : -1);
|
||||
dbo.put(k, (fieldSpec.get(k).equals(Order.ASCENDING) ? 1 : -1));
|
||||
}
|
||||
return dbo;
|
||||
}
|
||||
|
||||
public DBObject getIndexOptions() {
|
||||
|
||||
if (name == null && !unique) {
|
||||
return null;
|
||||
}
|
||||
DBObject dbo = new BasicDBObject();
|
||||
if (StringUtils.hasText(name)) {
|
||||
if (name != null) {
|
||||
dbo.put("name", name);
|
||||
}
|
||||
if (unique) {
|
||||
@@ -188,13 +98,6 @@ public class Index implements IndexDefinition {
|
||||
if (sparse) {
|
||||
dbo.put("sparse", true);
|
||||
}
|
||||
if (background) {
|
||||
dbo.put("background", true);
|
||||
}
|
||||
if (expire >= 0) {
|
||||
dbo.put("expireAfterSeconds", expire);
|
||||
}
|
||||
|
||||
return dbo;
|
||||
}
|
||||
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright (c) 2011-2014 by the original author(s).
|
||||
* Copyright (c) 2011 by the original author(s).
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -20,11 +20,11 @@ import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
* @author Jon Brisbin <jbrisbin@vmware.com>
|
||||
* @author Christoph Strobl
|
||||
*/
|
||||
public interface IndexDefinition {
|
||||
|
||||
DBObject getIndexKeys();
|
||||
|
||||
DBObject getIndexOptions();
|
||||
|
||||
}
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2012-2014 the original author or authors.
|
||||
* Copyright 2012 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -15,7 +15,6 @@
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.index;
|
||||
|
||||
import org.springframework.data.domain.Sort.Direction;
|
||||
import org.springframework.data.mongodb.core.query.Order;
|
||||
import org.springframework.util.Assert;
|
||||
import org.springframework.util.ObjectUtils;
|
||||
@@ -24,52 +23,33 @@ import org.springframework.util.ObjectUtils;
|
||||
* Value object for an index field.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
* @author Christoph Strobl
|
||||
*/
|
||||
@SuppressWarnings("deprecation")
|
||||
public final class IndexField {
|
||||
|
||||
enum Type {
|
||||
GEO, TEXT, DEFAULT;
|
||||
}
|
||||
|
||||
private final String key;
|
||||
private final Direction direction;
|
||||
private final Type type;
|
||||
private final Float weight;
|
||||
private final Order order;
|
||||
private final boolean isGeo;
|
||||
|
||||
private IndexField(String key, Direction direction, Type type) {
|
||||
this(key, direction, type, Float.NaN);
|
||||
}
|
||||
|
||||
private IndexField(String key, Direction direction, Type type, Float weight) {
|
||||
private IndexField(String key, Order order, boolean isGeo) {
|
||||
|
||||
Assert.hasText(key);
|
||||
Assert.isTrue(direction != null ^ (Type.GEO.equals(type) || Type.TEXT.equals(type)));
|
||||
Assert.isTrue(order != null ^ isGeo);
|
||||
|
||||
this.key = key;
|
||||
this.direction = direction;
|
||||
this.type = type == null ? Type.DEFAULT : type;
|
||||
this.weight = weight == null ? Float.NaN : weight;
|
||||
this.order = order;
|
||||
this.isGeo = isGeo;
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a default {@link IndexField} with the given key and {@link Order}.
|
||||
*
|
||||
* @deprecated use {@link #create(String, Direction)}.
|
||||
* @param key must not be {@literal null} or emtpy.
|
||||
* @param direction must not be {@literal null}.
|
||||
* @param order must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
@Deprecated
|
||||
public static IndexField create(String key, Order order) {
|
||||
Assert.notNull(order);
|
||||
return new IndexField(key, order.toDirection(), Type.DEFAULT);
|
||||
}
|
||||
|
||||
public static IndexField create(String key, Direction order) {
|
||||
Assert.notNull(order);
|
||||
return new IndexField(key, order, Type.DEFAULT);
|
||||
return new IndexField(key, order, false);
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -79,16 +59,7 @@ public final class IndexField {
|
||||
* @return
|
||||
*/
|
||||
public static IndexField geo(String key) {
|
||||
return new IndexField(key, null, Type.GEO);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a text {@link IndexField} for the given key.
|
||||
*
|
||||
* @since 1.6
|
||||
*/
|
||||
public static IndexField text(String key, Float weight) {
|
||||
return new IndexField(key, null, Type.TEXT, weight);
|
||||
return new IndexField(key, null, true);
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -99,42 +70,21 @@ public final class IndexField {
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the direction of the {@link IndexField} or {@literal null} in case we have a geo index field.
|
||||
* Returns the order of the {@link IndexField} or {@literal null} in case we have a geo index field.
|
||||
*
|
||||
* @deprecated use {@link #getDirection()} instead.
|
||||
* @return the direction
|
||||
* @return the order
|
||||
*/
|
||||
@Deprecated
|
||||
public Order getOrder() {
|
||||
return Direction.ASC.equals(direction) ? Order.ASCENDING : Order.DESCENDING;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the direction of the {@link IndexField} or {@literal null} in case we have a geo index field.
|
||||
*
|
||||
* @return the direction
|
||||
*/
|
||||
public Direction getDirection() {
|
||||
return direction;
|
||||
return order;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns whether the {@link IndexField} is a geo index field.
|
||||
*
|
||||
* @return true if type is {@link Type#GEO}.
|
||||
* @return the isGeo
|
||||
*/
|
||||
public boolean isGeo() {
|
||||
return Type.GEO.equals(type);
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns wheter the {@link IndexField} is a text index field.
|
||||
*
|
||||
* @return true if type is {@link Type#TEXT}
|
||||
* @since 1.6
|
||||
*/
|
||||
public boolean isText() {
|
||||
return Type.TEXT.equals(type);
|
||||
return isGeo;
|
||||
}
|
||||
|
||||
/*
|
||||
@@ -154,8 +104,7 @@ public final class IndexField {
|
||||
|
||||
IndexField that = (IndexField) obj;
|
||||
|
||||
return this.key.equals(that.key) && ObjectUtils.nullSafeEquals(this.direction, that.direction)
|
||||
&& this.type == that.type;
|
||||
return this.key.equals(that.key) && ObjectUtils.nullSafeEquals(this.order, that.order) && this.isGeo == that.isGeo;
|
||||
}
|
||||
|
||||
/*
|
||||
@@ -167,9 +116,8 @@ public final class IndexField {
|
||||
|
||||
int result = 17;
|
||||
result += 31 * ObjectUtils.nullSafeHashCode(key);
|
||||
result += 31 * ObjectUtils.nullSafeHashCode(direction);
|
||||
result += 31 * ObjectUtils.nullSafeHashCode(type);
|
||||
result += 31 * ObjectUtils.nullSafeHashCode(weight);
|
||||
result += 31 * ObjectUtils.nullSafeHashCode(order);
|
||||
result += 31 * ObjectUtils.nullSafeHashCode(isGeo);
|
||||
return result;
|
||||
}
|
||||
|
||||
@@ -179,7 +127,6 @@ public final class IndexField {
|
||||
*/
|
||||
@Override
|
||||
public String toString() {
|
||||
return String.format("IndexField [ key: %s, direction: %s, type: %s, weight: %s]", key, direction, type, weight);
|
||||
return String.format("IndexField [ key: %s, order: %s, isGeo: %s]", key, order, isGeo);
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2002-2014 the original author or authors.
|
||||
* Copyright 2002-2011 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -23,11 +23,6 @@ import java.util.List;
|
||||
import org.springframework.util.Assert;
|
||||
import org.springframework.util.ObjectUtils;
|
||||
|
||||
/**
|
||||
* @author Mark Pollack
|
||||
* @author Oliver Gierke
|
||||
* @author Christoph Strobl
|
||||
*/
|
||||
public class IndexInfo {
|
||||
|
||||
private final List<IndexField> indexFields;
|
||||
@@ -36,30 +31,14 @@ public class IndexInfo {
|
||||
private final boolean unique;
|
||||
private final boolean dropDuplicates;
|
||||
private final boolean sparse;
|
||||
private final String language;
|
||||
|
||||
/**
|
||||
* @deprecated Will be removed in 1.7. Please use {@link #IndexInfo(List, String, boolean, boolean, boolean, String)}
|
||||
* @param indexFields
|
||||
* @param name
|
||||
* @param unique
|
||||
* @param dropDuplicates
|
||||
* @param sparse
|
||||
*/
|
||||
@Deprecated
|
||||
public IndexInfo(List<IndexField> indexFields, String name, boolean unique, boolean dropDuplicates, boolean sparse) {
|
||||
this(indexFields, name, unique, dropDuplicates, sparse, "");
|
||||
}
|
||||
|
||||
public IndexInfo(List<IndexField> indexFields, String name, boolean unique, boolean dropDuplicates, boolean sparse,
|
||||
String language) {
|
||||
|
||||
this.indexFields = Collections.unmodifiableList(indexFields);
|
||||
this.name = name;
|
||||
this.unique = unique;
|
||||
this.dropDuplicates = dropDuplicates;
|
||||
this.sparse = sparse;
|
||||
this.language = language;
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -105,23 +84,14 @@ public class IndexInfo {
|
||||
return sparse;
|
||||
}
|
||||
|
||||
/**
|
||||
* @return
|
||||
* @since 1.6
|
||||
*/
|
||||
public String getLanguage() {
|
||||
return language;
|
||||
}
|
||||
|
||||
@Override
|
||||
public String toString() {
|
||||
return "IndexInfo [indexFields=" + indexFields + ", name=" + name + ", unique=" + unique + ", dropDuplicates="
|
||||
+ dropDuplicates + ", sparse=" + sparse + ", language=" + language + "]";
|
||||
+ dropDuplicates + ", sparse=" + sparse + "]";
|
||||
}
|
||||
|
||||
@Override
|
||||
public int hashCode() {
|
||||
|
||||
final int prime = 31;
|
||||
int result = 1;
|
||||
result = prime * result + (dropDuplicates ? 1231 : 1237);
|
||||
@@ -129,7 +99,6 @@ public class IndexInfo {
|
||||
result = prime * result + ((name == null) ? 0 : name.hashCode());
|
||||
result = prime * result + (sparse ? 1231 : 1237);
|
||||
result = prime * result + (unique ? 1231 : 1237);
|
||||
result = prime * result + ObjectUtils.nullSafeHashCode(language);
|
||||
return result;
|
||||
}
|
||||
|
||||
@@ -168,9 +137,6 @@ public class IndexInfo {
|
||||
if (unique != other.unique) {
|
||||
return false;
|
||||
}
|
||||
if (!ObjectUtils.nullSafeEquals(language, other.language)) {
|
||||
return false;
|
||||
}
|
||||
return true;
|
||||
}
|
||||
}
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user