Compare commits
58 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
1202d2cbc2 | ||
|
|
0edf6c06ed | ||
|
|
0afd137e7d | ||
|
|
bc3f44197b | ||
|
|
28bd631579 | ||
|
|
5396df9af4 | ||
|
|
2b864e9744 | ||
|
|
5accbbdac5 | ||
|
|
11d9f04fd1 | ||
|
|
67b91e446e | ||
|
|
1124841e17 | ||
|
|
10ccbf131d | ||
|
|
b29930b512 | ||
|
|
d671fb13ae | ||
|
|
b0a10d19c3 | ||
|
|
d8ef7e1472 | ||
|
|
b9a25eabae | ||
|
|
e92e5c737f | ||
|
|
6e46fb12cb | ||
|
|
031d446a1c | ||
|
|
e6bab1ce60 | ||
|
|
2fffe0a5c4 | ||
|
|
2493de5f91 | ||
|
|
bd11bab076 | ||
|
|
b667984563 | ||
|
|
7ef167ed96 | ||
|
|
303a057d86 | ||
|
|
607072c0d3 | ||
|
|
11e9c562b3 | ||
|
|
220b211faa | ||
|
|
c0c51fcc29 | ||
|
|
ed9eddf10e | ||
|
|
e23d73d55e | ||
|
|
9627fbaebf | ||
|
|
ef93d4db0b | ||
|
|
928b5a7742 | ||
|
|
118a52a8d6 | ||
|
|
b47e8ca3da | ||
|
|
8527d6eb43 | ||
|
|
d645c778c3 | ||
|
|
5c47f1ae9e | ||
|
|
9f324bac19 | ||
|
|
186caba1ac | ||
|
|
2ebb7e801d | ||
|
|
a932f3474e | ||
|
|
e992456532 | ||
|
|
7b34c5cac4 | ||
|
|
16baf00f5e | ||
|
|
61a2c56a27 | ||
|
|
e62437b64a | ||
|
|
e9a7e887be | ||
|
|
b91a66f6f9 | ||
|
|
a047e54e5a | ||
|
|
8197ff57c8 | ||
|
|
a2136719e1 | ||
|
|
7bcf142c8d | ||
|
|
d50d03a80e | ||
|
|
ba894a4511 |
138
README.md
138
README.md
@@ -1,68 +1,96 @@
|
||||
# Spring Data MongoDB
|
||||
Spring Data MongoDB
|
||||
======================
|
||||
|
||||
The primary goal of the [Spring Data](http://www.springsource.org/spring-data) project is to make it easier to build Spring-powered applications that use new data access technologies such as non-relational databases, map-reduce frameworks, and cloud based data services.
|
||||
|
||||
The Spring Data MongoDB project aims to provide a familiar and consistent Spring-based programming model for new datastores while retaining store-specific features and capabilities. The Spring Data MongoDB project provides integration with the MongoDB document database. Key functional areas of Spring Data MongoDB are a POJO centric model for interacting with a MongoDB DBCollection and easily writing a repository style data access layer.
|
||||
The Spring Data MongoDB aims to provide a familiar and consistent Spring-based programming model for for new datastores while retaining store-specific features and capabilities. The Spring Data MongoDB project provides integration with the MongoDB document database. Key functional areas of Spring Data MongoDB are a POJO centric model for interacting with a MongoDB DBCollection and easily writing a Repository style data access layer
|
||||
|
||||
## Getting Help
|
||||
Getting Help
|
||||
------------
|
||||
|
||||
For a comprehensive treatment of all the Spring Data MongoDB features, please refer to:
|
||||
For a comprehensive treatmet of all the Spring Data MongoDB features, please refer to the The [User Guide](http://static.springsource.org/spring-data/data-mongodb/docs/current/reference/html/)
|
||||
|
||||
* the [User Guide](http://static.springsource.org/spring-data/data-mongodb/docs/current/reference/html/)
|
||||
* the [JavaDocs](http://static.springsource.org/spring-data/data-mongodb/docs/current/api/) have extensive comments in them as well.
|
||||
* the home page of [Spring Data MongoDB](http://www.springsource.org/spring-data/mongodb) contains links to articles and other resources.
|
||||
* for more detailed questions, use the [forum](http://forum.springsource.org/forumdisplay.php?f=80).
|
||||
The [JavaDocs](http://static.springsource.org/spring-data/data-mongodb/docs/current/api/) have extensive comments in them as well.
|
||||
|
||||
The home page of [Spring Data MongoDB](http://www.springsource.org/spring-data/mongodb) contains links to articles and other resources.
|
||||
|
||||
For more detailed questions, use the [forum](http://forum.springsource.org/forumdisplay.php?f=80).
|
||||
|
||||
If you are new to Spring as well as to Spring Data, look for information about [Spring projects](http://www.springsource.org/projects).
|
||||
|
||||
|
||||
## Quick Start
|
||||
Quick Start
|
||||
-----------
|
||||
|
||||
### Maven configuration
|
||||
## MongoDB
|
||||
|
||||
Add the Maven dependency:
|
||||
For those in a hurry:
|
||||
|
||||
|
||||
* Download the jar through Maven:
|
||||
|
||||
```xml
|
||||
<dependency>
|
||||
<groupId>org.springframework.data</groupId>
|
||||
<artifactId>spring-data-mongodb</artifactId>
|
||||
<version>1.3.2.RELEASE</version>
|
||||
<version>1.2.3.RELEASE</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
If you'd rather like the latest snapshots of the upcoming major version, use our Maven snapshot repository and declare the appropriate dependency version.
|
||||
|
||||
```xml
|
||||
<dependency>
|
||||
<groupId>org.springframework.data</groupId>
|
||||
<artifactId>spring-data-mongodb</artifactId>
|
||||
<version>1.4.0.BUILD-SNAPSHOT</version>
|
||||
</dependency>
|
||||
|
||||
<repository>
|
||||
<id>spring-libs-snapshot</id>
|
||||
<name>Spring Snapshot Repository</name>
|
||||
<url>http://repo.springsource.org/libs-snapshot</url>
|
||||
</repository>
|
||||
```
|
||||
|
||||
### MongoTemplate
|
||||
|
||||
MongoTemplate is the central support class for Mongo database operations. It provides:
|
||||
MongoTemplate is the central support class for Mongo database operations. It provides
|
||||
|
||||
* Basic POJO mapping support to and from BSON
|
||||
* Convenience methods to interact with the store (insert object, update objects) and MongoDB specific ones (geo-spatial operations, upserts, map-reduce etc.)
|
||||
* Connection affinity callback
|
||||
* Connection Affinity callback
|
||||
* Exception translation into Spring's [technology agnostic DAO exception hierarchy](http://static.springsource.org/spring/docs/3.0.x/spring-framework-reference/html/dao.html#dao-exceptions).
|
||||
|
||||
### Spring Data repositories
|
||||
Future plans are to support optional logging and/or exception throwing based on WriteResult return value, common map-reduce operations, GridFS operations. A simple API for partial document updates is also planned.
|
||||
|
||||
To simplify the creation of data repositories Spring Data MongoDB provides a generic repository programming model. It will automatically create a repository proxy for you that adds implementations of finder methods you specify on an interface.
|
||||
### Easy Data Repository generation
|
||||
|
||||
For example, given a `Person` class with first and last name properties, a `PersonRepository` interface that can query for `Person` by last name and when the first name matches a like expression is shown below:
|
||||
To simplify the creation of data repositories a generic `Repository` interface and default implementation is provided. Furthermore, Spring will automatically create a Repository implementation for you that adds implementations of finder methods you specify on an interface.
|
||||
|
||||
The Repository interface is
|
||||
|
||||
```java
|
||||
public interface PersonRepository extends CrudRepository<Person, Long> {
|
||||
public interface Repository<T, ID extends Serializable> {
|
||||
|
||||
T save(T entity);
|
||||
|
||||
List<T> save(Iterable<? extends T> entities);
|
||||
|
||||
T findById(ID id);
|
||||
|
||||
boolean exists(ID id);
|
||||
|
||||
List<T> findAll();
|
||||
|
||||
Long count();
|
||||
|
||||
void delete(T entity);
|
||||
|
||||
void delete(Iterable<? extends T> entities);
|
||||
|
||||
void deleteAll();
|
||||
}
|
||||
```
|
||||
|
||||
|
||||
The `MongoRepository` extends `Repository` and will in future add more Mongo specific methods.
|
||||
|
||||
```java
|
||||
public interface MongoRepository<T, ID extends Serializable> extends Repository<T, ID> {
|
||||
}
|
||||
```
|
||||
|
||||
`SimpleMongoRepository` is the out of the box implementation of the `MongoRepository` you can use for basid CRUD operations.
|
||||
|
||||
To go beyond basic CRUD, extend the `MongoRepository` interface and supply your own finder methods that follow simple naming conventions such that they can be easily converted into queries.
|
||||
|
||||
For example, given a `Person` class with first and last name properties, a `PersonRepository` interface that can query for `Person` by last name and when the first name matches a regular expression is shown below
|
||||
|
||||
```java
|
||||
public interface PersonRepository extends MongoRepository<Person, Long> {
|
||||
|
||||
List<Person> findByLastname(String lastname);
|
||||
|
||||
@@ -70,28 +98,7 @@ public interface PersonRepository extends CrudRepository<Person, Long> {
|
||||
}
|
||||
```
|
||||
|
||||
The queries issued on execution will be derived from the method name. Extending `CrudRepository` causes CRUD methods being pulled into the interface so that you can easily save and find single entities and collections of them.
|
||||
|
||||
You can have Spring automatically create a proxy for the interface by using the following JavaConfig:
|
||||
|
||||
```java
|
||||
@Configuration
|
||||
@EnableMongoRepositories
|
||||
class ApplicationConfig extends AbstractMongoConfiguration {
|
||||
|
||||
@Override
|
||||
public Mongo mongo() throws Exception {
|
||||
return new Mongo();
|
||||
}
|
||||
|
||||
@Override
|
||||
protected String getDatabaseName() {
|
||||
return "springdata";
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
This sets up a connection to a local MongoDB instance and enables the detection of Spring Data repositories (through `@EnableMongoRepositories`). The same configuration would look like this in XML:
|
||||
You can have Spring automatically create a proxy for the interface as shown below:
|
||||
|
||||
```xml
|
||||
<bean id="template" class="org.springframework.data.document.mongodb.MongoTemplate">
|
||||
@@ -102,6 +109,7 @@ This sets up a connection to a local MongoDB instance and enables the detection
|
||||
</bean>
|
||||
</constructor-arg>
|
||||
<constructor-arg value="database" />
|
||||
<property name="defaultCollectionName" value="springdata" />
|
||||
</bean>
|
||||
|
||||
<mongo:repositories base-package="com.acme.repository" />
|
||||
@@ -109,16 +117,12 @@ This sets up a connection to a local MongoDB instance and enables the detection
|
||||
|
||||
This will find the repository interface and register a proxy object in the container. You can use it as shown below:
|
||||
|
||||
```java
|
||||
``java
|
||||
@Service
|
||||
public class MyService {
|
||||
|
||||
private final PersonRepository repository;
|
||||
|
||||
@Autowired
|
||||
public MyService(PersonRepository repository) {
|
||||
this.repository = repository;
|
||||
}
|
||||
private final PersonRepository repository;
|
||||
|
||||
public void doWork() {
|
||||
|
||||
@@ -130,12 +134,16 @@ public class MyService {
|
||||
person = repository.save(person);
|
||||
|
||||
List<Person> lastNameResults = repository.findByLastname("Gierke");
|
||||
|
||||
List<Person> firstNameResults = repository.findByFirstnameLike("Oli*");
|
||||
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Contributing to Spring Data
|
||||
|
||||
Contributing to Spring Data
|
||||
---------------------------
|
||||
|
||||
Here are some ways for you to get involved in the community:
|
||||
|
||||
|
||||
47
pom.xml
47
pom.xml
@@ -5,7 +5,7 @@
|
||||
|
||||
<groupId>org.springframework.data</groupId>
|
||||
<artifactId>spring-data-mongodb-parent</artifactId>
|
||||
<version>1.4.0.M1</version>
|
||||
<version>1.2.5.BUILD-SNAPSHOT</version>
|
||||
<packaging>pom</packaging>
|
||||
|
||||
<name>Spring Data MongoDB</name>
|
||||
@@ -15,7 +15,7 @@
|
||||
<parent>
|
||||
<groupId>org.springframework.data.build</groupId>
|
||||
<artifactId>spring-data-parent</artifactId>
|
||||
<version>1.3.0.M1</version>
|
||||
<version>1.0.5.RELEASE</version>
|
||||
<relativePath>../spring-data-build/parent/pom.xml</relativePath>
|
||||
</parent>
|
||||
|
||||
@@ -29,19 +29,19 @@
|
||||
<properties>
|
||||
<project.type>multi</project.type>
|
||||
<dist.id>spring-data-mongodb</dist.id>
|
||||
<springdata.commons>1.7.0.M1</springdata.commons>
|
||||
<mongo>2.11.3</mongo>
|
||||
<springdata.commons>1.5.3.RELEASE</springdata.commons>
|
||||
<mongo>2.10.1</mongo>
|
||||
</properties>
|
||||
|
||||
<developers>
|
||||
<developer>
|
||||
<id>ogierke</id>
|
||||
<name>Oliver Gierke</name>
|
||||
<email>ogierke at gopivotal.com</email>
|
||||
<organization>Pivotal</organization>
|
||||
<organizationUrl>http://www.gopivotal.com</organizationUrl>
|
||||
<email>ogierke at vmware.com</email>
|
||||
<organization>SpringSource</organization>
|
||||
<organizationUrl>http://www.springsource.com</organizationUrl>
|
||||
<roles>
|
||||
<role>Project Lead</role>
|
||||
<role>Project Lean</role>
|
||||
</roles>
|
||||
<timezone>+1</timezone>
|
||||
</developer>
|
||||
@@ -49,8 +49,8 @@
|
||||
<id>trisberg</id>
|
||||
<name>Thomas Risberg</name>
|
||||
<email>trisberg at vmware.com</email>
|
||||
<organization>Pivotal</organization>
|
||||
<organizationUrl>http://www.gopivotal.com</organizationUrl>
|
||||
<organization>SpringSource</organization>
|
||||
<organizationUrl>http://www.springsource.com</organizationUrl>
|
||||
<roles>
|
||||
<role>Developer</role>
|
||||
</roles>
|
||||
@@ -59,9 +59,9 @@
|
||||
<developer>
|
||||
<id>mpollack</id>
|
||||
<name>Mark Pollack</name>
|
||||
<email>mpollack at gopivotal.com</email>
|
||||
<organization>Pivotal</organization>
|
||||
<organizationUrl>http://www.gopivotal.com</organizationUrl>
|
||||
<email>mpollack at vmware.com</email>
|
||||
<organization>SpringSource</organization>
|
||||
<organizationUrl>http://www.springsource.com</organizationUrl>
|
||||
<roles>
|
||||
<role>Developer</role>
|
||||
</roles>
|
||||
@@ -70,25 +70,14 @@
|
||||
<developer>
|
||||
<id>jbrisbin</id>
|
||||
<name>Jon Brisbin</name>
|
||||
<email>jbrisbin at gopivotal.com</email>
|
||||
<organization>Pivotal</organization>
|
||||
<organizationUrl>http://www.gopivotal.com</organizationUrl>
|
||||
<email>jbrisbin at vmware.com</email>
|
||||
<organization>SpringSource</organization>
|
||||
<organizationUrl>http://www.springsource.com</organizationUrl>
|
||||
<roles>
|
||||
<role>Developer</role>
|
||||
</roles>
|
||||
<timezone>-6</timezone>
|
||||
</developer>
|
||||
<developer>
|
||||
<id>tdarimont</id>
|
||||
<name>Thomas Darimont</name>
|
||||
<email>tdarimont at gopivotal.com</email>
|
||||
<organization>Pivotal</organization>
|
||||
<organizationUrl>http://www.gopivotal.com</organizationUrl>
|
||||
<roles>
|
||||
<role>Developer</role>
|
||||
</roles>
|
||||
<timezone>+1</timezone>
|
||||
</developer>
|
||||
</developers>
|
||||
|
||||
<dependencies>
|
||||
@@ -102,8 +91,8 @@
|
||||
|
||||
<repositories>
|
||||
<repository>
|
||||
<id>spring-libs-milestone</id>
|
||||
<url>http://repo.springsource.org/libs-milestone-local</url>
|
||||
<id>spring-libs-release</id>
|
||||
<url>http://repo.springsource.org/libs-release</url>
|
||||
</repository>
|
||||
</repositories>
|
||||
|
||||
|
||||
@@ -6,7 +6,7 @@
|
||||
<parent>
|
||||
<groupId>org.springframework.data</groupId>
|
||||
<artifactId>spring-data-mongodb-parent</artifactId>
|
||||
<version>1.4.0.M1</version>
|
||||
<version>1.2.5.BUILD-SNAPSHOT</version>
|
||||
<relativePath>../pom.xml</relativePath>
|
||||
</parent>
|
||||
|
||||
@@ -52,7 +52,7 @@
|
||||
<dependency>
|
||||
<groupId>org.springframework.data</groupId>
|
||||
<artifactId>spring-data-mongodb</artifactId>
|
||||
<version>1.4.0.M1</version>
|
||||
<version>1.2.5.BUILD-SNAPSHOT</version>
|
||||
</dependency>
|
||||
|
||||
<dependency>
|
||||
|
||||
@@ -12,7 +12,7 @@ Import-Template:
|
||||
org.aspectj.*;version="${aspectj:[1.0.0, 2.0.0)}",
|
||||
org.bson.*;version="0",
|
||||
org.slf4j.*;version="${slf4j:[=.=.=,+1.0.0)}",
|
||||
org.springframework.*;version="${spring:[=.=.=.=,+1.0.0)}",
|
||||
org.springframework.*;version="${spring30:[=.=.=.=,+1.0.0)}",
|
||||
org.springframework.data.*;version="${springdata.commons:[=.=.=.=,+1.0.0)}",
|
||||
org.springframework.data.mongodb.*;version="${project.version:[=.=.=.=,+1.0.0)}",
|
||||
org.w3c.dom.*;version="0"
|
||||
|
||||
@@ -13,7 +13,7 @@
|
||||
<parent>
|
||||
<groupId>org.springframework.data</groupId>
|
||||
<artifactId>spring-data-mongodb-parent</artifactId>
|
||||
<version>1.4.0.M1</version>
|
||||
<version>1.2.5.BUILD-SNAPSHOT</version>
|
||||
<relativePath>../pom.xml</relativePath>
|
||||
</parent>
|
||||
|
||||
|
||||
@@ -5,7 +5,7 @@
|
||||
<parent>
|
||||
<groupId>org.springframework.data</groupId>
|
||||
<artifactId>spring-data-mongodb-parent</artifactId>
|
||||
<version>1.4.0.M1</version>
|
||||
<version>1.2.5.BUILD-SNAPSHOT</version>
|
||||
<relativePath>../pom.xml</relativePath>
|
||||
</parent>
|
||||
|
||||
|
||||
@@ -1,176 +1,146 @@
|
||||
<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
|
||||
<context version="7.1.9.205">
|
||||
<scope type="Project" name="spring-data-mongodb">
|
||||
<element type="TypeFilterReferenceOverridden" name="Filter">
|
||||
<element type="IncludeTypePattern" name="org.springframework.data.mongodb.**"/>
|
||||
<context version="7.0.3.1152">
|
||||
<scope name="spring-data-mongodb" type="Project">
|
||||
<element name="Filter" type="TypeFilterReferenceOverridden">
|
||||
<element name="org.springframework.data.mongodb.**" type="IncludeTypePattern"/>
|
||||
</element>
|
||||
<architecture>
|
||||
<element type="Layer" name="Config">
|
||||
<element type="TypeFilter" name="Assignment">
|
||||
<element type="WeakTypePattern" name="**.config.**"/>
|
||||
<element name="Config" type="Layer">
|
||||
<element name="Assignment" type="TypeFilter">
|
||||
<element name="**.config.**" type="WeakTypePattern"/>
|
||||
</element>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Core" type="AllowedDependency"/>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|GridFS" type="AllowedDependency"/>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Monitoring" type="AllowedDependency"/>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Repositories" type="AllowedDependency"/>
|
||||
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core"/>
|
||||
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Monitoring"/>
|
||||
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Repositories"/>
|
||||
</element>
|
||||
<element type="Layer" name="Repositories">
|
||||
<element type="TypeFilter" name="Assignment">
|
||||
<element type="IncludeTypePattern" name="**.repository.**"/>
|
||||
<element name="Repositories" type="Layer">
|
||||
<element name="Assignment" type="TypeFilter">
|
||||
<element name="**.repository.**" type="IncludeTypePattern"/>
|
||||
</element>
|
||||
<element type="Subsystem" name="API">
|
||||
<element type="TypeFilter" name="Assignment">
|
||||
<element type="IncludeTypePattern" name="**.repository.*"/>
|
||||
<element name="API" type="Subsystem">
|
||||
<element name="Assignment" type="TypeFilter">
|
||||
<element name="**.repository.*" type="IncludeTypePattern"/>
|
||||
</element>
|
||||
</element>
|
||||
<element type="Subsystem" name="Query">
|
||||
<element type="TypeFilter" name="Assignment">
|
||||
<element type="IncludeTypePattern" name="**.query.**"/>
|
||||
<element name="Query" type="Subsystem">
|
||||
<element name="Assignment" type="TypeFilter">
|
||||
<element name="**.query.**" type="IncludeTypePattern"/>
|
||||
</element>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|API" type="AllowedDependency"/>
|
||||
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|API"/>
|
||||
</element>
|
||||
<element type="Subsystem" name="Implementation">
|
||||
<element type="TypeFilter" name="Assignment">
|
||||
<element type="IncludeTypePattern" name="**.support.**"/>
|
||||
<element name="Implementation" type="Subsystem">
|
||||
<element name="Assignment" type="TypeFilter">
|
||||
<element name="**.support.**" type="IncludeTypePattern"/>
|
||||
</element>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|API" type="AllowedDependency"/>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|Query" type="AllowedDependency"/>
|
||||
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|API"/>
|
||||
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|Query"/>
|
||||
</element>
|
||||
<element type="Subsystem" name="Config">
|
||||
<element type="TypeFilter" name="Assignment">
|
||||
<element type="IncludeTypePattern" name="**.config.**"/>
|
||||
<element name="Config" type="Subsystem">
|
||||
<element name="Assignment" type="TypeFilter">
|
||||
<element name="**.config.**" type="IncludeTypePattern"/>
|
||||
</element>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|Implementation" type="AllowedDependency"/>
|
||||
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|Implementation"/>
|
||||
</element>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Core" type="AllowedDependency"/>
|
||||
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core"/>
|
||||
</element>
|
||||
<element type="Layer" name="Monitoring">
|
||||
<element type="TypeFilter" name="Assignment">
|
||||
<element type="IncludeTypePattern" name="**.monitor.**"/>
|
||||
<element name="Monitoring" type="Layer">
|
||||
<element name="Assignment" type="TypeFilter">
|
||||
<element name="**.monitor.**" type="IncludeTypePattern"/>
|
||||
</element>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Core" type="AllowedDependency"/>
|
||||
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core"/>
|
||||
</element>
|
||||
<element type="Layer" name="GridFS">
|
||||
<element type="TypeFilter" name="Assignment">
|
||||
<element type="IncludeTypePattern" name="**.gridfs.**"/>
|
||||
<element name="GridFS" type="Layer">
|
||||
<element name="Assignment" type="TypeFilter">
|
||||
<element name="**.gridfs.**" type="IncludeTypePattern"/>
|
||||
</element>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Core" type="AllowedDependency"/>
|
||||
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core"/>
|
||||
</element>
|
||||
<element type="Layer" name="Core">
|
||||
<element type="TypeFilter" name="Assignment">
|
||||
<element type="IncludeTypePattern" name="**.core.**"/>
|
||||
<element name="Core" type="Layer">
|
||||
<element name="Assignment" type="TypeFilter">
|
||||
<element name="**.core.**" type="IncludeTypePattern"/>
|
||||
</element>
|
||||
<element type="Subsystem" name="Mapping">
|
||||
<element type="TypeFilter" name="Assignment">
|
||||
<element type="IncludeTypePattern" name="**.mapping.**"/>
|
||||
<element name="Mapping" type="Subsystem">
|
||||
<element name="Assignment" type="TypeFilter">
|
||||
<element name="**.mapping.**" type="IncludeTypePattern"/>
|
||||
</element>
|
||||
</element>
|
||||
<element type="Subsystem" name="Geospatial">
|
||||
<element type="TypeFilter" name="Assignment">
|
||||
<element type="IncludeTypePattern" name="**.geo.**"/>
|
||||
<element name="Geospatial" type="Subsystem">
|
||||
<element name="Assignment" type="TypeFilter">
|
||||
<element name="**.geo.**" type="IncludeTypePattern"/>
|
||||
</element>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping" type="AllowedDependency"/>
|
||||
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping"/>
|
||||
</element>
|
||||
<element type="Subsystem" name="Query">
|
||||
<element type="TypeFilter" name="Assignment">
|
||||
<element type="IncludeTypePattern" name="**.query.**"/>
|
||||
<element name="Query" type="Subsystem">
|
||||
<element name="Assignment" type="TypeFilter">
|
||||
<element name="**.query.**" type="IncludeTypePattern"/>
|
||||
</element>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Geospatial" type="AllowedDependency"/>
|
||||
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Geospatial"/>
|
||||
</element>
|
||||
<element type="Subsystem" name="Conversion">
|
||||
<element type="TypeFilter" name="Assignment">
|
||||
<element type="IncludeTypePattern" name="**.convert.**"/>
|
||||
<element name="Index" type="Subsystem">
|
||||
<element name="Assignment" type="TypeFilter">
|
||||
<element name="**.index.**" type="IncludeTypePattern"/>
|
||||
</element>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping" type="AllowedDependency"/>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Query" type="AllowedDependency"/>
|
||||
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping"/>
|
||||
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Query"/>
|
||||
</element>
|
||||
<element type="Subsystem" name="SpEL">
|
||||
<element type="TypeFilter" name="Assignment">
|
||||
<element type="IncludeTypePattern" name="**.spel.**"/>
|
||||
<element name="Core" type="Subsystem">
|
||||
<element name="Assignment" type="TypeFilter">
|
||||
<element name="**.core.**" type="WeakTypePattern"/>
|
||||
</element>
|
||||
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Geospatial"/>
|
||||
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Index"/>
|
||||
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping"/>
|
||||
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Query"/>
|
||||
</element>
|
||||
<element type="Subsystem" name="Aggregation">
|
||||
<element type="TypeFilter" name="Assignment">
|
||||
<element type="IncludeTypePattern" name="**.aggregation.**"/>
|
||||
</element>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Conversion" type="AllowedDependency"/>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping" type="AllowedDependency"/>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Query" type="AllowedDependency"/>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|SpEL" type="AllowedDependency"/>
|
||||
</element>
|
||||
<element type="Subsystem" name="Index">
|
||||
<element type="TypeFilter" name="Assignment">
|
||||
<element type="IncludeTypePattern" name="**.index.**"/>
|
||||
</element>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping" type="AllowedDependency"/>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Query" type="AllowedDependency"/>
|
||||
</element>
|
||||
<element type="Subsystem" name="Core">
|
||||
<element type="TypeFilter" name="Assignment">
|
||||
<element type="WeakTypePattern" name="**.core.**"/>
|
||||
</element>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Aggregation" type="AllowedDependency"/>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Conversion" type="AllowedDependency"/>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Geospatial" type="AllowedDependency"/>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Index" type="AllowedDependency"/>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping" type="AllowedDependency"/>
|
||||
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Query" type="AllowedDependency"/>
|
||||
</element>
|
||||
</element>
|
||||
<element type="Subsystem" name="API">
|
||||
<element type="TypeFilter" name="Assignment">
|
||||
<element type="IncludeTypePattern" name="org.springframework.data.mongodb.*"/>
|
||||
</element>
|
||||
<stereotype name="Public"/>
|
||||
</element>
|
||||
</architecture>
|
||||
<workspace>
|
||||
<element type="JavaRootDirectory" name="src/main/java">
|
||||
<element name="src/main/java" type="JavaRootDirectory">
|
||||
<reference name="Project|spring-data-mongodb::BuildUnit|spring-data-mongodb"/>
|
||||
</element>
|
||||
<element type="JavaRootDirectory" name="target/classes">
|
||||
<element name="target/classes" type="JavaRootDirectory">
|
||||
<reference name="Project|spring-data-mongodb::BuildUnit|spring-data-mongodb"/>
|
||||
</element>
|
||||
</workspace>
|
||||
<physical>
|
||||
<element type="BuildUnit" name="spring-data-mongodb"/>
|
||||
<element name="spring-data-mongodb" type="BuildUnit"/>
|
||||
</physical>
|
||||
</scope>
|
||||
<scope type="External" name="External">
|
||||
<element type="TypeFilter" name="Filter">
|
||||
<element type="IncludeTypePattern" name="**"/>
|
||||
<element type="ExcludeTypePattern" name="java.**"/>
|
||||
<element type="ExcludeTypePattern" name="javax.**"/>
|
||||
<scope name="External" type="External">
|
||||
<element name="Filter" type="TypeFilter">
|
||||
<element name="**" type="IncludeTypePattern"/>
|
||||
<element name="java.**" type="ExcludeTypePattern"/>
|
||||
<element name="javax.**" type="ExcludeTypePattern"/>
|
||||
</element>
|
||||
<architecture>
|
||||
<element type="Subsystem" name="Spring">
|
||||
<element type="TypeFilter" name="Assignment">
|
||||
<element type="IncludeTypePattern" name="org.springframework.**"/>
|
||||
<element type="ExcludeTypePattern" name="org.springframework.data.**"/>
|
||||
<element name="Spring" type="Subsystem">
|
||||
<element name="Assignment" type="TypeFilter">
|
||||
<element name="org.springframework.**" type="IncludeTypePattern"/>
|
||||
<element name="org.springframework.data.**" type="ExcludeTypePattern"/>
|
||||
</element>
|
||||
</element>
|
||||
<element type="Subsystem" name="Spring Data Core">
|
||||
<element type="TypeFilter" name="Assignment">
|
||||
<element type="IncludeTypePattern" name="org.springframework.data.**"/>
|
||||
<element name="Spring Data Core" type="Subsystem">
|
||||
<element name="Assignment" type="TypeFilter">
|
||||
<element name="org.springframework.data.**" type="IncludeTypePattern"/>
|
||||
</element>
|
||||
</element>
|
||||
<element type="Subsystem" name="Mongo Java Driver">
|
||||
<element type="TypeFilter" name="Assignment">
|
||||
<element type="IncludeTypePattern" name="com.mongodb.**"/>
|
||||
<element type="IncludeTypePattern" name="org.bson.**"/>
|
||||
<element name="Mongo Java Driver" type="Subsystem">
|
||||
<element name="Assignment" type="TypeFilter">
|
||||
<element name="com.mongodb.**" type="IncludeTypePattern"/>
|
||||
<element name="org.bson.**" type="IncludeTypePattern"/>
|
||||
</element>
|
||||
</element>
|
||||
<element type="Subsystem" name="Querydsl">
|
||||
<element type="TypeFilter" name="Assignment">
|
||||
<element type="IncludeTypePattern" name="com.mysema.query.**"/>
|
||||
<element name="Querydsl" type="Subsystem">
|
||||
<element name="Assignment" type="TypeFilter">
|
||||
<element name="com.mysema.query.**" type="IncludeTypePattern"/>
|
||||
</element>
|
||||
</element>
|
||||
</architecture>
|
||||
</scope>
|
||||
<scope type="Global" name="Global">
|
||||
<element type="Configuration" name="Configuration"/>
|
||||
<element type="TypeFilter" name="Filter">
|
||||
<element type="IncludeTypePattern" name="**"/>
|
||||
<scope name="Global" type="Global">
|
||||
<element name="Configuration" type="Configuration"/>
|
||||
<element name="Filter" type="TypeFilter">
|
||||
<element name="**" type="IncludeTypePattern"/>
|
||||
</element>
|
||||
</scope>
|
||||
</context>
|
||||
|
||||
@@ -11,13 +11,12 @@
|
||||
<parent>
|
||||
<groupId>org.springframework.data</groupId>
|
||||
<artifactId>spring-data-mongodb-parent</artifactId>
|
||||
<version>1.4.0.M1</version>
|
||||
<version>1.2.5.BUILD-SNAPSHOT</version>
|
||||
<relativePath>../pom.xml</relativePath>
|
||||
</parent>
|
||||
|
||||
<properties>
|
||||
<validation>1.0.0.GA</validation>
|
||||
<objenesis>1.3</objenesis>
|
||||
</properties>
|
||||
|
||||
<dependencies>
|
||||
@@ -121,13 +120,6 @@
|
||||
<optional>true</optional>
|
||||
</dependency>
|
||||
|
||||
<dependency>
|
||||
<groupId>org.objenesis</groupId>
|
||||
<artifactId>objenesis</artifactId>
|
||||
<version>${objenesis}</version>
|
||||
<optional>true</optional>
|
||||
</dependency>
|
||||
|
||||
<dependency>
|
||||
<groupId>org.hibernate</groupId>
|
||||
<artifactId>hibernate-validator</artifactId>
|
||||
@@ -149,8 +141,8 @@
|
||||
|
||||
<plugin>
|
||||
<groupId>com.mysema.maven</groupId>
|
||||
<artifactId>apt-maven-plugin</artifactId>
|
||||
<version>1.0.8</version>
|
||||
<artifactId>maven-apt-plugin</artifactId>
|
||||
<version>1.0.4</version>
|
||||
<dependencies>
|
||||
<dependency>
|
||||
<groupId>com.mysema.querydsl</groupId>
|
||||
|
||||
@@ -1,34 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb;
|
||||
|
||||
import org.springframework.dao.UncategorizedDataAccessException;
|
||||
|
||||
/**
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
public class LazyLoadingException extends UncategorizedDataAccessException {
|
||||
|
||||
private static final long serialVersionUID = -7089224903873220037L;
|
||||
|
||||
/**
|
||||
* @param msg
|
||||
* @param cause
|
||||
*/
|
||||
public LazyLoadingException(String msg, Throwable cause) {
|
||||
super(msg, cause);
|
||||
}
|
||||
}
|
||||
@@ -13,9 +13,10 @@
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core;
|
||||
package org.springframework.data.mongodb;
|
||||
|
||||
import org.springframework.dao.DataIntegrityViolationException;
|
||||
import org.springframework.data.mongodb.core.MongoActionOperation;
|
||||
import org.springframework.util.Assert;
|
||||
|
||||
import com.mongodb.WriteResult;
|
||||
@@ -1,23 +1,6 @@
|
||||
/*
|
||||
* Copyright 2011-2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb;
|
||||
|
||||
import org.springframework.dao.DataAccessException;
|
||||
import org.springframework.dao.support.PersistenceExceptionTranslator;
|
||||
import org.springframework.data.mongodb.core.MongoExceptionTranslator;
|
||||
|
||||
import com.mongodb.DB;
|
||||
|
||||
@@ -25,7 +8,6 @@ import com.mongodb.DB;
|
||||
* Interface for factories creating {@link DB} instances.
|
||||
*
|
||||
* @author Mark Pollack
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
public interface MongoDbFactory {
|
||||
|
||||
@@ -45,11 +27,4 @@ public interface MongoDbFactory {
|
||||
* @throws DataAccessException
|
||||
*/
|
||||
DB getDb(String dbName) throws DataAccessException;
|
||||
|
||||
/**
|
||||
* Exposes a shared {@link MongoExceptionTranslator}.
|
||||
*
|
||||
* @return will never be {@literal null}.
|
||||
*/
|
||||
PersistenceExceptionTranslator getExceptionTranslator();
|
||||
}
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2011-2013 the original author or authors.
|
||||
* Copyright 2011-2012 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -31,10 +31,7 @@ import org.springframework.data.mapping.context.MappingContextIsNewStrategyFacto
|
||||
import org.springframework.data.mongodb.core.MongoTemplate;
|
||||
import org.springframework.data.mongodb.core.SimpleMongoDbFactory;
|
||||
import org.springframework.data.mongodb.core.convert.CustomConversions;
|
||||
import org.springframework.data.mongodb.core.convert.DbRefResolver;
|
||||
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver;
|
||||
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
|
||||
import org.springframework.data.mongodb.core.mapping.CamelCaseAbbreviatingFieldNamingStrategy;
|
||||
import org.springframework.data.mongodb.core.mapping.Document;
|
||||
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
|
||||
import org.springframework.data.support.CachingIsNewStrategyFactory;
|
||||
@@ -49,7 +46,6 @@ import com.mongodb.Mongo;
|
||||
*
|
||||
* @author Mark Pollack
|
||||
* @author Oliver Gierke
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
@Configuration
|
||||
public abstract class AbstractMongoConfiguration {
|
||||
@@ -62,22 +58,12 @@ public abstract class AbstractMongoConfiguration {
|
||||
protected abstract String getDatabaseName();
|
||||
|
||||
/**
|
||||
* Return the name of the authentication database to use. Defaults to {@literal null} and will turn into the value
|
||||
* returned by {@link #getDatabaseName()} later on effectively.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
protected String getAuthenticationDatabaseName() {
|
||||
return null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Return the {@link Mongo} instance to connect to. Annotate with {@link Bean} in case you want to expose a
|
||||
* {@link Mongo} instance to the {@link org.springframework.context.ApplicationContext}.
|
||||
* Return the {@link Mongo} instance to connect to.
|
||||
*
|
||||
* @return
|
||||
* @throws Exception
|
||||
*/
|
||||
@Bean
|
||||
public abstract Mongo mongo() throws Exception;
|
||||
|
||||
/**
|
||||
@@ -102,7 +88,14 @@ public abstract class AbstractMongoConfiguration {
|
||||
*/
|
||||
@Bean
|
||||
public SimpleMongoDbFactory mongoDbFactory() throws Exception {
|
||||
return new SimpleMongoDbFactory(mongo(), getDatabaseName(), getUserCredentials(), getAuthenticationDatabaseName());
|
||||
|
||||
UserCredentials credentials = getUserCredentials();
|
||||
|
||||
if (credentials == null) {
|
||||
return new SimpleMongoDbFactory(mongo(), getDatabaseName());
|
||||
} else {
|
||||
return new SimpleMongoDbFactory(mongo(), getDatabaseName(), credentials);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -142,10 +135,6 @@ public abstract class AbstractMongoConfiguration {
|
||||
mappingContext.setInitialEntitySet(getInitialEntitySet());
|
||||
mappingContext.setSimpleTypeHolder(customConversions().getSimpleTypeHolder());
|
||||
|
||||
if (abbreviateFieldNames()) {
|
||||
mappingContext.setFieldNamingStrategy(new CamelCaseAbbreviatingFieldNamingStrategy());
|
||||
}
|
||||
|
||||
return mappingContext;
|
||||
}
|
||||
|
||||
@@ -184,11 +173,8 @@ public abstract class AbstractMongoConfiguration {
|
||||
*/
|
||||
@Bean
|
||||
public MappingMongoConverter mappingMongoConverter() throws Exception {
|
||||
|
||||
DbRefResolver dbRefResolver = new DefaultDbRefResolver(mongoDbFactory());
|
||||
MappingMongoConverter converter = new MappingMongoConverter(dbRefResolver, mongoMappingContext());
|
||||
MappingMongoConverter converter = new MappingMongoConverter(mongoDbFactory(), mongoMappingContext());
|
||||
converter.setCustomConversions(customConversions());
|
||||
|
||||
return converter;
|
||||
}
|
||||
|
||||
@@ -218,15 +204,4 @@ public abstract class AbstractMongoConfiguration {
|
||||
|
||||
return initialEntitySet;
|
||||
}
|
||||
|
||||
/**
|
||||
* Configures whether to abbreviate field names for domain objects by configuring a
|
||||
* {@link CamelCaseAbbreviatingFieldNamingStrategy} on the {@link MongoMappingContext} instance created. For advanced
|
||||
* customization needs, consider overriding {@link #mappingMongoConverter()}.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
protected boolean abbreviateFieldNames() {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2011-2013 the original author or authors.
|
||||
* Copyright (c) 2011 by the original author(s).
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -13,14 +13,11 @@
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
|
||||
package org.springframework.data.mongodb.config;
|
||||
|
||||
/**
|
||||
* Constants to declare bean names used by the namespace configuration.
|
||||
*
|
||||
* @author Jon Brisbin
|
||||
* @author Oliver Gierke
|
||||
* @author Martin Baumgartner
|
||||
* @author Jon Brisbin <jbrisbin@vmware.com>
|
||||
*/
|
||||
public abstract class BeanNames {
|
||||
|
||||
@@ -31,6 +28,4 @@ public abstract class BeanNames {
|
||||
static final String VALIDATING_EVENT_LISTENER = "validatingMongoEventListener";
|
||||
static final String IS_NEW_STRATEGY_FACTORY = "isNewStrategyFactory";
|
||||
static final String DEFAULT_CONVERTER_BEAN_NAME = "mappingConverter";
|
||||
static final String MONGO_TEMPLATE = "mongoTemplate";
|
||||
static final String GRID_FS_TEMPLATE = "gridFsTemplate";
|
||||
}
|
||||
|
||||
@@ -1,70 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.config;
|
||||
|
||||
import java.lang.annotation.Documented;
|
||||
import java.lang.annotation.ElementType;
|
||||
import java.lang.annotation.Inherited;
|
||||
import java.lang.annotation.Retention;
|
||||
import java.lang.annotation.RetentionPolicy;
|
||||
import java.lang.annotation.Target;
|
||||
|
||||
import org.springframework.context.annotation.Import;
|
||||
import org.springframework.data.auditing.DateTimeProvider;
|
||||
import org.springframework.data.domain.AuditorAware;
|
||||
|
||||
/**
|
||||
* Annotation to enable auditing in MongoDB via annotation configuration.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
@Inherited
|
||||
@Documented
|
||||
@Target(ElementType.TYPE)
|
||||
@Retention(RetentionPolicy.RUNTIME)
|
||||
@Import(MongoAuditingRegistrar.class)
|
||||
public @interface EnableMongoAuditing {
|
||||
|
||||
/**
|
||||
* Configures the {@link AuditorAware} bean to be used to lookup the current principal.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
String auditorAwareRef() default "";
|
||||
|
||||
/**
|
||||
* Configures whether the creation and modification dates are set. Defaults to {@literal true}.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
boolean setDates() default true;
|
||||
|
||||
/**
|
||||
* Configures whether the entity shall be marked as modified on creation. Defaults to {@literal true}.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
boolean modifyOnCreate() default true;
|
||||
|
||||
/**
|
||||
* Configures a {@link DateTimeProvider} bean name that allows customizing the {@link org.joda.time.DateTime} to be
|
||||
* used for setting creation and modification dates.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
String dateTimeProviderRef() default "";
|
||||
}
|
||||
@@ -1,78 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.config;
|
||||
|
||||
import org.springframework.beans.factory.BeanDefinitionStoreException;
|
||||
import org.springframework.beans.factory.config.BeanDefinition;
|
||||
import org.springframework.beans.factory.support.AbstractBeanDefinition;
|
||||
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
|
||||
import org.springframework.beans.factory.xml.AbstractBeanDefinitionParser;
|
||||
import org.springframework.beans.factory.xml.BeanDefinitionParser;
|
||||
import org.springframework.beans.factory.xml.ParserContext;
|
||||
import org.springframework.data.config.BeanComponentDefinitionBuilder;
|
||||
import org.springframework.data.mongodb.gridfs.GridFsTemplate;
|
||||
import org.springframework.util.StringUtils;
|
||||
import org.w3c.dom.Element;
|
||||
|
||||
/**
|
||||
* {@link BeanDefinitionParser} to parse {@code gridFsTemplate} elements into {@link BeanDefinition}s.
|
||||
*
|
||||
* @author Martin Baumgartner
|
||||
*/
|
||||
class GridFsTemplateParser extends AbstractBeanDefinitionParser {
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#resolveId(org.w3c.dom.Element, org.springframework.beans.factory.support.AbstractBeanDefinition, org.springframework.beans.factory.xml.ParserContext)
|
||||
*/
|
||||
@Override
|
||||
protected String resolveId(Element element, AbstractBeanDefinition definition, ParserContext parserContext)
|
||||
throws BeanDefinitionStoreException {
|
||||
|
||||
String id = super.resolveId(element, definition, parserContext);
|
||||
return StringUtils.hasText(id) ? id : BeanNames.GRID_FS_TEMPLATE;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#parseInternal(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
|
||||
*/
|
||||
@Override
|
||||
protected AbstractBeanDefinition parseInternal(Element element, ParserContext parserContext) {
|
||||
|
||||
BeanComponentDefinitionBuilder helper = new BeanComponentDefinitionBuilder(element, parserContext);
|
||||
|
||||
String converterRef = element.getAttribute("converter-ref");
|
||||
String dbFactoryRef = element.getAttribute("db-factory-ref");
|
||||
|
||||
BeanDefinitionBuilder gridFsTemplateBuilder = BeanDefinitionBuilder.genericBeanDefinition(GridFsTemplate.class);
|
||||
|
||||
if (StringUtils.hasText(dbFactoryRef)) {
|
||||
gridFsTemplateBuilder.addConstructorArgReference(dbFactoryRef);
|
||||
} else {
|
||||
gridFsTemplateBuilder.addConstructorArgReference(BeanNames.DB_FACTORY);
|
||||
}
|
||||
|
||||
if (StringUtils.hasText(converterRef)) {
|
||||
gridFsTemplateBuilder.addConstructorArgReference(converterRef);
|
||||
} else {
|
||||
gridFsTemplateBuilder.addConstructorArgReference(BeanNames.DEFAULT_CONVERTER_BEAN_NAME);
|
||||
}
|
||||
|
||||
return (AbstractBeanDefinition) helper.getComponentIdButFallback(gridFsTemplateBuilder, BeanNames.GRID_FS_TEMPLATE)
|
||||
.getBeanDefinition();
|
||||
}
|
||||
}
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2011-2013 the original author or authors.
|
||||
* Copyright 2011-2012 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -54,7 +54,6 @@ import org.springframework.data.mapping.context.MappingContextIsNewStrategyFacto
|
||||
import org.springframework.data.mongodb.core.convert.CustomConversions;
|
||||
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
|
||||
import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexCreator;
|
||||
import org.springframework.data.mongodb.core.mapping.CamelCaseAbbreviatingFieldNamingStrategy;
|
||||
import org.springframework.data.mongodb.core.mapping.Document;
|
||||
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
|
||||
import org.springframework.data.mongodb.core.mapping.event.ValidatingMongoEventListener;
|
||||
@@ -70,7 +69,6 @@ import org.w3c.dom.Element;
|
||||
* @author Jon Brisbin
|
||||
* @author Oliver Gierke
|
||||
* @author Maciej Walkowiak
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
public class MappingMongoConverterParser implements BeanDefinitionParser {
|
||||
|
||||
@@ -106,11 +104,6 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
|
||||
converterBuilder.addConstructorArgReference(dbFactoryRef);
|
||||
converterBuilder.addConstructorArgReference(ctxRef);
|
||||
|
||||
String typeMapperRef = element.getAttribute("type-mapper-ref");
|
||||
if (StringUtils.hasText(typeMapperRef)) {
|
||||
converterBuilder.addPropertyReference("typeMapper", typeMapperRef);
|
||||
}
|
||||
|
||||
if (conversionsDefinition != null) {
|
||||
converterBuilder.addPropertyValue("customConversions", conversionsDefinition);
|
||||
}
|
||||
@@ -208,12 +201,6 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
|
||||
mappingContextBuilder.addPropertyValue("simpleTypeHolder", simpleTypesDefinition);
|
||||
}
|
||||
|
||||
String abbreviateFieldNames = element.getAttribute("abbreviate-field-names");
|
||||
if ("true".equals(abbreviateFieldNames)) {
|
||||
mappingContextBuilder.addPropertyValue("fieldNamingStrategy", new RootBeanDefinition(
|
||||
CamelCaseAbbreviatingFieldNamingStrategy.class));
|
||||
}
|
||||
|
||||
ctxRef = converterId + "." + MAPPING_CONTEXT;
|
||||
|
||||
parserContext.registerBeanComponent(componentDefinitionBuilder.getComponent(mappingContextBuilder, ctxRef));
|
||||
|
||||
@@ -1,105 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.config;
|
||||
|
||||
import java.lang.annotation.Annotation;
|
||||
|
||||
import org.springframework.beans.factory.config.BeanDefinition;
|
||||
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
|
||||
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
|
||||
import org.springframework.context.annotation.ImportBeanDefinitionRegistrar;
|
||||
import org.springframework.core.type.AnnotationMetadata;
|
||||
import org.springframework.data.auditing.IsNewAwareAuditingHandler;
|
||||
import org.springframework.data.auditing.config.AnnotationAuditingConfiguration;
|
||||
import org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport;
|
||||
import org.springframework.data.mapping.context.MappingContextIsNewStrategyFactory;
|
||||
import org.springframework.data.mongodb.core.mapping.event.AuditingEventListener;
|
||||
import org.springframework.data.support.IsNewStrategyFactory;
|
||||
import org.springframework.util.Assert;
|
||||
|
||||
/**
|
||||
* {@link ImportBeanDefinitionRegistrar} to enable {@link EnableMongoAuditing} annotation.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAnnotation()
|
||||
*/
|
||||
@Override
|
||||
protected Class<? extends Annotation> getAnnotation() {
|
||||
return EnableMongoAuditing.class;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#registerBeanDefinitions(org.springframework.core.type.AnnotationMetadata, org.springframework.beans.factory.support.BeanDefinitionRegistry)
|
||||
*/
|
||||
@Override
|
||||
public void registerBeanDefinitions(AnnotationMetadata annotationMetadata, BeanDefinitionRegistry registry) {
|
||||
|
||||
Assert.notNull(annotationMetadata, "AnnotationMetadata must not be null!");
|
||||
Assert.notNull(registry, "BeanDefinitionRegistry must not be null!");
|
||||
|
||||
registerIsNewStrategyFactoryIfNecessary(registry);
|
||||
super.registerBeanDefinitions(annotationMetadata, registry);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAuditHandlerBeanDefinitionBuilder(org.springframework.data.auditing.config.AnnotationAuditingConfiguration)
|
||||
*/
|
||||
@Override
|
||||
protected BeanDefinitionBuilder getAuditHandlerBeanDefinitionBuilder(AnnotationAuditingConfiguration configuration) {
|
||||
|
||||
Assert.notNull(configuration, "AnnotationAuditingConfiguration must not be null!");
|
||||
|
||||
return configureDefaultAuditHandlerAttributes(configuration,
|
||||
BeanDefinitionBuilder.rootBeanDefinition(IsNewAwareAuditingHandler.class)).addConstructorArgReference(
|
||||
BeanNames.IS_NEW_STRATEGY_FACTORY);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#registerAuditListener(org.springframework.beans.factory.config.BeanDefinition, org.springframework.beans.factory.support.BeanDefinitionRegistry)
|
||||
*/
|
||||
@Override
|
||||
protected void registerAuditListenerBeanDefinition(BeanDefinition auditingHandlerDefinition,
|
||||
BeanDefinitionRegistry registry) {
|
||||
|
||||
Assert.notNull(auditingHandlerDefinition, "BeanDefinition must not be null!");
|
||||
Assert.notNull(registry, "BeanDefinitionRegistry must not be null!");
|
||||
|
||||
registerInfrastructureBeanWithId(BeanDefinitionBuilder.rootBeanDefinition(AuditingEventListener.class)
|
||||
.addConstructorArgValue(auditingHandlerDefinition).getRawBeanDefinition(),
|
||||
AuditingEventListener.class.getName(), registry);
|
||||
}
|
||||
|
||||
/**
|
||||
* @param registry, the {@link BeanDefinitionRegistry} to use to register an {@link IsNewStrategyFactory} to.
|
||||
*/
|
||||
private void registerIsNewStrategyFactoryIfNecessary(BeanDefinitionRegistry registry) {
|
||||
|
||||
if (!registry.containsBeanDefinition(BeanNames.IS_NEW_STRATEGY_FACTORY)) {
|
||||
registry.registerBeanDefinition(BeanNames.IS_NEW_STRATEGY_FACTORY,
|
||||
BeanDefinitionBuilder.rootBeanDefinition(MappingContextIsNewStrategyFactory.class)
|
||||
.addConstructorArgReference(BeanNames.MAPPING_CONTEXT).getBeanDefinition());
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2011-2013 by the original author(s).
|
||||
* Copyright 2011-2012 by the original author(s).
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -41,7 +41,6 @@ import com.mongodb.MongoURI;
|
||||
*
|
||||
* @author Jon Brisbin
|
||||
* @author Oliver Gierke
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
|
||||
|
||||
@@ -71,7 +70,6 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
|
||||
String uri = element.getAttribute("uri");
|
||||
String mongoRef = element.getAttribute("mongo-ref");
|
||||
String dbname = element.getAttribute("dbname");
|
||||
|
||||
BeanDefinition userCredentials = getUserCredentialsBeanDefinition(element, parserContext);
|
||||
|
||||
// Common setup
|
||||
@@ -94,9 +92,12 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
|
||||
dbFactoryBuilder.addConstructorArgValue(registerMongoBeanDefinition(element, parserContext));
|
||||
}
|
||||
|
||||
dbFactoryBuilder.addConstructorArgValue(StringUtils.hasText(dbname) ? dbname : "db");
|
||||
dbname = StringUtils.hasText(dbname) ? dbname : "db";
|
||||
dbFactoryBuilder.addConstructorArgValue(dbname);
|
||||
|
||||
if (userCredentials != null) {
|
||||
dbFactoryBuilder.addConstructorArgValue(userCredentials);
|
||||
dbFactoryBuilder.addConstructorArgValue(element.getAttribute("authentication-dbname"));
|
||||
}
|
||||
|
||||
BeanDefinitionBuilder writeConcernPropertyEditorBuilder = getWriteConcernPropertyEditorBuilder();
|
||||
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2011-2013 the original author or authors.
|
||||
* Copyright 2011-2012 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -24,7 +24,6 @@ import org.springframework.data.repository.config.RepositoryConfigurationExtensi
|
||||
* {@link org.springframework.beans.factory.xml.NamespaceHandler} for Mongo DB configuration.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
* @author Martin Baumgartner
|
||||
*/
|
||||
public class MongoNamespaceHandler extends NamespaceHandlerSupport {
|
||||
|
||||
@@ -43,7 +42,5 @@ public class MongoNamespaceHandler extends NamespaceHandlerSupport {
|
||||
registerBeanDefinitionParser("db-factory", new MongoDbFactoryParser());
|
||||
registerBeanDefinitionParser("jmx", new MongoJmxParser());
|
||||
registerBeanDefinitionParser("auditing", new MongoAuditingBeanDefinitionParser());
|
||||
registerBeanDefinitionParser("template", new MongoTemplateParser());
|
||||
registerBeanDefinitionParser("gridFsTemplate", new GridFsTemplateParser());
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2011-2013 the original author or authors.
|
||||
* Copyright 2011-2012 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -33,7 +33,6 @@ import org.w3c.dom.Element;
|
||||
*
|
||||
* @author Mark Pollack
|
||||
* @author Oliver Gierke
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
abstract class MongoParsingUtils {
|
||||
|
||||
@@ -80,8 +79,6 @@ abstract class MongoParsingUtils {
|
||||
setPropertyValue(optionsDefBuilder, optionsElement, "write-timeout", "writeTimeout");
|
||||
setPropertyValue(optionsDefBuilder, optionsElement, "write-fsync", "writeFsync");
|
||||
setPropertyValue(optionsDefBuilder, optionsElement, "slave-ok", "slaveOk");
|
||||
setPropertyValue(optionsDefBuilder, optionsElement, "ssl", "ssl");
|
||||
setPropertyReference(optionsDefBuilder, optionsElement, "ssl-socket-factory-ref", "sslSocketFactory");
|
||||
|
||||
mongoBuilder.addPropertyValue("mongoOptions", optionsDefBuilder.getBeanDefinition());
|
||||
return true;
|
||||
|
||||
@@ -1,86 +0,0 @@
|
||||
/*
|
||||
* Copyright 2011-2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.config;
|
||||
|
||||
import static org.springframework.data.config.ParsingUtils.*;
|
||||
import static org.springframework.data.mongodb.config.MongoParsingUtils.*;
|
||||
|
||||
import org.springframework.beans.factory.BeanDefinitionStoreException;
|
||||
import org.springframework.beans.factory.config.BeanDefinition;
|
||||
import org.springframework.beans.factory.parsing.BeanComponentDefinition;
|
||||
import org.springframework.beans.factory.support.AbstractBeanDefinition;
|
||||
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
|
||||
import org.springframework.beans.factory.xml.AbstractBeanDefinitionParser;
|
||||
import org.springframework.beans.factory.xml.BeanDefinitionParser;
|
||||
import org.springframework.beans.factory.xml.ParserContext;
|
||||
import org.springframework.data.config.BeanComponentDefinitionBuilder;
|
||||
import org.springframework.data.mongodb.core.MongoTemplate;
|
||||
import org.springframework.util.StringUtils;
|
||||
import org.w3c.dom.Element;
|
||||
|
||||
/**
|
||||
* {@link BeanDefinitionParser} to parse {@code template} elements into {@link BeanDefinition}s.
|
||||
*
|
||||
* @author Martin Baumgartner
|
||||
*/
|
||||
class MongoTemplateParser extends AbstractBeanDefinitionParser {
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#resolveId(org.w3c.dom.Element, org.springframework.beans.factory.support.AbstractBeanDefinition, org.springframework.beans.factory.xml.ParserContext)
|
||||
*/
|
||||
@Override
|
||||
protected String resolveId(Element element, AbstractBeanDefinition definition, ParserContext parserContext)
|
||||
throws BeanDefinitionStoreException {
|
||||
|
||||
String id = super.resolveId(element, definition, parserContext);
|
||||
return StringUtils.hasText(id) ? id : BeanNames.MONGO_TEMPLATE;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#parseInternal(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
|
||||
*/
|
||||
@Override
|
||||
protected AbstractBeanDefinition parseInternal(Element element, ParserContext parserContext) {
|
||||
|
||||
BeanComponentDefinitionBuilder helper = new BeanComponentDefinitionBuilder(element, parserContext);
|
||||
|
||||
String converterRef = element.getAttribute("converter-ref");
|
||||
String dbFactoryRef = element.getAttribute("db-factory-ref");
|
||||
|
||||
BeanDefinitionBuilder mongoTemplateBuilder = BeanDefinitionBuilder.genericBeanDefinition(MongoTemplate.class);
|
||||
setPropertyValue(mongoTemplateBuilder, element, "write-concern", "writeConcern");
|
||||
|
||||
if (StringUtils.hasText(dbFactoryRef)) {
|
||||
mongoTemplateBuilder.addConstructorArgReference(dbFactoryRef);
|
||||
} else {
|
||||
mongoTemplateBuilder.addConstructorArgReference(BeanNames.DB_FACTORY);
|
||||
}
|
||||
|
||||
if (StringUtils.hasText(converterRef)) {
|
||||
mongoTemplateBuilder.addConstructorArgReference(converterRef);
|
||||
}
|
||||
|
||||
BeanDefinitionBuilder writeConcernPropertyEditorBuilder = getWriteConcernPropertyEditorBuilder();
|
||||
|
||||
BeanComponentDefinition component = helper.getComponent(writeConcernPropertyEditorBuilder);
|
||||
parserContext.registerBeanComponent(component);
|
||||
|
||||
return (AbstractBeanDefinition) helper.getComponentIdButFallback(mongoTemplateBuilder, BeanNames.MONGO_TEMPLATE)
|
||||
.getBeanDefinition();
|
||||
}
|
||||
}
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2011-2013 the original author or authors.
|
||||
* Copyright 2011 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -15,8 +15,6 @@
|
||||
*/
|
||||
package org.springframework.data.mongodb.core;
|
||||
|
||||
import static org.springframework.data.domain.Sort.Direction.*;
|
||||
|
||||
import java.util.ArrayList;
|
||||
import java.util.List;
|
||||
|
||||
@@ -24,6 +22,7 @@ import org.springframework.dao.DataAccessException;
|
||||
import org.springframework.data.mongodb.core.index.IndexDefinition;
|
||||
import org.springframework.data.mongodb.core.index.IndexField;
|
||||
import org.springframework.data.mongodb.core.index.IndexInfo;
|
||||
import org.springframework.data.mongodb.core.query.Order;
|
||||
import org.springframework.util.Assert;
|
||||
|
||||
import com.mongodb.DBCollection;
|
||||
@@ -35,13 +34,9 @@ import com.mongodb.MongoException;
|
||||
*
|
||||
* @author Mark Pollack
|
||||
* @author Oliver Gierke
|
||||
* @author Komi Innocent
|
||||
*/
|
||||
public class DefaultIndexOperations implements IndexOperations {
|
||||
|
||||
private static final Double ONE = Double.valueOf(1);
|
||||
private static final Double MINUS_ONE = Double.valueOf(-1);
|
||||
|
||||
private final MongoOperations mongoOperations;
|
||||
private final String collectionName;
|
||||
|
||||
@@ -140,17 +135,12 @@ public class DefaultIndexOperations implements IndexOperations {
|
||||
|
||||
Object value = keyDbObject.get(key);
|
||||
|
||||
if ("2d".equals(value)) {
|
||||
if (Integer.valueOf(1).equals(value)) {
|
||||
indexFields.add(IndexField.create(key, Order.ASCENDING));
|
||||
} else if (Integer.valueOf(-1).equals(value)) {
|
||||
indexFields.add(IndexField.create(key, Order.DESCENDING));
|
||||
} else if ("2d".equals(value)) {
|
||||
indexFields.add(IndexField.geo(key));
|
||||
} else {
|
||||
|
||||
Double keyValue = new Double(value.toString());
|
||||
|
||||
if (ONE.equals(keyValue)) {
|
||||
indexFields.add(IndexField.create(key, ASC));
|
||||
} else if (MINUS_ONE.equals(keyValue)) {
|
||||
indexFields.add(IndexField.create(key, DESC));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2011-2013 the original author or authors.
|
||||
* Copyright 2011-2012 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -27,7 +27,6 @@ import com.mongodb.Mongo;
|
||||
* Mongo server administration exposed via JMX annotations
|
||||
*
|
||||
* @author Mark Pollack
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
@ManagedResource(description = "Mongo Admin Operations")
|
||||
public class MongoAdmin implements MongoAdminOperations {
|
||||
@@ -35,7 +34,6 @@ public class MongoAdmin implements MongoAdminOperations {
|
||||
private final Mongo mongo;
|
||||
private String username;
|
||||
private String password;
|
||||
private String authenticationDatabaseName;
|
||||
|
||||
public MongoAdmin(Mongo mongo) {
|
||||
Assert.notNull(mongo);
|
||||
@@ -84,16 +82,7 @@ public class MongoAdmin implements MongoAdminOperations {
|
||||
this.password = password;
|
||||
}
|
||||
|
||||
/**
|
||||
* Sets the authenticationDatabaseName to use to authenticate with the Mongo database.
|
||||
*
|
||||
* @param authenticationDatabaseName The authenticationDatabaseName to use.
|
||||
*/
|
||||
public void setAuthenticationDatabaseName(String authenticationDatabaseName) {
|
||||
this.authenticationDatabaseName = authenticationDatabaseName;
|
||||
}
|
||||
|
||||
DB getDB(String databaseName) {
|
||||
return MongoDbUtils.getDB(mongo, databaseName, new UserCredentials(username, password), authenticationDatabaseName);
|
||||
return MongoDbUtils.getDB(mongo, databaseName, new UserCredentials(username, password));
|
||||
}
|
||||
}
|
||||
|
||||
@@ -33,7 +33,6 @@ import com.mongodb.Mongo;
|
||||
* @author Graeme Rocher
|
||||
* @author Oliver Gierke
|
||||
* @author Randy Watler
|
||||
* @author Thomas Darimont
|
||||
* @since 1.0
|
||||
*/
|
||||
public abstract class MongoDbUtils {
|
||||
@@ -55,7 +54,7 @@ public abstract class MongoDbUtils {
|
||||
* @return the {@link DB} connection
|
||||
*/
|
||||
public static DB getDB(Mongo mongo, String databaseName) {
|
||||
return doGetDB(mongo, databaseName, UserCredentials.NO_CREDENTIALS, true, databaseName);
|
||||
return doGetDB(mongo, databaseName, UserCredentials.NO_CREDENTIALS, true);
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -67,22 +66,15 @@ public abstract class MongoDbUtils {
|
||||
* @return the {@link DB} connection
|
||||
*/
|
||||
public static DB getDB(Mongo mongo, String databaseName, UserCredentials credentials) {
|
||||
return getDB(mongo, databaseName, credentials, databaseName);
|
||||
}
|
||||
|
||||
public static DB getDB(Mongo mongo, String databaseName, UserCredentials credentials,
|
||||
String authenticationDatabaseName) {
|
||||
|
||||
Assert.notNull(mongo, "No Mongo instance specified!");
|
||||
Assert.hasText(databaseName, "Database name must be given!");
|
||||
Assert.notNull(credentials, "Credentials must not be null, use UserCredentials.NO_CREDENTIALS!");
|
||||
Assert.hasText(authenticationDatabaseName, "Authentication database name must not be null or empty!");
|
||||
|
||||
return doGetDB(mongo, databaseName, credentials, true, authenticationDatabaseName);
|
||||
return doGetDB(mongo, databaseName, credentials, true);
|
||||
}
|
||||
|
||||
private static DB doGetDB(Mongo mongo, String databaseName, UserCredentials credentials, boolean allowCreate,
|
||||
String authenticationDatabaseName) {
|
||||
private static DB doGetDB(Mongo mongo, String databaseName, UserCredentials credentials, boolean allowCreate) {
|
||||
|
||||
DbHolder dbHolder = (DbHolder) TransactionSynchronizationManager.getResource(mongo);
|
||||
|
||||
@@ -111,16 +103,14 @@ public abstract class MongoDbUtils {
|
||||
DB db = mongo.getDB(databaseName);
|
||||
boolean credentialsGiven = credentials.hasUsername() && credentials.hasPassword();
|
||||
|
||||
DB authDb = databaseName.equals(authenticationDatabaseName) ? db : mongo.getDB(authenticationDatabaseName);
|
||||
synchronized (db) {
|
||||
|
||||
synchronized (authDb) {
|
||||
|
||||
if (credentialsGiven && !authDb.isAuthenticated()) {
|
||||
if (credentialsGiven && !db.isAuthenticated()) {
|
||||
|
||||
String username = credentials.getUsername();
|
||||
String password = credentials.hasPassword() ? credentials.getPassword() : null;
|
||||
|
||||
if (!authDb.authenticate(username, password == null ? null : password.toCharArray())) {
|
||||
if (!db.authenticate(username, password == null ? null : password.toCharArray())) {
|
||||
throw new CannotGetMongoDbConnectionException("Failed to authenticate to database [" + databaseName + "], "
|
||||
+ credentials.toString(), databaseName, credentials);
|
||||
}
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2010-2013 the original author or authors.
|
||||
* Copyright 2010-2012 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -148,7 +148,6 @@ public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, D
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.beans.factory.InitializingBean#afterPropertiesSet()
|
||||
*/
|
||||
@SuppressWarnings("deprecation")
|
||||
public void afterPropertiesSet() throws Exception {
|
||||
|
||||
Mongo mongo;
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2010-2013 the original author or authors.
|
||||
* Copyright 2010-2011 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -19,9 +19,6 @@ import java.util.Collection;
|
||||
import java.util.List;
|
||||
import java.util.Set;
|
||||
|
||||
import org.springframework.data.mongodb.core.aggregation.Aggregation;
|
||||
import org.springframework.data.mongodb.core.aggregation.AggregationResults;
|
||||
import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
|
||||
import org.springframework.data.mongodb.core.convert.MongoConverter;
|
||||
import org.springframework.data.mongodb.core.geo.GeoResult;
|
||||
import org.springframework.data.mongodb.core.geo.GeoResults;
|
||||
@@ -48,8 +45,6 @@ import com.mongodb.WriteResult;
|
||||
* @author Thomas Risberg
|
||||
* @author Mark Pollack
|
||||
* @author Oliver Gierke
|
||||
* @author Tobias Trelle
|
||||
* @author Chuong Ngo
|
||||
*/
|
||||
public interface MongoOperations {
|
||||
|
||||
@@ -306,57 +301,6 @@ public interface MongoOperations {
|
||||
*/
|
||||
<T> GroupByResults<T> group(Criteria criteria, String inputCollectionName, GroupBy groupBy, Class<T> entityClass);
|
||||
|
||||
/**
|
||||
* Execute an aggregation operation. The raw results will be mapped to the given entity class. The name of the
|
||||
* inputCollection is derived from the inputType of the aggregation.
|
||||
*
|
||||
* @param aggregation The {@link TypedAggregation} specification holding the aggregation operations, must not be
|
||||
* {@literal null}.
|
||||
* @param collectionName The name of the input collection to use for the aggreation.
|
||||
* @param outputType The parameterized type of the returned list, must not be {@literal null}.
|
||||
* @return The results of the aggregation operation.
|
||||
* @since 1.3
|
||||
*/
|
||||
<O> AggregationResults<O> aggregate(TypedAggregation<?> aggregation, String collectionName, Class<O> outputType);
|
||||
|
||||
/**
|
||||
* Execute an aggregation operation. The raw results will be mapped to the given entity class. The name of the
|
||||
* inputCollection is derived from the inputType of the aggregation.
|
||||
*
|
||||
* @param aggregation The {@link TypedAggregation} specification holding the aggregation operations, must not be
|
||||
* {@literal null}.
|
||||
* @param outputType The parameterized type of the returned list, must not be {@literal null}.
|
||||
* @return The results of the aggregation operation.
|
||||
* @since 1.3
|
||||
*/
|
||||
<O> AggregationResults<O> aggregate(TypedAggregation<?> aggregation, Class<O> outputType);
|
||||
|
||||
/**
|
||||
* Execute an aggregation operation. The raw results will be mapped to the given entity class.
|
||||
*
|
||||
* @param aggregation The {@link Aggregation} specification holding the aggregation operations, must not be
|
||||
* {@literal null}.
|
||||
* @param inputType the inputType where the aggregation operation will read from, must not be {@literal null} or
|
||||
* empty.
|
||||
* @param outputType The parameterized type of the returned list, must not be {@literal null}.
|
||||
* @return The results of the aggregation operation.
|
||||
* @since 1.3
|
||||
*/
|
||||
<O> AggregationResults<O> aggregate(Aggregation aggregation, Class<?> inputType, Class<O> outputType);
|
||||
|
||||
/**
|
||||
* Execute an aggregation operation. The raw results will be mapped to the given entity class.
|
||||
*
|
||||
* @param aggregation The {@link Aggregation} specification holding the aggregation operations, must not be
|
||||
* {@literal null}.
|
||||
* @param collectionName the collection where the aggregation operation will read from, must not be {@literal null} or
|
||||
* empty.
|
||||
* @param outputType The parameterized type of the returned list, must not be {@literal null}.
|
||||
* @return The results of the aggregation operation.
|
||||
* @since 1.3
|
||||
*/
|
||||
<O> AggregationResults<O> aggregate(Aggregation aggregation, String collectionName, Class<O> outputType);
|
||||
|
||||
/**
|
||||
* Execute a map-reduce operation. The map-reduce operation will be formed with an output type of INLINE
|
||||
*
|
||||
@@ -464,16 +408,11 @@ public interface MongoOperations {
|
||||
* specification
|
||||
* @param entityClass the parameterized type of the returned list.
|
||||
* @param collectionName name of the collection to retrieve the objects from
|
||||
*
|
||||
* @return the converted object
|
||||
*/
|
||||
<T> T findOne(Query query, Class<T> entityClass, String collectionName);
|
||||
|
||||
boolean exists(Query query, String collectionName);
|
||||
|
||||
boolean exists(Query query, Class<?> entityClass);
|
||||
|
||||
boolean exists(Query query, Class<?> entityClass, String collectionName);
|
||||
|
||||
/**
|
||||
* Map the results of an ad-hoc query on the collection for the entity class to a List of the specified type.
|
||||
* <p/>
|
||||
@@ -503,6 +442,7 @@ public interface MongoOperations {
|
||||
* specification
|
||||
* @param entityClass the parameterized type of the returned list.
|
||||
* @param collectionName name of the collection to retrieve the objects from
|
||||
*
|
||||
* @return the List of converted objects
|
||||
*/
|
||||
<T> List<T> find(Query query, Class<T> entityClass, String collectionName);
|
||||
@@ -524,6 +464,7 @@ public interface MongoOperations {
|
||||
* @param id the id of the document to return
|
||||
* @param entityClass the type to convert the document to
|
||||
* @param collectionName the collection to query for the document
|
||||
*
|
||||
* @param <T>
|
||||
* @return
|
||||
*/
|
||||
@@ -569,6 +510,7 @@ public interface MongoOperations {
|
||||
* specification
|
||||
* @param entityClass the parameterized type of the returned list.
|
||||
* @param collectionName name of the collection to retrieve the objects from
|
||||
*
|
||||
* @return the converted object
|
||||
*/
|
||||
<T> T findAndRemove(Query query, Class<T> entityClass, String collectionName);
|
||||
@@ -704,18 +646,6 @@ public interface MongoOperations {
|
||||
*/
|
||||
WriteResult upsert(Query query, Update update, String collectionName);
|
||||
|
||||
/**
|
||||
* Performs an upsert. If no document is found that matches the query, a new document is created and inserted by
|
||||
* combining the query document and the update document.
|
||||
*
|
||||
* @param query the query document that specifies the criteria used to select a record to be upserted
|
||||
* @param update the update document that contains the updated object or $ operators to manipulate the existing object
|
||||
* @param entityClass class of the pojo to be operated on
|
||||
* @param collectionName name of the collection to update the object in
|
||||
* @return the WriteResult which lets you access the results of the previous write.
|
||||
*/
|
||||
WriteResult upsert(Query query, Update update, Class<?> entityClass, String collectionName);
|
||||
|
||||
/**
|
||||
* Updates the first object that is found in the collection of the entity class that matches the query document with
|
||||
* the provided update document.
|
||||
@@ -740,19 +670,6 @@ public interface MongoOperations {
|
||||
*/
|
||||
WriteResult updateFirst(Query query, Update update, String collectionName);
|
||||
|
||||
/**
|
||||
* Updates the first object that is found in the specified collection that matches the query document criteria with
|
||||
* the provided updated document.
|
||||
*
|
||||
* @param query the query document that specifies the criteria used to select a record to be updated
|
||||
* @param update the update document that contains the updated object or $ operators to manipulate the existing
|
||||
* object.
|
||||
* @param entityClass class of the pojo to be operated on
|
||||
* @param collectionName name of the collection to update the object in
|
||||
* @return the WriteResult which lets you access the results of the previous write.
|
||||
*/
|
||||
WriteResult updateFirst(Query query, Update update, Class<?> entityClass, String collectionName);
|
||||
|
||||
/**
|
||||
* Updates all objects that are found in the collection for the entity class that matches the query document criteria
|
||||
* with the provided updated document.
|
||||
@@ -777,19 +694,6 @@ public interface MongoOperations {
|
||||
*/
|
||||
WriteResult updateMulti(Query query, Update update, String collectionName);
|
||||
|
||||
/**
|
||||
* Updates all objects that are found in the collection for the entity class that matches the query document criteria
|
||||
* with the provided updated document.
|
||||
*
|
||||
* @param query the query document that specifies the criteria used to select a record to be updated
|
||||
* @param update the update document that contains the updated object or $ operators to manipulate the existing
|
||||
* object.
|
||||
* @param entityClass class of the pojo to be operated on
|
||||
* @param collectionName name of the collection to update the object in
|
||||
* @return the WriteResult which lets you access the results of the previous write.
|
||||
*/
|
||||
WriteResult updateMulti(final Query query, final Update update, Class<?> entityClass, String collectionName);
|
||||
|
||||
/**
|
||||
* Remove the given object from the collection by id.
|
||||
*
|
||||
@@ -809,12 +713,11 @@ public interface MongoOperations {
|
||||
* Remove all documents that match the provided query document criteria from the the collection used to store the
|
||||
* entityClass. The Class parameter is also used to help convert the Id of the object if it is present in the query.
|
||||
*
|
||||
* @param <T>
|
||||
* @param query
|
||||
* @param entityClass
|
||||
*/
|
||||
void remove(Query query, Class<?> entityClass);
|
||||
|
||||
void remove(Query query, Class<?> entityClass, String collectionName);
|
||||
<T> void remove(Query query, Class<T> entityClass);
|
||||
|
||||
/**
|
||||
* Remove all documents from the specified collection that match the provided query document criteria. There is no
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2010-2013 the original author or authors.
|
||||
* Copyright 2010-2011 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -15,91 +15,129 @@
|
||||
*/
|
||||
package org.springframework.data.mongodb.core;
|
||||
|
||||
import javax.net.ssl.SSLSocketFactory;
|
||||
import com.mongodb.MongoOptions;
|
||||
|
||||
import org.springframework.beans.factory.FactoryBean;
|
||||
import org.springframework.beans.factory.InitializingBean;
|
||||
|
||||
import com.mongodb.MongoOptions;
|
||||
|
||||
/**
|
||||
* A factory bean for construction of a {@link MongoOptions} instance.
|
||||
* A factory bean for construction of a MongoOptions instance
|
||||
*
|
||||
* @author Graeme Rocher
|
||||
* @author Mark Pollack
|
||||
* @author Mike Saavedra
|
||||
* @author Thomas Darimont
|
||||
* @Author Mark Pollack
|
||||
*/
|
||||
public class MongoOptionsFactoryBean implements FactoryBean<MongoOptions>, InitializingBean {
|
||||
|
||||
@SuppressWarnings("deprecation")//
|
||||
private final MongoOptions MONGO_OPTIONS = new MongoOptions();
|
||||
|
||||
private static final MongoOptions MONGO_OPTIONS = new MongoOptions();
|
||||
/**
|
||||
* number of connections allowed per host will block if run out
|
||||
*/
|
||||
private int connectionsPerHost = MONGO_OPTIONS.connectionsPerHost;
|
||||
|
||||
/**
|
||||
* multiplier for connectionsPerHost for # of threads that can block if connectionsPerHost is 10, and
|
||||
* threadsAllowedToBlockForConnectionMultiplier is 5, then 50 threads can block more than that and an exception will
|
||||
* be throw
|
||||
*/
|
||||
private int threadsAllowedToBlockForConnectionMultiplier = MONGO_OPTIONS.threadsAllowedToBlockForConnectionMultiplier;
|
||||
private int maxWaitTime = MONGO_OPTIONS.maxWaitTime;
|
||||
private int connectTimeout = MONGO_OPTIONS.connectTimeout;
|
||||
private int socketTimeout = MONGO_OPTIONS.socketTimeout;
|
||||
private boolean socketKeepAlive = MONGO_OPTIONS.socketKeepAlive;
|
||||
private boolean autoConnectRetry = MONGO_OPTIONS.autoConnectRetry;
|
||||
private long maxAutoConnectRetryTime = MONGO_OPTIONS.maxAutoConnectRetryTime;
|
||||
private int writeNumber;
|
||||
private int writeTimeout;
|
||||
private boolean writeFsync;
|
||||
@SuppressWarnings("deprecation") private boolean slaveOk = MONGO_OPTIONS.slaveOk;
|
||||
private boolean ssl;
|
||||
private SSLSocketFactory sslSocketFactory;
|
||||
|
||||
/**
|
||||
* Configures the maximum number of connections allowed per host until we will block.
|
||||
* max wait time of a blocking thread for a connection
|
||||
*/
|
||||
private int maxWaitTime = MONGO_OPTIONS.maxWaitTime;
|
||||
|
||||
/**
|
||||
* connect timeout in milliseconds. 0 is default and infinite
|
||||
*/
|
||||
private int connectTimeout = MONGO_OPTIONS.connectTimeout;
|
||||
|
||||
/**
|
||||
* socket timeout. 0 is default and infinite
|
||||
*/
|
||||
private int socketTimeout = MONGO_OPTIONS.socketTimeout;
|
||||
|
||||
/**
|
||||
* This controls whether or not to have socket keep alive turned on (SO_KEEPALIVE).
|
||||
*
|
||||
* @param connectionsPerHost
|
||||
* defaults to false
|
||||
*/
|
||||
public boolean socketKeepAlive = MONGO_OPTIONS.socketKeepAlive;
|
||||
|
||||
/**
|
||||
* this controls whether or not on a connect, the system retries automatically
|
||||
*/
|
||||
private boolean autoConnectRetry = MONGO_OPTIONS.autoConnectRetry;
|
||||
|
||||
private long maxAutoConnectRetryTime = MONGO_OPTIONS.maxAutoConnectRetryTime;
|
||||
|
||||
/**
|
||||
* This specifies the number of servers to wait for on the write operation, and exception raising behavior.
|
||||
*
|
||||
* Defaults to 0.
|
||||
*/
|
||||
private int writeNumber;
|
||||
|
||||
/**
|
||||
* This controls timeout for write operations in milliseconds.
|
||||
*
|
||||
* Defaults to 0 (indefinite). Greater than zero is number of milliseconds to wait.
|
||||
*/
|
||||
private int writeTimeout;
|
||||
|
||||
/**
|
||||
* This controls whether or not to fsync.
|
||||
*
|
||||
* Defaults to false.
|
||||
*/
|
||||
private boolean writeFsync;
|
||||
|
||||
/**
|
||||
* Specifies if the driver is allowed to read from secondaries or slaves.
|
||||
*
|
||||
* Defaults to false
|
||||
*/
|
||||
@SuppressWarnings("deprecation")
|
||||
private boolean slaveOk = MONGO_OPTIONS.slaveOk;
|
||||
|
||||
/**
|
||||
* number of connections allowed per host will block if run out
|
||||
*/
|
||||
public void setConnectionsPerHost(int connectionsPerHost) {
|
||||
this.connectionsPerHost = connectionsPerHost;
|
||||
}
|
||||
|
||||
/**
|
||||
* A multiplier for connectionsPerHost for # of threads that can block a connection. If connectionsPerHost is 10, and
|
||||
* threadsAllowedToBlockForConnectionMultiplier is 5, then 50 threads can block. If more threads try to block an
|
||||
* exception will be thrown.
|
||||
*
|
||||
* @param threadsAllowedToBlockForConnectionMultiplier
|
||||
* multiplier for connectionsPerHost for # of threads that can block if connectionsPerHost is 10, and
|
||||
* threadsAllowedToBlockForConnectionMultiplier is 5, then 50 threads can block more than that and an exception will
|
||||
* be throw
|
||||
*/
|
||||
public void setThreadsAllowedToBlockForConnectionMultiplier(int threadsAllowedToBlockForConnectionMultiplier) {
|
||||
this.threadsAllowedToBlockForConnectionMultiplier = threadsAllowedToBlockForConnectionMultiplier;
|
||||
}
|
||||
|
||||
/**
|
||||
* Max wait time of a blocking thread for a connection.
|
||||
*
|
||||
* @param maxWaitTime
|
||||
* max wait time of a blocking thread for a connection
|
||||
*/
|
||||
public void setMaxWaitTime(int maxWaitTime) {
|
||||
this.maxWaitTime = maxWaitTime;
|
||||
}
|
||||
|
||||
/**
|
||||
* Configures the connect timeout in milliseconds. Defaults to 0 (infinite time).
|
||||
*
|
||||
* @param connectTimeout
|
||||
* connect timeout in milliseconds. 0 is default and infinite
|
||||
*/
|
||||
public void setConnectTimeout(int connectTimeout) {
|
||||
this.connectTimeout = connectTimeout;
|
||||
}
|
||||
|
||||
/**
|
||||
* Configures the socket timeout. Defaults to 0 (infinite time).
|
||||
*
|
||||
* @param socketTimeout
|
||||
* socket timeout. 0 is default and infinite
|
||||
*/
|
||||
public void setSocketTimeout(int socketTimeout) {
|
||||
this.socketTimeout = socketTimeout;
|
||||
}
|
||||
|
||||
/**
|
||||
* Configures whether or not to have socket keep alive turned on (SO_KEEPALIVE). Defaults to {@literal false}.
|
||||
* This controls whether or not to have socket keep alive
|
||||
*
|
||||
* @param socketKeepAlive
|
||||
*/
|
||||
@@ -124,33 +162,33 @@ public class MongoOptionsFactoryBean implements FactoryBean<MongoOptions>, Initi
|
||||
}
|
||||
|
||||
/**
|
||||
* Configures the timeout for write operations in milliseconds. This defaults to {@literal 0} (indefinite).
|
||||
* This controls timeout for write operations in milliseconds. The 'wtimeout' option to the getlasterror command.
|
||||
*
|
||||
* @param writeTimeout
|
||||
* @param writeTimeout Defaults to 0 (indefinite). Greater than zero is number of milliseconds to wait.
|
||||
*/
|
||||
public void setWriteTimeout(int writeTimeout) {
|
||||
this.writeTimeout = writeTimeout;
|
||||
}
|
||||
|
||||
/**
|
||||
* Configures whether or not to fsync. The 'fsync' option to the getlasterror command. Defaults to {@literal false}.
|
||||
* This controls whether or not to fsync. The 'fsync' option to the getlasterror command. Defaults to false.
|
||||
*
|
||||
* @param writeFsync to fsync on <code>write (true)<code>, otherwise {@literal false}.
|
||||
* @param writeFsync to fsync on write (true), otherwise false.
|
||||
*/
|
||||
public void setWriteFsync(boolean writeFsync) {
|
||||
this.writeFsync = writeFsync;
|
||||
}
|
||||
|
||||
/**
|
||||
* Configures whether or not the system retries automatically on a failed connect. This defaults to {@literal false}.
|
||||
* this controls whether or not on a connect, the system retries automatically
|
||||
*/
|
||||
public void setAutoConnectRetry(boolean autoConnectRetry) {
|
||||
this.autoConnectRetry = autoConnectRetry;
|
||||
}
|
||||
|
||||
/**
|
||||
* Configures the maximum amount of time in millisecons to spend retrying to open connection to the same server. This
|
||||
* defaults to {@literal 0}, which means to use the default {@literal 15s} if {@link #autoConnectRetry} is on.
|
||||
* The maximum amount of time in millisecons to spend retrying to open connection to the same server. Default is 0,
|
||||
* which means to use the default 15s if autoConnectRetry is on.
|
||||
*
|
||||
* @param maxAutoConnectRetryTime the maxAutoConnectRetryTime to set
|
||||
*/
|
||||
@@ -159,7 +197,7 @@ public class MongoOptionsFactoryBean implements FactoryBean<MongoOptions>, Initi
|
||||
}
|
||||
|
||||
/**
|
||||
* Specifies if the driver is allowed to read from secondaries or slaves. Defaults to {@literal false}.
|
||||
* Specifies if the driver is allowed to read from secondaries or slaves. Defaults to false.
|
||||
*
|
||||
* @param slaveOk true if the driver should read from secondaries or slaves.
|
||||
*/
|
||||
@@ -167,39 +205,8 @@ public class MongoOptionsFactoryBean implements FactoryBean<MongoOptions>, Initi
|
||||
this.slaveOk = slaveOk;
|
||||
}
|
||||
|
||||
/**
|
||||
* Specifies if the driver should use an SSL connection to Mongo. This defaults to {@literal false}. By default
|
||||
* {@link SSLSocketFactory#getDefault()} will be used. See {@link #setSslSocketFactory(SSLSocketFactory)} if you want
|
||||
* to configure a custom factory.
|
||||
*
|
||||
* @param ssl true if the driver should use an SSL connection.
|
||||
* @see #setSslSocketFactory(SSLSocketFactory)
|
||||
*/
|
||||
public void setSsl(boolean ssl) {
|
||||
this.ssl = ssl;
|
||||
}
|
||||
|
||||
/**
|
||||
* Specifies the {@link SSLSocketFactory} to use for creating SSL connections to Mongo. Defaults to
|
||||
* {@link SSLSocketFactory#getDefault()}. Implicitly activates {@link #setSsl(boolean)} if a non-{@literal null} value
|
||||
* is given.
|
||||
*
|
||||
* @param sslSocketFactory the sslSocketFactory to use.
|
||||
* @see #setSsl(boolean)
|
||||
*/
|
||||
public void setSslSocketFactory(SSLSocketFactory sslSocketFactory) {
|
||||
|
||||
setSsl(sslSocketFactory != null);
|
||||
this.sslSocketFactory = sslSocketFactory;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.beans.factory.InitializingBean#afterPropertiesSet()
|
||||
*/
|
||||
@SuppressWarnings("deprecation")
|
||||
public void afterPropertiesSet() {
|
||||
|
||||
MONGO_OPTIONS.connectionsPerHost = connectionsPerHost;
|
||||
MONGO_OPTIONS.threadsAllowedToBlockForConnectionMultiplier = threadsAllowedToBlockForConnectionMultiplier;
|
||||
MONGO_OPTIONS.maxWaitTime = maxWaitTime;
|
||||
@@ -212,32 +219,18 @@ public class MongoOptionsFactoryBean implements FactoryBean<MongoOptions>, Initi
|
||||
MONGO_OPTIONS.w = writeNumber;
|
||||
MONGO_OPTIONS.wtimeout = writeTimeout;
|
||||
MONGO_OPTIONS.fsync = writeFsync;
|
||||
if (ssl) {
|
||||
MONGO_OPTIONS.setSocketFactory(sslSocketFactory != null ? sslSocketFactory : SSLSocketFactory.getDefault());
|
||||
}
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.beans.factory.FactoryBean#getObject()
|
||||
*/
|
||||
public MongoOptions getObject() {
|
||||
return MONGO_OPTIONS;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.beans.factory.FactoryBean#getObjectType()
|
||||
*/
|
||||
public Class<?> getObjectType() {
|
||||
return MongoOptions.class;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.beans.factory.FactoryBean#isSingleton()
|
||||
*/
|
||||
public boolean isSingleton() {
|
||||
return true;
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
@@ -46,27 +46,18 @@ import org.springframework.core.io.ResourceLoader;
|
||||
import org.springframework.dao.DataAccessException;
|
||||
import org.springframework.dao.InvalidDataAccessApiUsageException;
|
||||
import org.springframework.dao.OptimisticLockingFailureException;
|
||||
import org.springframework.dao.support.PersistenceExceptionTranslator;
|
||||
import org.springframework.data.authentication.UserCredentials;
|
||||
import org.springframework.data.convert.EntityReader;
|
||||
import org.springframework.data.mapping.PersistentEntity;
|
||||
import org.springframework.data.mapping.context.MappingContext;
|
||||
import org.springframework.data.mapping.model.BeanWrapper;
|
||||
import org.springframework.data.mapping.model.MappingException;
|
||||
import org.springframework.data.mongodb.MongoDataIntegrityViolationException;
|
||||
import org.springframework.data.mongodb.MongoDbFactory;
|
||||
import org.springframework.data.mongodb.core.aggregation.Aggregation;
|
||||
import org.springframework.data.mongodb.core.aggregation.AggregationOperationContext;
|
||||
import org.springframework.data.mongodb.core.aggregation.AggregationResults;
|
||||
import org.springframework.data.mongodb.core.aggregation.Fields;
|
||||
import org.springframework.data.mongodb.core.aggregation.TypeBasedAggregationOperationContext;
|
||||
import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
|
||||
import org.springframework.data.mongodb.core.convert.DbRefResolver;
|
||||
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver;
|
||||
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
|
||||
import org.springframework.data.mongodb.core.convert.MongoConverter;
|
||||
import org.springframework.data.mongodb.core.convert.MongoWriter;
|
||||
import org.springframework.data.mongodb.core.convert.QueryMapper;
|
||||
import org.springframework.data.mongodb.core.convert.UpdateMapper;
|
||||
import org.springframework.data.mongodb.core.geo.Distance;
|
||||
import org.springframework.data.mongodb.core.geo.GeoResult;
|
||||
import org.springframework.data.mongodb.core.geo.GeoResults;
|
||||
@@ -78,11 +69,9 @@ import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
|
||||
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
|
||||
import org.springframework.data.mongodb.core.mapping.MongoSimpleTypes;
|
||||
import org.springframework.data.mongodb.core.mapping.event.AfterConvertEvent;
|
||||
import org.springframework.data.mongodb.core.mapping.event.AfterDeleteEvent;
|
||||
import org.springframework.data.mongodb.core.mapping.event.AfterLoadEvent;
|
||||
import org.springframework.data.mongodb.core.mapping.event.AfterSaveEvent;
|
||||
import org.springframework.data.mongodb.core.mapping.event.BeforeConvertEvent;
|
||||
import org.springframework.data.mongodb.core.mapping.event.BeforeDeleteEvent;
|
||||
import org.springframework.data.mongodb.core.mapping.event.BeforeSaveEvent;
|
||||
import org.springframework.data.mongodb.core.mapping.event.MongoMappingEvent;
|
||||
import org.springframework.data.mongodb.core.mapreduce.GroupBy;
|
||||
@@ -123,10 +112,6 @@ import com.mongodb.util.JSONParseException;
|
||||
* @author Oliver Gierke
|
||||
* @author Amol Nayak
|
||||
* @author Patryk Wasik
|
||||
* @author Tobias Trelle
|
||||
* @author Sebastian Herold
|
||||
* @author Thomas Darimont
|
||||
* @author Chuong Ngo
|
||||
*/
|
||||
public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
|
||||
@@ -148,9 +133,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
private final MongoConverter mongoConverter;
|
||||
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
|
||||
private final MongoDbFactory mongoDbFactory;
|
||||
private final PersistenceExceptionTranslator exceptionTranslator;
|
||||
private final QueryMapper queryMapper;
|
||||
private final UpdateMapper updateMapper;
|
||||
private final MongoExceptionTranslator exceptionTranslator = new MongoExceptionTranslator();
|
||||
private final QueryMapper mapper;
|
||||
|
||||
private WriteConcern writeConcern;
|
||||
private WriteConcernResolver writeConcernResolver = DefaultWriteConcernResolver.INSTANCE;
|
||||
@@ -202,10 +186,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
Assert.notNull(mongoDbFactory);
|
||||
|
||||
this.mongoDbFactory = mongoDbFactory;
|
||||
this.exceptionTranslator = mongoDbFactory.getExceptionTranslator();
|
||||
this.mongoConverter = mongoConverter == null ? getDefaultMongoConverter(mongoDbFactory) : mongoConverter;
|
||||
this.queryMapper = new QueryMapper(this.mongoConverter);
|
||||
this.updateMapper = new UpdateMapper(this.mongoConverter);
|
||||
this.mapper = new QueryMapper(this.mongoConverter);
|
||||
|
||||
// We always have a mapping context in the converter, whether it's a simple one or not
|
||||
mappingContext = this.mongoConverter.getMappingContext();
|
||||
@@ -498,24 +480,6 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
}
|
||||
}
|
||||
|
||||
public boolean exists(Query query, Class<?> entityClass) {
|
||||
return exists(query, entityClass, determineCollectionName(entityClass));
|
||||
}
|
||||
|
||||
public boolean exists(Query query, String collectionName) {
|
||||
return exists(query, null, collectionName);
|
||||
}
|
||||
|
||||
public boolean exists(Query query, Class<?> entityClass, String collectionName) {
|
||||
|
||||
if (query == null) {
|
||||
throw new InvalidDataAccessApiUsageException("Query passed in to exist can't be null");
|
||||
}
|
||||
|
||||
DBObject mappedQuery = queryMapper.getMappedObject(query.getQueryObject(), getPersistentEntity(entityClass));
|
||||
return execute(collectionName, new FindCallback(mappedQuery)).hasNext();
|
||||
}
|
||||
|
||||
// Find methods that take a Query to express the query and that return a List of objects.
|
||||
|
||||
public <T> List<T> find(Query query, Class<T> entityClass) {
|
||||
@@ -639,7 +603,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
private long count(Query query, Class<?> entityClass, String collectionName) {
|
||||
|
||||
Assert.hasText(collectionName);
|
||||
final DBObject dbObject = query == null ? null : queryMapper.getMappedObject(query.getQueryObject(),
|
||||
final DBObject dbObject = query == null ? null : mapper.getMappedObject(query.getQueryObject(),
|
||||
entityClass == null ? null : mappingContext.getPersistentEntity(entityClass));
|
||||
|
||||
return execute(collectionName, new CollectionCallback<Long>() {
|
||||
@@ -704,9 +668,10 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
|
||||
initializeVersionProperty(objectToSave);
|
||||
|
||||
maybeEmitEvent(new BeforeConvertEvent<T>(objectToSave));
|
||||
BasicDBObject dbDoc = new BasicDBObject();
|
||||
|
||||
DBObject dbDoc = toDbObject(objectToSave, writer);
|
||||
maybeEmitEvent(new BeforeConvertEvent<T>(objectToSave));
|
||||
writer.write(objectToSave, dbDoc);
|
||||
|
||||
maybeEmitEvent(new BeforeSaveEvent<T>(objectToSave, dbDoc));
|
||||
Object id = insertDBObject(collectionName, dbDoc, objectToSave.getClass());
|
||||
@@ -715,26 +680,6 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
maybeEmitEvent(new AfterSaveEvent<T>(objectToSave, dbDoc));
|
||||
}
|
||||
|
||||
/**
|
||||
* @param objectToSave
|
||||
* @param writer
|
||||
* @return
|
||||
*/
|
||||
private <T> DBObject toDbObject(T objectToSave, MongoWriter<T> writer) {
|
||||
|
||||
if (!(objectToSave instanceof String)) {
|
||||
DBObject dbDoc = new BasicDBObject();
|
||||
writer.write(objectToSave, dbDoc);
|
||||
return dbDoc;
|
||||
} else {
|
||||
try {
|
||||
return (DBObject) JSON.parse((String) objectToSave);
|
||||
} catch (JSONParseException e) {
|
||||
throw new MappingException("Could not parse given String to save into a JSON document!", e);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private void initializeVersionProperty(Object entity) {
|
||||
|
||||
MongoPersistentEntity<?> mongoPersistentEntity = getPersistentEntity(entity.getClass());
|
||||
@@ -874,9 +819,19 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
|
||||
assertUpdateableIdIfNotSet(objectToSave);
|
||||
|
||||
DBObject dbDoc = new BasicDBObject();
|
||||
|
||||
maybeEmitEvent(new BeforeConvertEvent<T>(objectToSave));
|
||||
|
||||
DBObject dbDoc = toDbObject(objectToSave, writer);
|
||||
if (!(objectToSave instanceof String)) {
|
||||
writer.write(objectToSave, dbDoc);
|
||||
} else {
|
||||
try {
|
||||
dbDoc = (DBObject) JSON.parse((String) objectToSave);
|
||||
} catch (JSONParseException e) {
|
||||
throw new MappingException("Could not parse given String to save into a JSON document!", e);
|
||||
}
|
||||
}
|
||||
|
||||
maybeEmitEvent(new BeforeSaveEvent<T>(objectToSave, dbDoc));
|
||||
Object id = saveDBObject(collectionName, dbDoc, objectToSave.getClass());
|
||||
@@ -960,10 +915,6 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
return doUpdate(collectionName, query, update, null, true, false);
|
||||
}
|
||||
|
||||
public WriteResult upsert(Query query, Update update, Class<?> entityClass, String collectionName) {
|
||||
return doUpdate(collectionName, query, update, entityClass, true, false);
|
||||
}
|
||||
|
||||
public WriteResult updateFirst(Query query, Update update, Class<?> entityClass) {
|
||||
return doUpdate(determineCollectionName(entityClass), query, update, entityClass, false, false);
|
||||
}
|
||||
@@ -972,10 +923,6 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
return doUpdate(collectionName, query, update, null, false, false);
|
||||
}
|
||||
|
||||
public WriteResult updateFirst(Query query, Update update, Class<?> entityClass, String collectionName) {
|
||||
return doUpdate(collectionName, query, update, entityClass, false, false);
|
||||
}
|
||||
|
||||
public WriteResult updateMulti(Query query, Update update, Class<?> entityClass) {
|
||||
return doUpdate(determineCollectionName(entityClass), query, update, entityClass, false, true);
|
||||
}
|
||||
@@ -984,10 +931,6 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
return doUpdate(collectionName, query, update, null, false, true);
|
||||
}
|
||||
|
||||
public WriteResult updateMulti(final Query query, final Update update, Class<?> entityClass, String collectionName) {
|
||||
return doUpdate(collectionName, query, update, entityClass, false, true);
|
||||
}
|
||||
|
||||
protected WriteResult doUpdate(final String collectionName, final Query query, final Update update,
|
||||
final Class<?> entityClass, final boolean upsert, final boolean multi) {
|
||||
|
||||
@@ -996,10 +939,10 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
|
||||
MongoPersistentEntity<?> entity = entityClass == null ? null : getPersistentEntity(entityClass);
|
||||
|
||||
DBObject queryObj = query == null ? new BasicDBObject() : queryMapper.getMappedObject(query.getQueryObject(),
|
||||
DBObject queryObj = query == null ? new BasicDBObject()
|
||||
: mapper.getMappedObject(query.getQueryObject(), entity);
|
||||
DBObject updateObj = update == null ? new BasicDBObject() : mapper.getMappedObject(update.getUpdateObject(),
|
||||
entity);
|
||||
DBObject updateObj = update == null ? new BasicDBObject() : updateMapper.getMappedObject(
|
||||
update.getUpdateObject(), entity);
|
||||
|
||||
if (LOGGER.isDebugEnabled()) {
|
||||
LOGGER.debug("Calling update using query: " + queryObj + " and update: " + updateObj + " in collection: "
|
||||
@@ -1089,35 +1032,24 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
}
|
||||
}
|
||||
|
||||
public void remove(Query query, String collectionName) {
|
||||
remove(query, null, collectionName);
|
||||
}
|
||||
|
||||
public void remove(Query query, Class<?> entityClass) {
|
||||
remove(query, entityClass, determineCollectionName(entityClass));
|
||||
}
|
||||
|
||||
public void remove(Query query, Class<?> entityClass, String collectionName) {
|
||||
doRemove(collectionName, query, entityClass);
|
||||
public <T> void remove(Query query, Class<T> entityClass) {
|
||||
Assert.notNull(query);
|
||||
doRemove(determineCollectionName(entityClass), query, entityClass);
|
||||
}
|
||||
|
||||
protected <T> void doRemove(final String collectionName, final Query query, final Class<T> entityClass) {
|
||||
|
||||
if (query == null) {
|
||||
throw new InvalidDataAccessApiUsageException("Query passed in to remove can't be null!");
|
||||
throw new InvalidDataAccessApiUsageException("Query passed in to remove can't be null");
|
||||
}
|
||||
|
||||
Assert.hasText(collectionName, "Collection name must not be null or empty!");
|
||||
|
||||
final DBObject queryObject = query.getQueryObject();
|
||||
final MongoPersistentEntity<?> entity = getPersistentEntity(entityClass);
|
||||
|
||||
execute(collectionName, new CollectionCallback<Void>() {
|
||||
public Void doInCollection(DBCollection collection) throws MongoException, DataAccessException {
|
||||
|
||||
maybeEmitEvent(new BeforeDeleteEvent<T>(queryObject, entityClass));
|
||||
|
||||
DBObject dboq = queryMapper.getMappedObject(queryObject, entity);
|
||||
DBObject dboq = mapper.getMappedObject(queryObject, entity);
|
||||
|
||||
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.REMOVE, collectionName,
|
||||
entityClass, null, queryObject);
|
||||
@@ -1130,14 +1062,15 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
WriteResult wr = writeConcernToUse == null ? collection.remove(dboq) : collection.remove(dboq,
|
||||
writeConcernToUse);
|
||||
handleAnyWriteResultErrors(wr, dboq, MongoActionOperation.REMOVE);
|
||||
|
||||
maybeEmitEvent(new AfterDeleteEvent<T>(queryObject, entityClass));
|
||||
|
||||
return null;
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
public void remove(final Query query, String collectionName) {
|
||||
doRemove(collectionName, query, null);
|
||||
}
|
||||
|
||||
public <T> List<T> findAll(Class<T> entityClass) {
|
||||
return executeFindMultiInternal(new FindCallback(null), null, new ReadDbObjectCallback<T>(mongoConverter,
|
||||
entityClass), determineCollectionName(entityClass));
|
||||
@@ -1212,7 +1145,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
if (criteria == null) {
|
||||
dbo.put("cond", null);
|
||||
} else {
|
||||
dbo.put("cond", queryMapper.getMappedObject(criteria.getCriteriaObject(), null));
|
||||
dbo.put("cond", mapper.getMappedObject(criteria.getCriteriaObject(), null));
|
||||
}
|
||||
// If initial document was a JavaScript string, potentially loaded by Spring's Resource abstraction, load it and
|
||||
// convert to DBObject
|
||||
@@ -1261,64 +1194,6 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
|
||||
}
|
||||
|
||||
@Override
|
||||
public <O> AggregationResults<O> aggregate(TypedAggregation<?> aggregation, Class<O> outputType) {
|
||||
return aggregate(aggregation, determineCollectionName(aggregation.getInputType()), outputType);
|
||||
}
|
||||
|
||||
@Override
|
||||
public <O> AggregationResults<O> aggregate(TypedAggregation<?> aggregation, String inputCollectionName,
|
||||
Class<O> outputType) {
|
||||
|
||||
Assert.notNull(aggregation, "Aggregation pipeline must not be null!");
|
||||
|
||||
AggregationOperationContext context = new TypeBasedAggregationOperationContext(aggregation.getInputType(),
|
||||
mappingContext, queryMapper);
|
||||
return aggregate(aggregation, inputCollectionName, outputType, context);
|
||||
}
|
||||
|
||||
@Override
|
||||
public <O> AggregationResults<O> aggregate(Aggregation aggregation, Class<?> inputType, Class<O> outputType) {
|
||||
|
||||
return aggregate(aggregation, determineCollectionName(inputType), outputType,
|
||||
new TypeBasedAggregationOperationContext(inputType, mappingContext, queryMapper));
|
||||
}
|
||||
|
||||
@Override
|
||||
public <O> AggregationResults<O> aggregate(Aggregation aggregation, String collectionName, Class<O> outputType) {
|
||||
return aggregate(aggregation, collectionName, outputType, null);
|
||||
}
|
||||
|
||||
protected <O> AggregationResults<O> aggregate(Aggregation aggregation, String collectionName, Class<O> outputType,
|
||||
AggregationOperationContext context) {
|
||||
|
||||
Assert.hasText(collectionName, "Collection name must not be null or empty!");
|
||||
Assert.notNull(aggregation, "Aggregation pipeline must not be null!");
|
||||
Assert.notNull(outputType, "Output type must not be null!");
|
||||
|
||||
AggregationOperationContext rootContext = context == null ? Aggregation.DEFAULT_CONTEXT : context;
|
||||
DBObject command = aggregation.toDbObject(collectionName, rootContext);
|
||||
|
||||
if (LOGGER.isDebugEnabled()) {
|
||||
LOGGER.debug("Executing aggregation: {}", serializeToJsonSafely(command));
|
||||
}
|
||||
|
||||
CommandResult commandResult = executeCommand(command);
|
||||
handleCommandError(commandResult, command);
|
||||
|
||||
// map results
|
||||
@SuppressWarnings("unchecked")
|
||||
Iterable<DBObject> resultSet = (Iterable<DBObject>) commandResult.get("result");
|
||||
List<O> mappedResults = new ArrayList<O>();
|
||||
DbObjectCallback<O> callback = new UnwrapAndReadDbObjectCallback<O>(mongoConverter, outputType);
|
||||
|
||||
for (DBObject dbObject : resultSet) {
|
||||
mappedResults.add(callback.doWith(dbObject));
|
||||
}
|
||||
|
||||
return new AggregationResults<O>(mappedResults, commandResult);
|
||||
}
|
||||
|
||||
protected String replaceWithResourceIfNecessary(String function) {
|
||||
|
||||
String func = function;
|
||||
@@ -1427,28 +1302,57 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
}
|
||||
|
||||
/**
|
||||
* Map the results of an ad-hoc query on the default MongoDB collection to an object using the template's converter.
|
||||
* The query document is specified as a standard {@link DBObject} and so is the fields specification.
|
||||
* Map the results of an ad-hoc query on the default MongoDB collection to an object using the template's converter
|
||||
* <p/>
|
||||
* The query document is specified as a standard DBObject and so is the fields specification.
|
||||
*
|
||||
* @param collectionName name of the collection to retrieve the objects from.
|
||||
* @param query the query document that specifies the criteria used to find a record.
|
||||
* @param fields the document that specifies the fields to be returned.
|
||||
* @param collectionName name of the collection to retrieve the objects from
|
||||
* @param query the query document that specifies the criteria used to find a record
|
||||
* @param fields the document that specifies the fields to be returned
|
||||
* @param entityClass the parameterized type of the returned list.
|
||||
* @return the {@link List} of converted objects.
|
||||
* @return the List of converted objects.
|
||||
*/
|
||||
protected <T> T doFindOne(String collectionName, DBObject query, DBObject fields, Class<T> entityClass) {
|
||||
|
||||
EntityReader<? super T, DBObject> readerToUse = this.mongoConverter;
|
||||
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass);
|
||||
DBObject mappedQuery = queryMapper.getMappedObject(query, entity);
|
||||
DBObject mappedFields = fields == null ? null : queryMapper.getMappedObject(fields, entity);
|
||||
DBObject mappedQuery = mapper.getMappedObject(query, entity);
|
||||
|
||||
if (LOGGER.isDebugEnabled()) {
|
||||
LOGGER.debug(String.format("findOne using query: %s fields: %s for class: %s in collection: %s",
|
||||
serializeToJsonSafely(query), mappedFields, entityClass, collectionName));
|
||||
return executeFindOneInternal(new FindOneCallback(mappedQuery, fields), new ReadDbObjectCallback<T>(readerToUse,
|
||||
entityClass), collectionName);
|
||||
}
|
||||
|
||||
return executeFindOneInternal(new FindOneCallback(mappedQuery, mappedFields), new ReadDbObjectCallback<T>(
|
||||
this.mongoConverter, entityClass), collectionName);
|
||||
/**
|
||||
* Map the results of an ad-hoc query on the default MongoDB collection to a List of the specified type. The object is
|
||||
* converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless configured
|
||||
* otherwise, an instance of MappingMongoConverter will be used. The query document is specified as a standard
|
||||
* DBObject and so is the fields specification. Can be overridden by subclasses.
|
||||
*
|
||||
* @param collectionName name of the collection to retrieve the objects from
|
||||
* @param query the query document that specifies the criteria used to find a record
|
||||
* @param fields the document that specifies the fields to be returned
|
||||
* @param entityClass the parameterized type of the returned list.
|
||||
* @param preparer allows for customization of the DBCursor used when iterating over the result set, (apply limits,
|
||||
* skips and so on).
|
||||
* @return the List of converted objects.
|
||||
*/
|
||||
protected <T> List<T> doFind(String collectionName, DBObject query, DBObject fields, Class<T> entityClass,
|
||||
CursorPreparer preparer) {
|
||||
return doFind(collectionName, query, fields, entityClass, preparer, new ReadDbObjectCallback<T>(mongoConverter,
|
||||
entityClass));
|
||||
}
|
||||
|
||||
protected <S, T> List<T> doFind(String collectionName, DBObject query, DBObject fields, Class<S> entityClass,
|
||||
CursorPreparer preparer, DbObjectCallback<T> objectCallback) {
|
||||
|
||||
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass);
|
||||
|
||||
if (LOGGER.isDebugEnabled()) {
|
||||
LOGGER.debug(String.format("find using query: %s fields: %s for class: %s in collection: %s",
|
||||
serializeToJsonSafely(query), fields, entityClass, collectionName));
|
||||
}
|
||||
|
||||
return executeFindMultiInternal(new FindCallback(mapper.getMappedObject(query, entity), fields), preparer,
|
||||
objectCallback, collectionName);
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -1462,43 +1366,14 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
* @return the List of converted objects.
|
||||
*/
|
||||
protected <T> List<T> doFind(String collectionName, DBObject query, DBObject fields, Class<T> entityClass) {
|
||||
return doFind(collectionName, query, fields, entityClass, null, new ReadDbObjectCallback<T>(this.mongoConverter,
|
||||
entityClass));
|
||||
}
|
||||
|
||||
/**
|
||||
* Map the results of an ad-hoc query on the default MongoDB collection to a List of the specified type. The object is
|
||||
* converted from the MongoDB native representation using an instance of {@see MongoConverter}. The query document is
|
||||
* specified as a standard DBObject and so is the fields specification.
|
||||
*
|
||||
* @param collectionName name of the collection to retrieve the objects from.
|
||||
* @param query the query document that specifies the criteria used to find a record.
|
||||
* @param fields the document that specifies the fields to be returned.
|
||||
* @param entityClass the parameterized type of the returned list.
|
||||
* @param preparer allows for customization of the {@link DBCursor} used when iterating over the result set, (apply
|
||||
* limits, skips and so on).
|
||||
* @return the {@link List} of converted objects.
|
||||
*/
|
||||
protected <T> List<T> doFind(String collectionName, DBObject query, DBObject fields, Class<T> entityClass,
|
||||
CursorPreparer preparer) {
|
||||
return doFind(collectionName, query, fields, entityClass, preparer, new ReadDbObjectCallback<T>(mongoConverter,
|
||||
entityClass));
|
||||
}
|
||||
|
||||
protected <S, T> List<T> doFind(String collectionName, DBObject query, DBObject fields, Class<S> entityClass,
|
||||
CursorPreparer preparer, DbObjectCallback<T> objectCallback) {
|
||||
|
||||
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass);
|
||||
DBObject mappedFields = fields == null ? null : queryMapper.getMappedObject(fields, entity);
|
||||
DBObject mappedQuery = queryMapper.getMappedObject(query, entity);
|
||||
|
||||
if (LOGGER.isDebugEnabled()) {
|
||||
LOGGER.debug(String.format("find using query: %s fields: %s for class: %s in collection: %s",
|
||||
serializeToJsonSafely(query), mappedFields, entityClass, collectionName));
|
||||
LOGGER.debug("find using query: " + query + " fields: " + fields + " for class: " + entityClass
|
||||
+ " in collection: " + collectionName);
|
||||
}
|
||||
|
||||
return executeFindMultiInternal(new FindCallback(mappedQuery, mappedFields), preparer, objectCallback,
|
||||
collectionName);
|
||||
EntityReader<? super T, DBObject> readerToUse = this.mongoConverter;
|
||||
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass);
|
||||
return executeFindMultiInternal(new FindCallback(mapper.getMappedObject(query, entity), fields), null,
|
||||
new ReadDbObjectCallback<T>(readerToUse, entityClass), collectionName);
|
||||
}
|
||||
|
||||
protected DBObject convertToDbObject(CollectionOptions collectionOptions) {
|
||||
@@ -1536,7 +1411,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
+ entityClass + " in collection: " + collectionName);
|
||||
}
|
||||
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass);
|
||||
return executeFindOneInternal(new FindAndRemoveCallback(queryMapper.getMappedObject(query, entity), fields, sort),
|
||||
return executeFindOneInternal(new FindAndRemoveCallback(mapper.getMappedObject(query, entity), fields, sort),
|
||||
new ReadDbObjectCallback<T>(readerToUse, entityClass), collectionName);
|
||||
}
|
||||
|
||||
@@ -1551,8 +1426,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
|
||||
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass);
|
||||
|
||||
DBObject mappedUpdate = queryMapper.getMappedObject(update.getUpdateObject(), entity);
|
||||
DBObject mappedQuery = queryMapper.getMappedObject(query, entity);
|
||||
DBObject mappedUpdate = mapper.getMappedObject(update.getUpdateObject(), entity);
|
||||
DBObject mappedQuery = mapper.getMappedObject(query, entity);
|
||||
|
||||
if (LOGGER.isDebugEnabled()) {
|
||||
LOGGER.debug("findAndModify using query: " + mappedQuery + " fields: " + fields + " sort: " + sort
|
||||
@@ -1827,9 +1702,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
}
|
||||
|
||||
private static final MongoConverter getDefaultMongoConverter(MongoDbFactory factory) {
|
||||
|
||||
DbRefResolver dbRefResolver = new DefaultDbRefResolver(factory);
|
||||
MappingMongoConverter converter = new MappingMongoConverter(dbRefResolver, new MongoMappingContext());
|
||||
MappingMongoConverter converter = new MappingMongoConverter(factory, new MongoMappingContext());
|
||||
converter.afterPropertiesSet();
|
||||
return converter;
|
||||
}
|
||||
@@ -1986,35 +1859,6 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
|
||||
}
|
||||
}
|
||||
|
||||
class UnwrapAndReadDbObjectCallback<T> extends ReadDbObjectCallback<T> {
|
||||
|
||||
public UnwrapAndReadDbObjectCallback(EntityReader<? super T, DBObject> reader, Class<T> type) {
|
||||
super(reader, type);
|
||||
}
|
||||
|
||||
@Override
|
||||
public T doWith(DBObject object) {
|
||||
|
||||
Object idField = object.get(Fields.UNDERSCORE_ID);
|
||||
|
||||
if (!(idField instanceof DBObject)) {
|
||||
return super.doWith(object);
|
||||
}
|
||||
|
||||
DBObject toMap = new BasicDBObject();
|
||||
DBObject nested = (DBObject) idField;
|
||||
toMap.putAll(nested);
|
||||
|
||||
for (String key : object.keySet()) {
|
||||
if (!Fields.UNDERSCORE_ID.equals(key)) {
|
||||
toMap.put(key, object.get(key));
|
||||
}
|
||||
}
|
||||
|
||||
return super.doWith(toMap);
|
||||
}
|
||||
}
|
||||
|
||||
private enum DefaultWriteConcernResolver implements WriteConcernResolver {
|
||||
|
||||
INSTANCE;
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2011-2013 the original author or authors.
|
||||
* Copyright 2011-2012 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -19,11 +19,9 @@ import java.net.UnknownHostException;
|
||||
|
||||
import org.springframework.beans.factory.DisposableBean;
|
||||
import org.springframework.dao.DataAccessException;
|
||||
import org.springframework.dao.support.PersistenceExceptionTranslator;
|
||||
import org.springframework.data.authentication.UserCredentials;
|
||||
import org.springframework.data.mongodb.MongoDbFactory;
|
||||
import org.springframework.util.Assert;
|
||||
import org.springframework.util.StringUtils;
|
||||
|
||||
import com.mongodb.DB;
|
||||
import com.mongodb.Mongo;
|
||||
@@ -36,7 +34,6 @@ import com.mongodb.WriteConcern;
|
||||
*
|
||||
* @author Mark Pollack
|
||||
* @author Oliver Gierke
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
|
||||
|
||||
@@ -44,9 +41,6 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
|
||||
private final String databaseName;
|
||||
private final boolean mongoInstanceCreated;
|
||||
private final UserCredentials credentials;
|
||||
private final PersistenceExceptionTranslator exceptionTranslator;
|
||||
private final String authenticationDatabaseName;
|
||||
|
||||
private WriteConcern writeConcern;
|
||||
|
||||
/**
|
||||
@@ -56,7 +50,7 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
|
||||
* @param databaseName database name, not be {@literal null} or empty.
|
||||
*/
|
||||
public SimpleMongoDbFactory(Mongo mongo, String databaseName) {
|
||||
this(mongo, databaseName, null);
|
||||
this(mongo, databaseName, UserCredentials.NO_CREDENTIALS, false);
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -67,20 +61,7 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
|
||||
* @param credentials username and password.
|
||||
*/
|
||||
public SimpleMongoDbFactory(Mongo mongo, String databaseName, UserCredentials credentials) {
|
||||
this(mongo, databaseName, credentials, false, null);
|
||||
}
|
||||
|
||||
/**
|
||||
* Create an instance of SimpleMongoDbFactory given the Mongo instance, database name, and username/password
|
||||
*
|
||||
* @param mongo Mongo instance, must not be {@literal null}.
|
||||
* @param databaseName Database name, must not be {@literal null} or empty.
|
||||
* @param credentials username and password.
|
||||
* @param authenticationDatabaseName the database name to use for authentication
|
||||
*/
|
||||
public SimpleMongoDbFactory(Mongo mongo, String databaseName, UserCredentials credentials,
|
||||
String authenticationDatabaseName) {
|
||||
this(mongo, databaseName, credentials, false, authenticationDatabaseName);
|
||||
this(mongo, databaseName, credentials, false);
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -91,14 +72,12 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
|
||||
* @throws UnknownHostException
|
||||
* @see MongoURI
|
||||
*/
|
||||
@SuppressWarnings("deprecation")
|
||||
public SimpleMongoDbFactory(MongoURI uri) throws MongoException, UnknownHostException {
|
||||
this(new Mongo(uri), uri.getDatabase(), new UserCredentials(uri.getUsername(), parseChars(uri.getPassword())),
|
||||
true, uri.getDatabase());
|
||||
this(new Mongo(uri), uri.getDatabase(), new UserCredentials(uri.getUsername(), parseChars(uri.getPassword())), true);
|
||||
}
|
||||
|
||||
private SimpleMongoDbFactory(Mongo mongo, String databaseName, UserCredentials credentials,
|
||||
boolean mongoInstanceCreated, String authenticationDatabaseName) {
|
||||
boolean mongoInstanceCreated) {
|
||||
|
||||
Assert.notNull(mongo, "Mongo must not be null");
|
||||
Assert.hasText(databaseName, "Database name must not be empty");
|
||||
@@ -109,12 +88,6 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
|
||||
this.databaseName = databaseName;
|
||||
this.mongoInstanceCreated = mongoInstanceCreated;
|
||||
this.credentials = credentials == null ? UserCredentials.NO_CREDENTIALS : credentials;
|
||||
this.exceptionTranslator = new MongoExceptionTranslator();
|
||||
this.authenticationDatabaseName = StringUtils.hasText(authenticationDatabaseName) ? authenticationDatabaseName
|
||||
: databaseName;
|
||||
|
||||
Assert.isTrue(this.authenticationDatabaseName.matches("[\\w-]+"),
|
||||
"Authentication database name must only contain letters, numbers, underscores and dashes!");
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -142,7 +115,7 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
|
||||
|
||||
Assert.hasText(dbName, "Database name must not be empty.");
|
||||
|
||||
DB db = MongoDbUtils.getDB(mongo, dbName, credentials, authenticationDatabaseName);
|
||||
DB db = MongoDbUtils.getDB(mongo, dbName, credentials);
|
||||
|
||||
if (writeConcern != null) {
|
||||
db.setWriteConcern(writeConcern);
|
||||
@@ -165,13 +138,4 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
|
||||
private static String parseChars(char[] chars) {
|
||||
return chars == null ? null : String.valueOf(chars);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.MongoDbFactory#getExceptionTranslator()
|
||||
*/
|
||||
@Override
|
||||
public PersistenceExceptionTranslator getExceptionTranslator() {
|
||||
return this.exceptionTranslator;
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,305 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.aggregation;
|
||||
|
||||
import static org.springframework.data.mongodb.core.aggregation.Fields.*;
|
||||
|
||||
import java.util.ArrayList;
|
||||
import java.util.Arrays;
|
||||
import java.util.List;
|
||||
|
||||
import org.springframework.data.domain.Sort;
|
||||
import org.springframework.data.domain.Sort.Direction;
|
||||
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
|
||||
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
|
||||
import org.springframework.data.mongodb.core.aggregation.Fields.AggregationField;
|
||||
import org.springframework.data.mongodb.core.query.Criteria;
|
||||
import org.springframework.data.mongodb.core.query.SerializationUtils;
|
||||
import org.springframework.util.Assert;
|
||||
|
||||
import com.mongodb.BasicDBObject;
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
* An {@code Aggregation} is a representation of a list of aggregation steps to be performed by the MongoDB Aggregation
|
||||
* Framework.
|
||||
*
|
||||
* @author Tobias Trelle
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
* @since 1.3
|
||||
*/
|
||||
public class Aggregation {
|
||||
|
||||
public static final AggregationOperationContext DEFAULT_CONTEXT = new NoOpAggregationOperationContext();
|
||||
|
||||
private final List<AggregationOperation> operations;
|
||||
|
||||
/**
|
||||
* Creates a new {@link Aggregation} from the given {@link AggregationOperation}s.
|
||||
*
|
||||
* @param operations must not be {@literal null} or empty.
|
||||
*/
|
||||
public static Aggregation newAggregation(List<? extends AggregationOperation> operations) {
|
||||
return newAggregation(operations.toArray(new AggregationOperation[operations.size()]));
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link Aggregation} from the given {@link AggregationOperation}s.
|
||||
*
|
||||
* @param operations must not be {@literal null} or empty.
|
||||
*/
|
||||
public static Aggregation newAggregation(AggregationOperation... operations) {
|
||||
return new Aggregation(operations);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link TypedAggregation} for the given type and {@link AggregationOperation}s.
|
||||
*
|
||||
* @param type must not be {@literal null}.
|
||||
* @param operations must not be {@literal null} or empty.
|
||||
*/
|
||||
public static <T> TypedAggregation<T> newAggregation(Class<T> type, List<? extends AggregationOperation> operations) {
|
||||
return newAggregation(type, operations.toArray(new AggregationOperation[operations.size()]));
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link TypedAggregation} for the given type and {@link AggregationOperation}s.
|
||||
*
|
||||
* @param type must not be {@literal null}.
|
||||
* @param operations must not be {@literal null} or empty.
|
||||
*/
|
||||
public static <T> TypedAggregation<T> newAggregation(Class<T> type, AggregationOperation... operations) {
|
||||
return new TypedAggregation<T>(type, operations);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link Aggregation} from the given {@link AggregationOperation}s.
|
||||
*
|
||||
* @param aggregationOperations must not be {@literal null} or empty.
|
||||
*/
|
||||
protected Aggregation(AggregationOperation... aggregationOperations) {
|
||||
|
||||
Assert.notNull(aggregationOperations, "AggregationOperations must not be null!");
|
||||
Assert.isTrue(aggregationOperations.length > 0, "At least one AggregationOperation has to be provided");
|
||||
|
||||
this.operations = Arrays.asList(aggregationOperations);
|
||||
}
|
||||
|
||||
/**
|
||||
* A pointer to the previous {@link AggregationOperation}.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public static String previousOperation() {
|
||||
return "_id";
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link ProjectionOperation} including the given fields.
|
||||
*
|
||||
* @param fields must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
public static ProjectionOperation project(String... fields) {
|
||||
return project(fields(fields));
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link ProjectionOperation} includeing the given {@link Fields}.
|
||||
*
|
||||
* @param fields must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
public static ProjectionOperation project(Fields fields) {
|
||||
return new ProjectionOperation(fields);
|
||||
}
|
||||
|
||||
/**
|
||||
* Factory method to create a new {@link UnwindOperation} for the field with the given name.
|
||||
*
|
||||
* @param fieldName must not be {@literal null} or empty.
|
||||
* @return
|
||||
*/
|
||||
public static UnwindOperation unwind(String field) {
|
||||
return new UnwindOperation(field(field));
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link GroupOperation} for the given fields.
|
||||
*
|
||||
* @param fields must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
public static GroupOperation group(String... fields) {
|
||||
return group(fields(fields));
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link GroupOperation} for the given {@link Fields}.
|
||||
*
|
||||
* @param fields must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
public static GroupOperation group(Fields fields) {
|
||||
return new GroupOperation(fields);
|
||||
}
|
||||
|
||||
/**
|
||||
* Factory method to create a new {@link SortOperation} for the given {@link Sort}.
|
||||
*
|
||||
* @param sort must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
public static SortOperation sort(Sort sort) {
|
||||
return new SortOperation(sort);
|
||||
}
|
||||
|
||||
/**
|
||||
* Factory method to create a new {@link SortOperation} for the given sort {@link Direction} and {@code fields}.
|
||||
*
|
||||
* @param direction must not be {@literal null}.
|
||||
* @param fields must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
public static SortOperation sort(Direction direction, String... fields) {
|
||||
return new SortOperation(new Sort(direction, fields));
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link SkipOperation} skipping the given number of elements.
|
||||
*
|
||||
* @param elementsToSkip must not be less than zero.
|
||||
* @return
|
||||
*/
|
||||
public static SkipOperation skip(int elementsToSkip) {
|
||||
return new SkipOperation(elementsToSkip);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link LimitOperation} limiting the result to the given number of elements.
|
||||
*
|
||||
* @param maxElements must not be less than zero.
|
||||
* @return
|
||||
*/
|
||||
public static LimitOperation limit(long maxElements) {
|
||||
return new LimitOperation(maxElements);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link MatchOperation} using the given {@link Criteria}.
|
||||
*
|
||||
* @param criteria must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
public static MatchOperation match(Criteria criteria) {
|
||||
return new MatchOperation(criteria);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link Fields} instance for the given field names.
|
||||
*
|
||||
* @see Fields#fields(String...)
|
||||
* @param fields must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
public static Fields fields(String... fields) {
|
||||
return Fields.fields(fields);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link Fields} instance from the given field name and target reference.
|
||||
*
|
||||
* @param name must not be {@literal null} or empty.
|
||||
* @param target must not be {@literal null} or empty.
|
||||
* @return
|
||||
*/
|
||||
public static Fields bind(String name, String target) {
|
||||
return Fields.from(field(name, target));
|
||||
}
|
||||
|
||||
/**
|
||||
* Converts this {@link Aggregation} specification to a {@link DBObject}.
|
||||
*
|
||||
* @param inputCollectionName the name of the input collection
|
||||
* @return the {@code DBObject} representing this aggregation
|
||||
*/
|
||||
public DBObject toDbObject(String inputCollectionName, AggregationOperationContext rootContext) {
|
||||
|
||||
AggregationOperationContext context = rootContext;
|
||||
List<DBObject> operationDocuments = new ArrayList<DBObject>(operations.size());
|
||||
|
||||
for (AggregationOperation operation : operations) {
|
||||
|
||||
operationDocuments.add(operation.toDBObject(context));
|
||||
|
||||
if (operation instanceof FieldsExposingAggregationOperation) {
|
||||
FieldsExposingAggregationOperation exposedFieldsOperation = (FieldsExposingAggregationOperation) operation;
|
||||
context = new ExposedFieldsAggregationOperationContext(exposedFieldsOperation.getFields());
|
||||
}
|
||||
}
|
||||
|
||||
DBObject command = new BasicDBObject("aggregate", inputCollectionName);
|
||||
command.put("pipeline", operationDocuments);
|
||||
|
||||
return command;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see java.lang.Object#toString()
|
||||
*/
|
||||
@Override
|
||||
public String toString() {
|
||||
return SerializationUtils
|
||||
.serializeToJsonSafely(toDbObject("__collection__", new NoOpAggregationOperationContext()));
|
||||
}
|
||||
|
||||
/**
|
||||
* Simple {@link AggregationOperationContext} that just returns {@link FieldReference}s as is.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
private static class NoOpAggregationOperationContext implements AggregationOperationContext {
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getMappedObject(com.mongodb.DBObject)
|
||||
*/
|
||||
@Override
|
||||
public DBObject getMappedObject(DBObject dbObject) {
|
||||
return dbObject;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getReference(org.springframework.data.mongodb.core.aggregation.ExposedFields.AvailableField)
|
||||
*/
|
||||
@Override
|
||||
public FieldReference getReference(Field field) {
|
||||
return new FieldReference(new ExposedField(field, true));
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getReference(java.lang.String)
|
||||
*/
|
||||
@Override
|
||||
public FieldReference getReference(String name) {
|
||||
return new FieldReference(new ExposedField(new AggregationField(name), true));
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,82 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.aggregation;
|
||||
|
||||
import org.springframework.data.mongodb.core.aggregation.AggregationExpressionTransformer.AggregationExpressionTransformationContext;
|
||||
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
|
||||
import org.springframework.data.mongodb.core.spel.ExpressionNode;
|
||||
import org.springframework.data.mongodb.core.spel.ExpressionTransformationContextSupport;
|
||||
import org.springframework.data.mongodb.core.spel.ExpressionTransformer;
|
||||
import org.springframework.util.Assert;
|
||||
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
* Interface to type an {@link ExpressionTransformer} to the contained
|
||||
* {@link AggregationExpressionTransformationContext}.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
interface AggregationExpressionTransformer extends
|
||||
ExpressionTransformer<AggregationExpressionTransformationContext<ExpressionNode>> {
|
||||
|
||||
/**
|
||||
* A special {@link ExpressionTransformationContextSupport} to be aware of the {@link AggregationOperationContext}.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
public static class AggregationExpressionTransformationContext<T extends ExpressionNode> extends
|
||||
ExpressionTransformationContextSupport<T> {
|
||||
|
||||
private final AggregationOperationContext aggregationContext;
|
||||
|
||||
/**
|
||||
* Creates an {@link AggregationExpressionTransformationContext}.
|
||||
*
|
||||
* @param currentNode must not be {@literal null}.
|
||||
* @param parentNode
|
||||
* @param previousOperationObject
|
||||
* @param aggregationContext must not be {@literal null}.
|
||||
*/
|
||||
public AggregationExpressionTransformationContext(T currentNode, ExpressionNode parentNode,
|
||||
DBObject previousOperationObject, AggregationOperationContext context) {
|
||||
|
||||
super(currentNode, parentNode, previousOperationObject);
|
||||
|
||||
Assert.notNull(context, "AggregationOperationContext must not be null!");
|
||||
this.aggregationContext = context;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the underlying {@link AggregationOperationContext}.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public AggregationOperationContext getAggregationContext() {
|
||||
return aggregationContext;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the {@link FieldReference} for the current {@link ExpressionNode}.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public FieldReference getFieldReference() {
|
||||
return aggregationContext.getReference(getCurrentNode().getName());
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,37 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.aggregation;
|
||||
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
* Represents one single operation in an aggregation pipeline.
|
||||
*
|
||||
* @author Sebastian Herold
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
* @since 1.3
|
||||
*/
|
||||
public interface AggregationOperation {
|
||||
|
||||
/**
|
||||
* Turns the {@link AggregationOperation} into a {@link DBObject} by using the given
|
||||
* {@link AggregationOperationContext}.
|
||||
*
|
||||
* @return the DBObject
|
||||
*/
|
||||
DBObject toDBObject(AggregationOperationContext context);
|
||||
}
|
||||
@@ -1,55 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.aggregation;
|
||||
|
||||
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
|
||||
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
* The context for an {@link AggregationOperation}.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
* @since 1.3
|
||||
*/
|
||||
public interface AggregationOperationContext {
|
||||
|
||||
/**
|
||||
* Returns the mapped {@link DBObject}, potentially converting the source considering mapping metadata etc.
|
||||
*
|
||||
* @param dbObject will never be {@literal null}.
|
||||
* @return must not be {@literal null}.
|
||||
*/
|
||||
DBObject getMappedObject(DBObject dbObject);
|
||||
|
||||
/**
|
||||
* Returns a {@link FieldReference} for the given field or {@literal null} if the context does not expose the given
|
||||
* field.
|
||||
*
|
||||
* @param field must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
FieldReference getReference(Field field);
|
||||
|
||||
/**
|
||||
* Returns the {@link FieldReference} for the field with the given name or {@literal null} if the context does not
|
||||
* expose a field with the given name.
|
||||
*
|
||||
* @param name must not be {@literal null} or empty.
|
||||
* @return
|
||||
*/
|
||||
FieldReference getReference(String name);
|
||||
}
|
||||
@@ -1,98 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.aggregation;
|
||||
|
||||
import java.util.Collections;
|
||||
import java.util.Iterator;
|
||||
import java.util.List;
|
||||
|
||||
import org.springframework.util.Assert;
|
||||
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
* Collects the results of executing an aggregation operation.
|
||||
*
|
||||
* @author Tobias Trelle
|
||||
* @author Oliver Gierke
|
||||
* @param <T> The class in which the results are mapped onto.
|
||||
* @since 1.3
|
||||
*/
|
||||
public class AggregationResults<T> implements Iterable<T> {
|
||||
|
||||
private final List<T> mappedResults;
|
||||
private final DBObject rawResults;
|
||||
private final String serverUsed;
|
||||
|
||||
/**
|
||||
* Creates a new {@link AggregationResults} instance from the given mapped and raw results.
|
||||
*
|
||||
* @param mappedResults must not be {@literal null}.
|
||||
* @param rawResults must not be {@literal null}.
|
||||
*/
|
||||
public AggregationResults(List<T> mappedResults, DBObject rawResults) {
|
||||
|
||||
Assert.notNull(mappedResults);
|
||||
Assert.notNull(rawResults);
|
||||
|
||||
this.mappedResults = Collections.unmodifiableList(mappedResults);
|
||||
this.rawResults = rawResults;
|
||||
this.serverUsed = parseServerUsed();
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the aggregation results.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public List<T> getMappedResults() {
|
||||
return mappedResults;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the unique mapped result. Assumes no result or exactly one.
|
||||
*
|
||||
* @return
|
||||
* @throws IllegalArgumentException in case more than one result is available.
|
||||
*/
|
||||
public T getUniqueMappedResult() {
|
||||
Assert.isTrue(mappedResults.size() < 2, "Expected unique result or null, but got more than one!");
|
||||
return mappedResults.size() == 1 ? mappedResults.get(0) : null;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see java.lang.Iterable#iterator()
|
||||
*/
|
||||
public Iterator<T> iterator() {
|
||||
return mappedResults.iterator();
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the server that has been used to perform the aggregation.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public String getServerUsed() {
|
||||
return serverUsed;
|
||||
}
|
||||
|
||||
private String parseServerUsed() {
|
||||
|
||||
Object object = rawResults.get("serverUsed");
|
||||
return object instanceof String ? (String) object : null;
|
||||
}
|
||||
}
|
||||
@@ -1,402 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.aggregation;
|
||||
|
||||
import java.util.ArrayList;
|
||||
import java.util.Arrays;
|
||||
import java.util.Collections;
|
||||
import java.util.Iterator;
|
||||
import java.util.List;
|
||||
|
||||
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
|
||||
import org.springframework.util.Assert;
|
||||
import org.springframework.util.CompositeIterator;
|
||||
|
||||
/**
|
||||
* Value object to capture the fields exposed by an {@link AggregationOperation}.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
* @author Thomas Darimont
|
||||
* @since 1.3
|
||||
*/
|
||||
public class ExposedFields implements Iterable<ExposedField> {
|
||||
|
||||
private static final List<ExposedField> NO_FIELDS = Collections.emptyList();
|
||||
private static final ExposedFields EMPTY = new ExposedFields(NO_FIELDS, NO_FIELDS);
|
||||
|
||||
private final List<ExposedField> originalFields;
|
||||
private final List<ExposedField> syntheticFields;
|
||||
|
||||
/**
|
||||
* Creates a new {@link ExposedFields} instance from the given {@link ExposedField}s.
|
||||
*
|
||||
* @param fields must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
public static ExposedFields from(ExposedField... fields) {
|
||||
return from(Arrays.asList(fields));
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link ExposedFields} instance from the given {@link ExposedField}s.
|
||||
*
|
||||
* @param fields must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
private static ExposedFields from(List<ExposedField> fields) {
|
||||
|
||||
ExposedFields result = EMPTY;
|
||||
|
||||
for (ExposedField field : fields) {
|
||||
result = result.and(field);
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates synthetic {@link ExposedFields} from the given {@link Fields}.
|
||||
*
|
||||
* @param fields must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
public static ExposedFields synthetic(Fields fields) {
|
||||
return createFields(fields, true);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates non-synthetic {@link ExposedFields} from the given {@link Fields}.
|
||||
*
|
||||
* @param fields must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
public static ExposedFields nonSynthetic(Fields fields) {
|
||||
return createFields(fields, false);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link ExposedFields} instance for the given fields in either sythetic or non-synthetic way.
|
||||
*
|
||||
* @param fields must not be {@literal null}.
|
||||
* @param synthetic
|
||||
* @return
|
||||
*/
|
||||
private static ExposedFields createFields(Fields fields, boolean synthetic) {
|
||||
|
||||
Assert.notNull(fields, "Fields must not be null!");
|
||||
List<ExposedField> result = new ArrayList<ExposedField>();
|
||||
|
||||
for (Field field : fields) {
|
||||
result.add(new ExposedField(field, synthetic));
|
||||
}
|
||||
|
||||
return ExposedFields.from(result);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link ExposedFields} with the given orignals and synthetics.
|
||||
*
|
||||
* @param originals must not be {@literal null}.
|
||||
* @param synthetic must not be {@literal null}.
|
||||
*/
|
||||
private ExposedFields(List<ExposedField> originals, List<ExposedField> synthetic) {
|
||||
|
||||
this.originalFields = originals;
|
||||
this.syntheticFields = synthetic;
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link ExposedFields} adding the given {@link ExposedField}.
|
||||
*
|
||||
* @param field must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
public ExposedFields and(ExposedField field) {
|
||||
|
||||
Assert.notNull(field, "Exposed field must not be null!");
|
||||
|
||||
ArrayList<ExposedField> result = new ArrayList<ExposedField>();
|
||||
result.addAll(field.synthetic ? syntheticFields : originalFields);
|
||||
result.add(field);
|
||||
|
||||
return new ExposedFields(field.synthetic ? originalFields : result, field.synthetic ? result : syntheticFields);
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the field with the given name or {@literal null} if no field with the given name is available.
|
||||
*
|
||||
* @param name
|
||||
* @return
|
||||
*/
|
||||
public ExposedField getField(String name) {
|
||||
|
||||
for (ExposedField field : this) {
|
||||
if (field.canBeReferredToBy(name)) {
|
||||
return field;
|
||||
}
|
||||
}
|
||||
|
||||
return null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns whether the {@link ExposedFields} exposes no non-synthetic fields at all.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
boolean exposesNoNonSyntheticFields() {
|
||||
return originalFields.isEmpty();
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns whether the {@link ExposedFields} exposes a single non-synthetic field only.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
boolean exposesSingleNonSyntheticFieldOnly() {
|
||||
return originalFields.size() == 1;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns whether the {@link ExposedFields} exposes no fields at all.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
boolean exposesNoFields() {
|
||||
return exposedFieldsCount() == 0;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns whether the {@link ExposedFields} exposes a single field only.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
boolean exposesSingleFieldOnly() {
|
||||
return exposedFieldsCount() == 1;
|
||||
}
|
||||
|
||||
/**
|
||||
* @return
|
||||
*/
|
||||
private int exposedFieldsCount() {
|
||||
return originalFields.size() + syntheticFields.size();
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see java.lang.Iterable#iterator()
|
||||
*/
|
||||
@Override
|
||||
public Iterator<ExposedField> iterator() {
|
||||
|
||||
CompositeIterator<ExposedField> iterator = new CompositeIterator<ExposedField>();
|
||||
iterator.add(syntheticFields.iterator());
|
||||
iterator.add(originalFields.iterator());
|
||||
|
||||
return iterator;
|
||||
}
|
||||
|
||||
/**
|
||||
* A single exposed field.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
static class ExposedField implements Field {
|
||||
|
||||
private final boolean synthetic;
|
||||
private final Field field;
|
||||
|
||||
/**
|
||||
* Creates a new {@link ExposedField} with the given key.
|
||||
*
|
||||
* @param key must not be {@literal null} or empty.
|
||||
* @param synthetic whether the exposed field is synthetic.
|
||||
*/
|
||||
public ExposedField(String key, boolean synthetic) {
|
||||
this(Fields.field(key), synthetic);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link ExposedField} for the given {@link Field}.
|
||||
*
|
||||
* @param delegate must not be {@literal null}.
|
||||
* @param synthetic whether the exposed field is synthetic.
|
||||
*/
|
||||
public ExposedField(Field delegate, boolean synthetic) {
|
||||
|
||||
this.field = delegate;
|
||||
this.synthetic = synthetic;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.Field#getKey()
|
||||
*/
|
||||
@Override
|
||||
public String getName() {
|
||||
return field.getName();
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.Field#getTarget()
|
||||
*/
|
||||
@Override
|
||||
public String getTarget() {
|
||||
return field.getTarget();
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.Field#isAliased()
|
||||
*/
|
||||
@Override
|
||||
public boolean isAliased() {
|
||||
return field.isAliased();
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns whether the field can be referred to using the given name.
|
||||
*
|
||||
* @param input
|
||||
* @return
|
||||
*/
|
||||
public boolean canBeReferredToBy(String input) {
|
||||
return getTarget().equals(input);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see java.lang.Object#toString()
|
||||
*/
|
||||
@Override
|
||||
public String toString() {
|
||||
return String.format("AggregationField: %s, synthetic: %s", field, synthetic);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see java.lang.Object#equals(java.lang.Object)
|
||||
*/
|
||||
@Override
|
||||
public boolean equals(Object obj) {
|
||||
|
||||
if (this == obj) {
|
||||
return true;
|
||||
}
|
||||
|
||||
if (!(obj instanceof ExposedField)) {
|
||||
return false;
|
||||
}
|
||||
|
||||
ExposedField that = (ExposedField) obj;
|
||||
|
||||
return this.field.equals(that.field) && this.synthetic == that.synthetic;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see java.lang.Object#hashCode()
|
||||
*/
|
||||
@Override
|
||||
public int hashCode() {
|
||||
|
||||
int result = 17;
|
||||
|
||||
result += 31 * field.hashCode();
|
||||
result += 31 * (synthetic ? 0 : 1);
|
||||
|
||||
return result;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* A reference to an {@link ExposedField}.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
static class FieldReference {
|
||||
|
||||
private final ExposedField field;
|
||||
|
||||
/**
|
||||
* Creates a new {@link FieldReference} for the given {@link ExposedField}.
|
||||
*
|
||||
* @param field must not be {@literal null}.
|
||||
*/
|
||||
public FieldReference(ExposedField field) {
|
||||
|
||||
Assert.notNull(field, "ExposedField must not be null!");
|
||||
this.field = field;
|
||||
}
|
||||
|
||||
/**
|
||||
* @return
|
||||
*/
|
||||
public boolean isSynthetic() {
|
||||
return field.synthetic;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the raw, unqualified reference, i.e. the field reference without a {@literal $} prefix.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public String getRaw() {
|
||||
|
||||
String target = field.getTarget();
|
||||
return field.synthetic ? target : String.format("%s.%s", Fields.UNDERSCORE_ID, target);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see java.lang.Object#toString()
|
||||
*/
|
||||
@Override
|
||||
public String toString() {
|
||||
return String.format("$%s", getRaw());
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see java.lang.Object#equals(java.lang.Object)
|
||||
*/
|
||||
@Override
|
||||
public boolean equals(Object obj) {
|
||||
|
||||
if (this == obj) {
|
||||
return true;
|
||||
}
|
||||
|
||||
if (!(obj instanceof FieldReference)) {
|
||||
return false;
|
||||
}
|
||||
|
||||
FieldReference that = (FieldReference) obj;
|
||||
|
||||
return this.field.equals(that.field);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see java.lang.Object#hashCode()
|
||||
*/
|
||||
@Override
|
||||
public int hashCode() {
|
||||
return field.hashCode();
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,80 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.aggregation;
|
||||
|
||||
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
|
||||
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
|
||||
import org.springframework.util.Assert;
|
||||
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
* {@link AggregationOperationContext} that combines the available field references from a given
|
||||
* {@code AggregationOperationContext} and an {@link FieldsExposingAggregationOperation}.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
* @since 1.4
|
||||
*/
|
||||
class ExposedFieldsAggregationOperationContext implements AggregationOperationContext {
|
||||
|
||||
private final ExposedFields exposedFields;
|
||||
|
||||
/**
|
||||
* Creates a new {@link ExposedFieldsAggregationOperationContext} from the given {@link ExposedFields}.
|
||||
*
|
||||
* @param exposedFields must not be {@literal null}.
|
||||
*/
|
||||
public ExposedFieldsAggregationOperationContext(ExposedFields exposedFields) {
|
||||
|
||||
Assert.notNull(exposedFields, "ExposedFields must not be null!");
|
||||
this.exposedFields = exposedFields;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getMappedObject(com.mongodb.DBObject)
|
||||
*/
|
||||
@Override
|
||||
public DBObject getMappedObject(DBObject dbObject) {
|
||||
return dbObject;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getReference(org.springframework.data.mongodb.core.aggregation.ExposedFields.AvailableField)
|
||||
*/
|
||||
@Override
|
||||
public FieldReference getReference(Field field) {
|
||||
return getReference(field.getTarget());
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getReference(java.lang.String)
|
||||
*/
|
||||
@Override
|
||||
public FieldReference getReference(String name) {
|
||||
|
||||
ExposedField field = exposedFields.getField(name);
|
||||
|
||||
if (field != null) {
|
||||
return new FieldReference(field);
|
||||
}
|
||||
|
||||
throw new IllegalArgumentException(String.format("Invalid reference '%s'!", name));
|
||||
}
|
||||
}
|
||||
@@ -1,46 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.aggregation;
|
||||
|
||||
/**
|
||||
* Abstraction for a field.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
* @since 1.3
|
||||
*/
|
||||
public interface Field {
|
||||
|
||||
/**
|
||||
* Returns the name of the field.
|
||||
*
|
||||
* @return must not be {@literal null}.
|
||||
*/
|
||||
String getName();
|
||||
|
||||
/**
|
||||
* Returns the target of the field. In case no explicit target is available {@link #getName()} should be returned.
|
||||
*
|
||||
* @return must not be {@literal null}.
|
||||
*/
|
||||
String getTarget();
|
||||
|
||||
/**
|
||||
* Returns whether the Field is aliased, which means it has a name set different from the target.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
boolean isAliased();
|
||||
}
|
||||
@@ -1,293 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.aggregation;
|
||||
|
||||
import java.util.ArrayList;
|
||||
import java.util.Arrays;
|
||||
import java.util.HashMap;
|
||||
import java.util.Iterator;
|
||||
import java.util.List;
|
||||
import java.util.Map;
|
||||
|
||||
import org.springframework.util.Assert;
|
||||
import org.springframework.util.ObjectUtils;
|
||||
import org.springframework.util.StringUtils;
|
||||
|
||||
/**
|
||||
* Value object to capture a list of {@link Field} instances.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
* @since 1.3
|
||||
*/
|
||||
public class Fields implements Iterable<Field> {
|
||||
|
||||
private static final String AMBIGUOUS_EXCEPTION = "Found two fields both using '%s' as name: %s and %s! Please "
|
||||
+ "customize your field definitions to get to unique field names!";
|
||||
|
||||
public static String UNDERSCORE_ID = "_id";
|
||||
public static String UNDERSCORE_ID_REF = "$_id";
|
||||
|
||||
private final List<Field> fields;
|
||||
|
||||
/**
|
||||
* Creates a new {@link Fields} instance from the given {@link Fields}.
|
||||
*
|
||||
* @param fields must not be {@literal null} or empty.
|
||||
* @return
|
||||
*/
|
||||
public static Fields from(Field... fields) {
|
||||
|
||||
Assert.notNull(fields, "Fields must not be null!");
|
||||
return new Fields(Arrays.asList(fields));
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link Fields} instance for {@link Field}s with the given names.
|
||||
*
|
||||
* @param names must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
public static Fields fields(String... names) {
|
||||
|
||||
Assert.notNull(names, "Field names must not be null!");
|
||||
|
||||
List<Field> fields = new ArrayList<Field>();
|
||||
|
||||
for (String name : names) {
|
||||
fields.add(field(name));
|
||||
}
|
||||
|
||||
return new Fields(fields);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a {@link Field} with the given name.
|
||||
*
|
||||
* @param name must not be {@literal null} or empty.
|
||||
* @return
|
||||
*/
|
||||
public static Field field(String name) {
|
||||
return new AggregationField(name);
|
||||
}
|
||||
|
||||
public static Field field(String name, String target) {
|
||||
Assert.hasText(target, "Target must not be null or empty!");
|
||||
return new AggregationField(name, target);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link Fields} instance using the given {@link Field}s.
|
||||
*
|
||||
* @param fields must not be {@literal null}.
|
||||
*/
|
||||
private Fields(List<Field> fields) {
|
||||
|
||||
Assert.notNull(fields, "Fields must not be null!");
|
||||
|
||||
this.fields = verify(fields);
|
||||
}
|
||||
|
||||
private static final List<Field> verify(List<Field> fields) {
|
||||
|
||||
Map<String, Field> reference = new HashMap<String, Field>();
|
||||
|
||||
for (Field field : fields) {
|
||||
|
||||
String name = field.getName();
|
||||
Field found = reference.get(name);
|
||||
|
||||
if (found != null) {
|
||||
throw new IllegalArgumentException(String.format(AMBIGUOUS_EXCEPTION, name, found, field));
|
||||
}
|
||||
|
||||
reference.put(name, field);
|
||||
}
|
||||
|
||||
return fields;
|
||||
}
|
||||
|
||||
private Fields(Fields existing, Field tail) {
|
||||
|
||||
this.fields = new ArrayList<Field>(existing.fields.size() + 1);
|
||||
this.fields.addAll(existing.fields);
|
||||
this.fields.add(tail);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link Fields} instance with a new {@link Field} of the given name added.
|
||||
*
|
||||
* @param name must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
public Fields and(String name) {
|
||||
return and(new AggregationField(name));
|
||||
}
|
||||
|
||||
public Fields and(String name, String target) {
|
||||
return and(new AggregationField(name, target));
|
||||
}
|
||||
|
||||
public Fields and(Field field) {
|
||||
return new Fields(this, field);
|
||||
}
|
||||
|
||||
public Fields and(Fields fields) {
|
||||
|
||||
Fields result = this;
|
||||
|
||||
for (Field field : fields) {
|
||||
result = result.and(field);
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
public Field getField(String name) {
|
||||
|
||||
for (Field field : fields) {
|
||||
if (field.getName().equals(name)) {
|
||||
return field;
|
||||
}
|
||||
}
|
||||
|
||||
return null;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see java.lang.Iterable#iterator()
|
||||
*/
|
||||
@Override
|
||||
public Iterator<Field> iterator() {
|
||||
return fields.iterator();
|
||||
}
|
||||
|
||||
/**
|
||||
* Value object to encapsulate a field in an aggregation operation.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
static class AggregationField implements Field {
|
||||
|
||||
private final String name;
|
||||
private final String target;
|
||||
|
||||
/**
|
||||
* Creates an aggregation fieldwith the given name. As no target is set explicitly, the name will be used as target
|
||||
* as well.
|
||||
*
|
||||
* @param key
|
||||
*/
|
||||
public AggregationField(String key) {
|
||||
this(key, null);
|
||||
}
|
||||
|
||||
public AggregationField(String name, String target) {
|
||||
|
||||
String nameToSet = cleanUp(name);
|
||||
String targetToSet = cleanUp(target);
|
||||
|
||||
Assert.hasText(nameToSet, "AggregationField name must not be null or empty!");
|
||||
|
||||
if (target == null && name.contains(".")) {
|
||||
this.name = nameToSet.substring(nameToSet.indexOf(".") + 1);
|
||||
this.target = nameToSet;
|
||||
} else {
|
||||
this.name = nameToSet;
|
||||
this.target = targetToSet;
|
||||
}
|
||||
}
|
||||
|
||||
private static final String cleanUp(String source) {
|
||||
|
||||
if (source == null) {
|
||||
return source;
|
||||
}
|
||||
|
||||
int dollarIndex = source.lastIndexOf('$');
|
||||
return dollarIndex == -1 ? source : source.substring(dollarIndex + 1);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.Field#getKey()
|
||||
*/
|
||||
public String getName() {
|
||||
return name;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.Field#getAlias()
|
||||
*/
|
||||
public String getTarget() {
|
||||
return StringUtils.hasText(this.target) ? this.target : this.name;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.Field#isAliased()
|
||||
*/
|
||||
@Override
|
||||
public boolean isAliased() {
|
||||
return !getName().equals(getTarget());
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see java.lang.Object#toString()
|
||||
*/
|
||||
@Override
|
||||
public String toString() {
|
||||
return String.format("AggregationField - name: %s, target: %s", name, target);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see java.lang.Object#equals(java.lang.Object)
|
||||
*/
|
||||
@Override
|
||||
public boolean equals(Object obj) {
|
||||
|
||||
if (this == obj) {
|
||||
return true;
|
||||
}
|
||||
|
||||
if (!(obj instanceof AggregationField)) {
|
||||
return false;
|
||||
}
|
||||
|
||||
AggregationField that = (AggregationField) obj;
|
||||
|
||||
return this.name.equals(that.name) && ObjectUtils.nullSafeEquals(this.target, that.target);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see java.lang.Object#hashCode()
|
||||
*/
|
||||
@Override
|
||||
public int hashCode() {
|
||||
|
||||
int result = 17;
|
||||
|
||||
result += 31 * name.hashCode();
|
||||
result += 31 * ObjectUtils.nullSafeHashCode(target);
|
||||
|
||||
return result;
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,32 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.aggregation;
|
||||
|
||||
/**
|
||||
* {@link AggregationOperation} that exposes new {@link ExposedFields} that can be used for later aggregation pipeline
|
||||
* {@code AggregationOperation}s.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
public interface FieldsExposingAggregationOperation extends AggregationOperation {
|
||||
|
||||
/**
|
||||
* Returns the fields exposed by the {@link AggregationOperation}.
|
||||
*
|
||||
* @return will never be {@literal null}.
|
||||
*/
|
||||
ExposedFields getFields();
|
||||
}
|
||||
@@ -1,46 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.aggregation;
|
||||
|
||||
import org.springframework.data.mongodb.core.query.NearQuery;
|
||||
import org.springframework.util.Assert;
|
||||
|
||||
import com.mongodb.BasicDBObject;
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
* @author Thomas Darimont
|
||||
* @since 1.3
|
||||
*/
|
||||
public class GeoNearOperation implements AggregationOperation {
|
||||
|
||||
private final NearQuery nearQuery;
|
||||
|
||||
public GeoNearOperation(NearQuery nearQuery) {
|
||||
|
||||
Assert.notNull(nearQuery);
|
||||
this.nearQuery = nearQuery;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
|
||||
*/
|
||||
@Override
|
||||
public DBObject toDBObject(AggregationOperationContext context) {
|
||||
return new BasicDBObject("$geoNear", context.getMappedObject(nearQuery.toDBObject()));
|
||||
}
|
||||
}
|
||||
@@ -1,375 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.aggregation;
|
||||
|
||||
import java.util.ArrayList;
|
||||
import java.util.Arrays;
|
||||
import java.util.Collections;
|
||||
import java.util.List;
|
||||
import java.util.Locale;
|
||||
|
||||
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
|
||||
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
|
||||
import org.springframework.util.Assert;
|
||||
import org.springframework.util.StringUtils;
|
||||
|
||||
import com.mongodb.BasicDBObject;
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
* Encapsulates the aggregation framework {@code $group}-operation.
|
||||
*
|
||||
* @see http://docs.mongodb.org/manual/reference/aggregation/group/#stage._S_group
|
||||
* @author Sebastian Herold
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
* @since 1.3
|
||||
*/
|
||||
public class GroupOperation implements FieldsExposingAggregationOperation {
|
||||
|
||||
/**
|
||||
* Holds the non-synthetic fields which are the fields of the group-id structure.
|
||||
*/
|
||||
private final ExposedFields idFields;
|
||||
|
||||
private final List<Operation> operations;
|
||||
|
||||
/**
|
||||
* Creates a new {@link GroupOperation} including the given {@link Fields}.
|
||||
*
|
||||
* @param fields must not be {@literal null}.
|
||||
*/
|
||||
public GroupOperation(Fields fields) {
|
||||
|
||||
this.idFields = ExposedFields.nonSynthetic(fields);
|
||||
this.operations = new ArrayList<Operation>();
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link GroupOperation} from the given {@link GroupOperation}.
|
||||
*
|
||||
* @param groupOperation must not be {@literal null}.
|
||||
*/
|
||||
protected GroupOperation(GroupOperation groupOperation) {
|
||||
this(groupOperation, Collections.<Operation> emptyList());
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link GroupOperation} from the given {@link GroupOperation} and the given {@link Operation}s.
|
||||
*
|
||||
* @param groupOperation
|
||||
* @param nextOperations
|
||||
*/
|
||||
private GroupOperation(GroupOperation groupOperation, List<Operation> nextOperations) {
|
||||
|
||||
Assert.notNull(groupOperation, "GroupOperation must not be null!");
|
||||
Assert.notNull(nextOperations, "NextOperations must not be null!");
|
||||
|
||||
this.idFields = groupOperation.idFields;
|
||||
this.operations = new ArrayList<Operation>(nextOperations.size() + 1);
|
||||
this.operations.addAll(groupOperation.operations);
|
||||
this.operations.addAll(nextOperations);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link GroupOperation} from the current one adding the given {@link Operation}.
|
||||
*
|
||||
* @param operation must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
protected GroupOperation and(Operation operation) {
|
||||
return new GroupOperation(this, Arrays.asList(operation));
|
||||
}
|
||||
|
||||
/**
|
||||
* Builder for {@link GroupOperation}s on a field.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
public class GroupOperationBuilder {
|
||||
|
||||
private final GroupOperation groupOperation;
|
||||
private final Operation operation;
|
||||
|
||||
/**
|
||||
* Creates a new {@link GroupOperationBuilder} from the given {@link GroupOperation} and {@link Operation}.
|
||||
*
|
||||
* @param groupOperation
|
||||
* @param operation
|
||||
*/
|
||||
private GroupOperationBuilder(GroupOperation groupOperation, Operation operation) {
|
||||
|
||||
Assert.notNull(groupOperation, "GroupOperation must not be null!");
|
||||
Assert.notNull(operation, "Operation must not be null!");
|
||||
|
||||
this.groupOperation = groupOperation;
|
||||
this.operation = operation;
|
||||
}
|
||||
|
||||
/**
|
||||
* Allows to specify an alias for the new-operation operation.
|
||||
*
|
||||
* @param alias
|
||||
* @return
|
||||
*/
|
||||
public GroupOperation as(String alias) {
|
||||
return this.groupOperation.and(operation.withAlias(alias));
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Generates an {@link GroupOperationBuilder} for a {@code $sum}-expression.
|
||||
* <p>
|
||||
* Count expressions are emulated via {@code $sum: 1}.
|
||||
* <p>
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public GroupOperationBuilder count() {
|
||||
return newBuilder(GroupOps.SUM, null, 1);
|
||||
}
|
||||
|
||||
/**
|
||||
* Generates an {@link GroupOperationBuilder} for a {@code $sum}-expression for the given field-reference.
|
||||
*
|
||||
* @param reference
|
||||
* @return
|
||||
*/
|
||||
public GroupOperationBuilder sum(String reference) {
|
||||
return sum(reference, null);
|
||||
}
|
||||
|
||||
private GroupOperationBuilder sum(String reference, Object value) {
|
||||
return newBuilder(GroupOps.SUM, reference, value);
|
||||
}
|
||||
|
||||
/**
|
||||
* Generates an {@link GroupOperationBuilder} for an {@code $add_to_set}-expression for the given field-reference.
|
||||
*
|
||||
* @param reference
|
||||
* @return
|
||||
*/
|
||||
public GroupOperationBuilder addToSet(String reference) {
|
||||
return addToSet(reference, null);
|
||||
}
|
||||
|
||||
/**
|
||||
* Generates an {@link GroupOperationBuilder} for an {@code $add_to_set}-expression for the given value.
|
||||
*
|
||||
* @param value
|
||||
* @return
|
||||
*/
|
||||
public GroupOperationBuilder addToSet(Object value) {
|
||||
return addToSet(null, value);
|
||||
}
|
||||
|
||||
private GroupOperationBuilder addToSet(String reference, Object value) {
|
||||
return newBuilder(GroupOps.ADD_TO_SET, reference, value);
|
||||
}
|
||||
|
||||
/**
|
||||
* Generates an {@link GroupOperationBuilder} for an {@code $last}-expression for the given field-reference.
|
||||
*
|
||||
* @param reference
|
||||
* @return
|
||||
*/
|
||||
public GroupOperationBuilder last(String reference) {
|
||||
return newBuilder(GroupOps.LAST, reference, null);
|
||||
}
|
||||
|
||||
/**
|
||||
* Generates an {@link GroupOperationBuilder} for a {@code $first}-expression for the given field-reference.
|
||||
*
|
||||
* @param reference
|
||||
* @return
|
||||
*/
|
||||
public GroupOperationBuilder first(String reference) {
|
||||
return newBuilder(GroupOps.FIRST, reference, null);
|
||||
}
|
||||
|
||||
/**
|
||||
* Generates an {@link GroupOperationBuilder} for an {@code $avg}-expression for the given field-reference.
|
||||
*
|
||||
* @param reference
|
||||
* @return
|
||||
*/
|
||||
public GroupOperationBuilder avg(String reference) {
|
||||
return newBuilder(GroupOps.AVG, reference, null);
|
||||
}
|
||||
|
||||
/**
|
||||
* Generates an {@link GroupOperationBuilder} for an {@code $push}-expression for the given field-reference.
|
||||
*
|
||||
* @param reference
|
||||
* @return
|
||||
*/
|
||||
public GroupOperationBuilder push(String reference) {
|
||||
return push(reference, null);
|
||||
}
|
||||
|
||||
/**
|
||||
* Generates an {@link GroupOperationBuilder} for an {@code $push}-expression for the given value.
|
||||
*
|
||||
* @param value
|
||||
* @return
|
||||
*/
|
||||
public GroupOperationBuilder push(Object value) {
|
||||
return push(null, value);
|
||||
}
|
||||
|
||||
private GroupOperationBuilder push(String reference, Object value) {
|
||||
return newBuilder(GroupOps.PUSH, reference, value);
|
||||
}
|
||||
|
||||
/**
|
||||
* Generates an {@link GroupOperationBuilder} for an {@code $min}-expression that for the given field-reference.
|
||||
*
|
||||
* @param reference
|
||||
* @return
|
||||
*/
|
||||
public GroupOperationBuilder min(String reference) {
|
||||
return newBuilder(GroupOps.MIN, reference, null);
|
||||
}
|
||||
|
||||
/**
|
||||
* Generates an {@link GroupOperationBuilder} for an {@code $max}-expression that for the given field-reference.
|
||||
*
|
||||
* @param reference
|
||||
* @return
|
||||
*/
|
||||
public GroupOperationBuilder max(String reference) {
|
||||
return newBuilder(GroupOps.MAX, reference, null);
|
||||
}
|
||||
|
||||
private GroupOperationBuilder newBuilder(Keyword keyword, String reference, Object value) {
|
||||
return new GroupOperationBuilder(this, new Operation(keyword, null, reference, value));
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getFields()
|
||||
*/
|
||||
@Override
|
||||
public ExposedFields getFields() {
|
||||
|
||||
ExposedFields fields = this.idFields.and(new ExposedField(Fields.UNDERSCORE_ID, true));
|
||||
|
||||
for (Operation operation : operations) {
|
||||
fields = fields.and(operation.asField());
|
||||
}
|
||||
|
||||
return fields;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
|
||||
*/
|
||||
@Override
|
||||
public com.mongodb.DBObject toDBObject(AggregationOperationContext context) {
|
||||
|
||||
BasicDBObject operationObject = new BasicDBObject();
|
||||
|
||||
if (idFields.exposesNoNonSyntheticFields()) {
|
||||
|
||||
operationObject.put(Fields.UNDERSCORE_ID, null);
|
||||
|
||||
} else if (idFields.exposesSingleNonSyntheticFieldOnly()) {
|
||||
|
||||
FieldReference reference = context.getReference(idFields.iterator().next());
|
||||
operationObject.put(Fields.UNDERSCORE_ID, reference.toString());
|
||||
|
||||
} else {
|
||||
|
||||
BasicDBObject inner = new BasicDBObject();
|
||||
|
||||
for (ExposedField field : idFields) {
|
||||
FieldReference reference = context.getReference(field);
|
||||
inner.put(field.getName(), reference.toString());
|
||||
}
|
||||
|
||||
operationObject.put(Fields.UNDERSCORE_ID, inner);
|
||||
}
|
||||
|
||||
for (Operation operation : operations) {
|
||||
operationObject.putAll(operation.toDBObject(context));
|
||||
}
|
||||
|
||||
return new BasicDBObject("$group", operationObject);
|
||||
}
|
||||
|
||||
interface Keyword {
|
||||
|
||||
String toString();
|
||||
}
|
||||
|
||||
private static enum GroupOps implements Keyword {
|
||||
|
||||
SUM, LAST, FIRST, PUSH, AVG, MIN, MAX, ADD_TO_SET, COUNT;
|
||||
|
||||
@Override
|
||||
public String toString() {
|
||||
|
||||
String[] parts = name().split("_");
|
||||
|
||||
StringBuilder builder = new StringBuilder();
|
||||
|
||||
for (String part : parts) {
|
||||
String lowerCase = part.toLowerCase(Locale.US);
|
||||
builder.append(builder.length() == 0 ? lowerCase : StringUtils.capitalize(lowerCase));
|
||||
}
|
||||
|
||||
return "$" + builder.toString();
|
||||
}
|
||||
}
|
||||
|
||||
static class Operation implements AggregationOperation {
|
||||
|
||||
private final Keyword op;
|
||||
private final String key;
|
||||
private final String reference;
|
||||
private final Object value;
|
||||
|
||||
public Operation(Keyword op, String key, String reference, Object value) {
|
||||
|
||||
this.op = op;
|
||||
this.key = key;
|
||||
this.reference = reference;
|
||||
this.value = value;
|
||||
}
|
||||
|
||||
public Operation withAlias(String key) {
|
||||
return new Operation(op, key, reference, value);
|
||||
}
|
||||
|
||||
public ExposedField asField() {
|
||||
return new ExposedField(key, true);
|
||||
}
|
||||
|
||||
public DBObject toDBObject(AggregationOperationContext context) {
|
||||
return new BasicDBObject(key, new BasicDBObject(op.toString(), getValue(context)));
|
||||
}
|
||||
|
||||
public Object getValue(AggregationOperationContext context) {
|
||||
return reference == null ? value : context.getReference(reference).toString();
|
||||
}
|
||||
|
||||
@Override
|
||||
public String toString() {
|
||||
return "Operation [op=" + op + ", key=" + key + ", reference=" + reference + ", value=" + value + "]";
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,52 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.aggregation;
|
||||
|
||||
import org.springframework.util.Assert;
|
||||
|
||||
import com.mongodb.BasicDBObject;
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
* Encapsulates the {@code $limit}-operation
|
||||
*
|
||||
* @see http://docs.mongodb.org/manual/reference/aggregation/limit/
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
* @since 1.3
|
||||
*/
|
||||
class LimitOperation implements AggregationOperation {
|
||||
|
||||
private final long maxElements;
|
||||
|
||||
/**
|
||||
* @param maxElements Number of documents to consider.
|
||||
*/
|
||||
public LimitOperation(long maxElements) {
|
||||
|
||||
Assert.isTrue(maxElements >= 0, "Maximum number of elements must be greater or equal to zero!");
|
||||
this.maxElements = maxElements;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
|
||||
*/
|
||||
@Override
|
||||
public DBObject toDBObject(AggregationOperationContext context) {
|
||||
return new BasicDBObject("$limit", maxElements);
|
||||
}
|
||||
}
|
||||
@@ -1,56 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.aggregation;
|
||||
|
||||
import org.springframework.data.mongodb.core.query.Criteria;
|
||||
import org.springframework.util.Assert;
|
||||
|
||||
import com.mongodb.BasicDBObject;
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
* Encapsulates the {@code $match}-operation
|
||||
*
|
||||
* @see http://docs.mongodb.org/manual/reference/aggregation/match/
|
||||
* @author Sebastian Herold
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
* @since 1.3
|
||||
*/
|
||||
public class MatchOperation implements AggregationOperation {
|
||||
|
||||
private final Criteria criteria;
|
||||
|
||||
/**
|
||||
* Creates a new {@link MatchOperation} for the given {@link Criteria}.
|
||||
*
|
||||
* @param criteria must not be {@literal null}.
|
||||
*/
|
||||
public MatchOperation(Criteria criteria) {
|
||||
|
||||
Assert.notNull(criteria, "Criteria must not be null!");
|
||||
this.criteria = criteria;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
|
||||
*/
|
||||
@Override
|
||||
public DBObject toDBObject(AggregationOperationContext context) {
|
||||
return new BasicDBObject("$match", context.getMappedObject(criteria.getCriteriaObject()));
|
||||
}
|
||||
}
|
||||
@@ -1,774 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.aggregation;
|
||||
|
||||
import java.util.ArrayList;
|
||||
import java.util.Arrays;
|
||||
import java.util.Collections;
|
||||
import java.util.List;
|
||||
|
||||
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
|
||||
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
|
||||
import org.springframework.data.mongodb.core.aggregation.ProjectionOperation.ProjectionOperationBuilder.FieldProjection;
|
||||
import org.springframework.util.Assert;
|
||||
|
||||
import com.mongodb.BasicDBList;
|
||||
import com.mongodb.BasicDBObject;
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
* Encapsulates the aggregation framework {@code $project}-operation. Projection of field to be used in an
|
||||
* {@link Aggregation}. A projection is similar to a {@link Field} inclusion/exclusion but more powerful. It can
|
||||
* generate new fields, change values of given field etc.
|
||||
* <p>
|
||||
*
|
||||
* @see http://docs.mongodb.org/manual/reference/aggregation/project/
|
||||
* @author Tobias Trelle
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
* @since 1.3
|
||||
*/
|
||||
public class ProjectionOperation implements FieldsExposingAggregationOperation {
|
||||
|
||||
private static final List<Projection> NONE = Collections.emptyList();
|
||||
private static final String EXCLUSION_ERROR = "Exclusion of field %s not allowed. Projections by the mongodb "
|
||||
+ "aggregation framework only support the exclusion of the %s field!";
|
||||
|
||||
private final List<Projection> projections;
|
||||
|
||||
/**
|
||||
* Creates a new empty {@link ProjectionOperation}.
|
||||
*/
|
||||
public ProjectionOperation() {
|
||||
this(NONE, NONE);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link ProjectionOperation} including the given {@link Fields}.
|
||||
*
|
||||
* @param fields must not be {@literal null}.
|
||||
*/
|
||||
public ProjectionOperation(Fields fields) {
|
||||
this(NONE, ProjectionOperationBuilder.FieldProjection.from(fields));
|
||||
}
|
||||
|
||||
/**
|
||||
* Copy constructor to allow building up {@link ProjectionOperation} instances from already existing
|
||||
* {@link Projection}s.
|
||||
*
|
||||
* @param current must not be {@literal null}.
|
||||
* @param projections must not be {@literal null}.
|
||||
*/
|
||||
private ProjectionOperation(List<? extends Projection> current, List<? extends Projection> projections) {
|
||||
|
||||
Assert.notNull(current, "Current projections must not be null!");
|
||||
Assert.notNull(projections, "Projections must not be null!");
|
||||
|
||||
this.projections = new ArrayList<ProjectionOperation.Projection>(current.size() + projections.size());
|
||||
this.projections.addAll(current);
|
||||
this.projections.addAll(projections);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link ProjectionOperation} with the current {@link Projection}s and the given one.
|
||||
*
|
||||
* @param projection must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
private ProjectionOperation and(Projection projection) {
|
||||
return new ProjectionOperation(this.projections, Arrays.asList(projection));
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link ProjectionOperation} with the current {@link Projection}s replacing the last current one with
|
||||
* the given one.
|
||||
*
|
||||
* @param projection must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
private ProjectionOperation andReplaceLastOneWith(Projection projection) {
|
||||
|
||||
List<Projection> projections = this.projections.isEmpty() ? Collections.<Projection> emptyList() : this.projections
|
||||
.subList(0, this.projections.size() - 1);
|
||||
return new ProjectionOperation(projections, Arrays.asList(projection));
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link ProjectionOperationBuilder} to define a projection for the field with the given name.
|
||||
*
|
||||
* @param name must not be {@literal null} or empty.
|
||||
* @return
|
||||
*/
|
||||
public ProjectionOperationBuilder and(String name) {
|
||||
return new ProjectionOperationBuilder(name, this, null);
|
||||
}
|
||||
|
||||
public ExpressionProjectionOperationBuilder andExpression(String expression, Object... params) {
|
||||
return new ExpressionProjectionOperationBuilder(expression, this, params);
|
||||
}
|
||||
|
||||
/**
|
||||
* Excludes the given fields from the projection.
|
||||
*
|
||||
* @param fieldNames must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
public ProjectionOperation andExclude(String... fieldNames) {
|
||||
|
||||
for (String fieldName : fieldNames) {
|
||||
Assert.isTrue(Fields.UNDERSCORE_ID.equals(fieldName),
|
||||
String.format(EXCLUSION_ERROR, fieldName, Fields.UNDERSCORE_ID));
|
||||
}
|
||||
|
||||
List<FieldProjection> excludeProjections = FieldProjection.from(Fields.fields(fieldNames), false);
|
||||
return new ProjectionOperation(this.projections, excludeProjections);
|
||||
}
|
||||
|
||||
/**
|
||||
* Includes the given fields into the projection.
|
||||
*
|
||||
* @param fieldNames must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
public ProjectionOperation andInclude(String... fieldNames) {
|
||||
|
||||
List<FieldProjection> projections = FieldProjection.from(Fields.fields(fieldNames), true);
|
||||
return new ProjectionOperation(this.projections, projections);
|
||||
}
|
||||
|
||||
/**
|
||||
* Includes the given fields into the projection.
|
||||
*
|
||||
* @param fields must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
public ProjectionOperation andInclude(Fields fields) {
|
||||
return new ProjectionOperation(this.projections, FieldProjection.from(fields, true));
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.FieldsExposingAggregationOperation#getFields()
|
||||
*/
|
||||
@Override
|
||||
public ExposedFields getFields() {
|
||||
|
||||
ExposedFields fields = null;
|
||||
|
||||
for (Projection projection : projections) {
|
||||
ExposedField field = projection.getExposedField();
|
||||
fields = fields == null ? ExposedFields.from(field) : fields.and(field);
|
||||
}
|
||||
|
||||
return fields;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
|
||||
*/
|
||||
@Override
|
||||
public DBObject toDBObject(AggregationOperationContext context) {
|
||||
|
||||
BasicDBObject fieldObject = new BasicDBObject();
|
||||
|
||||
for (Projection projection : projections) {
|
||||
fieldObject.putAll(projection.toDBObject(context));
|
||||
}
|
||||
|
||||
return new BasicDBObject("$project", fieldObject);
|
||||
}
|
||||
|
||||
/**
|
||||
* Base class for {@link ProjectionOperationBuilder}s.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
private static abstract class AbstractProjectionOperationBuilder implements AggregationOperation {
|
||||
|
||||
protected final Object value;
|
||||
protected final ProjectionOperation operation;
|
||||
|
||||
/**
|
||||
* Creates a new {@link AbstractProjectionOperationBuilder} fot the given value and {@link ProjectionOperation}.
|
||||
*
|
||||
* @param value must not be {@literal null}.
|
||||
* @param operation must not be {@literal null}.
|
||||
*/
|
||||
public AbstractProjectionOperationBuilder(Object value, ProjectionOperation operation) {
|
||||
|
||||
Assert.notNull(value, "value must not be null or empty!");
|
||||
Assert.notNull(operation, "ProjectionOperation must not be null!");
|
||||
|
||||
this.value = value;
|
||||
this.operation = operation;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
|
||||
*/
|
||||
@Override
|
||||
public DBObject toDBObject(AggregationOperationContext context) {
|
||||
return this.operation.toDBObject(context);
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the finally to be applied {@link ProjectionOperation} with the given alias.
|
||||
*
|
||||
* @param alias will never be {@literal null} or empty.
|
||||
* @return
|
||||
*/
|
||||
public abstract ProjectionOperation as(String alias);
|
||||
}
|
||||
|
||||
/**
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
public static class ExpressionProjectionOperationBuilder extends AbstractProjectionOperationBuilder {
|
||||
|
||||
private final Object[] params;
|
||||
|
||||
/**
|
||||
* Creates a new {@link ExpressionProjectionOperationBuilder} for the given value, {@link ProjectionOperation} and
|
||||
* parameters.
|
||||
*
|
||||
* @param value must not be {@literal null}.
|
||||
* @param operation must not be {@literal null}.
|
||||
* @param parameters
|
||||
*/
|
||||
public ExpressionProjectionOperationBuilder(Object value, ProjectionOperation operation, Object[] parameters) {
|
||||
|
||||
super(value, operation);
|
||||
this.params = parameters;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.AbstractProjectionOperationBuilder#as(java.lang.String)
|
||||
*/
|
||||
@Override
|
||||
public ProjectionOperation as(String alias) {
|
||||
|
||||
Field expressionField = Fields.field(alias, "expr");
|
||||
return this.operation.and(new ExpressionProjection(expressionField, this.value.toString(), params));
|
||||
}
|
||||
|
||||
/**
|
||||
* A {@link Projection} based on a SpEL expression.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
static class ExpressionProjection extends Projection {
|
||||
|
||||
private static final SpelExpressionTransformer TRANSFORMER = new SpelExpressionTransformer();
|
||||
|
||||
private final String expression;
|
||||
private final Object[] params;
|
||||
|
||||
/**
|
||||
* Creates a new {@link ExpressionProjection} for the given field, SpEL expression and parameters.
|
||||
*
|
||||
* @param field must not be {@literal null}.
|
||||
* @param expression must not be {@literal null} or empty.
|
||||
* @param parameters must not be {@literal null}.
|
||||
*/
|
||||
public ExpressionProjection(Field field, String expression, Object[] parameters) {
|
||||
|
||||
super(field);
|
||||
|
||||
Assert.hasText(expression, "Expression must not be null!");
|
||||
Assert.notNull(parameters, "Parameters must not be null!");
|
||||
|
||||
this.expression = expression;
|
||||
this.params = parameters;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.Projection#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
|
||||
*/
|
||||
@Override
|
||||
public DBObject toDBObject(AggregationOperationContext context) {
|
||||
return new BasicDBObject(getExposedField().getName(), TRANSFORMER.transform(expression, context, params));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Builder for {@link ProjectionOperation}s on a field.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
public static class ProjectionOperationBuilder extends AbstractProjectionOperationBuilder {
|
||||
|
||||
private final String name;
|
||||
private final ProjectionOperation operation;
|
||||
private final OperationProjection previousProjection;
|
||||
|
||||
/**
|
||||
* Creates a new {@link ProjectionOperationBuilder} for the field with the given name on top of the given
|
||||
* {@link ProjectionOperation}.
|
||||
*
|
||||
* @param name must not be {@literal null} or empty.
|
||||
* @param operation must not be {@literal null}.
|
||||
* @param previousProjection the previous operation projection, may be {@literal null}.
|
||||
*/
|
||||
public ProjectionOperationBuilder(String name, ProjectionOperation operation, OperationProjection previousProjection) {
|
||||
super(name, operation);
|
||||
|
||||
this.name = name;
|
||||
this.operation = operation;
|
||||
this.previousProjection = previousProjection;
|
||||
}
|
||||
|
||||
/**
|
||||
* Projects the result of the previous operation onto the current field. Will automatically add an exclusion for
|
||||
* {@code _id} as what would be held in it by default will now go into the field just projected into.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public ProjectionOperation previousOperation() {
|
||||
|
||||
return this.operation.andExclude(Fields.UNDERSCORE_ID) //
|
||||
.and(new PreviousOperationProjection(name));
|
||||
}
|
||||
|
||||
/**
|
||||
* Defines a nested field binding for the current field.
|
||||
*
|
||||
* @param fields must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
public ProjectionOperation nested(Fields fields) {
|
||||
return this.operation.and(new NestedFieldProjection(name, fields));
|
||||
}
|
||||
|
||||
/**
|
||||
* Allows to specify an alias for the previous projection operation.
|
||||
*
|
||||
* @param string
|
||||
* @return
|
||||
*/
|
||||
@Override
|
||||
public ProjectionOperation as(String alias) {
|
||||
|
||||
if (this.previousProjection != null) {
|
||||
return this.operation.andReplaceLastOneWith(this.previousProjection.withAlias(alias));
|
||||
} else {
|
||||
return this.operation.and(new FieldProjection(Fields.field(alias, name), null));
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Generates an {@code $add} expression that adds the given number to the previously mentioned field.
|
||||
*
|
||||
* @param number
|
||||
* @return
|
||||
*/
|
||||
public ProjectionOperationBuilder plus(Number number) {
|
||||
|
||||
Assert.notNull(number, "Number must not be null!");
|
||||
return project("add", number);
|
||||
}
|
||||
|
||||
/**
|
||||
* Generates an {@code $add} expression that adds the value of the given field to the previously mentioned field.
|
||||
*
|
||||
* @param fieldReference
|
||||
* @return
|
||||
*/
|
||||
public ProjectionOperationBuilder plus(String fieldReference) {
|
||||
|
||||
Assert.notNull(fieldReference, "Field reference must not be null!");
|
||||
return project("add", Fields.field(fieldReference));
|
||||
}
|
||||
|
||||
/**
|
||||
* Generates an {@code $subtract} expression that subtracts the given number to the previously mentioned field.
|
||||
*
|
||||
* @param number
|
||||
* @return
|
||||
*/
|
||||
public ProjectionOperationBuilder minus(Number number) {
|
||||
|
||||
Assert.notNull(number, "Number must not be null!");
|
||||
return project("subtract", number);
|
||||
}
|
||||
|
||||
/**
|
||||
* Generates an {@code $subtract} expression that subtracts the value of the given field to the previously mentioned
|
||||
* field.
|
||||
*
|
||||
* @param fieldReference
|
||||
* @return
|
||||
*/
|
||||
public ProjectionOperationBuilder minus(String fieldReference) {
|
||||
|
||||
Assert.notNull(fieldReference, "Field reference must not be null!");
|
||||
return project("subtract", Fields.field(fieldReference));
|
||||
}
|
||||
|
||||
/**
|
||||
* Generates an {@code $multiply} expression that multiplies the given number with the previously mentioned field.
|
||||
*
|
||||
* @param number
|
||||
* @return
|
||||
*/
|
||||
public ProjectionOperationBuilder multiply(Number number) {
|
||||
|
||||
Assert.notNull(number, "Number must not be null!");
|
||||
return project("multiply", number);
|
||||
}
|
||||
|
||||
/**
|
||||
* Generates an {@code $multiply} expression that multiplies the value of the given field with the previously
|
||||
* mentioned field.
|
||||
*
|
||||
* @param fieldReference
|
||||
* @return
|
||||
*/
|
||||
public ProjectionOperationBuilder multiply(String fieldReference) {
|
||||
|
||||
Assert.notNull(fieldReference, "Field reference must not be null!");
|
||||
return project("multiply", Fields.field(fieldReference));
|
||||
}
|
||||
|
||||
/**
|
||||
* Generates an {@code $divide} expression that divides the previously mentioned field by the given number.
|
||||
*
|
||||
* @param number
|
||||
* @return
|
||||
*/
|
||||
public ProjectionOperationBuilder divide(Number number) {
|
||||
|
||||
Assert.notNull(number, "Number must not be null!");
|
||||
Assert.isTrue(Math.abs(number.intValue()) != 0, "Number must not be zero!");
|
||||
return project("divide", number);
|
||||
}
|
||||
|
||||
/**
|
||||
* Generates an {@code $divide} expression that divides the value of the given field by the previously mentioned
|
||||
* field.
|
||||
*
|
||||
* @param fieldReference
|
||||
* @return
|
||||
*/
|
||||
public ProjectionOperationBuilder divide(String fieldReference) {
|
||||
|
||||
Assert.notNull(fieldReference, "Field reference must not be null!");
|
||||
return project("divide", Fields.field(fieldReference));
|
||||
}
|
||||
|
||||
/**
|
||||
* Generates an {@code $mod} expression that divides the previously mentioned field by the given number and returns
|
||||
* the remainder.
|
||||
*
|
||||
* @param number
|
||||
* @return
|
||||
*/
|
||||
public ProjectionOperationBuilder mod(Number number) {
|
||||
|
||||
Assert.notNull(number, "Number must not be null!");
|
||||
Assert.isTrue(Math.abs(number.intValue()) != 0, "Number must not be zero!");
|
||||
return project("mod", number);
|
||||
}
|
||||
|
||||
/**
|
||||
* Generates an {@code $mod} expression that divides the value of the given field by the previously mentioned field
|
||||
* and returns the remainder.
|
||||
*
|
||||
* @param fieldReference
|
||||
* @return
|
||||
*/
|
||||
public ProjectionOperationBuilder mod(String fieldReference) {
|
||||
|
||||
Assert.notNull(fieldReference, "Field reference must not be null!");
|
||||
return project("mod", Fields.field(fieldReference));
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
|
||||
*/
|
||||
@Override
|
||||
public DBObject toDBObject(AggregationOperationContext context) {
|
||||
return this.operation.toDBObject(context);
|
||||
}
|
||||
|
||||
/**
|
||||
* Adds a generic projection for the current field.
|
||||
*
|
||||
* @param operation the operation key, e.g. {@code $add}.
|
||||
* @param values the values to be set for the projection operation.
|
||||
* @return
|
||||
*/
|
||||
public ProjectionOperationBuilder project(String operation, Object... values) {
|
||||
OperationProjection projectionOperation = new OperationProjection(Fields.field(name), operation, values);
|
||||
return new ProjectionOperationBuilder(name, this.operation.and(projectionOperation), projectionOperation);
|
||||
}
|
||||
|
||||
/**
|
||||
* A {@link Projection} to pull in the result of the previous operation.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
static class PreviousOperationProjection extends Projection {
|
||||
|
||||
private final String name;
|
||||
|
||||
/**
|
||||
* Creates a new {@link PreviousOperationProjection} for the field with the given name.
|
||||
*
|
||||
* @param name must not be {@literal null} or empty.
|
||||
*/
|
||||
public PreviousOperationProjection(String name) {
|
||||
super(Fields.field(name));
|
||||
this.name = name;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.Projection#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
|
||||
*/
|
||||
@Override
|
||||
public DBObject toDBObject(AggregationOperationContext context) {
|
||||
return new BasicDBObject(name, Fields.UNDERSCORE_ID_REF);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* A {@link FieldProjection} to map a result of a previous {@link AggregationOperation} to a new field.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
static class FieldProjection extends Projection {
|
||||
|
||||
private final Field field;
|
||||
private final Object value;
|
||||
|
||||
/**
|
||||
* Creates a new {@link FieldProjection} for the field of the given name, assigning the given value.
|
||||
*
|
||||
* @param name must not be {@literal null} or empty.
|
||||
* @param value
|
||||
*/
|
||||
public FieldProjection(String name, Object value) {
|
||||
this(Fields.field(name), value);
|
||||
}
|
||||
|
||||
private FieldProjection(Field field, Object value) {
|
||||
|
||||
super(field);
|
||||
|
||||
this.field = field;
|
||||
this.value = value;
|
||||
}
|
||||
|
||||
/**
|
||||
* Factory method to easily create {@link FieldProjection}s for the given {@link Fields}. Fields are projected as
|
||||
* references with their given name. A field {@code foo} will be projected as: {@code foo : 1 } .
|
||||
*
|
||||
* @param fields the {@link Fields} to in- or exclude, must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
public static List<? extends Projection> from(Fields fields) {
|
||||
return from(fields, null);
|
||||
}
|
||||
|
||||
/**
|
||||
* Factory method to easily create {@link FieldProjection}s for the given {@link Fields}.
|
||||
*
|
||||
* @param fields the {@link Fields} to in- or exclude, must not be {@literal null}.
|
||||
* @param value to use for the given field.
|
||||
* @return
|
||||
*/
|
||||
public static List<FieldProjection> from(Fields fields, Object value) {
|
||||
|
||||
Assert.notNull(fields, "Fields must not be null!");
|
||||
List<FieldProjection> projections = new ArrayList<FieldProjection>();
|
||||
|
||||
for (Field field : fields) {
|
||||
projections.add(new FieldProjection(field, value));
|
||||
}
|
||||
|
||||
return projections;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.Projection#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
|
||||
*/
|
||||
@Override
|
||||
public DBObject toDBObject(AggregationOperationContext context) {
|
||||
return new BasicDBObject(field.getName(), renderFieldValue(context));
|
||||
}
|
||||
|
||||
private Object renderFieldValue(AggregationOperationContext context) {
|
||||
|
||||
// implicit reference or explicit include?
|
||||
if (value == null || Boolean.TRUE.equals(value)) {
|
||||
|
||||
// check whether referenced field exists in the context
|
||||
FieldReference reference = context.getReference(field.getTarget());
|
||||
return reference.isSynthetic() && !field.isAliased() ? 1 : reference.toString();
|
||||
|
||||
} else if (Boolean.FALSE.equals(value)) {
|
||||
|
||||
// render field as excluded
|
||||
return 0;
|
||||
}
|
||||
|
||||
return value;
|
||||
}
|
||||
}
|
||||
|
||||
static class OperationProjection extends Projection {
|
||||
|
||||
private final Field field;
|
||||
private final String operation;
|
||||
private final List<Object> values;
|
||||
|
||||
/**
|
||||
* Creates a new {@link OperationProjection} for the given field.
|
||||
*
|
||||
* @param name the name of the field to add the operation projection for, must not be {@literal null} or empty.
|
||||
* @param operation the actual operation key, must not be {@literal null} or empty.
|
||||
* @param values the values to pass into the operation, must not be {@literal null}.
|
||||
*/
|
||||
public OperationProjection(Field field, String operation, Object[] values) {
|
||||
|
||||
super(field);
|
||||
|
||||
Assert.hasText(operation, "Operation must not be null or empty!");
|
||||
Assert.notNull(values, "Values must not be null!");
|
||||
|
||||
this.field = field;
|
||||
this.operation = operation;
|
||||
this.values = Arrays.asList(values);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.Projection#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
|
||||
*/
|
||||
@Override
|
||||
public DBObject toDBObject(AggregationOperationContext context) {
|
||||
|
||||
BasicDBList values = new BasicDBList();
|
||||
values.addAll(buildReferences(context));
|
||||
|
||||
DBObject inner = new BasicDBObject("$" + operation, values);
|
||||
|
||||
return new BasicDBObject(this.field.getName(), inner);
|
||||
}
|
||||
|
||||
private List<Object> buildReferences(AggregationOperationContext context) {
|
||||
|
||||
List<Object> result = new ArrayList<Object>(values.size());
|
||||
result.add(context.getReference(field.getTarget()).toString());
|
||||
|
||||
for (Object element : values) {
|
||||
result.add(element instanceof Field ? context.getReference((Field) element).toString() : element);
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new instance of this {@link OperationProjection} with the given alias.
|
||||
*
|
||||
* @param alias the alias to set
|
||||
* @return
|
||||
*/
|
||||
public OperationProjection withAlias(String alias) {
|
||||
return new OperationProjection(Fields.field(alias, this.field.getName()), operation, values.toArray());
|
||||
}
|
||||
}
|
||||
|
||||
static class NestedFieldProjection extends Projection {
|
||||
|
||||
private final String name;
|
||||
private final Fields fields;
|
||||
|
||||
public NestedFieldProjection(String name, Fields fields) {
|
||||
|
||||
super(Fields.field(name));
|
||||
this.name = name;
|
||||
this.fields = fields;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.Projection#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
|
||||
*/
|
||||
@Override
|
||||
public DBObject toDBObject(AggregationOperationContext context) {
|
||||
|
||||
DBObject nestedObject = new BasicDBObject();
|
||||
|
||||
for (Field field : fields) {
|
||||
nestedObject.put(field.getName(), context.getReference(field.getTarget()).toString());
|
||||
}
|
||||
|
||||
return new BasicDBObject(name, nestedObject);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Base class for {@link Projection} implementations.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
private static abstract class Projection {
|
||||
|
||||
private final ExposedField field;
|
||||
|
||||
/**
|
||||
* Creates new {@link Projection} for the given {@link Field}.
|
||||
*
|
||||
* @param field must not be {@literal null}.
|
||||
*/
|
||||
public Projection(Field field) {
|
||||
|
||||
Assert.notNull(field, "Field must not be null!");
|
||||
this.field = new ExposedField(field, true);
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the field exposed by the {@link Projection}.
|
||||
*
|
||||
* @return will never be {@literal null}.
|
||||
*/
|
||||
public ExposedField getExposedField() {
|
||||
return field;
|
||||
}
|
||||
|
||||
/**
|
||||
* Renders the current {@link Projection} into a {@link DBObject} based on the given
|
||||
* {@link AggregationOperationContext}.
|
||||
*
|
||||
* @param context will never be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
public abstract DBObject toDBObject(AggregationOperationContext context);
|
||||
}
|
||||
|
||||
}
|
||||
@@ -1,54 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.aggregation;
|
||||
|
||||
import org.springframework.util.Assert;
|
||||
|
||||
import com.mongodb.BasicDBObject;
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
* Encapsulates the aggregation framework {@code $skip}-operation.
|
||||
*
|
||||
* @see http://docs.mongodb.org/manual/reference/aggregation/skip/
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
* @since 1.3
|
||||
*/
|
||||
public class SkipOperation implements AggregationOperation {
|
||||
|
||||
private final long skipCount;
|
||||
|
||||
/**
|
||||
* Creates a new {@link SkipOperation} skipping the given number of elements.
|
||||
*
|
||||
* @param skipCount number of documents to skip.
|
||||
*/
|
||||
public SkipOperation(long skipCount) {
|
||||
|
||||
Assert.isTrue(skipCount >= 0, "Skip count must not be negative!");
|
||||
this.skipCount = skipCount;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
|
||||
*/
|
||||
@Override
|
||||
public DBObject toDBObject(AggregationOperationContext context) {
|
||||
return new BasicDBObject("$skip", skipCount);
|
||||
}
|
||||
}
|
||||
@@ -1,76 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.aggregation;
|
||||
|
||||
import org.springframework.data.domain.Sort;
|
||||
import org.springframework.data.domain.Sort.Direction;
|
||||
import org.springframework.data.domain.Sort.Order;
|
||||
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
|
||||
import org.springframework.util.Assert;
|
||||
|
||||
import com.mongodb.BasicDBObject;
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
* Encapsulates the aggregation framework {@code $sort}-operation.
|
||||
*
|
||||
* @see http://docs.mongodb.org/manual/reference/aggregation/sort/#pipe._S_sort
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
* @since 1.3
|
||||
*/
|
||||
public class SortOperation implements AggregationOperation {
|
||||
|
||||
private final Sort sort;
|
||||
|
||||
/**
|
||||
* Creates a new {@link SortOperation} for the given {@link Sort} instance.
|
||||
*
|
||||
* @param sort must not be {@literal null}.
|
||||
*/
|
||||
public SortOperation(Sort sort) {
|
||||
|
||||
Assert.notNull(sort, "Sort must not be null!");
|
||||
this.sort = sort;
|
||||
}
|
||||
|
||||
public SortOperation and(Direction direction, String... fields) {
|
||||
return and(new Sort(direction, fields));
|
||||
}
|
||||
|
||||
public SortOperation and(Sort sort) {
|
||||
return new SortOperation(this.sort.and(sort));
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
|
||||
*/
|
||||
@Override
|
||||
public DBObject toDBObject(AggregationOperationContext context) {
|
||||
|
||||
BasicDBObject object = new BasicDBObject();
|
||||
|
||||
for (Order order : sort) {
|
||||
|
||||
// Check reference
|
||||
FieldReference reference = context.getReference(order.getProperty());
|
||||
object.put(reference.getRaw(), order.isAscending() ? 1 : -1);
|
||||
}
|
||||
|
||||
return new BasicDBObject("$sort", object);
|
||||
}
|
||||
}
|
||||
@@ -1,513 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.aggregation;
|
||||
|
||||
import static org.springframework.data.mongodb.util.DBObjectUtils.*;
|
||||
|
||||
import java.util.ArrayList;
|
||||
import java.util.Collections;
|
||||
import java.util.List;
|
||||
|
||||
import org.springframework.core.GenericTypeResolver;
|
||||
import org.springframework.data.mongodb.core.spel.ExpressionNode;
|
||||
import org.springframework.data.mongodb.core.spel.ExpressionTransformationContextSupport;
|
||||
import org.springframework.data.mongodb.core.spel.LiteralNode;
|
||||
import org.springframework.data.mongodb.core.spel.MethodReferenceNode;
|
||||
import org.springframework.data.mongodb.core.spel.OperatorNode;
|
||||
import org.springframework.expression.spel.ExpressionState;
|
||||
import org.springframework.expression.spel.SpelNode;
|
||||
import org.springframework.expression.spel.SpelParserConfiguration;
|
||||
import org.springframework.expression.spel.ast.CompoundExpression;
|
||||
import org.springframework.expression.spel.ast.Indexer;
|
||||
import org.springframework.expression.spel.ast.InlineList;
|
||||
import org.springframework.expression.spel.ast.PropertyOrFieldReference;
|
||||
import org.springframework.expression.spel.standard.SpelExpression;
|
||||
import org.springframework.expression.spel.standard.SpelExpressionParser;
|
||||
import org.springframework.expression.spel.support.StandardEvaluationContext;
|
||||
import org.springframework.util.Assert;
|
||||
import org.springframework.util.NumberUtils;
|
||||
|
||||
import com.mongodb.BasicDBList;
|
||||
import com.mongodb.BasicDBObject;
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
* Renders the AST of a SpEL expression as a MongoDB Aggregation Framework projection expression.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
class SpelExpressionTransformer implements AggregationExpressionTransformer {
|
||||
|
||||
// TODO: remove explicit usage of a configuration once SPR-11031 gets fixed
|
||||
private static final SpelParserConfiguration CONFIG = new SpelParserConfiguration(false, false);
|
||||
private static final SpelExpressionParser PARSER = new SpelExpressionParser(CONFIG);
|
||||
private final List<ExpressionNodeConversion<? extends ExpressionNode>> conversions;
|
||||
|
||||
/**
|
||||
* Creates a new {@link SpelExpressionTransformer}.
|
||||
*/
|
||||
public SpelExpressionTransformer() {
|
||||
|
||||
List<ExpressionNodeConversion<? extends ExpressionNode>> conversions = new ArrayList<ExpressionNodeConversion<? extends ExpressionNode>>();
|
||||
conversions.add(new OperatorNodeConversion(this));
|
||||
conversions.add(new LiteralNodeConversion(this));
|
||||
conversions.add(new IndexerNodeConversion(this));
|
||||
conversions.add(new InlineListNodeConversion(this));
|
||||
conversions.add(new PropertyOrFieldReferenceNodeConversion(this));
|
||||
conversions.add(new CompoundExpressionNodeConversion(this));
|
||||
conversions.add(new MethodReferenceNodeConversion(this));
|
||||
|
||||
this.conversions = Collections.unmodifiableList(conversions);
|
||||
}
|
||||
|
||||
/**
|
||||
* Transforms the given SpEL expression to a corresponding MongoDB expression against the given
|
||||
* {@link AggregationOperationContext} {@code context}.
|
||||
* <p>
|
||||
* Exposes the given @{code params} as <code>[0] ... [n]</code>.
|
||||
*
|
||||
* @param expression must not be {@literal null}
|
||||
* @param context must not be {@literal null}
|
||||
* @param params must not be {@literal null}
|
||||
* @return
|
||||
*/
|
||||
public Object transform(String expression, AggregationOperationContext context, Object... params) {
|
||||
|
||||
Assert.notNull(expression, "Expression must not be null!");
|
||||
Assert.notNull(context, "AggregationOperationContext must not be null!");
|
||||
Assert.notNull(params, "Parameters must not be null!");
|
||||
|
||||
SpelExpression spelExpression = (SpelExpression) PARSER.parseExpression(expression);
|
||||
ExpressionState state = new ExpressionState(new StandardEvaluationContext(params), CONFIG);
|
||||
ExpressionNode node = ExpressionNode.from(spelExpression.getAST(), state);
|
||||
|
||||
return transform(new AggregationExpressionTransformationContext<ExpressionNode>(node, null, null, context));
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.spel.ExpressionTransformer#transform(org.springframework.data.mongodb.core.spel.ExpressionTransformationContextSupport)
|
||||
*/
|
||||
public Object transform(AggregationExpressionTransformationContext<ExpressionNode> context) {
|
||||
return lookupConversionFor(context.getCurrentNode()).convert(context);
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns an appropriate {@link ExpressionNodeConversion} for the given {@code node}. Throws an
|
||||
* {@link IllegalArgumentException} if no conversion could be found.
|
||||
*
|
||||
* @param node
|
||||
* @return the appropriate {@link ExpressionNodeConversion} for the given {@link ExpressionNode}.
|
||||
*/
|
||||
@SuppressWarnings("unchecked")
|
||||
private ExpressionNodeConversion<ExpressionNode> lookupConversionFor(ExpressionNode node) {
|
||||
|
||||
for (ExpressionNodeConversion<? extends ExpressionNode> candidate : conversions) {
|
||||
if (candidate.supports(node)) {
|
||||
return (ExpressionNodeConversion<ExpressionNode>) candidate;
|
||||
}
|
||||
}
|
||||
|
||||
throw new IllegalArgumentException("Unsupported Element: " + node + " Type: " + node.getClass()
|
||||
+ " You probably have a syntax error in your SpEL expression!");
|
||||
}
|
||||
|
||||
/**
|
||||
* Abstract base class for {@link SpelNode} to (Db)-object conversions.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
private static abstract class ExpressionNodeConversion<T extends ExpressionNode> implements
|
||||
AggregationExpressionTransformer {
|
||||
|
||||
private final AggregationExpressionTransformer transformer;
|
||||
private final Class<? extends ExpressionNode> nodeType;
|
||||
|
||||
/**
|
||||
* Creates a new {@link ExpressionNodeConversion}.
|
||||
*
|
||||
* @param transformer must not be {@literal null}.
|
||||
*/
|
||||
@SuppressWarnings("unchecked")
|
||||
public ExpressionNodeConversion(AggregationExpressionTransformer transformer) {
|
||||
|
||||
Assert.notNull(transformer, "Transformer must not be null!");
|
||||
|
||||
this.nodeType = (Class<? extends ExpressionNode>) GenericTypeResolver.resolveTypeArgument(this.getClass(),
|
||||
ExpressionNodeConversion.class);
|
||||
this.transformer = transformer;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns whether the current conversion supports the given {@link ExpressionNode}. By default we will match the
|
||||
* node type against the genric type the subclass types the type parameter to.
|
||||
*
|
||||
* @param node will never be {@literal null}.
|
||||
* @return true if {@literal this} conversion can be applied to the given {@code node}.
|
||||
*/
|
||||
protected boolean supports(ExpressionNode node) {
|
||||
return nodeType.equals(node.getClass());
|
||||
}
|
||||
|
||||
/**
|
||||
* Triggers the transformation for the given {@link ExpressionNode} and the given current context.
|
||||
*
|
||||
* @param node must not be {@literal null}.
|
||||
* @param context must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
protected Object transform(ExpressionNode node, AggregationExpressionTransformationContext<?> context) {
|
||||
|
||||
Assert.notNull(node, "ExpressionNode must not be null!");
|
||||
Assert.notNull(context, "AggregationExpressionTransformationContext must not be null!");
|
||||
|
||||
return transform(node, context.getParentNode(), null, context);
|
||||
}
|
||||
|
||||
/**
|
||||
* Triggers the transformation with the given new {@link ExpressionNode}, new parent node, the current operation and
|
||||
* the previous context.
|
||||
*
|
||||
* @param node must not be {@literal null}.
|
||||
* @param parent
|
||||
* @param operation
|
||||
* @param context must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
protected Object transform(ExpressionNode node, ExpressionNode parent, DBObject operation,
|
||||
AggregationExpressionTransformationContext<?> context) {
|
||||
|
||||
Assert.notNull(node, "ExpressionNode must not be null!");
|
||||
Assert.notNull(context, "AggregationExpressionTransformationContext must not be null!");
|
||||
|
||||
return transform(new AggregationExpressionTransformationContext<ExpressionNode>(node, parent, operation,
|
||||
context.getAggregationContext()));
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.NodeConversion#transform(org.springframework.data.mongodb.core.aggregation.AggregationExpressionTransformer.AggregationExpressionTransformationContext)
|
||||
*/
|
||||
@Override
|
||||
public Object transform(AggregationExpressionTransformationContext<ExpressionNode> context) {
|
||||
return transformer.transform(context);
|
||||
}
|
||||
|
||||
/**
|
||||
* Performs the actual conversion from {@link SpelNode} to the corresponding representation for MongoDB.
|
||||
*
|
||||
* @param context
|
||||
* @return
|
||||
*/
|
||||
protected abstract Object convert(AggregationExpressionTransformationContext<T> context);
|
||||
}
|
||||
|
||||
/**
|
||||
* A {@link ExpressionNodeConversion} that converts arithmetic operations.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
private static class OperatorNodeConversion extends ExpressionNodeConversion<OperatorNode> {
|
||||
|
||||
public OperatorNodeConversion(AggregationExpressionTransformer transformer) {
|
||||
super(transformer);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.SpelNodeWrapper#convertSpelNodeToMongoObjectExpression(org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.ExpressionConversionContext)
|
||||
*/
|
||||
@Override
|
||||
protected Object convert(AggregationExpressionTransformationContext<OperatorNode> context) {
|
||||
|
||||
OperatorNode currentNode = context.getCurrentNode();
|
||||
|
||||
DBObject operationObject = createOperationObjectAndAddToPreviousArgumentsIfNecessary(context, currentNode);
|
||||
Object leftResult = transform(currentNode.getLeft(), currentNode, operationObject, context);
|
||||
|
||||
if (currentNode.isUnaryMinus()) {
|
||||
return convertUnaryMinusOp(context, leftResult);
|
||||
}
|
||||
|
||||
// we deliberately ignore the RHS result
|
||||
transform(currentNode.getRight(), currentNode, operationObject, context);
|
||||
|
||||
return operationObject;
|
||||
}
|
||||
|
||||
private DBObject createOperationObjectAndAddToPreviousArgumentsIfNecessary(
|
||||
AggregationExpressionTransformationContext<OperatorNode> context, OperatorNode currentNode) {
|
||||
|
||||
DBObject nextDbObject = new BasicDBObject(currentNode.getMongoOperator(), new BasicDBList());
|
||||
|
||||
if (!context.hasPreviousOperation()) {
|
||||
return nextDbObject;
|
||||
}
|
||||
|
||||
if (context.parentIsSameOperation()) {
|
||||
|
||||
// same operator applied in a row e.g. 1 + 2 + 3 carry on with the operation and render as $add: [1, 2 ,3]
|
||||
nextDbObject = context.getPreviousOperationObject();
|
||||
} else if (!currentNode.isUnaryOperator()) {
|
||||
|
||||
// different operator -> add context object for next level to list if arguments of previous expression
|
||||
context.addToPreviousOperation(nextDbObject);
|
||||
}
|
||||
|
||||
return nextDbObject;
|
||||
}
|
||||
|
||||
private Object convertUnaryMinusOp(ExpressionTransformationContextSupport<OperatorNode> context, Object leftResult) {
|
||||
|
||||
Object result = leftResult instanceof Number ? leftResult
|
||||
: new BasicDBObject("$multiply", dbList(-1, leftResult));
|
||||
|
||||
if (leftResult != null && context.hasPreviousOperation()) {
|
||||
context.addToPreviousOperation(result);
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.SpelNodeWrapper#supports(java.lang.Class)
|
||||
*/
|
||||
@Override
|
||||
protected boolean supports(ExpressionNode node) {
|
||||
return node.isMathematicalOperation();
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* A {@link ExpressionNodeConversion} that converts indexed expressions.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
private static class IndexerNodeConversion extends ExpressionNodeConversion<ExpressionNode> {
|
||||
|
||||
public IndexerNodeConversion(AggregationExpressionTransformer transformer) {
|
||||
super(transformer);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.SpelNodeWrapper#convertSpelNodeToMongoObjectExpression(org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.ExpressionConversionContext)
|
||||
*/
|
||||
@Override
|
||||
protected Object convert(AggregationExpressionTransformationContext<ExpressionNode> context) {
|
||||
return context.addToPreviousOrReturn(context.getCurrentNode().getValue());
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.NodeConversion#supports(org.springframework.data.mongodb.core.spel.ExpressionNode)
|
||||
*/
|
||||
@Override
|
||||
protected boolean supports(ExpressionNode node) {
|
||||
return node.isOfType(Indexer.class);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* A {@link ExpressionNodeConversion} that converts in-line list expressions.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
private static class InlineListNodeConversion extends ExpressionNodeConversion<ExpressionNode> {
|
||||
|
||||
public InlineListNodeConversion(AggregationExpressionTransformer transformer) {
|
||||
super(transformer);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.SpelNodeWrapper#convertSpelNodeToMongoObjectExpression(org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.ExpressionConversionContext)
|
||||
*/
|
||||
@Override
|
||||
protected Object convert(AggregationExpressionTransformationContext<ExpressionNode> context) {
|
||||
|
||||
ExpressionNode currentNode = context.getCurrentNode();
|
||||
|
||||
if (!currentNode.hasChildren()) {
|
||||
return null;
|
||||
}
|
||||
|
||||
// just take the first item
|
||||
return transform(currentNode.getChild(0), currentNode, null, context);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.NodeConversion#supports(org.springframework.data.mongodb.core.spel.ExpressionNode)
|
||||
*/
|
||||
@Override
|
||||
protected boolean supports(ExpressionNode node) {
|
||||
return node.isOfType(InlineList.class);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* A {@link ExpressionNodeConversion} that converts property or field reference expressions.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
private static class PropertyOrFieldReferenceNodeConversion extends ExpressionNodeConversion<ExpressionNode> {
|
||||
|
||||
public PropertyOrFieldReferenceNodeConversion(AggregationExpressionTransformer transformer) {
|
||||
super(transformer);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.NodeConversion#convert(org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.ExpressionTransformationContext)
|
||||
*/
|
||||
@Override
|
||||
protected Object convert(AggregationExpressionTransformationContext<ExpressionNode> context) {
|
||||
|
||||
String fieldReference = context.getFieldReference().toString();
|
||||
return context.addToPreviousOrReturn(fieldReference);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.NodeConversion#supports(org.springframework.data.mongodb.core.spel.ExpressionNode)
|
||||
*/
|
||||
@Override
|
||||
protected boolean supports(ExpressionNode node) {
|
||||
return node.isOfType(PropertyOrFieldReference.class);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* A {@link ExpressionNodeConversion} that converts literal expressions.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
private static class LiteralNodeConversion extends ExpressionNodeConversion<LiteralNode> {
|
||||
|
||||
public LiteralNodeConversion(AggregationExpressionTransformer transformer) {
|
||||
super(transformer);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.SpelNodeWrapper#convertSpelNodeToMongoObjectExpression(org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.ExpressionConversionContext)
|
||||
*/
|
||||
@Override
|
||||
@SuppressWarnings("unchecked")
|
||||
protected Object convert(AggregationExpressionTransformationContext<LiteralNode> context) {
|
||||
|
||||
LiteralNode node = context.getCurrentNode();
|
||||
Object value = node.getValue();
|
||||
|
||||
if (context.hasPreviousOperation()) {
|
||||
|
||||
if (node.isUnaryMinus(context.getParentNode())) {
|
||||
// unary minus operator
|
||||
return NumberUtils.convertNumberToTargetClass(((Number) value).doubleValue() * -1,
|
||||
(Class<Number>) value.getClass()); // retain type, e.g. int to -int
|
||||
}
|
||||
|
||||
return context.addToPreviousOperation(value);
|
||||
}
|
||||
|
||||
return value;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.SpelNodeWrapper#supports(org.springframework.expression.spel.SpelNode)
|
||||
*/
|
||||
@Override
|
||||
protected boolean supports(ExpressionNode node) {
|
||||
return node.isLiteral();
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* A {@link ExpressionNodeConversion} that converts method reference expressions.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
private static class MethodReferenceNodeConversion extends ExpressionNodeConversion<MethodReferenceNode> {
|
||||
|
||||
public MethodReferenceNodeConversion(AggregationExpressionTransformer transformer) {
|
||||
super(transformer);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.SpelNodeWrapper#convertSpelNodeToMongoObjectExpression(org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.ExpressionConversionContext)
|
||||
*/
|
||||
@Override
|
||||
protected Object convert(AggregationExpressionTransformationContext<MethodReferenceNode> context) {
|
||||
|
||||
MethodReferenceNode node = context.getCurrentNode();
|
||||
List<Object> args = new ArrayList<Object>();
|
||||
|
||||
for (ExpressionNode childNode : node) {
|
||||
args.add(transform(childNode, context));
|
||||
}
|
||||
|
||||
return context.addToPreviousOrReturn(new BasicDBObject(node.getMethodName(), dbList(args.toArray())));
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* A {@link ExpressionNodeConversion} that converts method compound expressions.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
private static class CompoundExpressionNodeConversion extends ExpressionNodeConversion<ExpressionNode> {
|
||||
|
||||
public CompoundExpressionNodeConversion(AggregationExpressionTransformer transformer) {
|
||||
super(transformer);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.SpelNodeWrapper#convertSpelNodeToMongoObjectExpression(org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.ExpressionConversionContext)
|
||||
*/
|
||||
@Override
|
||||
protected Object convert(AggregationExpressionTransformationContext<ExpressionNode> context) {
|
||||
|
||||
ExpressionNode currentNode = context.getCurrentNode();
|
||||
|
||||
if (currentNode.hasfirstChildNotOfType(Indexer.class)) {
|
||||
// we have a property path expression like: foo.bar -> render as reference
|
||||
return context.getFieldReference().toString();
|
||||
}
|
||||
|
||||
return context.addToPreviousOrReturn(currentNode.getValue());
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.NodeConversion#supports(org.springframework.data.mongodb.core.spel.ExpressionNode)
|
||||
*/
|
||||
@Override
|
||||
protected boolean supports(ExpressionNode node) {
|
||||
return node.isOfType(CompoundExpression.class);
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,101 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.aggregation;
|
||||
|
||||
import static org.springframework.data.mongodb.core.aggregation.Fields.*;
|
||||
|
||||
import org.springframework.data.mapping.PropertyPath;
|
||||
import org.springframework.data.mapping.context.MappingContext;
|
||||
import org.springframework.data.mapping.context.PersistentPropertyPath;
|
||||
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
|
||||
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
|
||||
import org.springframework.data.mongodb.core.convert.QueryMapper;
|
||||
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
|
||||
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
|
||||
import org.springframework.util.Assert;
|
||||
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
* {@link AggregationOperationContext} aware of a particular type and a {@link MappingContext} to potentially translate
|
||||
* property references into document field names.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
* @since 1.3
|
||||
*/
|
||||
public class TypeBasedAggregationOperationContext implements AggregationOperationContext {
|
||||
|
||||
private final Class<?> type;
|
||||
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
|
||||
private final QueryMapper mapper;
|
||||
|
||||
/**
|
||||
* Creates a new {@link TypeBasedAggregationOperationContext} for the given type, {@link MappingContext} and
|
||||
* {@link QueryMapper}.
|
||||
*
|
||||
* @param type must not be {@literal null}.
|
||||
* @param mappingContext must not be {@literal null}.
|
||||
* @param mapper must not be {@literal null}.
|
||||
*/
|
||||
public TypeBasedAggregationOperationContext(Class<?> type,
|
||||
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext, QueryMapper mapper) {
|
||||
|
||||
Assert.notNull(type, "Type must not be null!");
|
||||
Assert.notNull(mappingContext, "MappingContext must not be null!");
|
||||
Assert.notNull(mapper, "QueryMapper must not be null!");
|
||||
|
||||
this.type = type;
|
||||
this.mappingContext = mappingContext;
|
||||
this.mapper = mapper;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getMappedObject(com.mongodb.DBObject)
|
||||
*/
|
||||
@Override
|
||||
public DBObject getMappedObject(DBObject dbObject) {
|
||||
return mapper.getMappedObject(dbObject, mappingContext.getPersistentEntity(type));
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getReference(org.springframework.data.mongodb.core.aggregation.Field)
|
||||
*/
|
||||
@Override
|
||||
public FieldReference getReference(Field field) {
|
||||
|
||||
PropertyPath.from(field.getTarget(), type);
|
||||
return getReferenceFor(field);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getReference(java.lang.String)
|
||||
*/
|
||||
@Override
|
||||
public FieldReference getReference(String name) {
|
||||
|
||||
PersistentPropertyPath<MongoPersistentProperty> propertyPath = mappingContext.getPersistentPropertyPath(name, type);
|
||||
|
||||
return getReferenceFor(field(propertyPath.getLeafProperty().getName(),
|
||||
propertyPath.toDotPath(MongoPersistentProperty.PropertyToFieldNameConverter.INSTANCE)));
|
||||
}
|
||||
|
||||
private FieldReference getReferenceFor(Field field) {
|
||||
return new FieldReference(new ExposedField(field, true));
|
||||
}
|
||||
}
|
||||
@@ -1,51 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.aggregation;
|
||||
|
||||
import org.springframework.util.Assert;
|
||||
|
||||
/**
|
||||
* A {@code TypedAggregation} is a special {@link Aggregation} that holds information of the input aggregation type.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
public class TypedAggregation<I> extends Aggregation {
|
||||
|
||||
private final Class<I> inputType;
|
||||
|
||||
/**
|
||||
* Creates a new {@link TypedAggregation} from the given {@link AggregationOperation}s.
|
||||
*
|
||||
* @param operations must not be {@literal null} or empty.
|
||||
*/
|
||||
public TypedAggregation(Class<I> inputType, AggregationOperation... operations) {
|
||||
|
||||
super(operations);
|
||||
|
||||
Assert.notNull(inputType, "Input type must not be null!");
|
||||
this.inputType = inputType;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the input type for the {@link Aggregation}.
|
||||
*
|
||||
* @return the inputType will never be {@literal null}.
|
||||
*/
|
||||
public Class<I> getInputType() {
|
||||
return inputType;
|
||||
}
|
||||
}
|
||||
@@ -1,55 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.aggregation;
|
||||
|
||||
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
|
||||
import org.springframework.util.Assert;
|
||||
|
||||
import com.mongodb.BasicDBObject;
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
* Encapsulates the aggregation framework {@code $unwind}-operation.
|
||||
*
|
||||
* @see http://docs.mongodb.org/manual/reference/aggregation/unwind/#pipe._S_unwind
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
* @since 1.3
|
||||
*/
|
||||
public class UnwindOperation implements AggregationOperation {
|
||||
|
||||
private final ExposedField field;
|
||||
|
||||
/**
|
||||
* Creates a new {@link UnwindOperation} for the given {@link Field}.
|
||||
*
|
||||
* @param field must not be {@literal null}.
|
||||
*/
|
||||
public UnwindOperation(Field field) {
|
||||
|
||||
Assert.notNull(field);
|
||||
this.field = new ExposedField(field, true);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
|
||||
*/
|
||||
@Override
|
||||
public DBObject toDBObject(AggregationOperationContext context) {
|
||||
return new BasicDBObject("$unwind", context.getReference(field).toString());
|
||||
}
|
||||
}
|
||||
@@ -1,5 +0,0 @@
|
||||
/**
|
||||
* Support for the MongoDB aggregation framework.
|
||||
* @since 1.3
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.aggregation;
|
||||
@@ -19,7 +19,6 @@ import java.util.ArrayList;
|
||||
import java.util.Arrays;
|
||||
import java.util.HashMap;
|
||||
import java.util.HashSet;
|
||||
import java.util.LinkedHashSet;
|
||||
import java.util.List;
|
||||
import java.util.Locale;
|
||||
import java.util.Map;
|
||||
@@ -40,7 +39,6 @@ import org.springframework.data.convert.WritingConverter;
|
||||
import org.springframework.data.mapping.model.SimpleTypeHolder;
|
||||
import org.springframework.data.mongodb.core.convert.MongoConverters.BigDecimalToStringConverter;
|
||||
import org.springframework.data.mongodb.core.convert.MongoConverters.BigIntegerToStringConverter;
|
||||
import org.springframework.data.mongodb.core.convert.MongoConverters.DBObjectToStringConverter;
|
||||
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToBigDecimalConverter;
|
||||
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToBigIntegerConverter;
|
||||
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToURLConverter;
|
||||
@@ -87,13 +85,12 @@ public class CustomConversions {
|
||||
|
||||
Assert.notNull(converters);
|
||||
|
||||
this.readingPairs = new LinkedHashSet<ConvertiblePair>();
|
||||
this.writingPairs = new LinkedHashSet<ConvertiblePair>();
|
||||
this.readingPairs = new HashSet<ConvertiblePair>();
|
||||
this.writingPairs = new HashSet<ConvertiblePair>();
|
||||
this.customSimpleTypes = new HashSet<Class<?>>();
|
||||
this.cache = new HashMap<Class<?>, HashMap<Class<?>, CacheValue>>();
|
||||
|
||||
this.converters = new ArrayList<Object>();
|
||||
this.converters.addAll(converters);
|
||||
this.converters.add(CustomToStringConverter.INSTANCE);
|
||||
this.converters.add(BigDecimalToStringConverter.INSTANCE);
|
||||
this.converters.add(StringToBigDecimalConverter.INSTANCE);
|
||||
@@ -101,8 +98,8 @@ public class CustomConversions {
|
||||
this.converters.add(StringToBigIntegerConverter.INSTANCE);
|
||||
this.converters.add(URLToStringConverter.INSTANCE);
|
||||
this.converters.add(StringToURLConverter.INSTANCE);
|
||||
this.converters.add(DBObjectToStringConverter.INSTANCE);
|
||||
this.converters.addAll(JodaTimeConverters.getConvertersToRegister());
|
||||
this.converters.addAll(converters);
|
||||
|
||||
for (Object c : this.converters) {
|
||||
registerConversion(c);
|
||||
@@ -240,7 +237,6 @@ public class CustomConversions {
|
||||
* @return
|
||||
*/
|
||||
public Class<?> getCustomWriteTarget(Class<?> source, Class<?> expectedTargetType) {
|
||||
|
||||
Assert.notNull(source);
|
||||
return getCustomTarget(source, expectedTargetType, writingPairs);
|
||||
}
|
||||
|
||||
@@ -1,123 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.convert;
|
||||
|
||||
import java.util.Arrays;
|
||||
import java.util.Iterator;
|
||||
import java.util.Map;
|
||||
|
||||
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
|
||||
import org.springframework.util.Assert;
|
||||
|
||||
import com.mongodb.BasicDBObject;
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
* Wrapper value object for a {@link BasicDBObject} to be able to access raw values by {@link MongoPersistentProperty}
|
||||
* references. The accessors will transparently resolve nested document values that a {@link MongoPersistentProperty}
|
||||
* might refer to through a path expression in field names.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
class DBObjectAccessor {
|
||||
|
||||
private final DBObject dbObject;
|
||||
|
||||
/**
|
||||
* Creates a new {@link DBObjectAccessor} for the given {@link DBObject}.
|
||||
*
|
||||
* @param dbObject must be a {@link BasicDBObject} effectively, must not be {@literal null}.
|
||||
*/
|
||||
public DBObjectAccessor(DBObject dbObject) {
|
||||
|
||||
Assert.notNull(dbObject, "DBObject must not be null!");
|
||||
Assert.isInstanceOf(BasicDBObject.class, dbObject, "Given DBObject must be a BasicDBObject!");
|
||||
|
||||
this.dbObject = dbObject;
|
||||
}
|
||||
|
||||
/**
|
||||
* Puts the given value into the backing {@link DBObject} based on the coordinates defined through the given
|
||||
* {@link MongoPersistentProperty}. By default this will be the plain field name. But field names might also consist
|
||||
* of path traversals so we might need to create intermediate {@link BasicDBObject}s.
|
||||
*
|
||||
* @param prop must not be {@literal null}.
|
||||
* @param value
|
||||
*/
|
||||
public void put(MongoPersistentProperty prop, Object value) {
|
||||
|
||||
Assert.notNull(prop, "MongoPersistentProperty must not be null!");
|
||||
String fieldName = prop.getFieldName();
|
||||
|
||||
Iterator<String> parts = Arrays.asList(fieldName.split("\\.")).iterator();
|
||||
DBObject dbObject = this.dbObject;
|
||||
|
||||
while (parts.hasNext()) {
|
||||
|
||||
String part = parts.next();
|
||||
|
||||
if (parts.hasNext()) {
|
||||
BasicDBObject nestedDbObject = new BasicDBObject();
|
||||
dbObject.put(part, nestedDbObject);
|
||||
dbObject = nestedDbObject;
|
||||
} else {
|
||||
dbObject.put(part, value);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the value the given {@link MongoPersistentProperty} refers to. By default this will be a direct field but
|
||||
* the method will also transparently resolve nested values the {@link MongoPersistentProperty} might refer to through
|
||||
* a path expression in the field name metadata.
|
||||
*
|
||||
* @param property must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
@SuppressWarnings("unchecked")
|
||||
public Object get(MongoPersistentProperty property) {
|
||||
|
||||
String fieldName = property.getFieldName();
|
||||
Iterator<String> parts = Arrays.asList(fieldName.split("\\.")).iterator();
|
||||
Map<Object, Object> source = this.dbObject.toMap();
|
||||
Object result = null;
|
||||
|
||||
while (source != null && parts.hasNext()) {
|
||||
|
||||
result = source.get(parts.next());
|
||||
|
||||
if (parts.hasNext()) {
|
||||
source = getAsMap(result);
|
||||
}
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
@SuppressWarnings("unchecked")
|
||||
private Map<Object, Object> getAsMap(Object source) {
|
||||
|
||||
if (source instanceof BasicDBObject) {
|
||||
return ((DBObject) source).toMap();
|
||||
}
|
||||
|
||||
if (source instanceof Map) {
|
||||
return (Map<Object, Object>) source;
|
||||
}
|
||||
|
||||
return null;
|
||||
}
|
||||
}
|
||||
@@ -1,48 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.convert;
|
||||
|
||||
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
|
||||
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
|
||||
|
||||
import com.mongodb.DBRef;
|
||||
|
||||
/**
|
||||
* Used to resolve associations annotated with {@link org.springframework.data.mongodb.core.mapping.DBRef}.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
public interface DbRefResolver {
|
||||
|
||||
/**
|
||||
* @param property will never be {@literal null}.
|
||||
* @param callback will never be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
Object resolveDbRef(MongoPersistentProperty property, DbRefResolverCallback callback);
|
||||
|
||||
/**
|
||||
* Creates a {@link DBRef} instance for the given {@link org.springframework.data.mongodb.core.mapping.DBRef}
|
||||
* annotation, {@link MongoPersistentEntity} and id.
|
||||
*
|
||||
* @param annotation will never be {@literal null}.
|
||||
* @param entity will never be {@literal null}.
|
||||
* @param id will never be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
DBRef createDbRef(org.springframework.data.mongodb.core.mapping.DBRef annotation, MongoPersistentEntity<?> entity,
|
||||
Object id);
|
||||
}
|
||||
@@ -1,35 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.convert;
|
||||
|
||||
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
|
||||
|
||||
/**
|
||||
* Callback interface to be used in conjunction with {@link DbRefResolver}.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
public interface DbRefResolverCallback {
|
||||
|
||||
/**
|
||||
* Resolve the final object for the given {@link MongoPersistentProperty}.
|
||||
*
|
||||
* @param property will never be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
Object resolve(MongoPersistentProperty property);
|
||||
}
|
||||
@@ -1,282 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.convert;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.io.ObjectInputStream;
|
||||
import java.io.ObjectOutputStream;
|
||||
import java.io.Serializable;
|
||||
import java.lang.reflect.Method;
|
||||
|
||||
import org.aopalliance.intercept.MethodInterceptor;
|
||||
import org.aopalliance.intercept.MethodInvocation;
|
||||
import org.objenesis.Objenesis;
|
||||
import org.objenesis.ObjenesisStd;
|
||||
import org.springframework.aop.framework.ProxyFactory;
|
||||
import org.springframework.cglib.proxy.Callback;
|
||||
import org.springframework.cglib.proxy.Enhancer;
|
||||
import org.springframework.cglib.proxy.Factory;
|
||||
import org.springframework.cglib.proxy.MethodProxy;
|
||||
import org.springframework.core.SpringVersion;
|
||||
import org.springframework.dao.DataAccessException;
|
||||
import org.springframework.dao.support.PersistenceExceptionTranslator;
|
||||
import org.springframework.data.mongodb.LazyLoadingException;
|
||||
import org.springframework.data.mongodb.MongoDbFactory;
|
||||
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
|
||||
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
|
||||
import org.springframework.util.Assert;
|
||||
import org.springframework.util.ClassUtils;
|
||||
import org.springframework.util.ReflectionUtils;
|
||||
import org.springframework.util.StringUtils;
|
||||
|
||||
import com.mongodb.DB;
|
||||
import com.mongodb.DBRef;
|
||||
|
||||
/**
|
||||
* A {@link DbRefResolver} that resolves {@link org.springframework.data.mongodb.core.mapping.DBRef}s by delegating to a
|
||||
* {@link DbRefResolverCallback} than is able to generate lazy loading proxies.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
public class DefaultDbRefResolver implements DbRefResolver {
|
||||
|
||||
private static final boolean IS_SPRING_4_OR_BETTER = SpringVersion.getVersion().startsWith("4");
|
||||
private static final boolean OBJENESIS_PRESENT = ClassUtils.isPresent("org.objenesis.Objenesis", null);
|
||||
|
||||
private final MongoDbFactory mongoDbFactory;
|
||||
private final PersistenceExceptionTranslator exceptionTranslator;
|
||||
|
||||
/**
|
||||
* Creates a new {@link DefaultDbRefResolver} with the given {@link MongoDbFactory}.
|
||||
*
|
||||
* @param mongoDbFactory must not be {@literal null}.
|
||||
*/
|
||||
public DefaultDbRefResolver(MongoDbFactory mongoDbFactory) {
|
||||
|
||||
Assert.notNull(mongoDbFactory, "MongoDbFactory translator must not be null!");
|
||||
|
||||
this.mongoDbFactory = mongoDbFactory;
|
||||
this.exceptionTranslator = mongoDbFactory.getExceptionTranslator();
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.convert.DbRefResolver#resolveDbRef(org.springframework.data.mongodb.core.mapping.MongoPersistentProperty, org.springframework.data.mongodb.core.convert.DbRefResolverCallback)
|
||||
*/
|
||||
@Override
|
||||
public Object resolveDbRef(MongoPersistentProperty property, DbRefResolverCallback callback) {
|
||||
|
||||
Assert.notNull(property, "Property must not be null!");
|
||||
Assert.notNull(callback, "Callback must not be null!");
|
||||
|
||||
if (isLazyDbRef(property)) {
|
||||
return createLazyLoadingProxy(property, callback);
|
||||
}
|
||||
|
||||
return callback.resolve(property);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.convert.DbRefResolver#created(org.springframework.data.mongodb.core.mapping.MongoPersistentProperty, org.springframework.data.mongodb.core.mapping.MongoPersistentEntity, java.lang.Object)
|
||||
*/
|
||||
@Override
|
||||
public DBRef createDbRef(org.springframework.data.mongodb.core.mapping.DBRef annotation,
|
||||
MongoPersistentEntity<?> entity, Object id) {
|
||||
|
||||
DB db = mongoDbFactory.getDb();
|
||||
db = annotation != null && StringUtils.hasText(annotation.db()) ? mongoDbFactory.getDb(annotation.db()) : db;
|
||||
|
||||
return new DBRef(db, entity.getCollection(), id);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a proxy for the given {@link MongoPersistentProperty} using the given {@link DbRefResolverCallback} to
|
||||
* eventually resolve the value of the property.
|
||||
*
|
||||
* @param property must not be {@literal null}.
|
||||
* @param callback must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
private Object createLazyLoadingProxy(MongoPersistentProperty property, DbRefResolverCallback callback) {
|
||||
|
||||
ProxyFactory proxyFactory = new ProxyFactory();
|
||||
Class<?> propertyType = property.getType();
|
||||
|
||||
for (Class<?> type : propertyType.getInterfaces()) {
|
||||
proxyFactory.addInterface(type);
|
||||
}
|
||||
|
||||
LazyLoadingInterceptor interceptor = new LazyLoadingInterceptor(property, exceptionTranslator, callback);
|
||||
|
||||
if (propertyType.isInterface()) {
|
||||
proxyFactory.addInterface(propertyType);
|
||||
proxyFactory.addAdvice(interceptor);
|
||||
return proxyFactory.getProxy();
|
||||
}
|
||||
|
||||
proxyFactory.setProxyTargetClass(true);
|
||||
proxyFactory.setTargetClass(propertyType);
|
||||
|
||||
if (IS_SPRING_4_OR_BETTER || !OBJENESIS_PRESENT) {
|
||||
proxyFactory.addAdvice(interceptor);
|
||||
return proxyFactory.getProxy();
|
||||
}
|
||||
|
||||
return ObjenesisProxyEnhancer.enhanceAndGet(proxyFactory, propertyType, interceptor);
|
||||
}
|
||||
|
||||
/**
|
||||
* @param property
|
||||
* @return
|
||||
*/
|
||||
private boolean isLazyDbRef(MongoPersistentProperty property) {
|
||||
return property.getDBRef() != null && property.getDBRef().lazy();
|
||||
}
|
||||
|
||||
/**
|
||||
* A {@link MethodInterceptor} that is used within a lazy loading proxy. The property resolving is delegated to a
|
||||
* {@link DbRefResolverCallback}. The resolving process is triggered by a method invocation on the proxy and is
|
||||
* guaranteed to be performed only once.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
static class LazyLoadingInterceptor implements MethodInterceptor, org.springframework.cglib.proxy.MethodInterceptor,
|
||||
Serializable {
|
||||
|
||||
private final DbRefResolverCallback callback;
|
||||
private final MongoPersistentProperty property;
|
||||
private final PersistenceExceptionTranslator exceptionTranslator;
|
||||
|
||||
private volatile boolean resolved;
|
||||
private Object result;
|
||||
|
||||
/**
|
||||
* Creates a new {@link LazyLoadingInterceptor} for the given {@link MongoPersistentProperty},
|
||||
* {@link PersistenceExceptionTranslator} and {@link DbRefResolverCallback}.
|
||||
*
|
||||
* @param property must not be {@literal null}.
|
||||
* @param callback must not be {@literal null}.
|
||||
*/
|
||||
public LazyLoadingInterceptor(MongoPersistentProperty property, PersistenceExceptionTranslator exceptionTranslator,
|
||||
DbRefResolverCallback callback) {
|
||||
|
||||
Assert.notNull(property, "Property must not be null!");
|
||||
Assert.notNull(exceptionTranslator, "Exception translator must not be null!");
|
||||
Assert.notNull(callback, "Callback must not be null!");
|
||||
|
||||
this.callback = callback;
|
||||
this.exceptionTranslator = exceptionTranslator;
|
||||
this.property = property;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.aopalliance.intercept.MethodInterceptor#invoke(org.aopalliance.intercept.MethodInvocation)
|
||||
*/
|
||||
@Override
|
||||
public Object invoke(MethodInvocation invocation) throws Throwable {
|
||||
return intercept(invocation.getThis(), invocation.getMethod(), invocation.getArguments(), null);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.cglib.proxy.MethodInterceptor#intercept(java.lang.Object, java.lang.reflect.Method, java.lang.Object[], org.springframework.cglib.proxy.MethodProxy)
|
||||
*/
|
||||
@Override
|
||||
public Object intercept(Object obj, Method method, Object[] args, MethodProxy proxy) throws Throwable {
|
||||
return ReflectionUtils.isObjectMethod(method) ? method.invoke(obj, args) : method.invoke(ensureResolved(), args);
|
||||
}
|
||||
|
||||
private Object ensureResolved() {
|
||||
|
||||
if (!resolved) {
|
||||
this.result = resolve();
|
||||
this.resolved = true;
|
||||
}
|
||||
|
||||
return this.result;
|
||||
}
|
||||
|
||||
private void writeObject(ObjectOutputStream out) throws IOException {
|
||||
|
||||
ensureResolved();
|
||||
out.writeObject(this.result);
|
||||
}
|
||||
|
||||
private void readObject(ObjectInputStream in) throws IOException {
|
||||
|
||||
try {
|
||||
this.resolved = true; // Object is guaranteed to be resolved after serializations
|
||||
this.result = in.readObject();
|
||||
} catch (ClassNotFoundException e) {
|
||||
throw new LazyLoadingException("Could not deserialize result", e);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* @return
|
||||
*/
|
||||
private synchronized Object resolve() {
|
||||
|
||||
if (!resolved) {
|
||||
|
||||
try {
|
||||
|
||||
return callback.resolve(property);
|
||||
|
||||
} catch (RuntimeException ex) {
|
||||
|
||||
DataAccessException translatedException = this.exceptionTranslator.translateExceptionIfPossible(ex);
|
||||
throw new LazyLoadingException("Unable to lazily resolve DBRef!", translatedException);
|
||||
}
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
public boolean isResolved() {
|
||||
return resolved;
|
||||
}
|
||||
|
||||
public Object getResult() {
|
||||
return result;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Static class to accomodate optional dependency on Objenesis.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
private static class ObjenesisProxyEnhancer {
|
||||
|
||||
private static final Objenesis OBJENESIS = new ObjenesisStd(true);
|
||||
|
||||
public static Object enhanceAndGet(ProxyFactory proxyFactory, Class<?> type,
|
||||
org.springframework.cglib.proxy.MethodInterceptor interceptor) {
|
||||
|
||||
Enhancer enhancer = new Enhancer();
|
||||
enhancer.setSuperclass(type);
|
||||
enhancer.setCallbackType(org.springframework.cglib.proxy.MethodInterceptor.class);
|
||||
|
||||
Factory factory = (Factory) OBJENESIS.newInstance(enhancer.createClass());
|
||||
factory.setCallbacks(new Callback[] { interceptor });
|
||||
return factory;
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2011-2013 the original author or authors.
|
||||
* Copyright 2011 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -18,19 +18,16 @@ package org.springframework.data.mongodb.core.convert;
|
||||
import java.util.Arrays;
|
||||
import java.util.List;
|
||||
import java.util.Map;
|
||||
import java.util.Set;
|
||||
|
||||
import org.springframework.data.convert.DefaultTypeMapper;
|
||||
import org.springframework.data.convert.SimpleTypeInformationMapper;
|
||||
import org.springframework.data.convert.DefaultTypeMapper;
|
||||
import org.springframework.data.convert.TypeAliasAccessor;
|
||||
import org.springframework.data.convert.TypeInformationMapper;
|
||||
import org.springframework.data.mapping.PersistentEntity;
|
||||
import org.springframework.data.mapping.context.MappingContext;
|
||||
import org.springframework.data.util.ClassTypeInformation;
|
||||
import org.springframework.data.util.TypeInformation;
|
||||
|
||||
import com.mongodb.BasicDBList;
|
||||
import com.mongodb.BasicDBObject;
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
@@ -40,43 +37,33 @@ import com.mongodb.DBObject;
|
||||
* respectively.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
public class DefaultMongoTypeMapper extends DefaultTypeMapper<DBObject> implements MongoTypeMapper {
|
||||
|
||||
public static final String DEFAULT_TYPE_KEY = "_class";
|
||||
@SuppressWarnings("rawtypes")//
|
||||
@SuppressWarnings("rawtypes")
|
||||
private static final TypeInformation<List> LIST_TYPE_INFO = ClassTypeInformation.from(List.class);
|
||||
@SuppressWarnings("rawtypes")//
|
||||
@SuppressWarnings("rawtypes")
|
||||
private static final TypeInformation<Map> MAP_TYPE_INFO = ClassTypeInformation.from(Map.class);
|
||||
|
||||
private final TypeAliasAccessor<DBObject> accessor;
|
||||
private final String typeKey;
|
||||
private String typeKey = DEFAULT_TYPE_KEY;
|
||||
|
||||
public DefaultMongoTypeMapper() {
|
||||
this(DEFAULT_TYPE_KEY);
|
||||
this(DEFAULT_TYPE_KEY, Arrays.asList(SimpleTypeInformationMapper.INSTANCE));
|
||||
}
|
||||
|
||||
public DefaultMongoTypeMapper(String typeKey) {
|
||||
this(typeKey, Arrays.asList(SimpleTypeInformationMapper.INSTANCE));
|
||||
super(new DBObjectTypeAliasAccessor(typeKey));
|
||||
this.typeKey = typeKey;
|
||||
}
|
||||
|
||||
public DefaultMongoTypeMapper(String typeKey, MappingContext<? extends PersistentEntity<?, ?>, ?> mappingContext) {
|
||||
this(typeKey, new DBObjectTypeAliasAccessor(typeKey), mappingContext, Arrays
|
||||
.asList(SimpleTypeInformationMapper.INSTANCE));
|
||||
super(new DBObjectTypeAliasAccessor(typeKey), mappingContext, Arrays.asList(SimpleTypeInformationMapper.INSTANCE));
|
||||
this.typeKey = typeKey;
|
||||
}
|
||||
|
||||
public DefaultMongoTypeMapper(String typeKey, List<? extends TypeInformationMapper> mappers) {
|
||||
this(typeKey, new DBObjectTypeAliasAccessor(typeKey), null, mappers);
|
||||
}
|
||||
|
||||
private DefaultMongoTypeMapper(String typeKey, TypeAliasAccessor<DBObject> accessor,
|
||||
MappingContext<? extends PersistentEntity<?, ?>, ?> mappingContext, List<? extends TypeInformationMapper> mappers) {
|
||||
|
||||
super(accessor, mappingContext, mappers);
|
||||
|
||||
super(new DBObjectTypeAliasAccessor(typeKey), mappers);
|
||||
this.typeKey = typeKey;
|
||||
this.accessor = accessor;
|
||||
}
|
||||
|
||||
/*
|
||||
@@ -87,31 +74,6 @@ public class DefaultMongoTypeMapper extends DefaultTypeMapper<DBObject> implemen
|
||||
return typeKey == null ? false : typeKey.equals(key);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.convert.MongoTypeMapper#writeTypeRestrictions(java.util.Set)
|
||||
*/
|
||||
@Override
|
||||
public void writeTypeRestrictions(DBObject result, Set<Class<?>> restrictedTypes) {
|
||||
|
||||
if (restrictedTypes == null || restrictedTypes.isEmpty()) {
|
||||
return;
|
||||
}
|
||||
|
||||
BasicDBList restrictedMappedTypes = new BasicDBList();
|
||||
|
||||
for (Class<?> restrictedType : restrictedTypes) {
|
||||
|
||||
Object typeAlias = getAliasFor(ClassTypeInformation.from(restrictedType));
|
||||
|
||||
if (typeAlias != null) {
|
||||
restrictedMappedTypes.add(typeAlias);
|
||||
}
|
||||
}
|
||||
|
||||
accessor.writeTypeTo(result, new BasicDBObject("$in", restrictedMappedTypes));
|
||||
}
|
||||
|
||||
/* (non-Javadoc)
|
||||
* @see org.springframework.data.convert.DefaultTypeMapper#getFallbackTypeFor(java.lang.Object)
|
||||
*/
|
||||
@@ -121,7 +83,6 @@ public class DefaultMongoTypeMapper extends DefaultTypeMapper<DBObject> implemen
|
||||
}
|
||||
|
||||
/**
|
||||
* {@link TypeAliasAccessor} to store aliases in a {@link DBObject}.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
|
||||
@@ -57,9 +57,11 @@ import org.springframework.data.util.TypeInformation;
|
||||
import org.springframework.expression.spel.standard.SpelExpressionParser;
|
||||
import org.springframework.util.Assert;
|
||||
import org.springframework.util.CollectionUtils;
|
||||
import org.springframework.util.StringUtils;
|
||||
|
||||
import com.mongodb.BasicDBList;
|
||||
import com.mongodb.BasicDBObject;
|
||||
import com.mongodb.DB;
|
||||
import com.mongodb.DBObject;
|
||||
import com.mongodb.DBRef;
|
||||
|
||||
@@ -69,8 +71,6 @@ import com.mongodb.DBRef;
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
* @author Jon Brisbin
|
||||
* @author Patrik Wasik
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
public class MappingMongoConverter extends AbstractMongoConverter implements ApplicationContextAware {
|
||||
|
||||
@@ -78,8 +78,8 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
|
||||
|
||||
protected final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
|
||||
protected final SpelExpressionParser spelExpressionParser = new SpelExpressionParser();
|
||||
protected final MongoDbFactory mongoDbFactory;
|
||||
protected final QueryMapper idMapper;
|
||||
protected final DbRefResolver dbRefResolver;
|
||||
protected ApplicationContext applicationContext;
|
||||
protected boolean useFieldAccessOnly = true;
|
||||
protected MongoTypeMapper typeMapper;
|
||||
@@ -88,21 +88,21 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
|
||||
private SpELContext spELContext;
|
||||
|
||||
/**
|
||||
* Creates a new {@link MappingMongoConverter} given the new {@link DbRefResolver} and {@link MappingContext}.
|
||||
* Creates a new {@link MappingMongoConverter} given the new {@link MongoDbFactory} and {@link MappingContext}.
|
||||
*
|
||||
* @param mongoDbFactory must not be {@literal null}.
|
||||
* @param mappingContext must not be {@literal null}.
|
||||
*/
|
||||
@SuppressWarnings("deprecation")
|
||||
public MappingMongoConverter(DbRefResolver dbRefResolver,
|
||||
public MappingMongoConverter(MongoDbFactory mongoDbFactory,
|
||||
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext) {
|
||||
|
||||
super(ConversionServiceFactory.createDefaultConversionService());
|
||||
|
||||
Assert.notNull(dbRefResolver, "DbRefResolver must not be null!");
|
||||
Assert.notNull(mappingContext, "MappingContext must not be null!");
|
||||
Assert.notNull(mongoDbFactory);
|
||||
Assert.notNull(mappingContext);
|
||||
|
||||
this.dbRefResolver = dbRefResolver;
|
||||
this.mongoDbFactory = mongoDbFactory;
|
||||
this.mappingContext = mappingContext;
|
||||
this.typeMapper = new DefaultMongoTypeMapper(DefaultMongoTypeMapper.DEFAULT_TYPE_KEY, mappingContext);
|
||||
this.idMapper = new QueryMapper(this);
|
||||
@@ -110,19 +110,6 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
|
||||
this.spELContext = new SpELContext(DBObjectPropertyAccessor.INSTANCE);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link MappingMongoConverter} given the new {@link MongoDbFactory} and {@link MappingContext}.
|
||||
*
|
||||
* @deprecated use the constructor taking a {@link DbRefResolver} instead.
|
||||
* @param mongoDbFactory must not be {@literal null}.
|
||||
* @param mappingContext must not be {@literal null}.
|
||||
*/
|
||||
@Deprecated
|
||||
public MappingMongoConverter(MongoDbFactory mongoDbFactory,
|
||||
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext) {
|
||||
this(new DefaultDbRefResolver(mongoDbFactory), mappingContext);
|
||||
}
|
||||
|
||||
/**
|
||||
* Configures the {@link MongoTypeMapper} to be used to add type information to {@link DBObject}s created by the
|
||||
* converter and how to lookup type information from {@link DBObject}s when reading them. Uses a
|
||||
@@ -136,15 +123,6 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
|
||||
mappingContext) : typeMapper;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.convert.MongoConverter#getTypeMapper()
|
||||
*/
|
||||
@Override
|
||||
public MongoTypeMapper getTypeMapper() {
|
||||
return this.typeMapper;
|
||||
}
|
||||
|
||||
/**
|
||||
* Configure the characters dots potentially contained in a {@link Map} shall be replaced with. By default we don't do
|
||||
* any translation but rather reject a {@link Map} with keys containing dots causing the conversion for the entire
|
||||
@@ -245,7 +223,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
|
||||
parent);
|
||||
}
|
||||
|
||||
private <S extends Object> S read(final MongoPersistentEntity<S> entity, final DBObject dbo, final Object parent) {
|
||||
private <S extends Object> S read(final MongoPersistentEntity<S> entity, final DBObject dbo, Object parent) {
|
||||
|
||||
final DefaultSpELExpressionEvaluator evaluator = new DefaultSpELExpressionEvaluator(dbo, spELContext);
|
||||
|
||||
@@ -272,18 +250,11 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
|
||||
// Handle associations
|
||||
entity.doWithAssociations(new AssociationHandler<MongoPersistentProperty>() {
|
||||
public void doWithAssociation(Association<MongoPersistentProperty> association) {
|
||||
|
||||
MongoPersistentProperty inverseProp = association.getInverse();
|
||||
|
||||
Object obj = dbRefResolver.resolveDbRef(inverseProp, new DbRefResolverCallback() {
|
||||
|
||||
@Override
|
||||
public Object resolve(MongoPersistentProperty property) {
|
||||
return getValueInternal(property, dbo, evaluator, parent);
|
||||
}
|
||||
});
|
||||
Object obj = getValueInternal(inverseProp, dbo, evaluator, result);
|
||||
|
||||
wrapper.setProperty(inverseProp, obj);
|
||||
|
||||
}
|
||||
});
|
||||
|
||||
@@ -384,7 +355,8 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
|
||||
try {
|
||||
Object id = wrapper.getProperty(idProperty, Object.class, fieldAccessOnly);
|
||||
dbo.put("_id", idMapper.convertId(id));
|
||||
} catch (ConversionException ignored) {}
|
||||
} catch (ConversionException ignored) {
|
||||
}
|
||||
}
|
||||
|
||||
// Write the properties
|
||||
@@ -403,7 +375,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
|
||||
if (!conversions.isSimpleType(propertyObj.getClass())) {
|
||||
writePropertyInternal(propertyObj, dbo, prop);
|
||||
} else {
|
||||
writeSimpleInternal(propertyObj, dbo, prop);
|
||||
writeSimpleInternal(propertyObj, dbo, prop.getFieldName());
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -428,27 +400,27 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
|
||||
return;
|
||||
}
|
||||
|
||||
DBObjectAccessor accessor = new DBObjectAccessor(dbo);
|
||||
|
||||
String name = prop.getFieldName();
|
||||
TypeInformation<?> valueType = ClassTypeInformation.from(obj.getClass());
|
||||
TypeInformation<?> type = prop.getTypeInformation();
|
||||
|
||||
if (valueType.isCollectionLike()) {
|
||||
DBObject collectionInternal = createCollection(asCollection(obj), prop);
|
||||
accessor.put(prop, collectionInternal);
|
||||
dbo.put(name, collectionInternal);
|
||||
return;
|
||||
}
|
||||
|
||||
if (valueType.isMap()) {
|
||||
DBObject mapDbObj = createMap((Map<Object, Object>) obj, prop);
|
||||
accessor.put(prop, mapDbObj);
|
||||
BasicDBObject mapDbObj = new BasicDBObject();
|
||||
writeMapInternal((Map<Object, Object>) obj, mapDbObj, type);
|
||||
dbo.put(name, mapDbObj);
|
||||
return;
|
||||
}
|
||||
|
||||
if (prop.isDbReference()) {
|
||||
DBRef dbRefObj = createDBRef(obj, prop.getDBRef());
|
||||
if (null != dbRefObj) {
|
||||
accessor.put(prop, dbRefObj);
|
||||
dbo.put(name, dbRefObj);
|
||||
return;
|
||||
}
|
||||
}
|
||||
@@ -457,20 +429,18 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
|
||||
Class<?> basicTargetType = conversions.getCustomWriteTarget(obj.getClass(), null);
|
||||
|
||||
if (basicTargetType != null) {
|
||||
accessor.put(prop, conversionService.convert(obj, basicTargetType));
|
||||
dbo.put(name, conversionService.convert(obj, basicTargetType));
|
||||
return;
|
||||
}
|
||||
|
||||
Object existingValue = accessor.get(prop);
|
||||
BasicDBObject propDbObj = existingValue instanceof BasicDBObject ? (BasicDBObject) existingValue
|
||||
: new BasicDBObject();
|
||||
BasicDBObject propDbObj = new BasicDBObject();
|
||||
addCustomTypeKeyIfNecessary(type, obj, propDbObj);
|
||||
|
||||
MongoPersistentEntity<?> entity = isSubtype(prop.getType(), obj.getClass()) ? mappingContext
|
||||
.getPersistentEntity(obj.getClass()) : mappingContext.getPersistentEntity(type);
|
||||
|
||||
writeInternal(obj, propDbObj, entity);
|
||||
accessor.put(prop, propDbObj);
|
||||
dbo.put(name, propDbObj);
|
||||
}
|
||||
|
||||
private boolean isSubtype(Class<?> left, Class<?> right) {
|
||||
@@ -522,42 +492,6 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
|
||||
return dbList;
|
||||
}
|
||||
|
||||
/**
|
||||
* Writes the given {@link Map} using the given {@link MongoPersistentProperty} information.
|
||||
*
|
||||
* @param map must not {@literal null}.
|
||||
* @param property must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
protected DBObject createMap(Map<Object, Object> map, MongoPersistentProperty property) {
|
||||
|
||||
Assert.notNull(map, "Given map must not be null!");
|
||||
Assert.notNull(property, "PersistentProperty must not be null!");
|
||||
|
||||
if (!property.isDbReference()) {
|
||||
return writeMapInternal(map, new BasicDBObject(), property.getTypeInformation());
|
||||
}
|
||||
|
||||
BasicDBObject dbObject = new BasicDBObject();
|
||||
|
||||
for (Map.Entry<Object, Object> entry : map.entrySet()) {
|
||||
|
||||
Object key = entry.getKey();
|
||||
Object value = entry.getValue();
|
||||
|
||||
if (conversions.isSimpleType(key.getClass())) {
|
||||
|
||||
String simpleKey = potentiallyEscapeMapKey(key.toString());
|
||||
dbObject.put(simpleKey, value != null ? createDBRef(value, property.getDBRef()) : null);
|
||||
|
||||
} else {
|
||||
throw new MappingException("Cannot use a complex object as a key value.");
|
||||
}
|
||||
}
|
||||
|
||||
return dbObject;
|
||||
}
|
||||
|
||||
/**
|
||||
* Populates the given {@link BasicDBList} with values from the given {@link Collection}.
|
||||
*
|
||||
@@ -688,11 +622,6 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
|
||||
dbObject.put(key, getPotentiallyConvertedSimpleWrite(value));
|
||||
}
|
||||
|
||||
private void writeSimpleInternal(Object value, DBObject dbObject, MongoPersistentProperty property) {
|
||||
DBObjectAccessor accessor = new DBObjectAccessor(dbObject);
|
||||
accessor.put(property, getPotentiallyConvertedSimpleWrite(value));
|
||||
}
|
||||
|
||||
/**
|
||||
* Checks whether we have a custom conversion registered for the given value into an arbitrary simple Mongo type.
|
||||
* Returns the converted value if so. If not, we perform special enum handling or simply return the value as is.
|
||||
@@ -768,7 +697,10 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
|
||||
throw new MappingException("Cannot create a reference to an object with a NULL id.");
|
||||
}
|
||||
|
||||
return dbRefResolver.createDbRef(dbref, targetEntity, idMapper.convertId(id));
|
||||
DB db = mongoDbFactory.getDb();
|
||||
db = dbref != null && StringUtils.hasText(dbref.db()) ? mongoDbFactory.getDb(dbref.db()) : db;
|
||||
|
||||
return new DBRef(db, targetEntity.getCollection(), idMapper.convertId(id));
|
||||
}
|
||||
|
||||
protected Object getValueInternal(MongoPersistentProperty prop, DBObject dbo, SpELExpressionEvaluator eval,
|
||||
@@ -808,7 +740,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
|
||||
Object dbObjItem = sourceValue.get(i);
|
||||
|
||||
if (dbObjItem instanceof DBRef) {
|
||||
items.add(DBRef.class.equals(rawComponentType) ? dbObjItem : read(componentType, readRef((DBRef) dbObjItem),
|
||||
items.add(DBRef.class.equals(rawComponentType) ? dbObjItem : read(componentType, ((DBRef) dbObjItem).fetch(),
|
||||
parent));
|
||||
} else if (dbObjItem instanceof DBObject) {
|
||||
items.add(read(componentType, (DBObject) dbObjItem, parent));
|
||||
@@ -856,7 +788,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
|
||||
if (value instanceof DBObject) {
|
||||
map.put(key, read(valueType, (DBObject) value, parent));
|
||||
} else if (value instanceof DBRef) {
|
||||
map.put(key, DBRef.class.equals(rawValueType) ? value : read(valueType, readRef((DBRef) value)));
|
||||
map.put(key, DBRef.class.equals(rawValueType) ? value : read(valueType, ((DBRef) value).fetch()));
|
||||
} else {
|
||||
Class<?> valueClass = valueType == null ? null : valueType.getType();
|
||||
map.put(key, getPotentiallyConvertedSimpleRead(value, valueClass));
|
||||
@@ -988,7 +920,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
|
||||
|
||||
private class MongoDbPropertyValueProvider implements PropertyValueProvider<MongoPersistentProperty> {
|
||||
|
||||
private final DBObjectAccessor source;
|
||||
private final DBObject source;
|
||||
private final SpELExpressionEvaluator evaluator;
|
||||
private final Object parent;
|
||||
|
||||
@@ -1001,7 +933,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
|
||||
Assert.notNull(source);
|
||||
Assert.notNull(evaluator);
|
||||
|
||||
this.source = new DBObjectAccessor(source);
|
||||
this.source = source;
|
||||
this.evaluator = evaluator;
|
||||
this.parent = parent;
|
||||
}
|
||||
@@ -1013,7 +945,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
|
||||
public <T> T getPropertyValue(MongoPersistentProperty property) {
|
||||
|
||||
String expression = property.getSpelExpression();
|
||||
Object value = expression != null ? evaluator.evaluate(expression) : source.get(property);
|
||||
Object value = expression != null ? evaluator.evaluate(expression) : source.get(property.getFieldName());
|
||||
|
||||
if (value == null) {
|
||||
return null;
|
||||
@@ -1066,7 +998,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
|
||||
if (conversions.hasCustomReadTarget(value.getClass(), rawType)) {
|
||||
return (T) conversionService.convert(value, rawType);
|
||||
} else if (value instanceof DBRef) {
|
||||
return (T) (rawType.equals(DBRef.class) ? value : read(type, readRef((DBRef) value), parent));
|
||||
return (T) (rawType.equals(DBRef.class) ? value : read(type, ((DBRef) value).fetch(), parent));
|
||||
} else if (value instanceof BasicDBList) {
|
||||
return (T) readCollectionOrArray(type, (BasicDBList) value, parent);
|
||||
} else if (value instanceof DBObject) {
|
||||
@@ -1075,14 +1007,4 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
|
||||
return (T) getPotentiallyConvertedSimpleRead(value, rawType);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Performs the fetch operation for the given {@link DBRef}.
|
||||
*
|
||||
* @param ref
|
||||
* @return
|
||||
*/
|
||||
DBObject readRef(DBRef ref) {
|
||||
return ref.fetch();
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2010-2013 the original author or authors.
|
||||
* Copyright 2010-2011 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -17,7 +17,6 @@ package org.springframework.data.mongodb.core.convert;
|
||||
|
||||
import org.springframework.data.convert.EntityConverter;
|
||||
import org.springframework.data.convert.EntityReader;
|
||||
import org.springframework.data.convert.TypeMapper;
|
||||
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
|
||||
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
|
||||
|
||||
@@ -27,17 +26,9 @@ import com.mongodb.DBObject;
|
||||
* Central Mongo specific converter interface which combines {@link MongoWriter} and {@link MongoReader}.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
public interface MongoConverter extends
|
||||
EntityConverter<MongoPersistentEntity<?>, MongoPersistentProperty, Object, DBObject>, MongoWriter<Object>,
|
||||
EntityReader<Object, DBObject> {
|
||||
|
||||
/**
|
||||
* Returns thw {@link TypeMapper} being used to write type information into {@link DBObject}s created with that
|
||||
* converter.
|
||||
*
|
||||
* @return will never be {@literal null}.
|
||||
*/
|
||||
MongoTypeMapper getTypeMapper();
|
||||
}
|
||||
|
||||
@@ -24,11 +24,8 @@ import org.bson.types.ObjectId;
|
||||
import org.springframework.core.convert.ConversionFailedException;
|
||||
import org.springframework.core.convert.TypeDescriptor;
|
||||
import org.springframework.core.convert.converter.Converter;
|
||||
import org.springframework.data.convert.ReadingConverter;
|
||||
import org.springframework.util.StringUtils;
|
||||
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
* Wrapper class to contain useful converters for the usage with Mongo.
|
||||
*
|
||||
@@ -150,15 +147,4 @@ abstract class MongoConverters {
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@ReadingConverter
|
||||
public static enum DBObjectToStringConverter implements Converter<DBObject, String> {
|
||||
|
||||
INSTANCE;
|
||||
|
||||
@Override
|
||||
public String convert(DBObject source) {
|
||||
return source == null ? null : source.toString();
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -15,8 +15,6 @@
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.convert;
|
||||
|
||||
import java.util.Set;
|
||||
|
||||
import org.springframework.data.convert.TypeMapper;
|
||||
|
||||
import com.mongodb.DBObject;
|
||||
@@ -34,14 +32,4 @@ public interface MongoTypeMapper extends TypeMapper<DBObject> {
|
||||
* @return
|
||||
*/
|
||||
boolean isTypeKey(String key);
|
||||
|
||||
/**
|
||||
* Writes type restrictions to the given {@link DBObject}. This usually results in an {@code $in}-clause to be
|
||||
* generated that restricts the type-key (e.g. {@code _class}) to be in the set of type aliases for the given
|
||||
* {@code restrictedTypes}.
|
||||
*
|
||||
* @param result must not be {@literal null}
|
||||
* @param restrictedTypes must not be {@literal null}
|
||||
*/
|
||||
void writeTypeRestrictions(DBObject result, Set<Class<?>> restrictedTypes);
|
||||
}
|
||||
|
||||
@@ -30,7 +30,6 @@ import org.springframework.data.mapping.context.MappingContext;
|
||||
import org.springframework.data.mapping.context.PersistentPropertyPath;
|
||||
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
|
||||
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
|
||||
import org.springframework.data.mongodb.core.query.Query;
|
||||
import org.springframework.util.Assert;
|
||||
|
||||
import com.mongodb.BasicDBList;
|
||||
@@ -48,6 +47,7 @@ import com.mongodb.DBRef;
|
||||
public class QueryMapper {
|
||||
|
||||
private static final List<String> DEFAULT_ID_NAMES = Arrays.asList("id", "_id");
|
||||
private static final String N_OR_PATTERN = "\\$.*or";
|
||||
|
||||
private final ConversionService conversionService;
|
||||
private final MongoConverter converter;
|
||||
@@ -75,10 +75,9 @@ public class QueryMapper {
|
||||
* @param entity can be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
@SuppressWarnings("deprecation")
|
||||
public DBObject getMappedObject(DBObject query, MongoPersistentEntity<?> entity) {
|
||||
|
||||
if (isNestedKeyword(query)) {
|
||||
if (Keyword.isKeyword(query)) {
|
||||
return getMappedKeyword(new Keyword(query), entity);
|
||||
}
|
||||
|
||||
@@ -86,17 +85,7 @@ public class QueryMapper {
|
||||
|
||||
for (String key : query.keySet()) {
|
||||
|
||||
// TODO: remove one once QueryMapper can work with Query instances directly
|
||||
if (Query.isRestrictedTypeKey(key)) {
|
||||
|
||||
@SuppressWarnings("unchecked")
|
||||
Set<Class<?>> restrictedTypes = (Set<Class<?>>) query.get(key);
|
||||
this.converter.getTypeMapper().writeTypeRestrictions(result, restrictedTypes);
|
||||
|
||||
continue;
|
||||
}
|
||||
|
||||
if (isKeyword(key)) {
|
||||
if (Keyword.isKeyword(key)) {
|
||||
result.putAll(getMappedKeyword(new Keyword(query, key), entity));
|
||||
continue;
|
||||
}
|
||||
@@ -106,11 +95,11 @@ public class QueryMapper {
|
||||
Object rawValue = query.get(key);
|
||||
String newKey = field.getMappedKey();
|
||||
|
||||
if (isNestedKeyword(rawValue) && !field.isIdField()) {
|
||||
if (Keyword.isKeyword(rawValue) && !field.isIdField()) {
|
||||
Keyword keyword = new Keyword((DBObject) rawValue);
|
||||
result.put(newKey, getMappedKeyword(field, keyword));
|
||||
} else {
|
||||
result.put(newKey, getMappedValue(field, rawValue));
|
||||
result.put(newKey, getMappedValue(field, query.get(key)));
|
||||
}
|
||||
}
|
||||
|
||||
@@ -120,16 +109,16 @@ public class QueryMapper {
|
||||
/**
|
||||
* Returns the given {@link DBObject} representing a keyword by mapping the keyword's value.
|
||||
*
|
||||
* @param keyword the {@link DBObject} representing a keyword (e.g. {@code $ne : … } )
|
||||
* @param query the {@link DBObject} representing a keyword (e.g. {@code $ne : … } )
|
||||
* @param entity
|
||||
* @return
|
||||
*/
|
||||
private DBObject getMappedKeyword(Keyword keyword, MongoPersistentEntity<?> entity) {
|
||||
private DBObject getMappedKeyword(Keyword query, MongoPersistentEntity<?> entity) {
|
||||
|
||||
// $or/$nor
|
||||
if (keyword.isOrOrNor() || keyword.hasIterableValue()) {
|
||||
if (query.key.matches(N_OR_PATTERN) || query.value instanceof Iterable) {
|
||||
|
||||
Iterable<?> conditions = keyword.getValue();
|
||||
Iterable<?> conditions = (Iterable<?>) query.value;
|
||||
BasicDBList newConditions = new BasicDBList();
|
||||
|
||||
for (Object condition : conditions) {
|
||||
@@ -137,10 +126,10 @@ public class QueryMapper {
|
||||
: convertSimpleOrDBObject(condition, entity));
|
||||
}
|
||||
|
||||
return new BasicDBObject(keyword.getKey(), newConditions);
|
||||
return new BasicDBObject(query.key, newConditions);
|
||||
}
|
||||
|
||||
return new BasicDBObject(keyword.getKey(), convertSimpleOrDBObject(keyword.getValue(), entity));
|
||||
return new BasicDBObject(query.key, convertSimpleOrDBObject(query.value, entity));
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -153,12 +142,10 @@ public class QueryMapper {
|
||||
private DBObject getMappedKeyword(Field property, Keyword keyword) {
|
||||
|
||||
boolean needsAssociationConversion = property.isAssociation() && !keyword.isExists();
|
||||
Object value = keyword.getValue();
|
||||
Object value = needsAssociationConversion ? convertAssociation(keyword.value, property.getProperty())
|
||||
: getMappedValue(property.with(keyword.key), keyword.value);
|
||||
|
||||
Object convertedValue = needsAssociationConversion ? convertAssociation(value, property.getProperty())
|
||||
: getMappedValue(property.with(keyword.getKey()), value);
|
||||
|
||||
return new BasicDBObject(keyword.key, convertedValue);
|
||||
return new BasicDBObject(keyword.key, value);
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -196,7 +183,7 @@ public class QueryMapper {
|
||||
}
|
||||
}
|
||||
|
||||
if (isNestedKeyword(value)) {
|
||||
if (Keyword.isKeyword(value)) {
|
||||
return getMappedKeyword(new Keyword((DBObject) value), null);
|
||||
}
|
||||
|
||||
@@ -217,25 +204,13 @@ public class QueryMapper {
|
||||
private Object convertSimpleOrDBObject(Object source, MongoPersistentEntity<?> entity) {
|
||||
|
||||
if (source instanceof BasicDBList) {
|
||||
return delegateConvertToMongoType(source, entity);
|
||||
return converter.convertToMongoType(source);
|
||||
}
|
||||
|
||||
if (source instanceof DBObject) {
|
||||
return getMappedObject((DBObject) source, entity);
|
||||
}
|
||||
|
||||
return delegateConvertToMongoType(source, entity);
|
||||
}
|
||||
|
||||
/**
|
||||
* Converts the given source Object to a mongo type with the type information of the original source type omitted.
|
||||
* Subclasses may overwrite this method to retain the type information of the source type on the resulting mongo type.
|
||||
*
|
||||
* @param source
|
||||
* @param entity
|
||||
* @return the converted mongo type or null if source is null
|
||||
*/
|
||||
protected Object delegateConvertToMongoType(Object source, MongoPersistentEntity<?> entity) {
|
||||
return converter.convertToMongoType(source);
|
||||
}
|
||||
|
||||
@@ -287,40 +262,7 @@ public class QueryMapper {
|
||||
// Ignore
|
||||
}
|
||||
|
||||
return delegateConvertToMongoType(id, null);
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns whether the given {@link Object} is a keyword, i.e. if it's a {@link DBObject} with a keyword key.
|
||||
*
|
||||
* @param candidate
|
||||
* @return
|
||||
*/
|
||||
protected boolean isNestedKeyword(Object candidate) {
|
||||
|
||||
if (!(candidate instanceof BasicDBObject)) {
|
||||
return false;
|
||||
}
|
||||
|
||||
BasicDBObject dbObject = (BasicDBObject) candidate;
|
||||
Set<String> keys = dbObject.keySet();
|
||||
|
||||
if (keys.size() != 1) {
|
||||
return false;
|
||||
}
|
||||
|
||||
return isKeyword(keys.iterator().next().toString());
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns whether the given {@link String} is a MongoDB keyword. The default implementation will check against the
|
||||
* set of registered keywords returned by {@link #getKeywords()}.
|
||||
*
|
||||
* @param candidate
|
||||
* @return
|
||||
*/
|
||||
protected boolean isKeyword(String candidate) {
|
||||
return candidate.startsWith("$");
|
||||
return converter.convertToMongoType(id);
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -330,10 +272,8 @@ public class QueryMapper {
|
||||
*/
|
||||
private static class Keyword {
|
||||
|
||||
private static final String N_OR_PATTERN = "\\$.*or";
|
||||
|
||||
private final String key;
|
||||
private final Object value;
|
||||
String key;
|
||||
Object value;
|
||||
|
||||
public Keyword(DBObject source, String key) {
|
||||
this.key = key;
|
||||
@@ -358,21 +298,25 @@ public class QueryMapper {
|
||||
return "$exists".equalsIgnoreCase(key);
|
||||
}
|
||||
|
||||
public boolean isOrOrNor() {
|
||||
return key.matches(N_OR_PATTERN);
|
||||
/**
|
||||
* Returns whether the given value actually represents a keyword. If this returns {@literal true} it's safe to call
|
||||
* the constructor.
|
||||
*
|
||||
* @param value
|
||||
* @return
|
||||
*/
|
||||
public static boolean isKeyword(Object value) {
|
||||
|
||||
if (value instanceof String) {
|
||||
return ((String) value).startsWith("$");
|
||||
}
|
||||
|
||||
public boolean hasIterableValue() {
|
||||
return value instanceof Iterable;
|
||||
if (!(value instanceof DBObject)) {
|
||||
return false;
|
||||
}
|
||||
|
||||
public String getKey() {
|
||||
return key;
|
||||
}
|
||||
|
||||
@SuppressWarnings("unchecked")
|
||||
public <T> T getValue() {
|
||||
return (T) value;
|
||||
DBObject dbObject = (DBObject) value;
|
||||
return dbObject.keySet().size() == 1 && dbObject.keySet().iterator().next().startsWith("$");
|
||||
}
|
||||
}
|
||||
|
||||
@@ -388,7 +332,7 @@ public class QueryMapper {
|
||||
protected final String name;
|
||||
|
||||
/**
|
||||
* Creates a new {@link DocumentField} without meta-information but the given name.
|
||||
* Creates a new {@link Field} without meta-information but the given name.
|
||||
*
|
||||
* @param name must not be {@literal null} or empty.
|
||||
*/
|
||||
@@ -399,7 +343,7 @@ public class QueryMapper {
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns a new {@link DocumentField} with the given name.
|
||||
* Returns a new {@link Field} with the given name.
|
||||
*
|
||||
* @param name must not be {@literal null} or empty.
|
||||
* @return
|
||||
@@ -455,7 +399,7 @@ public class QueryMapper {
|
||||
}
|
||||
|
||||
/**
|
||||
* Extension of {@link DocumentField} to be backed with mapping metadata.
|
||||
* Extension of {@link Field} to be backed with mapping metadata.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
|
||||
@@ -1,52 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.convert;
|
||||
|
||||
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
|
||||
|
||||
/**
|
||||
* A subclass of {@link QueryMapper} that retains type information on the mongo types.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
public class UpdateMapper extends QueryMapper {
|
||||
|
||||
private final MongoWriter<?> converter;
|
||||
|
||||
/**
|
||||
* Creates a new {@link UpdateMapper} using the given {@link MongoConverter}.
|
||||
*
|
||||
* @param converter must not be {@literal null}.
|
||||
*/
|
||||
public UpdateMapper(MongoConverter converter) {
|
||||
|
||||
super(converter);
|
||||
this.converter = converter;
|
||||
}
|
||||
|
||||
/**
|
||||
* Converts the given source object to a mongo type retaining the original type information of the source type on the
|
||||
* mongo type.
|
||||
*
|
||||
* @see org.springframework.data.mongodb.core.convert.QueryMapper#delegateConvertToMongoType(java.lang.Object,
|
||||
* org.springframework.data.mongodb.core.mapping.MongoPersistentEntity)
|
||||
*/
|
||||
@Override
|
||||
protected Object delegateConvertToMongoType(Object source, MongoPersistentEntity<?> entity) {
|
||||
return entity == null ? super.delegateConvertToMongoType(source, null) : converter.convertToMongoType(source,
|
||||
entity.getTypeInformation());
|
||||
}
|
||||
}
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2011-2013 the original author or authors.
|
||||
* Copyright 2011-2012 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -26,8 +26,6 @@ import java.lang.annotation.Target;
|
||||
*
|
||||
* @author Jon Brisbin
|
||||
* @author Oliver Gierke
|
||||
* @author Philipp Schneider
|
||||
* @author Johno Crawford
|
||||
*/
|
||||
@Target({ ElementType.TYPE })
|
||||
@Documented
|
||||
@@ -71,20 +69,4 @@ public @interface CompoundIndex {
|
||||
* @return
|
||||
*/
|
||||
String collection() default "";
|
||||
|
||||
/**
|
||||
* If {@literal true} the index will be created in the background.
|
||||
*
|
||||
* @see http://docs.mongodb.org/manual/core/indexes/#background-construction
|
||||
* @return
|
||||
*/
|
||||
boolean background() default false;
|
||||
|
||||
/**
|
||||
* Configures the number of seconds after which the collection should expire. Defaults to -1 for no expiry.
|
||||
*
|
||||
* @see http://docs.mongodb.org/manual/tutorial/expire-data/
|
||||
* @return
|
||||
*/
|
||||
int expireAfterSeconds() default -1;
|
||||
}
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2010-2013 the original author or authors.
|
||||
* Copyright 2010-2011 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -18,20 +18,18 @@ package org.springframework.data.mongodb.core.index;
|
||||
import java.util.LinkedHashMap;
|
||||
import java.util.Map;
|
||||
|
||||
import org.springframework.data.domain.Sort.Direction;
|
||||
import org.springframework.data.mongodb.core.query.Order;
|
||||
|
||||
import com.mongodb.BasicDBObject;
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
@SuppressWarnings("deprecation")
|
||||
public class Index implements IndexDefinition {
|
||||
|
||||
public enum Duplicates {
|
||||
RETAIN, DROP
|
||||
}
|
||||
|
||||
private final Map<String, Direction> fieldSpec = new LinkedHashMap<String, Direction>();
|
||||
private final Map<String, Order> fieldSpec = new LinkedHashMap<String, Order>();
|
||||
|
||||
private String name;
|
||||
|
||||
@@ -44,37 +42,12 @@ public class Index implements IndexDefinition {
|
||||
public Index() {
|
||||
}
|
||||
|
||||
public Index(String key, Direction direction) {
|
||||
fieldSpec.put(key, direction);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link Indexed} on the given key and {@link Order}.
|
||||
*
|
||||
* @deprecated use {@link #Index(String, Direction)} instead.
|
||||
* @param key must not be {@literal null} or empty.
|
||||
* @param order must not be {@literal null}.
|
||||
*/
|
||||
@Deprecated
|
||||
public Index(String key, Order order) {
|
||||
this(key, order.toDirection());
|
||||
fieldSpec.put(key, order);
|
||||
}
|
||||
|
||||
/**
|
||||
* Adds the given field to the index.
|
||||
*
|
||||
* @deprecated use {@link #on(String, Direction)} instead.
|
||||
* @param key must not be {@literal null} or empty.
|
||||
* @param order must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
@Deprecated
|
||||
public Index on(String key, Order order) {
|
||||
return on(key, order.toDirection());
|
||||
}
|
||||
|
||||
public Index on(String key, Direction direction) {
|
||||
fieldSpec.put(key, direction);
|
||||
fieldSpec.put(key, order);
|
||||
return this;
|
||||
}
|
||||
|
||||
@@ -103,7 +76,7 @@ public class Index implements IndexDefinition {
|
||||
public DBObject getIndexKeys() {
|
||||
DBObject dbo = new BasicDBObject();
|
||||
for (String k : fieldSpec.keySet()) {
|
||||
dbo.put(k, fieldSpec.get(k).equals(Direction.ASC) ? 1 : -1);
|
||||
dbo.put(k, (fieldSpec.get(k).equals(Order.ASCENDING) ? 1 : -1));
|
||||
}
|
||||
return dbo;
|
||||
}
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2012-2013 the original author or authors.
|
||||
* Copyright 2012 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -15,7 +15,6 @@
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.index;
|
||||
|
||||
import org.springframework.data.domain.Sort.Direction;
|
||||
import org.springframework.data.mongodb.core.query.Order;
|
||||
import org.springframework.util.Assert;
|
||||
import org.springframework.util.ObjectUtils;
|
||||
@@ -25,38 +24,30 @@ import org.springframework.util.ObjectUtils;
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
@SuppressWarnings("deprecation")
|
||||
public final class IndexField {
|
||||
|
||||
private final String key;
|
||||
private final Direction direction;
|
||||
private final Order order;
|
||||
private final boolean isGeo;
|
||||
|
||||
private IndexField(String key, Direction direction, boolean isGeo) {
|
||||
private IndexField(String key, Order order, boolean isGeo) {
|
||||
|
||||
Assert.hasText(key);
|
||||
Assert.isTrue(direction != null ^ isGeo);
|
||||
Assert.isTrue(order != null ^ isGeo);
|
||||
|
||||
this.key = key;
|
||||
this.direction = direction;
|
||||
this.order = order;
|
||||
this.isGeo = isGeo;
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a default {@link IndexField} with the given key and {@link Order}.
|
||||
*
|
||||
* @deprecated use {@link #create(String, Direction)}.
|
||||
* @param key must not be {@literal null} or emtpy.
|
||||
* @param direction must not be {@literal null}.
|
||||
* @param order must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
@Deprecated
|
||||
public static IndexField create(String key, Order order) {
|
||||
Assert.notNull(order);
|
||||
return new IndexField(key, order.toDirection(), false);
|
||||
}
|
||||
|
||||
public static IndexField create(String key, Direction order) {
|
||||
Assert.notNull(order);
|
||||
return new IndexField(key, order, false);
|
||||
}
|
||||
@@ -79,23 +70,12 @@ public final class IndexField {
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the direction of the {@link IndexField} or {@literal null} in case we have a geo index field.
|
||||
* Returns the order of the {@link IndexField} or {@literal null} in case we have a geo index field.
|
||||
*
|
||||
* @deprecated use {@link #getDirection()} instead.
|
||||
* @return the direction
|
||||
* @return the order
|
||||
*/
|
||||
@Deprecated
|
||||
public Order getOrder() {
|
||||
return Direction.ASC.equals(direction) ? Order.ASCENDING : Order.DESCENDING;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the direction of the {@link IndexField} or {@literal null} in case we have a geo index field.
|
||||
*
|
||||
* @return the direction
|
||||
*/
|
||||
public Direction getDirection() {
|
||||
return direction;
|
||||
return order;
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -124,8 +104,7 @@ public final class IndexField {
|
||||
|
||||
IndexField that = (IndexField) obj;
|
||||
|
||||
return this.key.equals(that.key) && ObjectUtils.nullSafeEquals(this.direction, that.direction)
|
||||
&& this.isGeo == that.isGeo;
|
||||
return this.key.equals(that.key) && ObjectUtils.nullSafeEquals(this.order, that.order) && this.isGeo == that.isGeo;
|
||||
}
|
||||
|
||||
/*
|
||||
@@ -137,7 +116,7 @@ public final class IndexField {
|
||||
|
||||
int result = 17;
|
||||
result += 31 * ObjectUtils.nullSafeHashCode(key);
|
||||
result += 31 * ObjectUtils.nullSafeHashCode(direction);
|
||||
result += 31 * ObjectUtils.nullSafeHashCode(order);
|
||||
result += 31 * ObjectUtils.nullSafeHashCode(isGeo);
|
||||
return result;
|
||||
}
|
||||
@@ -148,6 +127,6 @@ public final class IndexField {
|
||||
*/
|
||||
@Override
|
||||
public String toString() {
|
||||
return String.format("IndexField [ key: %s, direction: %s, isGeo: %s]", key, direction, isGeo);
|
||||
return String.format("IndexField [ key: %s, order: %s, isGeo: %s]", key, order, isGeo);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2011-2013 the original author or authors.
|
||||
* Copyright (c) 2011 by the original author(s).
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -13,6 +13,7 @@
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
|
||||
package org.springframework.data.mongodb.core.index;
|
||||
|
||||
import java.lang.annotation.ElementType;
|
||||
@@ -23,10 +24,7 @@ import java.lang.annotation.Target;
|
||||
/**
|
||||
* Mark a field to be indexed using MongoDB's indexing feature.
|
||||
*
|
||||
* @author Jon Brisbin
|
||||
* @author Oliver Gierke
|
||||
* @author Philipp Schneider
|
||||
* @author Johno Crawford
|
||||
* @author Jon Brisbin <jbrisbin@vmware.com>
|
||||
*/
|
||||
@Target(ElementType.FIELD)
|
||||
@Retention(RetentionPolicy.RUNTIME)
|
||||
@@ -43,20 +41,4 @@ public @interface Indexed {
|
||||
String name() default "";
|
||||
|
||||
String collection() default "";
|
||||
|
||||
/**
|
||||
* If {@literal true} the index will be created in the background.
|
||||
*
|
||||
* @see http://docs.mongodb.org/manual/core/indexes/#background-construction
|
||||
* @return
|
||||
*/
|
||||
boolean background() default false;
|
||||
|
||||
/**
|
||||
* Configures the number of seconds after which the collection should expire. Defaults to -1 for no expiry.
|
||||
*
|
||||
* @see http://docs.mongodb.org/manual/tutorial/expire-data/
|
||||
* @return
|
||||
*/
|
||||
int expireAfterSeconds() default -1;
|
||||
}
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2011-2013 the original author or authors.
|
||||
* Copyright 2011-2012 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -43,8 +43,6 @@ import com.mongodb.util.JSON;
|
||||
*
|
||||
* @author Jon Brisbin
|
||||
* @author Oliver Gierke
|
||||
* @author Philipp Schneider
|
||||
* @author Johno Crawford
|
||||
*/
|
||||
public class MongoPersistentEntityIndexCreator implements
|
||||
ApplicationListener<MappingContextEvent<MongoPersistentEntity<?>, MongoPersistentProperty>> {
|
||||
@@ -108,8 +106,7 @@ public class MongoPersistentEntityIndexCreator implements
|
||||
String indexColl = StringUtils.hasText(index.collection()) ? index.collection() : entity.getCollection();
|
||||
DBObject definition = (DBObject) JSON.parse(index.def());
|
||||
|
||||
ensureIndex(indexColl, index.name(), definition, index.unique(), index.dropDups(), index.sparse(),
|
||||
index.background(), index.expireAfterSeconds());
|
||||
ensureIndex(indexColl, index.name(), definition, index.unique(), index.dropDups(), index.sparse());
|
||||
|
||||
if (log.isDebugEnabled()) {
|
||||
log.debug("Created compound index " + index);
|
||||
@@ -143,8 +140,7 @@ public class MongoPersistentEntityIndexCreator implements
|
||||
int direction = index.direction() == IndexDirection.ASCENDING ? 1 : -1;
|
||||
DBObject definition = new BasicDBObject(persistentProperty.getFieldName(), direction);
|
||||
|
||||
ensureIndex(collection, name, definition, index.unique(), index.dropDups(), index.sparse(),
|
||||
index.background(), index.expireAfterSeconds());
|
||||
ensureIndex(collection, name, definition, index.unique(), index.dropDups(), index.sparse());
|
||||
|
||||
if (log.isDebugEnabled()) {
|
||||
log.debug("Created property index " + index);
|
||||
@@ -193,22 +189,15 @@ public class MongoPersistentEntityIndexCreator implements
|
||||
* @param unique whether it shall be a unique index
|
||||
* @param dropDups whether to drop duplicates
|
||||
* @param sparse sparse or not
|
||||
* @param background whether the index will be created in the background
|
||||
* @param expireAfterSeconds the time to live for documents in the collection
|
||||
*/
|
||||
protected void ensureIndex(String collection, String name, DBObject indexDefinition, boolean unique,
|
||||
boolean dropDups, boolean sparse, boolean background, int expireAfterSeconds) {
|
||||
boolean dropDups, boolean sparse) {
|
||||
|
||||
DBObject opts = new BasicDBObject();
|
||||
opts.put("name", name);
|
||||
opts.put("dropDups", dropDups);
|
||||
opts.put("sparse", sparse);
|
||||
opts.put("unique", unique);
|
||||
opts.put("background", background);
|
||||
|
||||
if (expireAfterSeconds != -1) {
|
||||
opts.put("expireAfterSeconds", expireAfterSeconds);
|
||||
}
|
||||
|
||||
mongoDbFactory.getDb().getCollection(collection).ensureIndex(indexDefinition, opts);
|
||||
}
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2011-2013 the original author or authors.
|
||||
* Copyright 2011-2012 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -15,29 +15,20 @@
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.mapping;
|
||||
|
||||
import java.lang.reflect.Field;
|
||||
import java.util.Comparator;
|
||||
import java.util.HashMap;
|
||||
import java.util.Map;
|
||||
|
||||
import org.springframework.beans.BeansException;
|
||||
import org.springframework.context.ApplicationContext;
|
||||
import org.springframework.context.ApplicationContextAware;
|
||||
import org.springframework.context.expression.BeanFactoryAccessor;
|
||||
import org.springframework.context.expression.BeanFactoryResolver;
|
||||
import org.springframework.data.annotation.Id;
|
||||
import org.springframework.data.mapping.Association;
|
||||
import org.springframework.data.mapping.AssociationHandler;
|
||||
import org.springframework.data.mapping.PropertyHandler;
|
||||
import org.springframework.data.mapping.model.BasicPersistentEntity;
|
||||
import org.springframework.data.mapping.model.MappingException;
|
||||
import org.springframework.data.mongodb.MongoCollectionUtils;
|
||||
import org.springframework.data.util.TypeInformation;
|
||||
import org.springframework.expression.Expression;
|
||||
import org.springframework.expression.ParserContext;
|
||||
import org.springframework.expression.spel.standard.SpelExpressionParser;
|
||||
import org.springframework.expression.spel.support.StandardEvaluationContext;
|
||||
import org.springframework.util.Assert;
|
||||
import org.springframework.util.StringUtils;
|
||||
|
||||
/**
|
||||
@@ -46,12 +37,10 @@ import org.springframework.util.StringUtils;
|
||||
*
|
||||
* @author Jon Brisbin
|
||||
* @author Oliver Gierke
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
public class BasicMongoPersistentEntity<T> extends BasicPersistentEntity<T, MongoPersistentProperty> implements
|
||||
MongoPersistentEntity<T>, ApplicationContextAware {
|
||||
|
||||
private static final String AMBIGUOUS_FIELD_MAPPING = "Ambiguous field mapping detected! Both %s and %s map to the same field name %s! Disambiguate using @DocumentField annotation!";
|
||||
private final String collection;
|
||||
private final SpelExpressionParser parser;
|
||||
private final StandardEvaluationContext context;
|
||||
@@ -100,19 +89,6 @@ public class BasicMongoPersistentEntity<T> extends BasicPersistentEntity<T, Mong
|
||||
return expression.getValue(context, String.class);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mapping.model.BasicPersistentEntity#verify()
|
||||
*/
|
||||
@Override
|
||||
public void verify() {
|
||||
|
||||
AssertFieldNameUniquenessHandler handler = new AssertFieldNameUniquenessHandler();
|
||||
|
||||
doWithProperties(handler);
|
||||
doWithAssociations(handler);
|
||||
}
|
||||
|
||||
/**
|
||||
* {@link Comparator} implementation inspecting the {@link MongoPersistentProperty}'s order.
|
||||
*
|
||||
@@ -139,91 +115,4 @@ public class BasicMongoPersistentEntity<T> extends BasicPersistentEntity<T, Mong
|
||||
return o1.getFieldOrder() - o2.getFieldOrder();
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* As a general note: An implicit id property has a name that matches "id" or "_id". An explicit id property is one
|
||||
* that is annotated with @see {@link Id}. The property id is updated according to the following rules: 1) An id
|
||||
* property which is defined explicitly takes precedence over an implicitly defined id property. 2) In case of any
|
||||
* ambiguity a @see {@link MappingException} is thrown.
|
||||
*
|
||||
* @param property - the new id property candidate
|
||||
* @return
|
||||
*/
|
||||
@Override
|
||||
protected MongoPersistentProperty returnPropertyIfBetterIdPropertyCandidateOrNull(MongoPersistentProperty property) {
|
||||
|
||||
Assert.notNull(property);
|
||||
|
||||
if (!property.isIdProperty()) {
|
||||
return null;
|
||||
}
|
||||
|
||||
MongoPersistentProperty currentIdProperty = getIdProperty();
|
||||
|
||||
boolean currentIdPropertyIsSet = currentIdProperty != null;
|
||||
@SuppressWarnings("null")
|
||||
boolean currentIdPropertyIsExplicit = currentIdPropertyIsSet ? currentIdProperty.isExplicitIdProperty() : false;
|
||||
boolean newIdPropertyIsExplicit = property.isExplicitIdProperty();
|
||||
|
||||
if (!currentIdPropertyIsSet) {
|
||||
return property;
|
||||
|
||||
}
|
||||
|
||||
@SuppressWarnings("null")
|
||||
Field currentIdPropertyField = currentIdProperty.getField();
|
||||
|
||||
if (newIdPropertyIsExplicit && currentIdPropertyIsExplicit) {
|
||||
throw new MappingException(String.format(
|
||||
"Attempt to add explicit id property %s but already have an property %s registered "
|
||||
+ "as explicit id. Check your mapping configuration!", property.getField(), currentIdPropertyField));
|
||||
|
||||
} else if (newIdPropertyIsExplicit && !currentIdPropertyIsExplicit) {
|
||||
// explicit id property takes precedence over implicit id property
|
||||
return property;
|
||||
|
||||
} else if (!newIdPropertyIsExplicit && currentIdPropertyIsExplicit) {
|
||||
// no id property override - current property is explicitly defined
|
||||
|
||||
} else {
|
||||
throw new MappingException(String.format(
|
||||
"Attempt to add id property %s but already have an property %s registered "
|
||||
+ "as id. Check your mapping configuration!", property.getField(), currentIdPropertyField));
|
||||
}
|
||||
|
||||
return null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Handler to collect {@link MongoPersistentProperty} instances and check that each of them is mapped to a distinct
|
||||
* field name.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
private static class AssertFieldNameUniquenessHandler implements PropertyHandler<MongoPersistentProperty>,
|
||||
AssociationHandler<MongoPersistentProperty> {
|
||||
|
||||
private final Map<String, MongoPersistentProperty> properties = new HashMap<String, MongoPersistentProperty>();
|
||||
|
||||
public void doWithPersistentProperty(MongoPersistentProperty persistentProperty) {
|
||||
assertUniqueness(persistentProperty);
|
||||
}
|
||||
|
||||
public void doWithAssociation(Association<MongoPersistentProperty> association) {
|
||||
assertUniqueness(association.getInverse());
|
||||
}
|
||||
|
||||
private void assertUniqueness(MongoPersistentProperty property) {
|
||||
|
||||
String fieldName = property.getFieldName();
|
||||
MongoPersistentProperty existingProperty = properties.get(fieldName);
|
||||
|
||||
if (existingProperty != null) {
|
||||
throw new MappingException(String.format(AMBIGUOUS_FIELD_MAPPING, property.toString(),
|
||||
existingProperty.toString(), fieldName));
|
||||
}
|
||||
|
||||
properties.put(fieldName, property);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2011-2013 the original author or authors.
|
||||
* Copyright 2011-2012 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -24,10 +24,8 @@ import java.util.Set;
|
||||
import org.bson.types.ObjectId;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.data.annotation.Id;
|
||||
import org.springframework.data.mapping.Association;
|
||||
import org.springframework.data.mapping.model.AnnotationBasedPersistentProperty;
|
||||
import org.springframework.data.mapping.model.MappingException;
|
||||
import org.springframework.data.mapping.model.SimpleTypeHolder;
|
||||
import org.springframework.util.ReflectionUtils;
|
||||
import org.springframework.util.StringUtils;
|
||||
@@ -39,7 +37,6 @@ import com.mongodb.DBObject;
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
* @author Patryk Wasik
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
public class BasicMongoPersistentProperty extends AnnotationBasedPersistentProperty<MongoPersistentProperty> implements
|
||||
MongoPersistentProperty {
|
||||
@@ -63,8 +60,6 @@ public class BasicMongoPersistentProperty extends AnnotationBasedPersistentPrope
|
||||
CAUSE_FIELD = ReflectionUtils.findField(Throwable.class, "cause");
|
||||
}
|
||||
|
||||
private final FieldNamingStrategy fieldNamingStrategy;
|
||||
|
||||
/**
|
||||
* Creates a new {@link BasicMongoPersistentProperty}.
|
||||
*
|
||||
@@ -72,14 +67,10 @@ public class BasicMongoPersistentProperty extends AnnotationBasedPersistentPrope
|
||||
* @param propertyDescriptor
|
||||
* @param owner
|
||||
* @param simpleTypeHolder
|
||||
* @param fieldNamingStrategy
|
||||
*/
|
||||
public BasicMongoPersistentProperty(Field field, PropertyDescriptor propertyDescriptor,
|
||||
MongoPersistentEntity<?> owner, SimpleTypeHolder simpleTypeHolder, FieldNamingStrategy fieldNamingStrategy) {
|
||||
|
||||
MongoPersistentEntity<?> owner, SimpleTypeHolder simpleTypeHolder) {
|
||||
super(field, propertyDescriptor, owner, simpleTypeHolder);
|
||||
this.fieldNamingStrategy = fieldNamingStrategy == null ? PropertyNameFieldNamingStrategy.INSTANCE
|
||||
: fieldNamingStrategy;
|
||||
|
||||
if (isIdProperty() && getFieldName() != ID_FIELD_NAME) {
|
||||
LOG.warn("Customizing field name for id property not allowed! Custom name will not be considered!");
|
||||
@@ -111,15 +102,6 @@ public class BasicMongoPersistentProperty extends AnnotationBasedPersistentPrope
|
||||
return SUPPORTED_ID_PROPERTY_NAMES.contains(field.getName());
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.mapping.MongoPersistentProperty#isExplicitIdProperty()
|
||||
*/
|
||||
@Override
|
||||
public boolean isExplicitIdProperty() {
|
||||
return isAnnotationPresent(Id.class);
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the key to be used to store the value of the property inside a Mongo {@link DBObject}.
|
||||
*
|
||||
@@ -128,34 +110,12 @@ public class BasicMongoPersistentProperty extends AnnotationBasedPersistentPrope
|
||||
public String getFieldName() {
|
||||
|
||||
if (isIdProperty()) {
|
||||
|
||||
if (owner == null) {
|
||||
return ID_FIELD_NAME;
|
||||
}
|
||||
|
||||
if (owner.getIdProperty() == null) {
|
||||
return ID_FIELD_NAME;
|
||||
}
|
||||
|
||||
if (owner.isIdProperty(this)) {
|
||||
return ID_FIELD_NAME;
|
||||
}
|
||||
}
|
||||
|
||||
org.springframework.data.mongodb.core.mapping.Field annotation = findAnnotation(org.springframework.data.mongodb.core.mapping.Field.class);
|
||||
|
||||
if (annotation != null && StringUtils.hasText(annotation.value())) {
|
||||
return annotation.value();
|
||||
}
|
||||
|
||||
String fieldName = fieldNamingStrategy.getFieldName(this);
|
||||
|
||||
if (!StringUtils.hasText(fieldName)) {
|
||||
throw new MappingException(String.format("Invalid (null or empty) field name returned for property %s by %s!",
|
||||
this, fieldNamingStrategy.getClass()));
|
||||
}
|
||||
|
||||
return fieldName;
|
||||
org.springframework.data.mongodb.core.mapping.Field annotation = getField().getAnnotation(
|
||||
org.springframework.data.mongodb.core.mapping.Field.class);
|
||||
return annotation != null && StringUtils.hasText(annotation.value()) ? annotation.value() : field.getName();
|
||||
}
|
||||
|
||||
/*
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2011-2013 the original author or authors.
|
||||
* Copyright 2011-2012 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -38,11 +38,10 @@ public class CachingMongoPersistentProperty extends BasicMongoPersistentProperty
|
||||
* @param propertyDescriptor
|
||||
* @param owner
|
||||
* @param simpleTypeHolder
|
||||
* @param fieldNamingStrategy
|
||||
*/
|
||||
public CachingMongoPersistentProperty(Field field, PropertyDescriptor propertyDescriptor,
|
||||
MongoPersistentEntity<?> owner, SimpleTypeHolder simpleTypeHolder, FieldNamingStrategy fieldNamingStrategy) {
|
||||
super(field, propertyDescriptor, owner, simpleTypeHolder, fieldNamingStrategy);
|
||||
MongoPersistentEntity<?> owner, SimpleTypeHolder simpleTypeHolder) {
|
||||
super(field, propertyDescriptor, owner, simpleTypeHolder);
|
||||
}
|
||||
|
||||
/*
|
||||
|
||||
@@ -1,46 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.mapping;
|
||||
|
||||
import java.util.Locale;
|
||||
|
||||
/**
|
||||
* {@link FieldNamingStrategy} that abbreviates field names by using the very first letter of the camel case parts of
|
||||
* the {@link MongoPersistentProperty}'s name.
|
||||
*
|
||||
* @since 1.3
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
public class CamelCaseAbbreviatingFieldNamingStrategy implements FieldNamingStrategy {
|
||||
|
||||
private static final String CAMEL_CASE_PATTERN = "(?<!(^|[A-Z]))(?=[A-Z])|(?<!^)(?=[A-Z][a-z])";
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.mapping.FieldNamingStrategy#getFieldName(org.springframework.data.mongodb.core.mapping.MongoPersistentProperty)
|
||||
*/
|
||||
public String getFieldName(MongoPersistentProperty property) {
|
||||
|
||||
String[] parts = property.getName().split(CAMEL_CASE_PATTERN);
|
||||
StringBuilder builder = new StringBuilder();
|
||||
|
||||
for (String part : parts) {
|
||||
builder.append(part.substring(0, 1).toLowerCase(Locale.US));
|
||||
}
|
||||
|
||||
return builder.toString();
|
||||
}
|
||||
}
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2011-2013 by the original author(s).
|
||||
* Copyright 2011-2012 by the original author(s).
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -27,8 +27,7 @@ import org.springframework.data.annotation.Reference;
|
||||
* An annotation that indicates the annotated field is to be stored using a {@link com.mongodb.DBRef}.
|
||||
*
|
||||
* @author Jon Brisbin
|
||||
* @author Oliver Gierke
|
||||
* @author Thomas Darimont
|
||||
* @authot Oliver Gierke
|
||||
*/
|
||||
@Documented
|
||||
@Retention(RetentionPolicy.RUNTIME)
|
||||
@@ -42,11 +41,4 @@ public @interface DBRef {
|
||||
* @return
|
||||
*/
|
||||
String db() default "";
|
||||
|
||||
/**
|
||||
* Controls whether the referenced entity should be loaded lazily. This defaults to {@literal false}.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
boolean lazy() default false;
|
||||
}
|
||||
|
||||
@@ -1,36 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.mapping;
|
||||
|
||||
/**
|
||||
* SPI interface to determine how to name document fields in cases the field name is not manually defined.
|
||||
*
|
||||
* @see DocumentField
|
||||
* @see PropertyNameFieldNamingStrategy
|
||||
* @see CamelCaseAbbreviatingFieldNamingStrategy
|
||||
* @since 1.3
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
public interface FieldNamingStrategy {
|
||||
|
||||
/**
|
||||
* Returns the field name to be used for the given {@link MongoPersistentProperty}.
|
||||
*
|
||||
* @param property must not be {@literal null} or empty;
|
||||
* @return
|
||||
*/
|
||||
String getFieldName(MongoPersistentProperty property);
|
||||
}
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2011-2013 the original author or authors.
|
||||
* Copyright 2011-2012 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -37,9 +37,6 @@ import org.springframework.data.util.TypeInformation;
|
||||
public class MongoMappingContext extends AbstractMappingContext<BasicMongoPersistentEntity<?>, MongoPersistentProperty>
|
||||
implements ApplicationContextAware {
|
||||
|
||||
private static final FieldNamingStrategy DEFAULT_NAMING_STRATEGY = PropertyNameFieldNamingStrategy.INSTANCE;
|
||||
|
||||
private FieldNamingStrategy fieldNamingStrategy = DEFAULT_NAMING_STRATEGY;
|
||||
private ApplicationContext context;
|
||||
|
||||
/**
|
||||
@@ -49,17 +46,6 @@ public class MongoMappingContext extends AbstractMappingContext<BasicMongoPersis
|
||||
setSimpleTypeHolder(MongoSimpleTypes.HOLDER);
|
||||
}
|
||||
|
||||
/**
|
||||
* Configures the {@link FieldNamingStrategy} to be used to determine the field name if no manual mapping is applied.
|
||||
* Defaults to a strategy using the plain property name.
|
||||
*
|
||||
* @param fieldNamingStrategy the {@link FieldNamingStrategy} to be used to determine the field name if no manual
|
||||
* mapping is applied.
|
||||
*/
|
||||
public void setFieldNamingStrategy(FieldNamingStrategy fieldNamingStrategy) {
|
||||
this.fieldNamingStrategy = fieldNamingStrategy == null ? DEFAULT_NAMING_STRATEGY : fieldNamingStrategy;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mapping.context.AbstractMappingContext#shouldCreatePersistentEntityFor(org.springframework.data.util.TypeInformation)
|
||||
@@ -76,7 +62,7 @@ public class MongoMappingContext extends AbstractMappingContext<BasicMongoPersis
|
||||
@Override
|
||||
public MongoPersistentProperty createPersistentProperty(Field field, PropertyDescriptor descriptor,
|
||||
BasicMongoPersistentEntity<?> owner, SimpleTypeHolder simpleTypeHolder) {
|
||||
return new CachingMongoPersistentProperty(field, descriptor, owner, simpleTypeHolder, fieldNamingStrategy);
|
||||
return new CachingMongoPersistentProperty(field, descriptor, owner, simpleTypeHolder);
|
||||
}
|
||||
|
||||
/*
|
||||
@@ -101,6 +87,8 @@ public class MongoMappingContext extends AbstractMappingContext<BasicMongoPersis
|
||||
*/
|
||||
@Override
|
||||
public void setApplicationContext(ApplicationContext applicationContext) throws BeansException {
|
||||
|
||||
this.context = applicationContext;
|
||||
super.setApplicationContext(applicationContext);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2011-2013 the original author or authors.
|
||||
* Copyright 2011-2012 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -16,8 +16,6 @@
|
||||
package org.springframework.data.mongodb.core.mapping;
|
||||
|
||||
import org.springframework.core.convert.converter.Converter;
|
||||
import org.springframework.data.annotation.Id;
|
||||
import org.springframework.data.mapping.PersistentEntity;
|
||||
import org.springframework.data.mapping.PersistentProperty;
|
||||
|
||||
/**
|
||||
@@ -25,7 +23,6 @@ import org.springframework.data.mapping.PersistentProperty;
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
* @author Patryk Wasik
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
public interface MongoPersistentProperty extends PersistentProperty<MongoPersistentProperty> {
|
||||
|
||||
@@ -51,14 +48,6 @@ public interface MongoPersistentProperty extends PersistentProperty<MongoPersist
|
||||
*/
|
||||
boolean isDbReference();
|
||||
|
||||
/**
|
||||
* Returns whether the property is explicitly marked as an identifier property of the owning {@link PersistentEntity}.
|
||||
* A property is an explicit id property if it is annotated with @see {@link Id}.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
boolean isExplicitIdProperty();
|
||||
|
||||
/**
|
||||
* Returns the {@link DBRef} if the property is a reference.
|
||||
*
|
||||
|
||||
@@ -1,35 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.mapping;
|
||||
|
||||
/**
|
||||
* {@link FieldNamingStrategy} simply using the {@link MongoPersistentProperty}'s name.
|
||||
*
|
||||
* @since 1.3
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
public enum PropertyNameFieldNamingStrategy implements FieldNamingStrategy {
|
||||
|
||||
INSTANCE;
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.mapping.FieldNamingStrategy#getFieldName(org.springframework.data.mongodb.core.mapping.MongoPersistentProperty)
|
||||
*/
|
||||
public String getFieldName(MongoPersistentProperty property) {
|
||||
return property.getName();
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,167 @@
|
||||
/*
|
||||
* Copyright 2012-2012 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.mapping;
|
||||
|
||||
import java.beans.PropertyDescriptor;
|
||||
import java.lang.reflect.Field;
|
||||
import java.util.Arrays;
|
||||
import java.util.List;
|
||||
|
||||
import org.springframework.data.mapping.Association;
|
||||
import org.springframework.data.mapping.context.AbstractMappingContext;
|
||||
import org.springframework.data.mapping.model.AbstractPersistentProperty;
|
||||
import org.springframework.data.mapping.model.BasicPersistentEntity;
|
||||
import org.springframework.data.mapping.model.SimpleTypeHolder;
|
||||
import org.springframework.data.mongodb.MongoCollectionUtils;
|
||||
import org.springframework.data.util.TypeInformation;
|
||||
|
||||
/**
|
||||
* @deprecated use {@link MongoMappingContext} instead.
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
@Deprecated
|
||||
public class SimpleMongoMappingContext extends
|
||||
AbstractMappingContext<SimpleMongoMappingContext.SimpleMongoPersistentEntity<?>, MongoPersistentProperty> {
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mapping.context.AbstractMappingContext#createPersistentEntity(org.springframework.data.util.TypeInformation)
|
||||
*/
|
||||
@Override
|
||||
protected <T> SimpleMongoPersistentEntity<T> createPersistentEntity(TypeInformation<T> typeInformation) {
|
||||
return new SimpleMongoPersistentEntity<T>(typeInformation);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mapping.context.AbstractMappingContext#createPersistentProperty(java.lang.reflect.Field, java.beans.PropertyDescriptor, org.springframework.data.mapping.model.MutablePersistentEntity, org.springframework.data.mapping.model.SimpleTypeHolder)
|
||||
*/
|
||||
@Override
|
||||
protected SimplePersistentProperty createPersistentProperty(Field field, PropertyDescriptor descriptor,
|
||||
SimpleMongoPersistentEntity<?> owner, SimpleTypeHolder simpleTypeHolder) {
|
||||
return new SimplePersistentProperty(field, descriptor, owner, simpleTypeHolder);
|
||||
}
|
||||
|
||||
static class SimplePersistentProperty extends AbstractPersistentProperty<MongoPersistentProperty> implements
|
||||
MongoPersistentProperty {
|
||||
|
||||
private static final List<String> ID_FIELD_NAMES = Arrays.asList("id", "_id");
|
||||
|
||||
/**
|
||||
* Creates a new {@link SimplePersistentProperty}.
|
||||
*
|
||||
* @param field
|
||||
* @param propertyDescriptor
|
||||
* @param information
|
||||
*/
|
||||
public SimplePersistentProperty(Field field, PropertyDescriptor propertyDescriptor, MongoPersistentEntity<?> owner,
|
||||
SimpleTypeHolder simpleTypeHolder) {
|
||||
super(field, propertyDescriptor, owner, simpleTypeHolder);
|
||||
}
|
||||
|
||||
/* (non-Javadoc)
|
||||
* @see org.springframework.data.mapping.BasicPersistentProperty#isIdProperty()
|
||||
*/
|
||||
public boolean isIdProperty() {
|
||||
return ID_FIELD_NAMES.contains(field.getName());
|
||||
}
|
||||
|
||||
/* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.core.mapping.MongoPersistentProperty#getKey()
|
||||
*/
|
||||
public String getFieldName() {
|
||||
return isIdProperty() ? "_id" : getName();
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.core.mapping.MongoPersistentProperty#getFieldOrder()
|
||||
*/
|
||||
public int getFieldOrder() {
|
||||
return Integer.MAX_VALUE;
|
||||
}
|
||||
|
||||
/* (non-Javadoc)
|
||||
* @see org.springframework.data.mapping.AbstractPersistentProperty#createAssociation()
|
||||
*/
|
||||
@Override
|
||||
protected Association<MongoPersistentProperty> createAssociation() {
|
||||
return new Association<MongoPersistentProperty>(this, null);
|
||||
}
|
||||
|
||||
/* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.core.mapping.MongoPersistentProperty#isDbReference()
|
||||
*/
|
||||
public boolean isDbReference() {
|
||||
return false;
|
||||
}
|
||||
|
||||
/* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.core.mapping.MongoPersistentProperty#getDBRef()
|
||||
*/
|
||||
public DBRef getDBRef() {
|
||||
return null;
|
||||
}
|
||||
|
||||
/* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.core.mapping.MongoPersistentProperty#isVersion()
|
||||
*/
|
||||
public boolean isVersionProperty() {
|
||||
return false;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.mapping.MongoPersistentProperty#usePropertyAccess()
|
||||
*/
|
||||
public boolean usePropertyAccess() {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
static class SimpleMongoPersistentEntity<T> extends BasicPersistentEntity<T, MongoPersistentProperty> implements
|
||||
MongoPersistentEntity<T> {
|
||||
|
||||
/**
|
||||
* @param information
|
||||
*/
|
||||
public SimpleMongoPersistentEntity(TypeInformation<T> information) {
|
||||
super(information);
|
||||
}
|
||||
|
||||
/* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.core.mapping.MongoPersistentEntity#getCollection()
|
||||
*/
|
||||
public String getCollection() {
|
||||
return MongoCollectionUtils.getPreferredCollectionName(getType());
|
||||
}
|
||||
|
||||
/* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.core.mapping.MongoPersistentEntity#getVersionProperty()
|
||||
*/
|
||||
public MongoPersistentProperty getVersionProperty() {
|
||||
return null;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.mapping.MongoPersistentEntity#hasVersionProperty()
|
||||
*/
|
||||
public boolean hasVersionProperty() {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,50 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 by the original author(s).
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.mapping.event;
|
||||
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
* Base class for delete events.
|
||||
*
|
||||
* @author Martin Baumgartner
|
||||
*/
|
||||
public abstract class AbstractDeleteEvent<T> extends MongoMappingEvent<DBObject> {
|
||||
|
||||
private static final long serialVersionUID = 1L;
|
||||
private final Class<T> type;
|
||||
|
||||
/**
|
||||
* Creates a new {@link AbstractDeleteEvent} for the given {@link DBObject} and type.
|
||||
*
|
||||
* @param dbo must not be {@literal null}.
|
||||
* @param type , possibly be {@literal null}.
|
||||
*/
|
||||
public AbstractDeleteEvent(DBObject dbo, Class<T> type) {
|
||||
|
||||
super(dbo, dbo);
|
||||
this.type = type;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the type for which the {@link AbstractDeleteEvent} shall be invoked for.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public Class<T> getType() {
|
||||
return type;
|
||||
}
|
||||
}
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2011-2013 by the original author(s).
|
||||
* Copyright 2011 by the original author(s).
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -27,7 +27,6 @@ import com.mongodb.DBObject;
|
||||
*
|
||||
* @author Jon Brisbin
|
||||
* @author Oliver Gierke
|
||||
* @author Martin Baumgartner
|
||||
*/
|
||||
public abstract class AbstractMongoEventListener<E> implements ApplicationListener<MongoMappingEvent<?>> {
|
||||
|
||||
@@ -46,7 +45,6 @@ public abstract class AbstractMongoEventListener<E> implements ApplicationListen
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.context.ApplicationListener#onApplicationEvent(org.springframework.context.ApplicationEvent)
|
||||
*/
|
||||
@SuppressWarnings("rawtypes")
|
||||
public void onApplicationEvent(MongoMappingEvent<?> event) {
|
||||
|
||||
if (event instanceof AfterLoadEvent) {
|
||||
@@ -59,22 +57,6 @@ public abstract class AbstractMongoEventListener<E> implements ApplicationListen
|
||||
return;
|
||||
}
|
||||
|
||||
if (event instanceof AbstractDeleteEvent) {
|
||||
|
||||
Class<?> eventDomainType = ((AbstractDeleteEvent) event).getType();
|
||||
|
||||
if (eventDomainType != null && domainClass.isAssignableFrom(eventDomainType)) {
|
||||
if (event instanceof BeforeDeleteEvent) {
|
||||
onBeforeDelete(event.getDBObject());
|
||||
}
|
||||
if (event instanceof AfterDeleteEvent) {
|
||||
onAfterDelete(event.getDBObject());
|
||||
}
|
||||
}
|
||||
|
||||
return;
|
||||
}
|
||||
|
||||
@SuppressWarnings("unchecked")
|
||||
E source = (E) event.getSource();
|
||||
|
||||
@@ -96,43 +78,31 @@ public abstract class AbstractMongoEventListener<E> implements ApplicationListen
|
||||
|
||||
public void onBeforeConvert(E source) {
|
||||
if (LOG.isDebugEnabled()) {
|
||||
LOG.debug("onBeforeConvert({})", source);
|
||||
LOG.debug("onBeforeConvert(" + source + ")");
|
||||
}
|
||||
}
|
||||
|
||||
public void onBeforeSave(E source, DBObject dbo) {
|
||||
if (LOG.isDebugEnabled()) {
|
||||
LOG.debug("onBeforeSave({}, {})", source, dbo);
|
||||
LOG.debug("onBeforeSave(" + source + ", " + dbo + ")");
|
||||
}
|
||||
}
|
||||
|
||||
public void onAfterSave(E source, DBObject dbo) {
|
||||
if (LOG.isDebugEnabled()) {
|
||||
LOG.debug("onAfterSave({}, {})", source, dbo);
|
||||
LOG.debug("onAfterSave(" + source + ", " + dbo + ")");
|
||||
}
|
||||
}
|
||||
|
||||
public void onAfterLoad(DBObject dbo) {
|
||||
if (LOG.isDebugEnabled()) {
|
||||
LOG.debug("onAfterLoad({})", dbo);
|
||||
LOG.debug("onAfterLoad(" + dbo + ")");
|
||||
}
|
||||
}
|
||||
|
||||
public void onAfterConvert(DBObject dbo, E source) {
|
||||
if (LOG.isDebugEnabled()) {
|
||||
LOG.debug("onAfterConvert({}, {})", dbo, source);
|
||||
}
|
||||
}
|
||||
|
||||
public void onAfterDelete(DBObject dbo) {
|
||||
if (LOG.isDebugEnabled()) {
|
||||
LOG.debug("onAfterConvert({})", dbo);
|
||||
}
|
||||
}
|
||||
|
||||
public void onBeforeDelete(DBObject dbo) {
|
||||
if (LOG.isDebugEnabled()) {
|
||||
LOG.debug("onAfterConvert({})", dbo);
|
||||
LOG.debug("onAfterConvert(" + dbo + "," + source + ")");
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,39 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 by the original author(s).
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.mapping.event;
|
||||
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
* Event being thrown after a single or a set of documents has/have been deleted. The {@link DBObject} held in the event
|
||||
* will be the query document <em>after</am> it has been mapped onto the domain type handled.
|
||||
*
|
||||
* @author Martin Baumgartner
|
||||
*/
|
||||
public class AfterDeleteEvent<T> extends AbstractDeleteEvent<T> {
|
||||
|
||||
private static final long serialVersionUID = 1L;
|
||||
|
||||
/**
|
||||
* Creates a new {@link AfterDeleteEvent} for the given {@link DBObject} and type.
|
||||
*
|
||||
* @param dbo must not be {@literal null}.
|
||||
* @param type can be {@literal null}.
|
||||
*/
|
||||
public AfterDeleteEvent(DBObject dbo, Class<T> type) {
|
||||
super(dbo, type);
|
||||
}
|
||||
}
|
||||
@@ -1,39 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 by the original author(s).
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.mapping.event;
|
||||
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
* Event being thrown before a document is deleted. The {@link DBObject} held in the event will represent the query
|
||||
* document <em>before</em> being mapped based on the domain class handled.
|
||||
*
|
||||
* @author Martin Baumgartner
|
||||
*/
|
||||
public class BeforeDeleteEvent<T> extends AbstractDeleteEvent<T> {
|
||||
|
||||
private static final long serialVersionUID = -2627547705679734497L;
|
||||
|
||||
/**
|
||||
* Creates a new {@link BeforeDeleteEvent} for the given {@link DBObject} and type.
|
||||
*
|
||||
* @param dbo must not be {@literal null}.
|
||||
* @param type can be {@literal null}.
|
||||
*/
|
||||
public BeforeDeleteEvent(DBObject dbo, Class<T> type) {
|
||||
super(dbo, type);
|
||||
}
|
||||
}
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2011-2013 the original author or authors.
|
||||
* Copyright 2011-2012 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -17,19 +17,15 @@ package org.springframework.data.mongodb.core.mapping.event;
|
||||
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.context.ApplicationListener;
|
||||
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
* {@link ApplicationListener} for Mongo mapping events logging the events.
|
||||
*
|
||||
* @author Jon Brisbin
|
||||
* @author Martin Baumgartner
|
||||
* @author Jon Brisbin <jbrisbin@vmware.com>
|
||||
*/
|
||||
public class LoggingEventListener extends AbstractMongoEventListener<Object> {
|
||||
|
||||
private static final Logger LOGGER = LoggerFactory.getLogger(LoggingEventListener.class);
|
||||
private static final Logger log = LoggerFactory.getLogger(LoggingEventListener.class);
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
@@ -37,7 +33,7 @@ public class LoggingEventListener extends AbstractMongoEventListener<Object> {
|
||||
*/
|
||||
@Override
|
||||
public void onBeforeConvert(Object source) {
|
||||
LOGGER.info("onBeforeConvert: {}", source);
|
||||
log.info("onBeforeConvert: " + source);
|
||||
}
|
||||
|
||||
/*
|
||||
@@ -46,7 +42,10 @@ public class LoggingEventListener extends AbstractMongoEventListener<Object> {
|
||||
*/
|
||||
@Override
|
||||
public void onBeforeSave(Object source, DBObject dbo) {
|
||||
LOGGER.info("onBeforeSave: {}, {}", source, dbo);
|
||||
try {
|
||||
log.info("onBeforeSave: " + source + ", " + dbo);
|
||||
} catch (Throwable ignored) {
|
||||
}
|
||||
}
|
||||
|
||||
/*
|
||||
@@ -55,7 +54,7 @@ public class LoggingEventListener extends AbstractMongoEventListener<Object> {
|
||||
*/
|
||||
@Override
|
||||
public void onAfterSave(Object source, DBObject dbo) {
|
||||
LOGGER.info("onAfterSave: {}, {}", source, dbo);
|
||||
log.info("onAfterSave: " + source + ", " + dbo);
|
||||
}
|
||||
|
||||
/*
|
||||
@@ -64,7 +63,7 @@ public class LoggingEventListener extends AbstractMongoEventListener<Object> {
|
||||
*/
|
||||
@Override
|
||||
public void onAfterLoad(DBObject dbo) {
|
||||
LOGGER.info("onAfterLoad: {}", dbo);
|
||||
log.info("onAfterLoad: " + dbo);
|
||||
}
|
||||
|
||||
/*
|
||||
@@ -73,24 +72,6 @@ public class LoggingEventListener extends AbstractMongoEventListener<Object> {
|
||||
*/
|
||||
@Override
|
||||
public void onAfterConvert(DBObject dbo, Object source) {
|
||||
LOGGER.info("onAfterConvert: {}, {}", dbo, source);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.mapping.event.AbstractMongoEventListener#onAfterDelete(com.mongodb.DBObject)
|
||||
*/
|
||||
@Override
|
||||
public void onAfterDelete(DBObject dbo) {
|
||||
LOGGER.info("onAfterDelete: {}", dbo);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.mapping.event.AbstractMongoEventListener#onBeforeDelete(com.mongodb.DBObject)
|
||||
*/
|
||||
@Override
|
||||
public void onBeforeDelete(DBObject dbo) {
|
||||
LOGGER.info("onBeforeDelete: {}", dbo);
|
||||
log.info("onAfterConvert: " + dbo + ", " + source);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -17,26 +17,16 @@ package org.springframework.data.mongodb.core.query;
|
||||
|
||||
import java.util.HashMap;
|
||||
import java.util.Map;
|
||||
import java.util.Map.Entry;
|
||||
|
||||
import org.springframework.util.Assert;
|
||||
import org.springframework.util.ObjectUtils;
|
||||
|
||||
import com.mongodb.BasicDBObject;
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
* @author Thomas Risberg
|
||||
* @author Oliver Gierke
|
||||
* @author Patryk Wasik
|
||||
*/
|
||||
public class Field {
|
||||
|
||||
private final Map<String, Integer> criteria = new HashMap<String, Integer>();
|
||||
private final Map<String, Object> slices = new HashMap<String, Object>();
|
||||
private final Map<String, Criteria> elemMatchs = new HashMap<String, Criteria>();
|
||||
private String postionKey;
|
||||
private int positionValue;
|
||||
|
||||
public Field include(String key) {
|
||||
criteria.put(key, Integer.valueOf(1));
|
||||
@@ -58,50 +48,14 @@ public class Field {
|
||||
return this;
|
||||
}
|
||||
|
||||
public Field elemMatch(String key, Criteria elemMatchCriteria) {
|
||||
elemMatchs.put(key, elemMatchCriteria);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* The array field must appear in the query. Only one positional {@code $} operator can appear in the projection and
|
||||
* only one array field can appear in the query.
|
||||
*
|
||||
* @param field query array field, must not be {@literal null} or empty.
|
||||
* @param value
|
||||
* @return
|
||||
*/
|
||||
public Field position(String field, int value) {
|
||||
|
||||
Assert.hasText(field, "DocumentField must not be null or empty!");
|
||||
|
||||
postionKey = field;
|
||||
positionValue = value;
|
||||
|
||||
return this;
|
||||
}
|
||||
|
||||
public DBObject getFieldsObject() {
|
||||
|
||||
DBObject dbo = new BasicDBObject();
|
||||
|
||||
for (String k : criteria.keySet()) {
|
||||
dbo.put(k, criteria.get(k));
|
||||
}
|
||||
|
||||
for (String k : slices.keySet()) {
|
||||
dbo.put(k, new BasicDBObject("$slice", slices.get(k)));
|
||||
}
|
||||
|
||||
for (Entry<String, Criteria> entry : elemMatchs.entrySet()) {
|
||||
DBObject dbObject = new BasicDBObject("$elemMatch", entry.getValue().getCriteriaObject());
|
||||
dbo.put(entry.getKey(), dbObject);
|
||||
}
|
||||
|
||||
if (postionKey != null) {
|
||||
dbo.put(postionKey + ".$", positionValue);
|
||||
}
|
||||
|
||||
return dbo;
|
||||
}
|
||||
|
||||
@@ -130,15 +84,7 @@ public class Field {
|
||||
return false;
|
||||
}
|
||||
|
||||
if (!this.elemMatchs.equals(that.elemMatchs)) {
|
||||
return false;
|
||||
}
|
||||
|
||||
boolean samePositionKey = this.postionKey == null ? that.postionKey == null : this.postionKey
|
||||
.equals(that.postionKey);
|
||||
boolean samePositionValue = this.positionValue == that.positionValue;
|
||||
|
||||
return samePositionKey && samePositionValue;
|
||||
return true;
|
||||
}
|
||||
|
||||
/*
|
||||
@@ -151,10 +97,7 @@ public class Field {
|
||||
int result = 17;
|
||||
|
||||
result += 31 * ObjectUtils.nullSafeHashCode(this.criteria);
|
||||
result += 31 * ObjectUtils.nullSafeHashCode(this.elemMatchs);
|
||||
result += 31 * ObjectUtils.nullSafeHashCode(this.slices);
|
||||
result += 31 * ObjectUtils.nullSafeHashCode(this.postionKey);
|
||||
result += 31 * ObjectUtils.nullSafeHashCode(this.positionValue);
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2010-2013 the original author or authors.
|
||||
* Copyright 2010-2011 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -15,31 +15,11 @@
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.query;
|
||||
|
||||
import org.springframework.data.domain.Sort.Direction;
|
||||
|
||||
/**
|
||||
* An enum that specifies the ordering for sort or index specifications
|
||||
*
|
||||
* @deprecated prefer {@link Direction}
|
||||
* @author Thomas Risberg
|
||||
* @author Oliver Gierke
|
||||
* @author trisberg
|
||||
*/
|
||||
@Deprecated
|
||||
public enum Order {
|
||||
|
||||
ASCENDING {
|
||||
@Override
|
||||
public Direction toDirection() {
|
||||
return Direction.ASC;
|
||||
}
|
||||
},
|
||||
|
||||
DESCENDING {
|
||||
@Override
|
||||
public Direction toDirection() {
|
||||
return Direction.DESC;
|
||||
}
|
||||
};
|
||||
|
||||
public abstract Direction toDirection();
|
||||
ASCENDING, DESCENDING
|
||||
}
|
||||
|
||||
@@ -19,15 +19,11 @@ import static org.springframework.data.mongodb.core.query.SerializationUtils.*;
|
||||
import static org.springframework.util.ObjectUtils.*;
|
||||
|
||||
import java.util.ArrayList;
|
||||
import java.util.Collections;
|
||||
import java.util.HashSet;
|
||||
import java.util.LinkedHashMap;
|
||||
import java.util.List;
|
||||
import java.util.Set;
|
||||
|
||||
import org.springframework.data.domain.Pageable;
|
||||
import org.springframework.data.domain.Sort;
|
||||
import org.springframework.data.domain.Sort.Order;
|
||||
import org.springframework.data.mongodb.InvalidMongoDbApiUsageException;
|
||||
import org.springframework.util.Assert;
|
||||
|
||||
@@ -37,16 +33,14 @@ import com.mongodb.DBObject;
|
||||
/**
|
||||
* @author Thomas Risberg
|
||||
* @author Oliver Gierke
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
public class Query {
|
||||
|
||||
private final static String RESTRICTED_TYPES_KEY = "_$RESTRICTED_TYPES";
|
||||
|
||||
private final Set<Class<?>> restrictedTypes = new HashSet<Class<?>>();
|
||||
private LinkedHashMap<String, Criteria> criteria = new LinkedHashMap<String, Criteria>();
|
||||
private Field fieldSpec;
|
||||
private Sort sort;
|
||||
private Sort coreSort;
|
||||
@SuppressWarnings("deprecation")
|
||||
private org.springframework.data.mongodb.core.query.Sort sort;
|
||||
private int skip;
|
||||
private int limit;
|
||||
private String hint;
|
||||
@@ -61,7 +55,8 @@ public class Query {
|
||||
return new Query(criteria);
|
||||
}
|
||||
|
||||
public Query() {}
|
||||
public Query() {
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link Query} using the given {@link Criteria}.
|
||||
@@ -120,6 +115,21 @@ public class Query {
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns a {@link org.springframework.data.mongodb.core.query.Sort} instance to define ordering properties.
|
||||
*
|
||||
* @deprecated use {@link #with(Sort)} instead
|
||||
* @return
|
||||
*/
|
||||
@Deprecated
|
||||
public org.springframework.data.mongodb.core.query.Sort sort() {
|
||||
if (this.sort == null) {
|
||||
this.sort = new org.springframework.data.mongodb.core.query.Sort();
|
||||
}
|
||||
|
||||
return this.sort;
|
||||
}
|
||||
|
||||
/**
|
||||
* Sets the given pagination information on the {@link Query} instance. Will transparently set {@code skip} and
|
||||
* {@code limit} as well as applying the {@link Sort} instance defined with the {@link Pageable}.
|
||||
@@ -151,62 +161,22 @@ public class Query {
|
||||
return this;
|
||||
}
|
||||
|
||||
for (Order order : sort) {
|
||||
if (order.isIgnoreCase()) {
|
||||
throw new IllegalArgumentException(String.format("Gven sort contained an Order for %s with ignore case! "
|
||||
+ "MongoDB does not support sorting ignoreing case currently!", order.getProperty()));
|
||||
}
|
||||
}
|
||||
|
||||
if (this.sort == null) {
|
||||
this.sort = sort;
|
||||
if (this.coreSort == null) {
|
||||
this.coreSort = sort;
|
||||
} else {
|
||||
this.sort = this.sort.and(sort);
|
||||
}
|
||||
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* @return the restrictedTypes
|
||||
*/
|
||||
public Set<Class<?>> getRestrictedTypes() {
|
||||
return restrictedTypes == null ? Collections.<Class<?>> emptySet() : restrictedTypes;
|
||||
}
|
||||
|
||||
/**
|
||||
* Restricts the query to only return documents instances that are exactly of the given types.
|
||||
*
|
||||
* @param type may not be {@literal null}
|
||||
* @param additionalTypes may not be {@literal null}
|
||||
* @return
|
||||
*/
|
||||
public Query restrict(Class<?> type, Class<?>... additionalTypes) {
|
||||
|
||||
Assert.notNull(type, "Type must not be null!");
|
||||
Assert.notNull(additionalTypes, "AdditionalTypes must not be null");
|
||||
|
||||
restrictedTypes.add(type);
|
||||
for (Class<?> additionalType : additionalTypes) {
|
||||
restrictedTypes.add(additionalType);
|
||||
this.coreSort = this.coreSort.and(sort);
|
||||
}
|
||||
|
||||
return this;
|
||||
}
|
||||
|
||||
public DBObject getQueryObject() {
|
||||
|
||||
DBObject dbo = new BasicDBObject();
|
||||
for (String k : criteria.keySet()) {
|
||||
CriteriaDefinition c = criteria.get(k);
|
||||
DBObject cl = c.getCriteriaObject();
|
||||
dbo.putAll(cl);
|
||||
}
|
||||
|
||||
if (!restrictedTypes.isEmpty()) {
|
||||
dbo.put(RESTRICTED_TYPES_KEY, getRestrictedTypes());
|
||||
}
|
||||
|
||||
return dbo;
|
||||
}
|
||||
|
||||
@@ -217,20 +187,25 @@ public class Query {
|
||||
return fieldSpec.getFieldsObject();
|
||||
}
|
||||
|
||||
@SuppressWarnings("deprecation")
|
||||
public DBObject getSortObject() {
|
||||
|
||||
if (this.sort == null && this.sort == null) {
|
||||
if (this.coreSort == null && this.sort == null) {
|
||||
return null;
|
||||
}
|
||||
|
||||
DBObject dbo = new BasicDBObject();
|
||||
|
||||
if (this.sort != null) {
|
||||
for (org.springframework.data.domain.Sort.Order order : this.sort) {
|
||||
if (this.coreSort != null) {
|
||||
for (org.springframework.data.domain.Sort.Order order : this.coreSort) {
|
||||
dbo.put(order.getProperty(), order.isAscending() ? 1 : -1);
|
||||
}
|
||||
}
|
||||
|
||||
if (this.sort != null) {
|
||||
dbo.putAll(this.sort.getSortObject());
|
||||
}
|
||||
|
||||
return dbo;
|
||||
}
|
||||
|
||||
@@ -305,17 +280,4 @@ public class Query {
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns whether the given key is the one used to hold the type restriction information.
|
||||
*
|
||||
* @deprecated don't call this method as the restricted type handling will undergo some significant changes going
|
||||
* forward.
|
||||
* @param key
|
||||
* @return
|
||||
*/
|
||||
@Deprecated
|
||||
public static boolean isRestrictedTypeKey(String key) {
|
||||
return RESTRICTED_TYPES_KEY.equals(key);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -0,0 +1,56 @@
|
||||
/*
|
||||
* Copyright 2010-2012 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.query;
|
||||
|
||||
import java.util.LinkedHashMap;
|
||||
import java.util.Map;
|
||||
|
||||
import com.mongodb.BasicDBObject;
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
* Helper class to define sorting criterias for a Query instance.
|
||||
*
|
||||
* @author Thomas Risberg
|
||||
* @author Oliver Gierke
|
||||
* @deprecated use {@link org.springframework.data.domain.Sort} instead. See
|
||||
* {@link Query#with(org.springframework.data.domain.Sort)}.
|
||||
*/
|
||||
@Deprecated
|
||||
public class Sort {
|
||||
|
||||
private Map<String, Order> fieldSpec = new LinkedHashMap<String, Order>();
|
||||
|
||||
public Sort() {
|
||||
}
|
||||
|
||||
public Sort(String key, Order order) {
|
||||
fieldSpec.put(key, order);
|
||||
}
|
||||
|
||||
public Sort on(String key, Order order) {
|
||||
fieldSpec.put(key, order);
|
||||
return this;
|
||||
}
|
||||
|
||||
public DBObject getSortObject() {
|
||||
DBObject dbo = new BasicDBObject();
|
||||
for (String k : fieldSpec.keySet()) {
|
||||
dbo.put(k, fieldSpec.get(k).equals(Order.ASCENDING) ? 1 : -1);
|
||||
}
|
||||
return dbo;
|
||||
}
|
||||
}
|
||||
@@ -31,7 +31,6 @@ import com.mongodb.DBObject;
|
||||
* @author Thomas Risberg
|
||||
* @author Mark Pollack
|
||||
* @author Oliver Gierke
|
||||
* @author Becca Gaspard
|
||||
*/
|
||||
public class Update {
|
||||
|
||||
@@ -91,18 +90,6 @@ public class Update {
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Update using the $setOnInsert update modifier
|
||||
*
|
||||
* @param key
|
||||
* @param value
|
||||
* @return
|
||||
*/
|
||||
public Update setOnInsert(String key, Object value) {
|
||||
addMultiFieldOperation("$setOnInsert", key, value);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Update using the $unset update modifier
|
||||
*
|
||||
|
||||
@@ -1,217 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.spel;
|
||||
|
||||
import java.util.Collections;
|
||||
import java.util.Iterator;
|
||||
|
||||
import org.springframework.expression.spel.ExpressionState;
|
||||
import org.springframework.expression.spel.SpelNode;
|
||||
import org.springframework.expression.spel.ast.Literal;
|
||||
import org.springframework.expression.spel.ast.MethodReference;
|
||||
import org.springframework.expression.spel.ast.Operator;
|
||||
import org.springframework.util.Assert;
|
||||
|
||||
/**
|
||||
* A value object for nodes in an expression. Allows iterating ove potentially available child {@link ExpressionNode}s.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
public class ExpressionNode implements Iterable<ExpressionNode> {
|
||||
|
||||
private static final Iterator<ExpressionNode> EMPTY_ITERATOR = Collections.<ExpressionNode> emptySet().iterator();
|
||||
|
||||
private final SpelNode node;
|
||||
private final ExpressionState state;
|
||||
|
||||
/**
|
||||
* Creates a new {@link ExpressionNode} from the given {@link SpelNode} and {@link ExpressionState}.
|
||||
*
|
||||
* @param node must not be {@literal null}.
|
||||
* @param state must not be {@literal null}.
|
||||
*/
|
||||
protected ExpressionNode(SpelNode node, ExpressionState state) {
|
||||
|
||||
Assert.notNull(node, "SpelNode must not be null!");
|
||||
Assert.notNull(state, "ExpressionState must not be null!");
|
||||
|
||||
this.node = node;
|
||||
this.state = state;
|
||||
}
|
||||
|
||||
/**
|
||||
* Factory method to create {@link ExpressionNode}'s according to the given {@link SpelNode} and
|
||||
* {@link ExpressionState}.
|
||||
*
|
||||
* @param node
|
||||
* @param state must not be {@literal null}.
|
||||
* @return an {@link ExpressionNode} for the given {@link SpelNode} or {@literal null} if {@literal null} was given
|
||||
* for the {@link SpelNode}.
|
||||
*/
|
||||
public static ExpressionNode from(SpelNode node, ExpressionState state) {
|
||||
|
||||
if (node == null) {
|
||||
return null;
|
||||
}
|
||||
|
||||
if (node instanceof Operator) {
|
||||
return new OperatorNode((Operator) node, state);
|
||||
}
|
||||
|
||||
if (node instanceof MethodReference) {
|
||||
return new MethodReferenceNode((MethodReference) node, state);
|
||||
}
|
||||
|
||||
if (node instanceof Literal) {
|
||||
return new LiteralNode((Literal) node, state);
|
||||
}
|
||||
|
||||
return new ExpressionNode(node, state);
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the name of the {@link ExpressionNode}.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public String getName() {
|
||||
return node.toStringAST();
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns whether the current {@link ExpressionNode} is backed by the given type.
|
||||
*
|
||||
* @param type must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
public boolean isOfType(Class<?> type) {
|
||||
|
||||
Assert.notNull(type, "Type must not be empty!");
|
||||
return type.isAssignableFrom(node.getClass());
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns whether the given {@link ExpressionNode} is representing the same backing node type as the current one.
|
||||
*
|
||||
* @param node
|
||||
* @return
|
||||
*/
|
||||
boolean isOfSameTypeAs(ExpressionNode node) {
|
||||
return node == null ? false : this.node.getClass().equals(node.node.getClass());
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns whether the {@link ExpressionNode} is a mathematical operation.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public boolean isMathematicalOperation() {
|
||||
return false;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns whether the {@link ExpressionNode} is a literal.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public boolean isLiteral() {
|
||||
return false;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the value of the current node.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public Object getValue() {
|
||||
return node.getValue(state);
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns whether the current node has child nodes.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public boolean hasChildren() {
|
||||
return node.getChildCount() != 0;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the child {@link ExpressionNode} with the given index.
|
||||
*
|
||||
* @param index must not be negative.
|
||||
* @return
|
||||
*/
|
||||
public ExpressionNode getChild(int index) {
|
||||
|
||||
Assert.isTrue(index >= 0);
|
||||
return from(node.getChild(index), state);
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns whether the {@link ExpressionNode} has a first child node that is not of the given type.
|
||||
*
|
||||
* @param type must not be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
public boolean hasfirstChildNotOfType(Class<?> type) {
|
||||
|
||||
Assert.notNull(type, "Type must not be null!");
|
||||
return hasChildren() && !node.getChild(0).getClass().equals(type);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new {@link ExpressionNode} from the given {@link SpelNode}.
|
||||
*
|
||||
* @param node
|
||||
* @return
|
||||
*/
|
||||
protected ExpressionNode from(SpelNode node) {
|
||||
return from(node, state);
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see java.lang.Iterable#iterator()
|
||||
*/
|
||||
@Override
|
||||
public Iterator<ExpressionNode> iterator() {
|
||||
|
||||
if (!hasChildren()) {
|
||||
return EMPTY_ITERATOR;
|
||||
}
|
||||
|
||||
return new Iterator<ExpressionNode>() {
|
||||
|
||||
int index = 0;
|
||||
|
||||
@Override
|
||||
public boolean hasNext() {
|
||||
return index < node.getChildCount();
|
||||
}
|
||||
|
||||
@Override
|
||||
public ExpressionNode next() {
|
||||
return from(node.getChild(index++));
|
||||
}
|
||||
|
||||
@Override
|
||||
public void remove() {
|
||||
throw new UnsupportedOperationException();
|
||||
}
|
||||
};
|
||||
}
|
||||
}
|
||||
@@ -1,126 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.spel;
|
||||
|
||||
import org.springframework.util.Assert;
|
||||
|
||||
import com.mongodb.BasicDBList;
|
||||
import com.mongodb.DBObject;
|
||||
|
||||
/**
|
||||
* The context for an {@link ExpressionNode} transformation.
|
||||
*
|
||||
* @author Thomas Darimont
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
public class ExpressionTransformationContextSupport<T extends ExpressionNode> {
|
||||
|
||||
private final T currentNode;
|
||||
private final ExpressionNode parentNode;
|
||||
private final DBObject previousOperationObject;
|
||||
|
||||
/**
|
||||
* Creates a new {@link ExpressionTransformationContextSupport} for the given {@link ExpressionNode}s and an optional
|
||||
* previous operation.
|
||||
*
|
||||
* @param currentNode must not be {@literal null}.
|
||||
* @param parentNode
|
||||
* @param previousOperationObject
|
||||
*/
|
||||
public ExpressionTransformationContextSupport(T currentNode, ExpressionNode parentNode,
|
||||
DBObject previousOperationObject) {
|
||||
|
||||
Assert.notNull(currentNode, "currentNode must not be null!");
|
||||
|
||||
this.currentNode = currentNode;
|
||||
this.parentNode = parentNode;
|
||||
this.previousOperationObject = previousOperationObject;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the current {@link ExpressionNode}.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public T getCurrentNode() {
|
||||
return currentNode;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the parent {@link ExpressionNode} or {@literal null} if none available.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public ExpressionNode getParentNode() {
|
||||
return parentNode;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the previously accumulated operaton object or {@literal null} if none available. Rather than manually
|
||||
* adding stuff to the object prefer using {@link #addToPreviousOrReturn(Object)} to transparently do if one is
|
||||
* present.
|
||||
*
|
||||
* @see #hasPreviousOperation()
|
||||
* @see #addToPreviousOrReturn(Object)
|
||||
* @return
|
||||
*/
|
||||
public DBObject getPreviousOperationObject() {
|
||||
return previousOperationObject;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns whether a previous operation is present.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public boolean hasPreviousOperation() {
|
||||
return getPreviousOperationObject() != null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns whether the parent node is of the same operation as the current node.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public boolean parentIsSameOperation() {
|
||||
return parentNode == null ? false : currentNode.isOfSameTypeAs(parentNode);
|
||||
}
|
||||
|
||||
/**
|
||||
* Adds the given value to the previous operation and returns it.
|
||||
*
|
||||
* @param value
|
||||
* @return
|
||||
*/
|
||||
public DBObject addToPreviousOperation(Object value) {
|
||||
extractArgumentListFrom(previousOperationObject).add(value);
|
||||
return previousOperationObject;
|
||||
}
|
||||
|
||||
/**
|
||||
* Adds the given value to the previous operation if one is present or returns the value to add as is.
|
||||
*
|
||||
* @param value
|
||||
* @return
|
||||
*/
|
||||
public Object addToPreviousOrReturn(Object value) {
|
||||
return hasPreviousOperation() ? addToPreviousOperation(value) : value;
|
||||
}
|
||||
|
||||
private BasicDBList extractArgumentListFrom(DBObject context) {
|
||||
return (BasicDBList) context.get(context.keySet().iterator().next());
|
||||
}
|
||||
}
|
||||
@@ -1,33 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.spel;
|
||||
|
||||
/**
|
||||
* SPI interface to implement components that can transfrom an {@link ExpressionTransformationContextSupport} into an
|
||||
* object.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
public interface ExpressionTransformer<T extends ExpressionTransformationContextSupport<?>> {
|
||||
|
||||
/**
|
||||
* Transforms the given {@link ExpressionTransformationContextSupport} into an Object.
|
||||
*
|
||||
* @param context will never be {@literal null}.
|
||||
* @return
|
||||
*/
|
||||
Object transform(T context);
|
||||
}
|
||||
@@ -1,72 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.spel;
|
||||
|
||||
import org.springframework.expression.spel.ExpressionState;
|
||||
import org.springframework.expression.spel.ast.FloatLiteral;
|
||||
import org.springframework.expression.spel.ast.IntLiteral;
|
||||
import org.springframework.expression.spel.ast.Literal;
|
||||
import org.springframework.expression.spel.ast.LongLiteral;
|
||||
import org.springframework.expression.spel.ast.NullLiteral;
|
||||
import org.springframework.expression.spel.ast.RealLiteral;
|
||||
import org.springframework.expression.spel.ast.StringLiteral;
|
||||
|
||||
/**
|
||||
* A node representing a literal in an expression.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
*/
|
||||
public class LiteralNode extends ExpressionNode {
|
||||
|
||||
private final Literal literal;
|
||||
|
||||
/**
|
||||
* Creates a new {@link LiteralNode} from the given {@link Literal} and {@link ExpressionState}.
|
||||
*
|
||||
* @param node must not be {@literal null}.
|
||||
* @param state must not be {@literal null}.
|
||||
*/
|
||||
LiteralNode(Literal node, ExpressionState state) {
|
||||
super(node, state);
|
||||
this.literal = node;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns whether the given {@link ExpressionNode} is a unary minus.
|
||||
*
|
||||
* @param parent
|
||||
* @return
|
||||
*/
|
||||
public boolean isUnaryMinus(ExpressionNode parent) {
|
||||
|
||||
if (!(parent instanceof OperatorNode)) {
|
||||
return false;
|
||||
}
|
||||
|
||||
OperatorNode operator = (OperatorNode) parent;
|
||||
return operator.isUnaryMinus() && operator.getRight() == null;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.spel.ExpressionNode#isLiteral()
|
||||
*/
|
||||
@Override
|
||||
public boolean isLiteral() {
|
||||
return literal instanceof FloatLiteral || literal instanceof RealLiteral || literal instanceof IntLiteral
|
||||
|| literal instanceof LongLiteral || literal instanceof StringLiteral || literal instanceof NullLiteral;
|
||||
}
|
||||
}
|
||||
@@ -1,75 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.spel;
|
||||
|
||||
import java.util.Collections;
|
||||
import java.util.HashMap;
|
||||
import java.util.Map;
|
||||
|
||||
import org.springframework.expression.spel.ExpressionState;
|
||||
import org.springframework.expression.spel.ast.MethodReference;
|
||||
|
||||
/**
|
||||
* An {@link ExpressionNode} representing a method reference.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
public class MethodReferenceNode extends ExpressionNode {
|
||||
|
||||
private static final Map<String, String> FUNCTIONS;
|
||||
|
||||
static {
|
||||
|
||||
Map<String, String> map = new HashMap<String, String>();
|
||||
|
||||
map.put("concat", "$concat"); // Concatenates two strings.
|
||||
map.put("strcasecmp", "$strcasecmp"); // Compares two strings and returns an integer that reflects the comparison.
|
||||
map.put("substr", "$substr"); // Takes a string and returns portion of that string.
|
||||
map.put("toLower", "$toLower"); // Converts a string to lowercase.
|
||||
map.put("toUpper", "$toUpper"); // Converts a string to uppercase.
|
||||
|
||||
map.put("dayOfYear", "$dayOfYear"); // Converts a date to a number between 1 and 366.
|
||||
map.put("dayOfMonth", "$dayOfMonth"); // Converts a date to a number between 1 and 31.
|
||||
map.put("dayOfWeek", "$dayOfWeek"); // Converts a date to a number between 1 and 7.
|
||||
map.put("year", "$year"); // Converts a date to the full year.
|
||||
map.put("month", "$month"); // Converts a date into a number between 1 and 12.
|
||||
map.put("week", "$week"); // Converts a date into a number between 0 and 53
|
||||
map.put("hour", "$hour"); // Converts a date into a number between 0 and 23.
|
||||
map.put("minute", "$minute"); // Converts a date into a number between 0 and 59.
|
||||
map.put("second", "$second"); // Converts a date into a number between 0 and 59. May be 60 to account for leap
|
||||
// seconds.
|
||||
map.put("millisecond", "$millisecond"); // Returns the millisecond portion of a date as an integer between 0 and
|
||||
|
||||
FUNCTIONS = Collections.unmodifiableMap(map);
|
||||
}
|
||||
|
||||
MethodReferenceNode(MethodReference reference, ExpressionState state) {
|
||||
super(reference, state);
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the name of the method.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public String getMethodName() {
|
||||
|
||||
String name = getName();
|
||||
String methodName = name.substring(0, name.indexOf('('));
|
||||
return FUNCTIONS.get(methodName);
|
||||
}
|
||||
}
|
||||
@@ -1,120 +0,0 @@
|
||||
/*
|
||||
* Copyright 2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.spel;
|
||||
|
||||
import java.util.Collections;
|
||||
import java.util.HashMap;
|
||||
import java.util.Map;
|
||||
|
||||
import org.springframework.expression.spel.ExpressionState;
|
||||
import org.springframework.expression.spel.ast.OpDivide;
|
||||
import org.springframework.expression.spel.ast.OpMinus;
|
||||
import org.springframework.expression.spel.ast.OpModulus;
|
||||
import org.springframework.expression.spel.ast.OpMultiply;
|
||||
import org.springframework.expression.spel.ast.OpPlus;
|
||||
import org.springframework.expression.spel.ast.Operator;
|
||||
|
||||
/**
|
||||
* An {@link ExpressionNode} representing an operator.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
public class OperatorNode extends ExpressionNode {
|
||||
|
||||
private static final Map<String, String> OPERATORS;
|
||||
|
||||
static {
|
||||
|
||||
Map<String, String> map = new HashMap<String, String>(6);
|
||||
|
||||
map.put("+", "$add");
|
||||
map.put("-", "$subtract");
|
||||
map.put("*", "$multiply");
|
||||
map.put("/", "$divide");
|
||||
map.put("%", "$mod");
|
||||
|
||||
OPERATORS = Collections.unmodifiableMap(map);
|
||||
}
|
||||
|
||||
private final Operator operator;
|
||||
|
||||
/**
|
||||
* Creates a new {@link OperatorNode} from the given {@link Operator} and {@link ExpressionState}.
|
||||
*
|
||||
* @param node must not be {@literal null}.
|
||||
* @param state must not be {@literal null}.
|
||||
*/
|
||||
OperatorNode(Operator node, ExpressionState state) {
|
||||
super(node, state);
|
||||
this.operator = node;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.mongodb.core.spel.ExpressionNode#isMathematicalOperation()
|
||||
*/
|
||||
@Override
|
||||
public boolean isMathematicalOperation() {
|
||||
return operator instanceof OpMinus || operator instanceof OpPlus || operator instanceof OpMultiply
|
||||
|| operator instanceof OpDivide || operator instanceof OpModulus;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns whether the operator is unary.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public boolean isUnaryOperator() {
|
||||
return operator.getRightOperand() == null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the Mongo expression of the operator.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public String getMongoOperator() {
|
||||
return OPERATORS.get(operator.getOperatorName());
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns whether the operator is a unary minus, e.g. -1.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public boolean isUnaryMinus() {
|
||||
return isUnaryOperator() && operator instanceof OpMinus;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the left operand as {@link ExpressionNode}.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public ExpressionNode getLeft() {
|
||||
return from(operator.getLeftOperand());
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the right operand as {@link ExpressionNode}.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public ExpressionNode getRight() {
|
||||
return from(operator.getRightOperand());
|
||||
}
|
||||
}
|
||||
@@ -1,5 +0,0 @@
|
||||
/**
|
||||
* Support classes to transform SpEL expressions into MongoDB expressions.
|
||||
* @since 1.4
|
||||
*/
|
||||
package org.springframework.data.mongodb.core.spel;
|
||||
@@ -158,15 +158,7 @@ public class GridFsTemplate implements GridFsOperations, ResourcePatternResolver
|
||||
* @see org.springframework.data.mongodb.gridfs.GridFsOperations#find(com.mongodb.DBObject)
|
||||
*/
|
||||
public List<GridFSDBFile> find(Query query) {
|
||||
|
||||
if (query == null) {
|
||||
return getGridFs().find((DBObject) null);
|
||||
}
|
||||
|
||||
DBObject queryObject = getMappedQuery(query.getQueryObject());
|
||||
DBObject sortObject = getMappedQuery(query.getSortObject());
|
||||
|
||||
return getGridFs().find(queryObject, sortObject);
|
||||
return getGridFs().find(getMappedQuery(query));
|
||||
}
|
||||
|
||||
/*
|
||||
@@ -229,11 +221,7 @@ public class GridFsTemplate implements GridFsOperations, ResourcePatternResolver
|
||||
}
|
||||
|
||||
private DBObject getMappedQuery(Query query) {
|
||||
return query == null ? null : getMappedQuery(query.getQueryObject());
|
||||
}
|
||||
|
||||
private DBObject getMappedQuery(DBObject query) {
|
||||
return query == null ? null : queryMapper.getMappedObject(query, null);
|
||||
return query == null ? null : queryMapper.getMappedObject(query.getQueryObject(), null);
|
||||
}
|
||||
|
||||
private GridFS getGridFs() {
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2012-2013 the original author or authors.
|
||||
* Copyright 2002-2013 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/*
|
||||
* Copyright 2011-2013 the original author or authors.
|
||||
* Copyright 2011 the original author or authors.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
@@ -15,25 +15,17 @@
|
||||
*/
|
||||
package org.springframework.data.mongodb.repository;
|
||||
|
||||
import java.lang.annotation.Documented;
|
||||
import java.lang.annotation.ElementType;
|
||||
import java.lang.annotation.Retention;
|
||||
import java.lang.annotation.RetentionPolicy;
|
||||
import java.lang.annotation.Target;
|
||||
|
||||
import org.springframework.data.annotation.QueryAnnotation;
|
||||
import java.lang.annotation.*;
|
||||
|
||||
/**
|
||||
* Annotation to declare finder queries directly on repository methods. Both attributes allow using a placeholder
|
||||
* notation of {@code ?0}, {@code ?1} and so on.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
@Retention(RetentionPolicy.RUNTIME)
|
||||
@Target(ElementType.METHOD)
|
||||
@Documented
|
||||
@QueryAnnotation
|
||||
public @interface Query {
|
||||
|
||||
/**
|
||||
@@ -51,12 +43,4 @@ public @interface Query {
|
||||
* @return
|
||||
*/
|
||||
String fields() default "";
|
||||
|
||||
/**
|
||||
* Returns whether the query defined should be executed as count projection.
|
||||
*
|
||||
* @since 1.3
|
||||
* @return
|
||||
*/
|
||||
boolean count() default false;
|
||||
}
|
||||
|
||||
@@ -53,6 +53,14 @@ public class MongoRepositoryBean<T> extends CdiRepositoryBean<T> {
|
||||
this.operations = operations;
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see javax.enterprise.inject.spi.Bean#getScope()
|
||||
*/
|
||||
public Class<? extends Annotation> getScope() {
|
||||
return operations.getScope();
|
||||
}
|
||||
|
||||
/*
|
||||
* (non-Javadoc)
|
||||
* @see org.springframework.data.repository.cdi.CdiRepositoryBean#create(javax.enterprise.context.spi.CreationalContext, java.lang.Class)
|
||||
|
||||
@@ -35,7 +35,6 @@ import org.springframework.data.repository.query.QueryLookupStrategy.Key;
|
||||
* {@link #basePackages()} or {@link #basePackageClasses()} it will trigger scanning of the package of annotated class.
|
||||
*
|
||||
* @author Oliver Gierke
|
||||
* @author Thomas Darimont
|
||||
*/
|
||||
@Target(ElementType.TYPE)
|
||||
@Retention(RetentionPolicy.RUNTIME)
|
||||
@@ -120,10 +119,4 @@ public @interface EnableMongoRepositories {
|
||||
* @return
|
||||
*/
|
||||
boolean createIndexesForQueryMethods() default false;
|
||||
|
||||
/**
|
||||
* Configures whether nested repository-interfaces (e.g. defined as inner classes) should be discovered by the
|
||||
* repositories infrastructure.
|
||||
*/
|
||||
boolean considerNestedRepositories() default false;
|
||||
}
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user