Compare commits

...

50 Commits

Author SHA1 Message Date
Mark Paluch
479dc3a0d6 DATAMONGO-1755 - Release version 1.10.7 (Ingalls SR7). 2017-09-11 11:45:05 +02:00
Mark Paluch
59ebbd3d35 DATAMONGO-1755 - Prepare 1.10.7 (Ingalls SR7). 2017-09-11 11:44:20 +02:00
Mark Paluch
38556f522f DATAMONGO-1755 - Updated changelog. 2017-09-11 11:44:15 +02:00
Christoph Strobl
166304849a DATAMONGO-1772 - Fix UpdateMapper type key rendering for abstract list elements contained in concrete typed ones.
Original pull request: #497.
2017-09-05 10:58:55 +02:00
Mark Paluch
f71b38b731 DATAMONGO-1768 - Polishing.
Extend javadocs. Make methods static/reorder methods where possible. Formatting.

Original pull request: #496.
2017-08-25 10:48:42 +02:00
Christoph Strobl
c4af78d81d DATAMONGO-1768 - Allow ignoring type restriction when issuing QBE.
We now allow to remove the type restriction inferred by the QBE mapping via an ignored path expression on the ExampleMatcher. This allows to create untyped QBE expressions returning all entities matching the query without limiting the result to types assignable to the probe itself.

Original pull request: #496.
2017-08-25 10:41:37 +02:00
Oliver Gierke
a281ec83b5 DATAMONGO-1765 - Polishing.
Formatting.
2017-08-07 17:35:11 +02:00
Oliver Gierke
407087b3a7 DATAMONGO-1765 - DefaultDbRefResolver now maps duplicate references correctly.
On bulk resolution of a DBRef array we now map the resulting documents back to their ids to make sure that reoccurring identifiers are mapped to the corresponding documents.
2017-08-07 17:35:11 +02:00
Mark Paluch
90411decce DATAMONGO-1756 - Polishing.
Add author tag.

Original pull request: #491.
2017-08-02 08:52:45 +02:00
Christoph Strobl
71135395c1 DATAMONGO-1756 - Fix nested field name resolution for arithmetic aggregation ops.
Original pull request: #491.
2017-08-02 08:52:45 +02:00
Oliver Gierke
9c43ece3a7 DATAMONGO-1750 - After release cleanups. 2017-07-27 00:21:39 +02:00
Oliver Gierke
283bfce2fe DATAMONGO-1750 - Prepare next development iteration. 2017-07-27 00:15:08 +02:00
Oliver Gierke
42cc6ff37f DATAMONGO-1750 - Release version 1.10.6 (Ingalls SR6). 2017-07-26 23:47:20 +02:00
Oliver Gierke
9ded78b13c DATAMONGO-1750 - Prepare 1.10.6 (Ingalls SR6). 2017-07-26 23:45:52 +02:00
Oliver Gierke
b0842a89fd DATAMONGO-1750 - Updated changelog. 2017-07-26 23:45:43 +02:00
Oliver Gierke
5a9eef7c96 DATAMONGO-1751 - Updated changelog. 2017-07-25 16:15:49 +02:00
Oliver Gierke
77425736e9 DATAMONGO-1717 - Updated changelog. 2017-07-25 10:04:03 +02:00
Oliver Gierke
6aa8f84428 DATAMONGO-1711 - After release cleanups. 2017-07-24 19:25:08 +02:00
Oliver Gierke
b83f2e9198 DATAMONGO-1711 - Prepare next development iteration. 2017-07-24 19:25:06 +02:00
Oliver Gierke
2171c814e8 DATAMONGO-1711 - Release version 1.10.5 (Ingalls SR5). 2017-07-24 18:44:18 +02:00
Oliver Gierke
d0e398a39c DATAMONGO-1711 - Prepare 1.10.5 (Ingalls SR5). 2017-07-24 18:43:23 +02:00
Oliver Gierke
428c60dee0 DATAMONGO-1711 - Updated changelog. 2017-07-24 18:43:16 +02:00
Oliver Gierke
80393b2dc2 DATAMONGO-1720 - Make sure benchmark module is not included by default.
The benchmarks module does not produce a JAR by default which let's our Maven Central deployment fail as a module has to produce one according to their rules. We're now only including the benchmark module when the benchmarks profile is active.
2017-07-24 18:39:38 +02:00
Oliver Gierke
c15a542863 DATAMONGO-1744 - Improved setup of default MongoMappingContext instances created.
We now make sure that the SimpleTypeHolder produced by MongoCustomConversions is used to set up default MongoMappingContext instances in (Reactive)MongoTemplate and unit tests.
2017-07-19 15:21:15 +02:00
Mark Paluch
92c6db13dc DATAMONGO-1703 - Polishing.
Use lombok's Value for ObjectPathItem. Make methods accessible in DefaultDbRefResolver before calling. Use class.cast to avoid warnings. Update Javadoc.

Original pull request: #478.
2017-07-14 11:47:59 +02:00
Christoph Strobl
1681bcd15b DATAMONGO-1703 - Convert resolved DBRef's from source that do not match the requested property type.
We now check if already resolved DBRef's are assignable to the target property type. If not, we perform conversion again to prevent ClassCastException when trying to assign non matching types.

Remove non applicable public modifiers in ObjectPath.

Original pull request: #478.
2017-07-14 11:47:59 +02:00
Mark Paluch
1f2d0da5ed DATAMONGO-1720 - Polishing.
Enhance benchmark statistics with Git/working tree details. Specify byte encoding for JSON to byte encoder.
Add status code check to HttpResultsWriter to verify that the results were accepted. Convert spaces to tabs in pom.xml.

Original pull request: #483.
2017-07-13 15:17:13 +02:00
Christoph Strobl
8009bd2846 DATAMONGO-1720 - Add JMH based benchmarks for MappingMongoConverter.
Run the benchmark via the maven profile "benchmarks":

    mvn -P benchmarks clean test

Or run them customized:

    mvn -P benchmarks -DwarmupIterations=2 -DmeasurementIterations=5 -Dforks=1 clean test

Original pull request: #483.
2017-07-13 15:17:09 +02:00
Oliver Gierke
cbd9807f16 DATAMONGO-1725 - Prevent NullPointerException in CloseableIterableCursorAdapter.close(). 2017-07-05 13:15:45 +02:00
Oliver Gierke
f672b17dfc DATAMONGO-1729 - Open projections don't get field restrictions applied.
We now only apply a field restriction if the projection used for a query is closed.
2017-07-03 22:17:38 +02:00
Oliver Gierke
2a018b04ec DATAMONGO-1723 - ConfigurationExtensionUnitTests now need to provide a BeanDefinitionRegistry. 2017-06-26 16:53:39 +02:00
Mark Paluch
8e748ab1c2 DATAMONGO-1678 - Polishing.
Use Lombok's Value annotation for immutable value objects. Use IllegalArgumentException for NonNull validation exceptions. Trim whitespaces, formatting.

Original pull request: #472.
2017-06-26 13:28:28 +02:00
Christoph Strobl
49f9307884 DATAMONGO-1678 - Run bulk update / remove documents through type mappers.
We now make sure to run any query / update object through the Query- / UpdateMapper. This ensures @Field annotations and potential custom conversions get processed correctly for update / remove operations.

Original pull request: #472.
2017-06-26 13:13:54 +02:00
Christoph Strobl
ebd8491642 DATAMONGO-1697 - Update MongoOperations JavaDoc regarding mapping limitations.
We now explicitly mention mapping/support limitations for API variants like count(Query, String) not having domain type specific information that allows field specific mapping.
2017-06-19 10:31:20 +02:00
Christoph Strobl
f581677bf2 DATAMONGO-1718 - Polishing.
Add test and hand over Object.class as placeholder for required domain type.

Original Pull Request: #469
2017-06-16 13:31:37 +02:00
Borislav Rangelov
a2a172e559 DATAMONGO-1718 - Fix MongoTemplate::findAllAndRemove(Query,String) delegating to wrong overload.
Original Pull Request: #469 (by Borislav Rangelov).
2017-06-16 13:31:26 +02:00
Mark Paluch
61fd09bb43 DATAMONGO-1688 - Updated changelog. 2017-06-14 17:35:01 +02:00
Mark Paluch
4b11d415f1 DATAMONGO-1672 - After release cleanups. 2017-06-08 11:26:20 +02:00
Mark Paluch
beabbc0307 DATAMONGO-1672 - Prepare next development iteration. 2017-06-08 11:26:18 +02:00
Mark Paluch
ad55a8bab7 DATAMONGO-1672 - Release version 1.10.4 (Ingalls SR4). 2017-06-08 10:56:51 +02:00
Mark Paluch
79e0b44b5e DATAMONGO-1672 - Prepare 1.10.4 (Ingalls SR4). 2017-06-08 10:56:03 +02:00
Mark Paluch
b747e226d7 DATAMONGO-1672 - Updated changelog. 2017-06-08 10:55:58 +02:00
Mark Paluch
0bdd4a2eb4 DATAMONGO-1671 - Updated changelog. 2017-06-07 12:23:37 +02:00
Christoph Strobl
dda91f9d48 DATAMONGO-1699 - Upgrade travis-ci build to use MongoDB 3.4 server.
We now do it explicitly as there seems to be almost no movement getting the alias on the whitelist.
2017-05-24 13:17:27 +02:00
Mark Paluch
d32afee0e0 DATAMONGO-1664 - Updated changelog. 2017-05-09 11:36:18 +02:00
Mark Paluch
5e4cced3e6 DATAMONGO-1205 - Polishing.
Add author tag. Extend year range in copyright header.

Original pull request: #397.
2017-04-20 08:36:04 +02:00
Martin Macko
375d59f7a2 DATAMONGO-1205 - Log only CyclicPropertyReferenceException message.
We log CyclicPropertyReferenceException with its message only and removed the stack trace from the log. The stacktrace points to a verifier location and is not particularly useful in finding the offending code. This change creates consistency over how CyclicPropertyReferenceException is logged.

Original pull request: #397.
2017-04-20 08:36:04 +02:00
Oliver Gierke
0fe197d608 DATAMONGO-1670 - Updated changelog. 2017-04-19 21:04:23 +02:00
Oliver Gierke
5325e46aaa DATAMONGO-1669 - After release cleanups. 2017-04-19 20:01:14 +02:00
Oliver Gierke
1fe5793bd7 DATAMONGO-1669 - Prepare next development iteration. 2017-04-19 20:01:12 +02:00
44 changed files with 2405 additions and 374 deletions

View File

@@ -23,7 +23,9 @@ env:
addons:
apt:
sources:
- mongodb-3.4-precise
- mongodb-upstart
- sourceline: 'deb [arch=amd64] http://repo.mongodb.org/apt/ubuntu precise/mongodb-org/3.4 multiverse'
key_url: 'https://www.mongodb.org/static/pgp/server-3.4.asc'
packages:
- mongodb-org-server
- mongodb-org-shell

2
lombok.config Normal file
View File

@@ -0,0 +1,2 @@
lombok.nonNull.exceptionType = IllegalArgumentException
lombok.log.fieldName = LOG

18
pom.xml
View File

@@ -5,7 +5,7 @@
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.10.3.RELEASE</version>
<version>1.10.7.RELEASE</version>
<packaging>pom</packaging>
<name>Spring Data MongoDB</name>
@@ -15,7 +15,7 @@
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>1.9.3.RELEASE</version>
<version>1.9.7.RELEASE</version>
</parent>
<modules>
@@ -28,9 +28,10 @@
<properties>
<project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>1.13.3.RELEASE</springdata.commons>
<springdata.commons>1.13.7.RELEASE</springdata.commons>
<mongo>2.14.3</mongo>
<mongo.osgi>2.13.0</mongo.osgi>
<jmh.version>1.19</jmh.version>
</properties>
<developers>
@@ -216,6 +217,17 @@
</build>
</profile>
<profile>
<id>benchmarks</id>
<modules>
<module>spring-data-mongodb</module>
<module>spring-data-mongodb-cross-store</module>
<module>spring-data-mongodb-log4j</module>
<module>spring-data-mongodb-distribution</module>
<module>spring-data-mongodb-benchmarks</module>
</modules>
</profile>
</profiles>
<dependencies>

View File

@@ -0,0 +1,76 @@
# Benchmarks
Benchmarks are based on [JMH](http://openjdk.java.net/projects/code-tools/jmh/).
# Running Benchmarks
Running benchmarks is disabled by default and can be activated via the `benchmarks` profile.
To run the benchmarks with default settings use.
```bash
mvn -P benchmarks clean test
```
A basic report will be printed to the CLI.
```bash
# Run complete. Total time: 00:00:15
Benchmark Mode Cnt Score Error Units
MappingMongoConverterBenchmark.readObject thrpt 10 1920157,631 ± 64310,809 ops/s
MappingMongoConverterBenchmark.writeObject thrpt 10 782732,857 ± 53804,130 ops/s
```
## Running all Benchmarks of a specific class
To run all Benchmarks of a specific class, just provide its simple class name via the `benchmark` command line argument.
```bash
mvn -P benchmarks clean test -D benchmark=MappingMongoConverterBenchmark
```
## Running a single Benchmark
To run a single Benchmark provide its containing class simple name followed by `#` and the method name via the `benchmark` command line argument.
```bash
mvn -P benchmarks clean test -D benchmark=MappingMongoConverterBenchmark#readObjectWith2Properties
```
# Saving Benchmark Results
A detailed benchmark report is stored in JSON format in the `/target/reports/performance` directory.
To store the report in a different location use the `benchmarkReportDir` command line argument.
## MongoDB
Results can be directly piped to MongoDB by providing a valid [Connection String](https://docs.mongodb.com/manual/reference/connection-string/) via the `publishTo` command line argument.
```bash
mvn -P benchmarks clean test -D publishTo=mongodb://127.0.0.1:27017
```
NOTE: If the uri does not explicitly define a database the default `spring-data-mongodb-benchmarks` is used.
## HTTP Endpoint
The benchmark report can also be posted as `application/json` to an HTTP Endpoint by providing a valid URl via the `publishTo` command line argument.
```bash
mvn -P benchmarks clean test -D publishTo=http://127.0.0.1:8080/capture-benchmarks
```
# Customizing Benchmarks
Following options can be set via command line.
Option | Default Value
--- | ---
warmupIterations | 10
warmupTime | 1 (seconds)
measurementIterations | 10
measurementTime | 1 (seconds)
forks | 1
benchmarkReportDir | /target/reports/performance (always relative to project root dir)
benchmark | .* (single benchmark via `classname#benchmark`)
publishTo | \[not set\] (mongodb-uri or http-endpoint)

View File

@@ -0,0 +1,112 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.10.7.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>
<artifactId>spring-data-mongodb-benchmarks</artifactId>
<packaging>jar</packaging>
<name>Spring Data MongoDB - Microbenchmarks</name>
<properties>
<!-- Skip tests by default; run only if -DskipTests=false is specified or benchmarks profile is activated -->
<skipTests>true</skipTests>
<bundlor.enabled>false</bundlor.enabled>
</properties>
<dependencies>
<dependency>
<groupId>${project.groupId}</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>${project.version}</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>${junit}</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.openjdk.jmh</groupId>
<artifactId>jmh-core</artifactId>
<version>${jmh.version}</version>
</dependency>
<dependency>
<groupId>org.openjdk.jmh</groupId>
<artifactId>jmh-generator-annprocess</artifactId>
<version>${jmh.version}</version>
<scope>provided</scope>
</dependency>
</dependencies>
<profiles>
<profile>
<id>benchmarks</id>
<properties>
<skipTests>false</skipTests>
</properties>
</profile>
</profiles>
<build>
<plugins>
<plugin>
<groupId>pl.project13.maven</groupId>
<artifactId>git-commit-id-plugin</artifactId>
<version>2.2.2</version>
<executions>
<execution>
<goals>
<goal>revision</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<artifactId>maven-jar-plugin</artifactId>
<executions>
<execution>
<id>default-jar</id>
<phase>never</phase>
</execution>
</executions>
</plugin>
<plugin>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<testSourceDirectory>${project.build.sourceDirectory}</testSourceDirectory>
<testClassesDirectory>${project.build.outputDirectory}</testClassesDirectory>
<excludes>
<exclude>**/AbstractMicrobenchmark.java</exclude>
<exclude>**/*$*.class</exclude>
<exclude>**/generated/*.class</exclude>
</excludes>
<includes>
<include>**/*Benchmark*</include>
</includes>
<systemPropertyVariables>
<benchmarkReportDir>${project.build.directory}/reports/performance</benchmarkReportDir>
<project.version>${project.version}</project.version>
<git.dirty>${git.dirty}</git.dirty>
<git.commit.id>${git.commit.id}</git.commit.id>
<git.branch>${git.branch}</git.branch>
</systemPropertyVariables>
</configuration>
</plugin>
</plugins>
</build>
</project>

View File

@@ -0,0 +1,111 @@
/*
* Copyright 2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import static org.springframework.data.mongodb.core.query.Query.*;
import lombok.Data;
import java.util.ArrayList;
import java.util.List;
import org.bson.types.ObjectId;
import org.openjdk.jmh.annotations.Benchmark;
import org.openjdk.jmh.annotations.Scope;
import org.openjdk.jmh.annotations.Setup;
import org.openjdk.jmh.annotations.State;
import org.openjdk.jmh.annotations.TearDown;
import org.springframework.data.annotation.Id;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.mapping.DBRef;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.microbenchmark.AbstractMicrobenchmark;
import com.mongodb.MongoClient;
import com.mongodb.ServerAddress;
/**
* @author Christoph Strobl
*/
@State(Scope.Benchmark)
public class DbRefMappingBenchmark extends AbstractMicrobenchmark {
private static final String DB_NAME = "dbref-loading-benchmark";
private MongoClient client;
private MongoTemplate template;
private Query queryObjectWithDBRef;
private Query queryObjectWithDBRefList;
@Setup
public void setUp() throws Exception {
client = new MongoClient(new ServerAddress());
template = new MongoTemplate(client, DB_NAME);
List<RefObject> refObjects = new ArrayList<RefObject>();
for (int i = 0; i < 1; i++) {
RefObject o = new RefObject();
template.save(o);
refObjects.add(o);
}
ObjectWithDBRef singleDBRef = new ObjectWithDBRef();
singleDBRef.ref = refObjects.iterator().next();
template.save(singleDBRef);
ObjectWithDBRef multipleDBRefs = new ObjectWithDBRef();
multipleDBRefs.refList = refObjects;
template.save(multipleDBRefs);
queryObjectWithDBRef = query(where("id").is(singleDBRef.id));
queryObjectWithDBRefList = query(where("id").is(multipleDBRefs.id));
}
@TearDown
public void tearDown() {
client.dropDatabase(DB_NAME);
client.close();
}
@Benchmark // DATAMONGO-1720
public ObjectWithDBRef readSingleDbRef() {
return template.findOne(queryObjectWithDBRef, ObjectWithDBRef.class);
}
@Benchmark // DATAMONGO-1720
public ObjectWithDBRef readMultipleDbRefs() {
return template.findOne(queryObjectWithDBRefList, ObjectWithDBRef.class);
}
@Data
static class ObjectWithDBRef {
private @Id ObjectId id;
private @DBRef RefObject ref;
private @DBRef List<RefObject> refList;
}
@Data
static class RefObject {
private @Id String id;
private String someValue;
}
}

View File

@@ -0,0 +1,182 @@
/*
* Copyright 2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.Getter;
import lombok.RequiredArgsConstructor;
import java.util.Arrays;
import java.util.Collections;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;
import java.util.UUID;
import org.bson.types.ObjectId;
import org.openjdk.jmh.annotations.Benchmark;
import org.openjdk.jmh.annotations.Scope;
import org.openjdk.jmh.annotations.Setup;
import org.openjdk.jmh.annotations.State;
import org.openjdk.jmh.annotations.TearDown;
import org.springframework.data.annotation.Id;
import org.springframework.data.geo.Point;
import org.springframework.data.mongodb.core.SimpleMongoDbFactory;
import org.springframework.data.mongodb.core.mapping.Field;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.microbenchmark.AbstractMicrobenchmark;
import com.mongodb.BasicDBObject;
import com.mongodb.MongoClient;
import com.mongodb.ServerAddress;
import com.mongodb.util.JSON;
/**
* @author Christoph Strobl
*/
@State(Scope.Benchmark)
public class MappingMongoConverterBenchmark extends AbstractMicrobenchmark {
private static final String DB_NAME = "mapping-mongo-converter-benchmark";
private MongoClient client;
private MongoMappingContext mappingContext;
private MappingMongoConverter converter;
private BasicDBObject documentWith2Properties, documentWith2PropertiesAnd1Nested;
private Customer objectWith2PropertiesAnd1Nested;
private BasicDBObject documentWithFlatAndComplexPropertiesPlusListAndMap;
private SlightlyMoreComplexObject objectWithFlatAndComplexPropertiesPlusListAndMap;
@Setup
public void setUp() throws Exception {
client = new MongoClient(new ServerAddress());
this.mappingContext = new MongoMappingContext();
this.mappingContext.setInitialEntitySet(Collections.singleton(Customer.class));
this.mappingContext.afterPropertiesSet();
DbRefResolver dbRefResolver = new DefaultDbRefResolver(new SimpleMongoDbFactory(client, DB_NAME));
this.converter = new MappingMongoConverter(dbRefResolver, mappingContext);
this.converter.setCustomConversions(new CustomConversions(Collections.emptyList()));
this.converter.afterPropertiesSet();
// just a flat document
this.documentWith2Properties = new BasicDBObject("firstname", "Dave").append("lastname", "Matthews");
// document with a nested one
BasicDBObject address = new BasicDBObject("zipCode", "ABCDE").append("city", "Some Place");
this.documentWith2PropertiesAnd1Nested = new BasicDBObject("firstname", "Dave").//
append("lastname", "Matthews").//
append("address", address);
// object equivalent of documentWith2PropertiesAnd1Nested
this.objectWith2PropertiesAnd1Nested = new Customer("Dave", "Matthews", new Address("zipCode", "City"));
// a bit more challenging object with list & map conversion.
objectWithFlatAndComplexPropertiesPlusListAndMap = new SlightlyMoreComplexObject();
objectWithFlatAndComplexPropertiesPlusListAndMap.id = UUID.randomUUID().toString();
objectWithFlatAndComplexPropertiesPlusListAndMap.addressList = Arrays.asList(new Address("zip-1", "city-1"),
new Address("zip-2", "city-2"));
objectWithFlatAndComplexPropertiesPlusListAndMap.customer = objectWith2PropertiesAnd1Nested;
objectWithFlatAndComplexPropertiesPlusListAndMap.customerMap = new LinkedHashMap<String, Customer>();
objectWithFlatAndComplexPropertiesPlusListAndMap.customerMap.put("dave", objectWith2PropertiesAnd1Nested);
objectWithFlatAndComplexPropertiesPlusListAndMap.customerMap.put("deborah",
new Customer("Deborah Anne", "Dyer", new Address("?", "london")));
objectWithFlatAndComplexPropertiesPlusListAndMap.customerMap.put("eddie",
new Customer("Eddie", "Vedder", new Address("??", "Seattle")));
objectWithFlatAndComplexPropertiesPlusListAndMap.intOne = Integer.MIN_VALUE;
objectWithFlatAndComplexPropertiesPlusListAndMap.intTwo = Integer.MAX_VALUE;
objectWithFlatAndComplexPropertiesPlusListAndMap.location = new Point(-33.865143, 151.209900);
objectWithFlatAndComplexPropertiesPlusListAndMap.renamedField = "supercalifragilisticexpialidocious";
objectWithFlatAndComplexPropertiesPlusListAndMap.stringOne = "¯\\_(ツ)_/¯";
objectWithFlatAndComplexPropertiesPlusListAndMap.stringTwo = " (╯°□°)╯︵ ┻━┻";
// JSON equivalent of objectWithFlatAndComplexPropertiesPlusListAndMap
documentWithFlatAndComplexPropertiesPlusListAndMap = (BasicDBObject) JSON.parse(
"{ \"_id\" : \"517f6aee-e9e0-44f0-88ed-f3694a019f27\", \"intOne\" : -2147483648, \"intTwo\" : 2147483647, \"stringOne\" : \"¯\\\\_(ツ)_/¯\", \"stringTwo\" : \" (╯°□°)╯︵ ┻━┻\", \"explicit-field-name\" : \"supercalifragilisticexpialidocious\", \"location\" : { \"x\" : -33.865143, \"y\" : 151.2099 }, \"objectWith2PropertiesAnd1Nested\" : { \"firstname\" : \"Dave\", \"lastname\" : \"Matthews\", \"address\" : { \"zipCode\" : \"zipCode\", \"city\" : \"City\" } }, \"addressList\" : [{ \"zipCode\" : \"zip-1\", \"city\" : \"city-1\" }, { \"zipCode\" : \"zip-2\", \"city\" : \"city-2\" }], \"customerMap\" : { \"dave\" : { \"firstname\" : \"Dave\", \"lastname\" : \"Matthews\", \"address\" : { \"zipCode\" : \"zipCode\", \"city\" : \"City\" } }, \"deborah\" : { \"firstname\" : \"Deborah Anne\", \"lastname\" : \"Dyer\", \"address\" : { \"zipCode\" : \"?\", \"city\" : \"london\" } }, \"eddie\" : { \"firstname\" : \"Eddie\", \"lastname\" : \"Vedder\", \"address\" : { \"zipCode\" : \"??\", \"city\" : \"Seattle\" } } }, \"_class\" : \"org.springframework.data.mongodb.core.convert.MappingMongoConverterBenchmark$SlightlyMoreComplexObject\" }");
}
@TearDown
public void tearDown() {
client.dropDatabase(DB_NAME);
client.close();
}
@Benchmark // DATAMONGO-1720
public Customer readObjectWith2Properties() {
return converter.read(Customer.class, documentWith2Properties);
}
@Benchmark // DATAMONGO-1720
public Customer readObjectWith2PropertiesAnd1NestedObject() {
return converter.read(Customer.class, documentWith2PropertiesAnd1Nested);
}
@Benchmark // DATAMONGO-1720
public BasicDBObject writeObjectWith2PropertiesAnd1NestedObject() {
BasicDBObject sink = new BasicDBObject();
converter.write(objectWith2PropertiesAnd1Nested, sink);
return sink;
}
@Benchmark // DATAMONGO-1720
public Object readObjectWithListAndMapsOfComplexType() {
return converter.read(SlightlyMoreComplexObject.class, documentWithFlatAndComplexPropertiesPlusListAndMap);
}
@Benchmark // DATAMONGO-1720
public Object writeObjectWithListAndMapsOfComplexType() {
BasicDBObject sink = new BasicDBObject();
converter.write(objectWithFlatAndComplexPropertiesPlusListAndMap, sink);
return sink;
}
@Getter
@RequiredArgsConstructor
static class Customer {
private @Id ObjectId id;
private final String firstname, lastname;
private final Address address;
}
@Getter
@AllArgsConstructor
static class Address {
private String zipCode, city;
}
@Data
static class SlightlyMoreComplexObject {
@Id String id;
int intOne, intTwo;
String stringOne, stringTwo;
@Field("explicit-field-name") String renamedField;
Point location;
Customer customer;
List<Address> addressList;
Map<String, Customer> customerMap;
}
}

View File

@@ -0,0 +1,329 @@
/*
* Copyright 2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.microbenchmark;
import java.io.File;
import java.io.IOException;
import java.text.SimpleDateFormat;
import java.util.Collection;
import java.util.Date;
import org.junit.Test;
import org.openjdk.jmh.annotations.Fork;
import org.openjdk.jmh.annotations.Measurement;
import org.openjdk.jmh.annotations.Scope;
import org.openjdk.jmh.annotations.State;
import org.openjdk.jmh.annotations.Warmup;
import org.openjdk.jmh.results.RunResult;
import org.openjdk.jmh.results.format.ResultFormatType;
import org.openjdk.jmh.runner.Runner;
import org.openjdk.jmh.runner.options.ChainedOptionsBuilder;
import org.openjdk.jmh.runner.options.OptionsBuilder;
import org.openjdk.jmh.runner.options.TimeValue;
import org.springframework.core.env.StandardEnvironment;
import org.springframework.data.mongodb.microbenchmark.ResultsWriter.Utils;
import org.springframework.util.CollectionUtils;
import org.springframework.util.ResourceUtils;
import org.springframework.util.StringUtils;
/**
* @author Christoph Strobl
*/
@Warmup(iterations = AbstractMicrobenchmark.WARMUP_ITERATIONS)
@Measurement(iterations = AbstractMicrobenchmark.MEASUREMENT_ITERATIONS)
@Fork(AbstractMicrobenchmark.FORKS)
@State(Scope.Thread)
public class AbstractMicrobenchmark {
static final int WARMUP_ITERATIONS = 5;
static final int MEASUREMENT_ITERATIONS = 10;
static final int FORKS = 1;
static final String[] JVM_ARGS = { "-server", "-XX:+HeapDumpOnOutOfMemoryError", "-Xms1024m", "-Xmx1024m",
"-XX:MaxDirectMemorySize=1024m" };
private final StandardEnvironment environment = new StandardEnvironment();
/**
* Run matching {@link org.openjdk.jmh.annotations.Benchmark} methods with options collected from
* {@link org.springframework.core.env.Environment}.
*
* @throws Exception
* @see #options(String)
*/
@Test
public void run() throws Exception {
String includes = includes();
if (!includes.contains(org.springframework.util.ClassUtils.getShortName(getClass()))) {
return;
}
publishResults(new Runner(options(includes).build()).run());
}
/**
* Get the regex for all benchmarks to be included in the run. By default every benchmark within classes matching the
* current ones short name. <br />
* The {@literal benchmark} command line argument allows overriding the defaults using {@code #} as class / method
* name separator.
*
* @return never {@literal null}.
* @see org.springframework.util.ClassUtils#getShortName(Class)
*/
protected String includes() {
String tests = environment.getProperty("benchmark", String.class);
if (!StringUtils.hasText(tests)) {
return ".*" + org.springframework.util.ClassUtils.getShortName(getClass()) + ".*";
}
if (!tests.contains("#")) {
return ".*" + tests + ".*";
}
String[] args = tests.split("#");
return ".*" + args[0] + "." + args[1];
}
/**
* Collect all options for the {@link Runner}.
*
* @param includes regex for matching benchmarks to be included in the run.
* @return never {@literal null}.
* @throws Exception
*/
protected ChainedOptionsBuilder options(String includes) throws Exception {
ChainedOptionsBuilder optionsBuilder = new OptionsBuilder().include(includes).jvmArgs(jvmArgs());
optionsBuilder = warmup(optionsBuilder);
optionsBuilder = measure(optionsBuilder);
optionsBuilder = forks(optionsBuilder);
optionsBuilder = report(optionsBuilder);
return optionsBuilder;
}
/**
* JVM args to apply to {@link Runner} via its {@link org.openjdk.jmh.runner.options.Options}.
*
* @return {@link #JVM_ARGS} by default.
*/
protected String[] jvmArgs() {
String[] args = new String[JVM_ARGS.length];
System.arraycopy(JVM_ARGS, 0, args, 0, JVM_ARGS.length);
return args;
}
/**
* Read {@code warmupIterations} property from {@link org.springframework.core.env.Environment}.
*
* @return -1 if not set.
*/
protected int getWarmupIterations() {
return environment.getProperty("warmupIterations", Integer.class, -1);
}
/**
* Read {@code measurementIterations} property from {@link org.springframework.core.env.Environment}.
*
* @return -1 if not set.
*/
protected int getMeasurementIterations() {
return environment.getProperty("measurementIterations", Integer.class, -1);
}
/**
* Read {@code forks} property from {@link org.springframework.core.env.Environment}.
*
* @return -1 if not set.
*/
protected int getForksCount() {
return environment.getProperty("forks", Integer.class, -1);
}
/**
* Read {@code benchmarkReportDir} property from {@link org.springframework.core.env.Environment}.
*
* @return {@literal null} if not set.
*/
protected String getReportDirectory() {
return environment.getProperty("benchmarkReportDir");
}
/**
* Read {@code measurementTime} property from {@link org.springframework.core.env.Environment}.
*
* @return -1 if not set.
*/
protected long getMeasurementTime() {
return environment.getProperty("measurementTime", Long.class, -1L);
}
/**
* Read {@code warmupTime} property from {@link org.springframework.core.env.Environment}.
*
* @return -1 if not set.
*/
protected long getWarmupTime() {
return environment.getProperty("warmupTime", Long.class, -1L);
}
/**
* {@code project.version_yyyy-MM-dd_ClassName.json} eg.
* {@literal 1.11.0.BUILD-SNAPSHOT_2017-03-07_MappingMongoConverterBenchmark.json}
*
* @return
*/
protected String reportFilename() {
StringBuilder sb = new StringBuilder();
if (environment.containsProperty("project.version")) {
sb.append(environment.getProperty("project.version"));
sb.append("_");
}
sb.append(new SimpleDateFormat("yyyy-MM-dd").format(new Date()));
sb.append("_");
sb.append(org.springframework.util.ClassUtils.getShortName(getClass()));
sb.append(".json");
return sb.toString();
}
/**
* Apply measurement options to {@link ChainedOptionsBuilder}.
*
* @param optionsBuilder must not be {@literal null}.
* @return {@link ChainedOptionsBuilder} with options applied.
* @see #getMeasurementIterations()
* @see #getMeasurementTime()
*/
private ChainedOptionsBuilder measure(ChainedOptionsBuilder optionsBuilder) {
int measurementIterations = getMeasurementIterations();
long measurementTime = getMeasurementTime();
if (measurementIterations > 0) {
optionsBuilder = optionsBuilder.measurementIterations(measurementIterations);
}
if (measurementTime > 0) {
optionsBuilder = optionsBuilder.measurementTime(TimeValue.seconds(measurementTime));
}
return optionsBuilder;
}
/**
* Apply warmup options to {@link ChainedOptionsBuilder}.
*
* @param optionsBuilder must not be {@literal null}.
* @return {@link ChainedOptionsBuilder} with options applied.
* @see #getWarmupIterations()
* @see #getWarmupTime()
*/
private ChainedOptionsBuilder warmup(ChainedOptionsBuilder optionsBuilder) {
int warmupIterations = getWarmupIterations();
long warmupTime = getWarmupTime();
if (warmupIterations > 0) {
optionsBuilder = optionsBuilder.warmupIterations(warmupIterations);
}
if (warmupTime > 0) {
optionsBuilder = optionsBuilder.warmupTime(TimeValue.seconds(warmupTime));
}
return optionsBuilder;
}
/**
* Apply forks option to {@link ChainedOptionsBuilder}.
*
* @param optionsBuilder must not be {@literal null}.
* @return {@link ChainedOptionsBuilder} with options applied.
* @see #getForksCount()
*/
private ChainedOptionsBuilder forks(ChainedOptionsBuilder optionsBuilder) {
int forks = getForksCount();
if (forks <= 0) {
return optionsBuilder;
}
return optionsBuilder.forks(forks);
}
/**
* Apply report option to {@link ChainedOptionsBuilder}.
*
* @param optionsBuilder must not be {@literal null}.
* @return {@link ChainedOptionsBuilder} with options applied.
* @throws IOException if report file cannot be created.
* @see #getReportDirectory()
*/
private ChainedOptionsBuilder report(ChainedOptionsBuilder optionsBuilder) throws IOException {
String reportDir = getReportDirectory();
if (!StringUtils.hasText(reportDir)) {
return optionsBuilder;
}
String reportFilePath = reportDir + (reportDir.endsWith(File.separator) ? "" : File.separator) + reportFilename();
File file = ResourceUtils.getFile(reportFilePath);
if (file.exists()) {
file.delete();
} else {
file.getParentFile().mkdirs();
file.createNewFile();
}
optionsBuilder.resultFormat(ResultFormatType.JSON);
optionsBuilder.result(reportFilePath);
return optionsBuilder;
}
/**
* Publish results to an external system.
*
* @param results must not be {@literal null}.
*/
private void publishResults(Collection<RunResult> results) {
if (CollectionUtils.isEmpty(results) || !environment.containsProperty("publishTo")) {
return;
}
String uri = environment.getProperty("publishTo");
try {
Utils.forUri(uri).write(results);
} catch (Exception e) {
System.err.println(String.format("Cannot save benchmark results to '%s'. Error was %s.", uri, e));
}
}
}

View File

@@ -0,0 +1,86 @@
/*
* Copyright 2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.microbenchmark;
import lombok.SneakyThrows;
import java.io.OutputStream;
import java.net.HttpURLConnection;
import java.net.URL;
import java.net.URLConnection;
import java.nio.charset.Charset;
import java.util.Collection;
import org.openjdk.jmh.results.RunResult;
import org.springframework.core.env.StandardEnvironment;
import org.springframework.util.CollectionUtils;
/**
* {@link ResultsWriter} implementation of {@link URLConnection}.
*
* @since 2.0
*/
class HttpResultsWriter implements ResultsWriter {
private final String url;
HttpResultsWriter(String url) {
this.url = url;
}
@Override
@SneakyThrows
public void write(Collection<RunResult> results) {
if (CollectionUtils.isEmpty(results)) {
return;
}
StandardEnvironment env = new StandardEnvironment();
String projectVersion = env.getProperty("project.version", "unknown");
String gitBranch = env.getProperty("git.branch", "unknown");
String gitDirty = env.getProperty("git.dirty", "no");
String gitCommitId = env.getProperty("git.commit.id", "unknown");
HttpURLConnection connection = (HttpURLConnection) new URL(url).openConnection();
connection.setConnectTimeout(1000);
connection.setReadTimeout(1000);
connection.setDoOutput(true);
connection.setRequestMethod("POST");
connection.setRequestProperty("Content-Type", "application/json");
connection.addRequestProperty("X-Project-Version", projectVersion);
connection.addRequestProperty("X-Git-Branch", gitBranch);
connection.addRequestProperty("X-Git-Dirty", gitDirty);
connection.addRequestProperty("X-Git-Commit-Id", gitCommitId);
OutputStream output = null;
try {
output = connection.getOutputStream();
output.write(ResultsWriter.Utils.jsonifyResults(results).getBytes(Charset.forName("UTF-8")));
} finally {
if (output != null) {
output.close();
}
}
if (connection.getResponseCode() >= 400) {
throw new IllegalStateException(
String.format("Status %d %s", connection.getResponseCode(), connection.getResponseMessage()));
}
}
}

View File

@@ -0,0 +1,135 @@
/*
* Copyright 2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.microbenchmark;
import java.net.UnknownHostException;
import java.util.Collection;
import java.util.Date;
import java.util.List;
import org.openjdk.jmh.results.RunResult;
import org.springframework.core.env.StandardEnvironment;
import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.DB;
import com.mongodb.MongoClient;
import com.mongodb.MongoClientURI;
import com.mongodb.util.JSON;
/**
* MongoDB specific {@link ResultsWriter} implementation.
*
* @author Christoph Strobl
* @since 2.0
*/
class MongoResultsWriter implements ResultsWriter {
private final String uri;
MongoResultsWriter(String uri) {
this.uri = uri;
}
@Override
public void write(Collection<RunResult> results) {
Date now = new Date();
StandardEnvironment env = new StandardEnvironment();
String projectVersion = env.getProperty("project.version", "unknown");
String gitBranch = env.getProperty("git.branch", "unknown");
String gitDirty = env.getProperty("git.dirty", "no");
String gitCommitId = env.getProperty("git.commit.id", "unknown");
MongoClientURI uri = new MongoClientURI(this.uri);
MongoClient client = null;
try {
client = new MongoClient(uri);
} catch (UnknownHostException e) {
throw new RuntimeException(e);
}
String dbName = StringUtils.hasText(uri.getDatabase()) ? uri.getDatabase() : "spring-data-mongodb-benchmarks";
DB db = client.getDB(dbName);
for (BasicDBObject dbo : (List<BasicDBObject>) JSON.parse(Utils.jsonifyResults(results))) {
String collectionName = extractClass(dbo.get("benchmark").toString());
BasicDBObject sink = new BasicDBObject();
sink.append("_version", projectVersion);
sink.append("_branch", gitBranch);
sink.append("_commit", gitCommitId);
sink.append("_dirty", gitDirty);
sink.append("_method", extractBenchmarkName(dbo.get("benchmark").toString()));
sink.append("_date", now);
sink.append("_snapshot", projectVersion.toLowerCase().contains("snapshot"));
sink.putAll(dbo.toMap());
db.getCollection(collectionName).insert(fixDocumentKeys(sink));
}
client.close();
}
/**
* Replace {@code .} by {@code ,}.
*
* @param doc
* @return
*/
private BasicDBObject fixDocumentKeys(BasicDBObject doc) {
BasicDBObject sanitized = new BasicDBObject();
for (Object key : doc.keySet()) {
Object value = doc.get(key);
if (value instanceof BasicDBObject) {
value = fixDocumentKeys((BasicDBObject) value);
}
if (key instanceof String) {
String newKey = (String) key;
if (newKey.contains(".")) {
newKey = newKey.replace('.', ',');
}
sanitized.put(newKey, value);
} else {
sanitized.put(ObjectUtils.nullSafeToString(key).replace('.', ','), value);
}
}
return sanitized;
}
private static String extractClass(String source) {
String tmp = source.substring(0, source.lastIndexOf('.'));
return tmp.substring(tmp.lastIndexOf(".") + 1);
}
private static String extractBenchmarkName(String source) {
return source.substring(source.lastIndexOf(".") + 1);
}
}

View File

@@ -0,0 +1,71 @@
/*
* Copyright 2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.microbenchmark;
import lombok.SneakyThrows;
import java.io.ByteArrayOutputStream;
import java.io.PrintStream;
import java.nio.charset.Charset;
import java.util.Collection;
import org.openjdk.jmh.results.RunResult;
import org.openjdk.jmh.results.format.ResultFormatFactory;
import org.openjdk.jmh.results.format.ResultFormatType;
/**
* @author Christoph Strobl
* @since 2.0
*/
interface ResultsWriter {
/**
* Write the {@link RunResult}s.
*
* @param results can be {@literal null}.
*/
void write(Collection<RunResult> results);
/* non Java8 hack */
class Utils {
/**
* Get the uri specific {@link ResultsWriter}.
*
* @param uri must not be {@literal null}.
* @return
*/
static ResultsWriter forUri(String uri) {
return uri.startsWith("mongodb:") ? new MongoResultsWriter(uri) : new HttpResultsWriter(uri);
}
/**
* Convert {@link RunResult}s to JMH Json representation.
*
* @param results
* @return json string representation of results.
* @see org.openjdk.jmh.results.format.JSONResultFormat
*/
@SneakyThrows
static String jsonifyResults(Collection<RunResult> results) {
ByteArrayOutputStream baos = new ByteArrayOutputStream();
ResultFormatFactory.getInstance(ResultFormatType.JSON, new PrintStream(baos, true, "UTF-8")).writeOut(results);
return new String(baos.toByteArray(), Charset.forName("UTF-8"));
}
}
}

View File

@@ -0,0 +1,14 @@
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<appender name="console" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%d %5p %40.40c:%4L - %m%n</pattern>
</encoder>
</appender>
<root level="error">
<appender-ref ref="console" />
</root>
</configuration>

View File

@@ -6,7 +6,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.10.3.RELEASE</version>
<version>1.10.7.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -48,7 +48,7 @@
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.10.3.RELEASE</version>
<version>1.10.7.RELEASE</version>
</dependency>
<dependency>

View File

@@ -13,7 +13,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.10.3.RELEASE</version>
<version>1.10.7.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -5,7 +5,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.10.3.RELEASE</version>
<version>1.10.7.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -11,7 +11,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.10.3.RELEASE</version>
<version>1.10.7.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2015-2016 the original author or authors.
* Copyright 2015-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -28,7 +28,7 @@ import com.mongodb.BulkWriteResult;
* 2.6 and make use of low level bulk commands on the protocol level. This interface defines a fluent API to add
* multiple single operations or list of similar operations in sequence which can then eventually be executed by calling
* {@link #execute()}.
*
*
* @author Tobias Trelle
* @author Oliver Gierke
* @since 1.9
@@ -49,7 +49,7 @@ public interface BulkOperations {
/**
* Add a single insert to the bulk operation.
*
*
* @param documents the document to insert, must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the insert added, will never be {@literal null}.
*/
@@ -57,7 +57,7 @@ public interface BulkOperations {
/**
* Add a list of inserts to the bulk operation.
*
*
* @param documents List of documents to insert, must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the insert added, will never be {@literal null}.
*/
@@ -65,7 +65,7 @@ public interface BulkOperations {
/**
* Add a single update to the bulk operation. For the update request, only the first matching document is updated.
*
*
* @param query update criteria, must not be {@literal null}.
* @param update {@link Update} operation to perform, must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
@@ -74,7 +74,7 @@ public interface BulkOperations {
/**
* Add a list of updates to the bulk operation. For each update request, only the first matching document is updated.
*
*
* @param updates Update operations to perform.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
*/
@@ -82,7 +82,7 @@ public interface BulkOperations {
/**
* Add a single update to the bulk operation. For the update request, all matching documents are updated.
*
*
* @param query Update criteria.
* @param update Update operation to perform.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
@@ -91,7 +91,7 @@ public interface BulkOperations {
/**
* Add a list of updates to the bulk operation. For each update request, all matching documents are updated.
*
*
* @param updates Update operations to perform.
* @return The bulk operation.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
@@ -101,7 +101,7 @@ public interface BulkOperations {
/**
* Add a single upsert to the bulk operation. An upsert is an update if the set of matching documents is not empty,
* else an insert.
*
*
* @param query Update criteria.
* @param update Update operation to perform.
* @return The bulk operation.
@@ -112,7 +112,7 @@ public interface BulkOperations {
/**
* Add a list of upserts to the bulk operation. An upsert is an update if the set of matching documents is not empty,
* else an insert.
*
*
* @param updates Updates/insert operations to perform.
* @return The bulk operation.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
@@ -121,7 +121,7 @@ public interface BulkOperations {
/**
* Add a single remove operation to the bulk operation.
*
*
* @param remove the {@link Query} to select the documents to be removed, must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the removal added, will never be {@literal null}.
*/
@@ -129,7 +129,7 @@ public interface BulkOperations {
/**
* Add a list of remove operations to the bulk operation.
*
*
* @param removes the remove operations to perform, must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the removal added, will never be {@literal null}.
*/
@@ -137,9 +137,9 @@ public interface BulkOperations {
/**
* Execute all bulk operations using the default write concern.
*
*
* @return Result of the bulk operation providing counters for inserts/updates etc.
* @throws {@link BulkOperationException} if an error occurred during bulk processing.
* @throws org.springframework.data.mongodb.BulkOperationException if an error occurred during bulk processing.
*/
BulkWriteResult execute();
}

View File

@@ -15,11 +15,17 @@
*/
package org.springframework.data.mongodb.core;
import java.util.Arrays;
import lombok.NonNull;
import lombok.Value;
import java.util.Collections;
import java.util.List;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.convert.UpdateMapper;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.util.Pair;
@@ -36,18 +42,18 @@ import com.mongodb.WriteConcern;
/**
* Default implementation for {@link BulkOperations}.
*
*
* @author Tobias Trelle
* @author Oliver Gierke
* @author Christoph Strobl
* @author Mark Paluch
* @since 1.9
*/
class DefaultBulkOperations implements BulkOperations {
private final MongoOperations mongoOperations;
private final BulkMode bulkMode;
private final String collectionName;
private final Class<?> entityType;
private final BulkOperationContext bulkOperationContext;
private PersistenceExceptionTranslator exceptionTranslator;
private WriteConcernResolver writeConcernResolver;
@@ -56,35 +62,32 @@ class DefaultBulkOperations implements BulkOperations {
private BulkWriteOperation bulk;
/**
* Creates a new {@link DefaultBulkOperations} for the given {@link MongoOperations}, {@link BulkMode}, collection
* name and {@link WriteConcern}.
*
* @param mongoOperations The underlying {@link MongoOperations}, must not be {@literal null}.
* @param bulkMode must not be {@literal null}.
* @param collectionName Name of the collection to work on, must not be {@literal null} or empty.
* @param entityType the entity type, can be {@literal null}.
* Creates a new {@link DefaultBulkOperations} for the given {@link MongoOperations}, collection name and
* {@link BulkOperationContext}.
*
* @param mongoOperations must not be {@literal null}.
* @param collectionName must not be {@literal null}.
* @param bulkOperationContext must not be {@literal null}.
* @since 1.10.5
*/
DefaultBulkOperations(MongoOperations mongoOperations, BulkMode bulkMode, String collectionName,
Class<?> entityType) {
DefaultBulkOperations(MongoOperations mongoOperations, String collectionName,
BulkOperationContext bulkOperationContext) {
Assert.notNull(mongoOperations, "MongoOperations must not be null!");
Assert.notNull(bulkMode, "BulkMode must not be null!");
Assert.hasText(collectionName, "Collection name must not be null or empty!");
Assert.hasText(collectionName, "CollectionName must not be null nor empty!");
Assert.notNull(bulkOperationContext, "BulkOperationContext must not be null!");
this.mongoOperations = mongoOperations;
this.bulkMode = bulkMode;
this.collectionName = collectionName;
this.entityType = entityType;
this.bulkOperationContext = bulkOperationContext;
this.exceptionTranslator = new MongoExceptionTranslator();
this.writeConcernResolver = DefaultWriteConcernResolver.INSTANCE;
this.bulk = initBulkOperation();
this.bulk = getBulkWriteOptions(bulkOperationContext.getBulkMode());
}
/**
* Configures the {@link PersistenceExceptionTranslator} to be used. Defaults to {@link MongoExceptionTranslator}.
*
*
* @param exceptionTranslator can be {@literal null}.
*/
public void setExceptionTranslator(PersistenceExceptionTranslator exceptionTranslator) {
@@ -93,7 +96,7 @@ class DefaultBulkOperations implements BulkOperations {
/**
* Configures the {@link WriteConcernResolver} to be used. Defaults to {@link DefaultWriteConcernResolver}.
*
*
* @param writeConcernResolver can be {@literal null}.
*/
public void setWriteConcernResolver(WriteConcernResolver writeConcernResolver) {
@@ -103,10 +106,10 @@ class DefaultBulkOperations implements BulkOperations {
/**
* Configures the default {@link WriteConcern} to be used. Defaults to {@literal null}.
*
*
* @param defaultWriteConcern can be {@literal null}.
*/
public void setDefaultWriteConcern(WriteConcern defaultWriteConcern) {
void setDefaultWriteConcern(WriteConcern defaultWriteConcern) {
this.defaultWriteConcern = defaultWriteConcern;
}
@@ -158,7 +161,7 @@ class DefaultBulkOperations implements BulkOperations {
Assert.notNull(query, "Query must not be null!");
Assert.notNull(update, "Update must not be null!");
return updateOne(Arrays.asList(Pair.of(query, update)));
return updateOne(Collections.singletonList(Pair.of(query, update)));
}
/*
@@ -188,7 +191,7 @@ class DefaultBulkOperations implements BulkOperations {
Assert.notNull(query, "Query must not be null!");
Assert.notNull(update, "Update must not be null!");
return updateMulti(Arrays.asList(Pair.of(query, update)));
return updateMulti(Collections.singletonList(Pair.of(query, update)));
}
/*
@@ -239,7 +242,7 @@ class DefaultBulkOperations implements BulkOperations {
Assert.notNull(query, "Query must not be null!");
bulk.find(query.getQueryObject()).remove();
bulk.find(getMappedQuery(query.getQueryObject())).remove();
return this;
}
@@ -267,27 +270,25 @@ class DefaultBulkOperations implements BulkOperations {
@Override
public BulkWriteResult execute() {
MongoAction action = new MongoAction(defaultWriteConcern, MongoActionOperation.BULK, collectionName, entityType,
null, null);
MongoAction action = new MongoAction(defaultWriteConcern, MongoActionOperation.BULK, collectionName,
bulkOperationContext.getEntityType(), null, null);
WriteConcern writeConcern = writeConcernResolver.resolve(action);
try {
return writeConcern == null ? bulk.execute() : bulk.execute(writeConcern);
} catch (BulkWriteException o_O) {
DataAccessException toThrow = exceptionTranslator.translateExceptionIfPossible(o_O);
throw toThrow == null ? o_O : toThrow;
} finally {
this.bulk = initBulkOperation();
this.bulk = getBulkWriteOptions(bulkOperationContext.getBulkMode());
}
}
/**
* Performs update and upsert bulk operations.
*
*
* @param query the {@link Query} to determine documents to update.
* @param update the {@link Update} to perform, must not be {@literal null}.
* @param upsert whether to upsert.
@@ -299,29 +300,37 @@ class DefaultBulkOperations implements BulkOperations {
Assert.notNull(query, "Query must not be null!");
Assert.notNull(update, "Update must not be null!");
BulkWriteRequestBuilder builder = bulk.find(query.getQueryObject());
BulkWriteRequestBuilder builder = bulk.find(getMappedQuery(query.getQueryObject()));
if (upsert) {
if (multi) {
builder.upsert().update(update.getUpdateObject());
builder.upsert().update(getMappedUpdate(update.getUpdateObject()));
} else {
builder.upsert().updateOne(update.getUpdateObject());
builder.upsert().updateOne(getMappedUpdate(update.getUpdateObject()));
}
} else {
if (multi) {
builder.update(update.getUpdateObject());
builder.update(getMappedUpdate(update.getUpdateObject()));
} else {
builder.updateOne(update.getUpdateObject());
builder.updateOne(getMappedUpdate(update.getUpdateObject()));
}
}
return this;
}
private final BulkWriteOperation initBulkOperation() {
private DBObject getMappedUpdate(DBObject update) {
return bulkOperationContext.getUpdateMapper().getMappedObject(update, bulkOperationContext.getEntity());
}
private DBObject getMappedQuery(DBObject query) {
return bulkOperationContext.getQueryMapper().getMappedObject(query, bulkOperationContext.getEntity());
}
private BulkWriteOperation getBulkWriteOptions(BulkMode bulkMode) {
DBCollection collection = mongoOperations.getCollection(collectionName);
@@ -334,4 +343,25 @@ class DefaultBulkOperations implements BulkOperations {
throw new IllegalStateException("BulkMode was null!");
}
/**
* {@link BulkOperationContext} holds information about
* {@link org.springframework.data.mongodb.core.BulkOperations.BulkMode} the entity in use as well as references to
* {@link QueryMapper} and {@link UpdateMapper}.
*
* @author Christoph Strobl
* @since 2.0
*/
@Value
static class BulkOperationContext {
@NonNull BulkMode bulkMode;
MongoPersistentEntity<?> entity;
@NonNull QueryMapper queryMapper;
@NonNull UpdateMapper updateMapper;
Class<?> getEntityType() {
return entity != null ? entity.getType() : null;
}
}
}

View File

@@ -1,6 +1,6 @@
/*
* Copyright 2011-2015 the original author or authors.
*
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
@@ -48,7 +48,7 @@ import com.mongodb.WriteResult;
* Interface that specifies a basic set of MongoDB operations. Implemented by {@link MongoTemplate}. Not often used but
* a useful option for extensibility and testability (as it can be easily mocked, stubbed, or be the target of a JDK
* proxy).
*
*
* @author Thomas Risberg
* @author Mark Pollack
* @author Oliver Gierke
@@ -61,7 +61,7 @@ public interface MongoOperations {
/**
* The collection name used for the specified class by this template.
*
*
* @param entityClass must not be {@literal null}.
* @return
*/
@@ -71,7 +71,7 @@ public interface MongoOperations {
* Execute the a MongoDB command expressed as a JSON string. This will call the method JSON.parse that is part of the
* MongoDB driver to convert the JSON string to a DBObject. Any errors that result from executing this command will be
* converted into Spring's DAO exception hierarchy.
*
*
* @param jsonCommand a MongoDB command expressed as a JSON string.
*/
CommandResult executeCommand(String jsonCommand);
@@ -79,7 +79,7 @@ public interface MongoOperations {
/**
* Execute a MongoDB command. Any errors that result from executing this command will be converted into Spring's DAO
* exception hierarchy.
*
*
* @param command a MongoDB command
*/
CommandResult executeCommand(DBObject command);
@@ -87,7 +87,7 @@ public interface MongoOperations {
/**
* Execute a MongoDB command. Any errors that result from executing this command will be converted into Spring's DAO
* exception hierarchy.
*
*
* @param command a MongoDB command
* @param options query options to use
* @deprecated since 1.7. Please use {@link #executeCommand(DBObject, ReadPreference)}, as the MongoDB Java driver
@@ -99,7 +99,7 @@ public interface MongoOperations {
/**
* Execute a MongoDB command. Any errors that result from executing this command will be converted into Spring's data
* access exception hierarchy.
*
*
* @param command a MongoDB command, must not be {@literal null}.
* @param readPreference read preferences to use, can be {@literal null}.
* @return
@@ -109,7 +109,7 @@ public interface MongoOperations {
/**
* Execute a MongoDB query and iterate over the query results on a per-document basis with a DocumentCallbackHandler.
*
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification
* @param collectionName name of the collection to retrieve the objects from
@@ -121,7 +121,7 @@ public interface MongoOperations {
* Executes a {@link DbCallback} translating any exceptions as necessary.
* <p/>
* Allows for returning a result object, that is a domain object or a collection of domain objects.
*
*
* @param <T> return type
* @param action callback object that specifies the MongoDB actions to perform on the passed in DB instance.
* @return a result object returned by the action or <tt>null</tt>
@@ -132,7 +132,7 @@ public interface MongoOperations {
* Executes the given {@link CollectionCallback} on the entity collection of the specified class.
* <p/>
* Allows for returning a result object, that is a domain object or a collection of domain objects.
*
*
* @param entityClass class that determines the collection to use
* @param <T> return type
* @param action callback object that specifies the MongoDB action
@@ -144,7 +144,7 @@ public interface MongoOperations {
* Executes the given {@link CollectionCallback} on the collection of the given name.
* <p/>
* Allows for returning a result object, that is a domain object or a collection of domain objects.
*
*
* @param <T> return type
* @param collectionName the name of the collection that specifies which DBCollection instance will be passed into
* @param action callback object that specifies the MongoDB action the callback action.
@@ -158,7 +158,7 @@ public interface MongoOperations {
* href=http://www.mongodb.org/display/DOCS/Java+Driver+Concurrency>Java Driver Concurrency</a>}
* <p/>
* Allows for returning a result object, that is a domain object or a collection of domain objects.
*
*
* @param <T> return type
* @param action callback that specified the MongoDB actions to perform on the DB instance
* @return a result object returned by the action or <tt>null</tt>
@@ -173,7 +173,7 @@ public interface MongoOperations {
* {@link Cursor}.
* <p>
* Returns a {@link CloseableIterator} that wraps the a Mongo DB {@link Cursor} that needs to be closed.
*
*
* @param <T> element return type
* @param query must not be {@literal null}.
* @param entityType must not be {@literal null}.
@@ -187,7 +187,7 @@ public interface MongoOperations {
* by a Mongo DB {@link Cursor}.
* <p>
* Returns a {@link CloseableIterator} that wraps the a Mongo DB {@link Cursor} that needs to be closed.
*
*
* @param <T> element return type
* @param query must not be {@literal null}.
* @param entityType must not be {@literal null}.
@@ -199,7 +199,7 @@ public interface MongoOperations {
/**
* Create an uncapped collection with a name based on the provided entity class.
*
*
* @param entityClass class that determines the collection to create
* @return the created collection
*/
@@ -207,7 +207,7 @@ public interface MongoOperations {
/**
* Create a collection with a name based on the provided entity class using the options.
*
*
* @param entityClass class that determines the collection to create
* @param collectionOptions options to use when creating the collection.
* @return the created collection
@@ -216,7 +216,7 @@ public interface MongoOperations {
/**
* Create an uncapped collection with the provided name.
*
*
* @param collectionName name of the collection
* @return the created collection
*/
@@ -224,7 +224,7 @@ public interface MongoOperations {
/**
* Create a collection with the provided name and options.
*
*
* @param collectionName name of the collection
* @param collectionOptions options to use when creating the collection.
* @return the created collection
@@ -233,7 +233,7 @@ public interface MongoOperations {
/**
* A set of collection names.
*
*
* @return list of collection names
*/
Set<String> getCollectionNames();
@@ -242,7 +242,7 @@ public interface MongoOperations {
* Get a collection by name, creating it if it doesn't exist.
* <p/>
* Translate any exceptions as necessary.
*
*
* @param collectionName name of the collection
* @return an existing collection or a newly created one.
*/
@@ -252,7 +252,7 @@ public interface MongoOperations {
* Check to see if a collection with a name indicated by the entity class exists.
* <p/>
* Translate any exceptions as necessary.
*
*
* @param entityClass class that determines the name of the collection
* @return true if a collection with the given name is found, false otherwise.
*/
@@ -262,7 +262,7 @@ public interface MongoOperations {
* Check to see if a collection with a given name exists.
* <p/>
* Translate any exceptions as necessary.
*
*
* @param collectionName name of the collection
* @return true if a collection with the given name is found, false otherwise.
*/
@@ -272,7 +272,7 @@ public interface MongoOperations {
* Drop the collection with the name indicated by the entity class.
* <p/>
* Translate any exceptions as necessary.
*
*
* @param entityClass class that determines the collection to drop/delete.
*/
<T> void dropCollection(Class<T> entityClass);
@@ -281,36 +281,39 @@ public interface MongoOperations {
* Drop the collection with the given name.
* <p/>
* Translate any exceptions as necessary.
*
*
* @param collectionName name of the collection to drop/delete.
*/
void dropCollection(String collectionName);
/**
* Returns the operations that can be performed on indexes
*
*
* @return index operations on the named collection
*/
IndexOperations indexOps(String collectionName);
/**
* Returns the operations that can be performed on indexes
*
*
* @return index operations on the named collection associated with the given entity class
*/
IndexOperations indexOps(Class<?> entityClass);
/**
* Returns the {@link ScriptOperations} that can be performed on {@link com.mongodb.DB} level.
*
*
* @return
* @since 1.7
*/
ScriptOperations scriptOps();
/**
* Returns a new {@link BulkOperations} for the given collection.
*
* Returns a new {@link BulkOperations} for the given collection. <br />
* <strong>NOTE:</strong> Any additional support for field mapping, etc. is not available for {@literal update} or
* {@literal remove} operations in bulk mode due to the lack of domain type information. Use
* {@link #bulkOps(BulkMode, Class, String)} to get full type specific support.
*
* @param mode the {@link BulkMode} to use for bulk operations, must not be {@literal null}.
* @param collectionName the name of the collection to work on, must not be {@literal null} or empty.
* @return {@link BulkOperations} on the named collection
@@ -319,7 +322,7 @@ public interface MongoOperations {
/**
* Returns a new {@link BulkOperations} for the given entity type.
*
*
* @param mode the {@link BulkMode} to use for bulk operations, must not be {@literal null}.
* @param entityType the name of the entity class, must not be {@literal null}.
* @return {@link BulkOperations} on the named collection associated of the given entity class.
@@ -328,7 +331,7 @@ public interface MongoOperations {
/**
* Returns a new {@link BulkOperations} for the given entity type and collection name.
*
*
* @param mode the {@link BulkMode} to use for bulk operations, must not be {@literal null}.
* @param entityClass the name of the entity class, must not be {@literal null}.
* @param collectionName the name of the collection to work on, must not be {@literal null} or empty.
@@ -344,7 +347,7 @@ public interface MongoOperations {
* <p/>
* If your collection does not contain a homogeneous collection of types, this operation will not be an efficient way
* to map objects since the test for class type is done in the client and not on the server.
*
*
* @param entityClass the parameterized type of the returned list
* @return the converted collection
*/
@@ -358,7 +361,7 @@ public interface MongoOperations {
* <p/>
* If your collection does not contain a homogeneous collection of types, this operation will not be an efficient way
* to map objects since the test for class type is done in the client and not on the server.
*
*
* @param entityClass the parameterized type of the returned list.
* @param collectionName name of the collection to retrieve the objects from
* @return the converted collection
@@ -368,7 +371,7 @@ public interface MongoOperations {
/**
* Execute a group operation over the entire collection. The group operation entity class should match the 'shape' of
* the returned object that takes int account the initial document structure as well as any finalize functions.
*
*
* @param criteria The criteria that restricts the row that are considered for grouping. If not specified all rows are
* considered.
* @param inputCollectionName the collection where the group operation will read from
@@ -383,7 +386,7 @@ public interface MongoOperations {
* Execute a group operation restricting the rows to those which match the provided Criteria. The group operation
* entity class should match the 'shape' of the returned object that takes int account the initial document structure
* as well as any finalize functions.
*
*
* @param criteria The criteria that restricts the row that are considered for grouping. If not specified all rows are
* considered.
* @param inputCollectionName the collection where the group operation will read from
@@ -397,7 +400,7 @@ public interface MongoOperations {
/**
* Execute an aggregation operation. The raw results will be mapped to the given entity class. The name of the
* inputCollection is derived from the inputType of the aggregation.
*
*
* @param aggregation The {@link TypedAggregation} specification holding the aggregation operations, must not be
* {@literal null}.
* @param collectionName The name of the input collection to use for the aggreation.
@@ -410,7 +413,7 @@ public interface MongoOperations {
/**
* Execute an aggregation operation. The raw results will be mapped to the given entity class. The name of the
* inputCollection is derived from the inputType of the aggregation.
*
*
* @param aggregation The {@link TypedAggregation} specification holding the aggregation operations, must not be
* {@literal null}.
* @param outputType The parameterized type of the returned list, must not be {@literal null}.
@@ -421,7 +424,7 @@ public interface MongoOperations {
/**
* Execute an aggregation operation. The raw results will be mapped to the given entity class.
*
*
* @param aggregation The {@link Aggregation} specification holding the aggregation operations, must not be
* {@literal null}.
* @param inputType the inputType where the aggregation operation will read from, must not be {@literal null} or
@@ -434,7 +437,7 @@ public interface MongoOperations {
/**
* Execute an aggregation operation. The raw results will be mapped to the given entity class.
*
*
* @param aggregation The {@link Aggregation} specification holding the aggregation operations, must not be
* {@literal null}.
* @param collectionName the collection where the aggregation operation will read from, must not be {@literal null} or
@@ -447,7 +450,7 @@ public interface MongoOperations {
/**
* Execute a map-reduce operation. The map-reduce operation will be formed with an output type of INLINE
*
*
* @param inputCollectionName the collection where the map-reduce will read from
* @param mapFunction The JavaScript map function
* @param reduceFunction The JavaScript reduce function
@@ -460,7 +463,7 @@ public interface MongoOperations {
/**
* Execute a map-reduce operation that takes additional map-reduce options.
*
*
* @param inputCollectionName the collection where the map-reduce will read from
* @param mapFunction The JavaScript map function
* @param reduceFunction The JavaScript reduce function
@@ -474,7 +477,7 @@ public interface MongoOperations {
/**
* Execute a map-reduce operation that takes a query. The map-reduce operation will be formed with an output type of
* INLINE
*
*
* @param query The query to use to select the data for the map phase
* @param inputCollectionName the collection where the map-reduce will read from
* @param mapFunction The JavaScript map function
@@ -488,7 +491,7 @@ public interface MongoOperations {
/**
* Execute a map-reduce operation that takes a query and additional map-reduce options
*
*
* @param query The query to use to select the data for the map phase
* @param inputCollectionName the collection where the map-reduce will read from
* @param mapFunction The JavaScript map function
@@ -505,7 +508,7 @@ public interface MongoOperations {
* information to determine the collection the query is ran against. Note, that MongoDB limits the number of results
* by default. Make sure to add an explicit limit to the {@link NearQuery} if you expect a particular number of
* results.
*
*
* @param near must not be {@literal null}.
* @param entityClass must not be {@literal null}.
* @return
@@ -516,7 +519,7 @@ public interface MongoOperations {
* Returns {@link GeoResults} for all entities matching the given {@link NearQuery}. Note, that MongoDB limits the
* number of results by default. Make sure to add an explicit limit to the {@link NearQuery} if you expect a
* particular number of results.
*
*
* @param near must not be {@literal null}.
* @param entityClass must not be {@literal null}.
* @param collectionName the collection to trigger the query against. If no collection name is given the entity class
@@ -534,7 +537,7 @@ public interface MongoOperations {
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification
* @param entityClass the parameterized type of the returned list.
@@ -551,7 +554,7 @@ public interface MongoOperations {
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification
* @param entityClass the parameterized type of the returned list.
@@ -561,8 +564,10 @@ public interface MongoOperations {
<T> T findOne(Query query, Class<T> entityClass, String collectionName);
/**
* Determine result of given {@link Query} contains at least one element.
*
* Determine result of given {@link Query} contains at least one element. <br />
* <strong>NOTE:</strong> Any additional support for query/field mapping, etc. is not available due to the lack of
* domain type information. Use {@link #exists(Query, Class, String)} to get full type specific support.
*
* @param query the {@link Query} class that specifies the criteria used to find a record.
* @param collectionName name of the collection to check for objects.
* @return
@@ -571,7 +576,7 @@ public interface MongoOperations {
/**
* Determine result of given {@link Query} contains at least one element.
*
*
* @param query the {@link Query} class that specifies the criteria used to find a record.
* @param entityClass the parameterized type.
* @return
@@ -580,7 +585,7 @@ public interface MongoOperations {
/**
* Determine result of given {@link Query} contains at least one element.
*
*
* @param query the {@link Query} class that specifies the criteria used to find a record.
* @param entityClass the parameterized type.
* @param collectionName name of the collection to check for objects.
@@ -596,7 +601,7 @@ public interface MongoOperations {
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification
* @param entityClass the parameterized type of the returned list.
@@ -612,7 +617,7 @@ public interface MongoOperations {
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification
* @param entityClass the parameterized type of the returned list.
@@ -624,7 +629,7 @@ public interface MongoOperations {
/**
* Returns a document with the given id mapped onto the given class. The collection the query is ran against will be
* derived from the given target class as well.
*
*
* @param <T>
* @param id the id of the document to return.
* @param entityClass the type the document shall be converted into.
@@ -634,7 +639,7 @@ public interface MongoOperations {
/**
* Returns the document with the given id from the given collection mapped onto the given target class.
*
*
* @param id the id of the document to return
* @param entityClass the type to convert the document to
* @param collectionName the collection to query for the document
@@ -646,7 +651,7 @@ public interface MongoOperations {
/**
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify
* <a/> to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
*
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* fields specification.
* @param update the {@link Update} to apply on matching documents.
@@ -658,7 +663,7 @@ public interface MongoOperations {
/**
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify
* <a/> to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
*
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* fields specification.
* @param update the {@link Update} to apply on matching documents.
@@ -672,7 +677,7 @@ public interface MongoOperations {
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify
* <a/> to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* {@link FindAndModifyOptions} into account.
*
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* fields specification.
* @param update the {@link Update} to apply on matching documents.
@@ -686,7 +691,7 @@ public interface MongoOperations {
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify
* <a/> to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* {@link FindAndModifyOptions} into account.
*
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* fields specification.
* @param update the {@link Update} to apply on matching documents.
@@ -707,7 +712,7 @@ public interface MongoOperations {
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification
* @param entityClass the parameterized type of the returned list.
@@ -724,7 +729,7 @@ public interface MongoOperations {
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification
* @param entityClass the parameterized type of the returned list.
@@ -735,7 +740,7 @@ public interface MongoOperations {
/**
* Returns the number of documents for the given {@link Query} by querying the collection of the given entity class.
*
*
* @param query
* @param entityClass must not be {@literal null}.
* @return
@@ -745,8 +750,8 @@ public interface MongoOperations {
/**
* Returns the number of documents for the given {@link Query} querying the given collection. The given {@link Query}
* must solely consist of document field references as we lack type information to map potential property references
* onto document fields. TO make sure the query gets mapped, use {@link #count(Query, Class, String)}.
*
* onto document fields. Use {@link #count(Query, Class, String)} to get full type specific support.
*
* @param query
* @param collectionName must not be {@literal null} or empty.
* @return
@@ -757,7 +762,7 @@ public interface MongoOperations {
/**
* Returns the number of documents for the given {@link Query} by querying the given collection using the given entity
* class to map the given {@link Query}.
*
*
* @param query
* @param entityClass must not be {@literal null}.
* @param collectionName must not be {@literal null} or empty.
@@ -778,7 +783,7 @@ public interface MongoOperations {
* <p/>
* <p/>
* Insert is used to initially store the object into the database. To update an existing object use the save method.
*
*
* @param objectToSave the object to store in the collection.
*/
void insert(Object objectToSave);
@@ -790,7 +795,7 @@ public interface MongoOperations {
* configured otherwise, an instance of MappingMongoConverter will be used.
* <p/>
* Insert is used to initially store the object into the database. To update an existing object use the save method.
*
*
* @param objectToSave the object to store in the collection
* @param collectionName name of the collection to store the object in
*/
@@ -798,7 +803,7 @@ public interface MongoOperations {
/**
* Insert a Collection of objects into a collection in a single batch write to the database.
*
*
* @param batchToSave the list of objects to save.
* @param entityClass class that determines the collection to use
*/
@@ -806,7 +811,7 @@ public interface MongoOperations {
/**
* Insert a list of objects into the specified collection in a single batch write to the database.
*
*
* @param batchToSave the list of objects to save.
* @param collectionName name of the collection to store the object in
*/
@@ -815,7 +820,7 @@ public interface MongoOperations {
/**
* Insert a mixed Collection of objects into a database collection determining the collection name to use based on the
* class.
*
*
* @param collectionToSave the list of objects to save.
*/
void insertAll(Collection<? extends Object> objectsToSave);
@@ -832,7 +837,7 @@ public interface MongoOperations {
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="http://docs.spring.io/spring/docs/current/spring-framework-reference/html/validation.html#core-convert" >
* Spring's Type Conversion"</a> for more details.
*
*
* @param objectToSave the object to store in the collection
*/
void save(Object objectToSave);
@@ -849,7 +854,7 @@ public interface MongoOperations {
* property type will be handled by Spring's BeanWrapper class that leverages Type Cobnversion API. See <a
* http://docs.spring.io/spring/docs/current/spring-framework-reference/html/validation.html#core-convert">Spring's
* Type Conversion"</a> for more details.
*
*
* @param objectToSave the object to store in the collection
* @param collectionName name of the collection to store the object in
*/
@@ -858,7 +863,7 @@ public interface MongoOperations {
/**
* Performs an upsert. If no document is found that matches the query, a new document is created and inserted by
* combining the query document and the update document.
*
*
* @param query the query document that specifies the criteria used to select a record to be upserted
* @param update the update document that contains the updated object or $ operators to manipulate the existing object
* @param entityClass class that determines the collection to use
@@ -868,8 +873,10 @@ public interface MongoOperations {
/**
* Performs an upsert. If no document is found that matches the query, a new document is created and inserted by
* combining the query document and the update document.
*
* combining the query document and the update document. <br />
* <strong>NOTE:</strong> Any additional support for field mapping, versions, etc. is not available due to the lack of
* domain type information. Use {@link #upsert(Query, Update, Class, String)} to get full type specific support.
*
* @param query the query document that specifies the criteria used to select a record to be updated
* @param update the update document that contains the updated object or $ operators to manipulate the existing
* object.
@@ -881,7 +888,7 @@ public interface MongoOperations {
/**
* Performs an upsert. If no document is found that matches the query, a new document is created and inserted by
* combining the query document and the update document.
*
*
* @param query the query document that specifies the criteria used to select a record to be upserted
* @param update the update document that contains the updated object or $ operators to manipulate the existing object
* @param entityClass class of the pojo to be operated on
@@ -893,7 +900,7 @@ public interface MongoOperations {
/**
* Updates the first object that is found in the collection of the entity class that matches the query document with
* the provided update document.
*
*
* @param query the query document that specifies the criteria used to select a record to be updated
* @param update the update document that contains the updated object or $ operators to manipulate the existing
* object.
@@ -904,8 +911,10 @@ public interface MongoOperations {
/**
* Updates the first object that is found in the specified collection that matches the query document criteria with
* the provided updated document.
*
* the provided updated document. <br />
* <strong>NOTE:</strong> Any additional support for field mapping, versions, etc. is not available due to the lack of
* domain type information. Use {@link #updateFirst(Query, Update, Class, String)} to get full type specific support.
*
* @param query the query document that specifies the criteria used to select a record to be updated
* @param update the update document that contains the updated object or $ operators to manipulate the existing
* object.
@@ -917,7 +926,7 @@ public interface MongoOperations {
/**
* Updates the first object that is found in the specified collection that matches the query document criteria with
* the provided updated document.
*
*
* @param query the query document that specifies the criteria used to select a record to be updated
* @param update the update document that contains the updated object or $ operators to manipulate the existing
* object.
@@ -930,7 +939,7 @@ public interface MongoOperations {
/**
* Updates all objects that are found in the collection for the entity class that matches the query document criteria
* with the provided updated document.
*
*
* @param query the query document that specifies the criteria used to select a record to be updated
* @param update the update document that contains the updated object or $ operators to manipulate the existing
* object.
@@ -941,8 +950,10 @@ public interface MongoOperations {
/**
* Updates all objects that are found in the specified collection that matches the query document criteria with the
* provided updated document.
*
* provided updated document. <br />
* <strong>NOTE:</strong> Any additional support for field mapping, versions, etc. is not available due to the lack of
* domain type information. Use {@link #updateMulti(Query, Update, Class, String)} to get full type specific support.
*
* @param query the query document that specifies the criteria used to select a record to be updated
* @param update the update document that contains the updated object or $ operators to manipulate the existing
* object.
@@ -954,7 +965,7 @@ public interface MongoOperations {
/**
* Updates all objects that are found in the collection for the entity class that matches the query document criteria
* with the provided updated document.
*
*
* @param query the query document that specifies the criteria used to select a record to be updated
* @param update the update document that contains the updated object or $ operators to manipulate the existing
* object.
@@ -966,14 +977,14 @@ public interface MongoOperations {
/**
* Remove the given object from the collection by id.
*
*
* @param object
*/
WriteResult remove(Object object);
/**
* Removes the given object from the given collection.
*
*
* @param object
* @param collection must not be {@literal null} or empty.
*/
@@ -982,7 +993,7 @@ public interface MongoOperations {
/**
* Remove all documents that match the provided query document criteria from the the collection used to store the
* entityClass. The Class parameter is also used to help convert the Id of the object if it is present in the query.
*
*
* @param query
* @param entityClass
*/
@@ -991,7 +1002,7 @@ public interface MongoOperations {
/**
* Remove all documents that match the provided query document criteria from the the collection used to store the
* entityClass. The Class parameter is also used to help convert the Id of the object if it is present in the query.
*
*
* @param query
* @param entityClass
* @param collectionName
@@ -1000,18 +1011,22 @@ public interface MongoOperations {
/**
* Remove all documents from the specified collection that match the provided query document criteria. There is no
* conversion/mapping done for any criteria using the id field.
*
* conversion/mapping done for any criteria using the id field. <br />
* <strong>NOTE:</strong> Any additional support for field mapping is not available due to the lack of domain type
* information. Use {@link #remove(Query, Class, String)} to get full type specific support.
*
* @param query the query document that specifies the criteria used to remove a record
* @param collectionName name of the collection where the objects will removed
*/
WriteResult remove(Query query, String collectionName);
/**
* Returns and removes all documents form the specified collection that match the provided query.
*
* @param query
* @param collectionName
* Returns and removes all documents form the specified collection that match the provided query. <br />
* <strong>NOTE:</strong> Any additional support for field mapping is not available due to the lack of domain type
* information. Use {@link #findAllAndRemove(Query, Class, String)} to get full type specific support.
*
* @param query must not be {@literal null}.
* @param collectionName must not be {@literal null}.
* @return
* @since 1.5
*/
@@ -1019,7 +1034,7 @@ public interface MongoOperations {
/**
* Returns and removes all documents matching the given query form the collection used to store the entityClass.
*
*
* @param query
* @param entityClass
* @return
@@ -1031,7 +1046,7 @@ public interface MongoOperations {
* Returns and removes all documents that match the provided query document criteria from the the collection used to
* store the entityClass. The Class parameter is also used to help convert the Id of the object if it is present in
* the query.
*
*
* @param query
* @param entityClass
* @param collectionName
@@ -1042,7 +1057,7 @@ public interface MongoOperations {
/**
* Returns the underlying {@link MongoConverter}.
*
*
* @return
*/
MongoConverter getConverter();

View File

@@ -61,12 +61,14 @@ import org.springframework.data.mapping.model.ConvertingPropertyAccessor;
import org.springframework.data.mapping.model.MappingException;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.BulkOperations.BulkMode;
import org.springframework.data.mongodb.core.DefaultBulkOperations.BulkOperationContext;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.aggregation.AggregationOperationContext;
import org.springframework.data.mongodb.core.aggregation.AggregationResults;
import org.springframework.data.mongodb.core.aggregation.Fields;
import org.springframework.data.mongodb.core.aggregation.TypeBasedAggregationOperationContext;
import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.convert.DbRefResolver;
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
@@ -142,6 +144,7 @@ import com.mongodb.util.JSONParseException;
* @author Niko Schmuck
* @author Mark Paluch
* @author Laszlo Csontos
* @author Borislav Rangelov
*/
@SuppressWarnings("deprecation")
public class MongoTemplate implements MongoOperations, ApplicationContextAware {
@@ -574,10 +577,10 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
Assert.notNull(mode, "BulkMode must not be null!");
Assert.hasText(collectionName, "Collection name must not be null or empty!");
DefaultBulkOperations operations = new DefaultBulkOperations(this, mode, collectionName, entityType);
DefaultBulkOperations operations = new DefaultBulkOperations(this, collectionName,
new BulkOperationContext(mode, getPersistentEntity(entityType), queryMapper, updateMapper));
operations.setExceptionTranslator(exceptionTranslator);
operations.setWriteConcernResolver(writeConcernResolver);
operations.setDefaultWriteConcern(writeConcern);
return operations;
@@ -1514,7 +1517,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
*/
@Override
public <T> List<T> findAllAndRemove(Query query, String collectionName) {
return findAndRemove(query, null, collectionName);
return (List<T>) findAllAndRemove(query, Object.class, collectionName);
}
/*
@@ -2108,8 +2111,16 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
private static final MongoConverter getDefaultMongoConverter(MongoDbFactory factory) {
DbRefResolver dbRefResolver = new DefaultDbRefResolver(factory);
MappingMongoConverter converter = new MappingMongoConverter(dbRefResolver, new MongoMappingContext());
CustomConversions conversions = new CustomConversions(Collections.emptyList());
MongoMappingContext mappingContext = new MongoMappingContext();
mappingContext.setSimpleTypeHolder(conversions.getSimpleTypeHolder());
mappingContext.afterPropertiesSet();
MappingMongoConverter converter = new MappingMongoConverter(dbRefResolver, mappingContext);
converter.setCustomConversions(conversions);
converter.afterPropertiesSet();
return converter;
}
@@ -2140,7 +2151,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
* Returns all identifiers for the given documents. Will augment the given identifiers and fill in only the ones that
* are {@literal null} currently. This would've been better solved in {@link #insertDBObjectList(String, List)}
* directly but would require a signature change of that method.
*
*
* @param ids
* @param documents
* @return TODO: Remove for 2.0 and change method signature of {@link #insertDBObjectList(String, List)}.
@@ -2528,7 +2539,11 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
Cursor c = cursor;
try {
c.close();
if (c != null) {
c.close();
}
} catch (RuntimeException ex) {
throw potentiallyConvertRuntimeException(ex, exceptionTranslator);
} finally {

View File

@@ -1408,7 +1408,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
protected List<Object> getOperationArguments(AggregationOperationContext context) {
List<Object> result = new ArrayList<Object>(values.size());
result.add(context.getReference(getField().getName()).toString());
result.add(context.getReference(getField()).toString());
for (Object element : values) {

View File

@@ -25,7 +25,9 @@ import java.lang.reflect.Method;
import java.util.ArrayList;
import java.util.Collections;
import java.util.Comparator;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import org.aopalliance.intercept.MethodInterceptor;
import org.aopalliance.intercept.MethodInvocation;
@@ -46,7 +48,6 @@ import org.springframework.util.Assert;
import org.springframework.util.ReflectionUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.BasicDBObjectBuilder;
import com.mongodb.DB;
import com.mongodb.DBObject;
import com.mongodb.DBRef;
@@ -62,6 +63,8 @@ import com.mongodb.DBRef;
*/
public class DefaultDbRefResolver implements DbRefResolver {
private static final String ID = "_id";
private final MongoDbFactory mongoDbFactory;
private final PersistenceExceptionTranslator exceptionTranslator;
private final ObjenesisStd objenesis;
@@ -144,10 +147,37 @@ public class DefaultDbRefResolver implements DbRefResolver {
ids.add(ref.getId());
}
Map<Object, DBObject> documentsById = getDocumentsById(ids, collection);
List<DBObject> result = new ArrayList<DBObject>(ids.size());
for (Object id : ids) {
result.add(documentsById.get(id));
}
return result;
}
/**
* Returns all documents with the given ids contained in the given collection mapped by their ids.
*
* @param ids must not be {@literal null}.
* @param collection must not be {@literal null} or empty.
* @return
*/
private Map<Object, DBObject> getDocumentsById(List<Object> ids, String collection) {
Assert.notNull(ids, "Ids must not be null!");
Assert.hasText(collection, "Collection must not be null or empty!");
DB db = mongoDbFactory.getDb();
List<DBObject> result = db.getCollection(collection)
.find(new BasicDBObjectBuilder().add("_id", new BasicDBObject("$in", ids)).get()).toArray();
Collections.sort(result, new DbRefByReferencePositionComparator(ids));
BasicDBObject query = new BasicDBObject(ID, new BasicDBObject("$in", ids));
List<DBObject> documents = db.getCollection(collection).find(query).toArray();
Map<Object, DBObject> result = new HashMap<Object, DBObject>(documents.size());
for (DBObject document : documents) {
result.put(document.get(ID), document);
}
return result;
}
@@ -316,6 +346,8 @@ public class DefaultDbRefResolver implements DbRefResolver {
return null;
}
ReflectionUtils.makeAccessible(method);
return method.invoke(target, args);
}
@@ -467,7 +499,7 @@ public class DefaultDbRefResolver implements DbRefResolver {
*/
@Override
public int compare(DBObject o1, DBObject o2) {
return Integer.compare(reference.indexOf(o1.get("_id")), reference.indexOf(o2.get("_id")));
return Integer.compare(reference.indexOf(o1.get(ID)), reference.indexOf(o2.get(ID)));
}
}
}

View File

@@ -74,7 +74,7 @@ import com.mongodb.DBRef;
/**
* {@link MongoConverter} that uses a {@link MappingContext} to do sophisticated mapping of domain objects to
* {@link DBObject}.
*
*
* @author Oliver Gierke
* @author Jon Brisbin
* @author Patrik Wasik
@@ -101,7 +101,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
/**
* Creates a new {@link MappingMongoConverter} given the new {@link DbRefResolver} and {@link MappingContext}.
*
*
* @param dbRefResolver must not be {@literal null}.
* @param mappingContext must not be {@literal null}.
*/
@@ -123,7 +123,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
/**
* Creates a new {@link MappingMongoConverter} given the new {@link MongoDbFactory} and {@link MappingContext}.
*
*
* @deprecated use the constructor taking a {@link DbRefResolver} instead.
* @param mongoDbFactory must not be {@literal null}.
* @param mappingContext must not be {@literal null}.
@@ -139,12 +139,13 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* converter and how to lookup type information from {@link DBObject}s when reading them. Uses a
* {@link DefaultMongoTypeMapper} by default. Setting this to {@literal null} will reset the {@link TypeMapper} to the
* default one.
*
*
* @param typeMapper the typeMapper to set
*/
public void setTypeMapper(MongoTypeMapper typeMapper) {
this.typeMapper = typeMapper == null
? new DefaultMongoTypeMapper(DefaultMongoTypeMapper.DEFAULT_TYPE_KEY, mappingContext) : typeMapper;
? new DefaultMongoTypeMapper(DefaultMongoTypeMapper.DEFAULT_TYPE_KEY, mappingContext)
: typeMapper;
}
/*
@@ -161,7 +162,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* any translation but rather reject a {@link Map} with keys containing dots causing the conversion for the entire
* object to fail. If further customization of the translation is needed, have a look at
* {@link #potentiallyEscapeMapKey(String)} as well as {@link #potentiallyUnescapeMapKey(String)}.
*
*
* @param mapKeyDotReplacement the mapKeyDotReplacement to set
*/
public void setMapKeyDotReplacement(String mapKeyDotReplacement) {
@@ -340,7 +341,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
/**
* Root entry method into write conversion. Adds a type discriminator to the {@link DBObject}. Shouldn't be called for
* nested conversions.
*
*
* @see org.springframework.data.mongodb.core.convert.MongoWriter#write(java.lang.Object, com.mongodb.DBObject)
*/
public void write(final Object obj, final DBObject dbo) {
@@ -364,7 +365,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
/**
* Internal write conversion method which should be used for nested invocations.
*
*
* @param obj
* @param dbo
*/
@@ -520,7 +521,8 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
addCustomTypeKeyIfNecessary(ClassTypeInformation.from(prop.getRawType()), obj, propDbObj);
MongoPersistentEntity<?> entity = isSubtype(prop.getType(), obj.getClass())
? mappingContext.getPersistentEntity(obj.getClass()) : mappingContext.getPersistentEntity(type);
? mappingContext.getPersistentEntity(obj.getClass())
: mappingContext.getPersistentEntity(type);
writeInternal(obj, propDbObj, entity);
accessor.put(prop, propDbObj);
@@ -534,7 +536,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* Returns given object as {@link Collection}. Will return the {@link Collection} as is if the source is a
* {@link Collection} already, will convert an array into a {@link Collection} or simply create a single element
* collection for everything else.
*
*
* @param source
* @return
*/
@@ -549,7 +551,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
/**
* Writes the given {@link Collection} using the given {@link MongoPersistentProperty} information.
*
*
* @param collection must not be {@literal null}.
* @param property must not be {@literal null}.
* @return
@@ -577,7 +579,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
/**
* Writes the given {@link Map} using the given {@link MongoPersistentProperty} information.
*
*
* @param map must not {@literal null}.
* @param property must not be {@literal null}.
* @return
@@ -613,7 +615,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
/**
* Populates the given {@link BasicDBList} with values from the given {@link Collection}.
*
*
* @param source the collection to create a {@link BasicDBList} for, must not be {@literal null}.
* @param type the {@link TypeInformation} to consider or {@literal null} if unknown.
* @param sink the {@link BasicDBList} to write to.
@@ -643,7 +645,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
/**
* Writes the given {@link Map} to the given {@link DBObject} considering the given {@link TypeInformation}.
*
*
* @param obj must not be {@literal null}.
* @param dbo must not be {@literal null}.
* @param propertyType must not be {@literal null}.
@@ -682,7 +684,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
/**
* Prepares the given {@link Map} key to be converted into a {@link String}. Will invoke potentially registered custom
* conversions and escape dots from the result as they're not supported as {@link Map} key in MongoDB.
*
*
* @param key must not be {@literal null}.
* @return
*/
@@ -697,7 +699,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
/**
* Potentially replaces dots in the given map key with the configured map key replacement if configured or aborts
* conversion if none is configured.
*
*
* @see #setMapKeyDotReplacement(String)
* @param source
* @return
@@ -720,7 +722,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
/**
* Returns a {@link String} representation of the given {@link Map} key
*
*
* @param key
* @return
*/
@@ -731,13 +733,14 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
return conversions.hasCustomWriteTarget(key.getClass(), String.class)
? (String) getPotentiallyConvertedSimpleWrite(key) : key.toString();
? (String) getPotentiallyConvertedSimpleWrite(key)
: key.toString();
}
/**
* Translates the map key replacements in the given key just read with a dot in case a map key replacement has been
* configured.
*
*
* @param source
* @return
*/
@@ -767,7 +770,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
/**
* Writes the given simple value to the given {@link DBObject}. Will store enum names for enum values.
*
*
* @param value
* @param dbObject must not be {@literal null}.
* @param key must not be {@literal null}.
@@ -784,7 +787,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
/**
* Checks whether we have a custom conversion registered for the given value into an arbitrary simple Mongo type.
* Returns the converted value if so. If not, we perform special enum handling or simply return the value as is.
*
*
* @param value
* @return
*/
@@ -806,7 +809,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
/**
* Checks whether we have a custom conversion for the given simple object. Converts the given value if so, applies
* {@link Enum} handling or returns the value as is.
*
*
* @param value
* @param target must not be {@literal null}.
* @return
@@ -879,7 +882,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
/**
* Reads the given {@link BasicDBList} into a collection of the given {@link TypeInformation}.
*
*
* @param targetType must not be {@literal null}.
* @param sourceValue must not be {@literal null}.
* @param path must not be {@literal null}.
@@ -929,7 +932,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
/**
* Reads the given {@link DBObject} into a {@link Map}. will recursively resolve nested {@link Map}s as well.
*
*
* @param type the {@link Map} {@link TypeInformation} to be used to unmarshall this {@link DBObject}.
* @param dbObject must not be {@literal null}
* @param path must not be {@literal null}
@@ -1031,7 +1034,8 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
for (Entry<Object, Object> entry : ((Map<Object, Object>) obj).entrySet()) {
TypeInformation<? extends Object> valueTypeHint = typeHint != null && typeHint.getMapValueType() != null
? typeHint.getMapValueType() : typeHint;
? typeHint.getMapValueType()
: typeHint;
converted.put(getPotentiallyConvertedSimpleWrite(entry.getKey()).toString(),
convertToMongoType(entry.getValue(), valueTypeHint));
@@ -1170,7 +1174,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
/**
* Extension of {@link SpELExpressionParameterValueProvider} to recursively trigger value conversion on the raw
* resolved SpEL value.
*
*
* @author Oliver Gierke
*/
private class ConverterAwareSpELExpressionParameterValueProvider
@@ -1180,7 +1184,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
/**
* Creates a new {@link ConverterAwareSpELExpressionParameterValueProvider}.
*
*
* @param evaluator must not be {@literal null}.
* @param conversionService must not be {@literal null}.
* @param delegate must not be {@literal null}.
@@ -1228,11 +1232,11 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return (T) dbref;
}
Object object = dbref == null ? null : path.getPathItem(dbref.getId(), dbref.getCollectionName());
return (T) (object != null ? object : readAndConvertDBRef(dbref, type, path, rawType));
T object = dbref == null ? null : path.getPathItem(dbref.getId(), dbref.getCollectionName(), (Class<T>) rawType);
return object != null ? object : (T) readAndConvertDBRef(dbref, type, path, rawType);
}
private <T> T readAndConvertDBRef(DBRef dbref, TypeInformation<?> type, ObjectPath path, final Class<?> rawType) {
private <T> T readAndConvertDBRef(DBRef dbref, TypeInformation<?> type, ObjectPath path, Class<?> rawType) {
List<T> result = bulkReadAndConvertDBRefs(Collections.singletonList(dbref), type, path, rawType);
return CollectionUtils.isEmpty(result) ? null : result.iterator().next();
@@ -1262,7 +1266,8 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
List<DBObject> referencedRawDocuments = dbrefs.size() == 1
? Collections.singletonList(readRef(dbrefs.iterator().next())) : bulkReadRefs(dbrefs);
? Collections.singletonList(readRef(dbrefs.iterator().next()))
: bulkReadRefs(dbrefs);
String collectionName = dbrefs.iterator().next().getCollectionName();
List<T> targeList = new ArrayList<T>(dbrefs.size());
@@ -1297,7 +1302,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
/**
* Performs the fetch operation for the given {@link DBRef}.
*
*
* @param ref
* @return
*/
@@ -1347,7 +1352,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
/**
* Marker class used to indicate we have a non root document object here that might be used within an update - so we
* need to preserve type hints for potential nested elements but need to remove it on top level.
*
*
* @author Christoph Strobl
* @since 1.8
*/

View File

@@ -28,6 +28,7 @@ import java.util.Stack;
import java.util.regex.Pattern;
import org.springframework.data.domain.Example;
import org.springframework.data.domain.ExampleMatcher;
import org.springframework.data.domain.ExampleMatcher.NullHandler;
import org.springframework.data.domain.ExampleMatcher.PropertyValueTransformer;
import org.springframework.data.domain.ExampleMatcher.StringMatcher;
@@ -41,6 +42,7 @@ import org.springframework.data.repository.core.support.ExampleMatcherAccessor;
import org.springframework.data.repository.query.parser.Part.Type;
import org.springframework.data.util.TypeInformation;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils;
@@ -48,9 +50,13 @@ import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Mapper from {@link Example} to a query {@link DBObject}.
*
* @author Christoph Strobl
* @author Mark Paluch
* @since 1.8
* @see Example
* @see org.springframework.data.domain.ExampleMatcher
*/
public class MongoExampleMapper {
@@ -58,6 +64,11 @@ public class MongoExampleMapper {
private final MongoConverter converter;
private final Map<StringMatcher, Type> stringMatcherPartMapping = new HashMap<StringMatcher, Type>();
/**
* Create a new {@link MongoTypeMapper} given {@link MongoConverter}.
*
* @param converter must not be {@literal null}.
*/
public MongoExampleMapper(MongoConverter converter) {
this.converter = converter;
@@ -99,8 +110,10 @@ public class MongoExampleMapper {
DBObject reference = (DBObject) converter.convertToMongoType(example.getProbe());
if (entity.hasIdProperty() && entity.getIdentifierAccessor(example.getProbe()).getIdentifier() == null) {
reference.removeField(entity.getIdProperty().getFieldName());
if (entity.hasIdProperty() && ClassUtils.isAssignable(entity.getType(), example.getProbeType())) {
if (entity.getIdentifierAccessor(example.getProbe()).getIdentifier() == null) {
reference.removeField(entity.getIdProperty().getFieldName());
}
}
ExampleMatcherAccessor matcherAccessor = new ExampleMatcherAccessor(example.getMatcher());
@@ -111,80 +124,7 @@ public class MongoExampleMapper {
: new BasicDBObject(SerializationUtils.flattenMap(reference));
DBObject result = example.getMatcher().isAllMatching() ? flattened : orConcatenate(flattened);
this.converter.getTypeMapper().writeTypeRestrictions(result, getTypesToMatch(example));
return result;
}
private static DBObject orConcatenate(DBObject source) {
List<DBObject> foo = new ArrayList<DBObject>(source.keySet().size());
for (String key : source.keySet()) {
foo.add(new BasicDBObject(key, source.get(key)));
}
return new BasicDBObject("$or", foo);
}
private Set<Class<?>> getTypesToMatch(Example<?> example) {
Set<Class<?>> types = new HashSet<Class<?>>();
for (TypeInformation<?> reference : mappingContext.getManagedTypes()) {
if (example.getProbeType().isAssignableFrom(reference.getType())) {
types.add(reference.getType());
}
}
return types;
}
private String getMappedPropertyPath(String path, Class<?> probeType) {
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(probeType);
Iterator<String> parts = Arrays.asList(path.split("\\.")).iterator();
final Stack<MongoPersistentProperty> stack = new Stack<MongoPersistentProperty>();
List<String> resultParts = new ArrayList<String>();
while (parts.hasNext()) {
final String part = parts.next();
MongoPersistentProperty prop = entity.getPersistentProperty(part);
if (prop == null) {
entity.doWithProperties(new PropertyHandler<MongoPersistentProperty>() {
@Override
public void doWithPersistentProperty(MongoPersistentProperty property) {
if (property.getFieldName().equals(part)) {
stack.push(property);
}
}
});
if (stack.isEmpty()) {
return "";
}
prop = stack.pop();
}
resultParts.add(prop.getName());
if (prop.isEntity() && mappingContext.hasPersistentEntityFor(prop.getActualType())) {
entity = mappingContext.getPersistentEntity(prop.getActualType());
} else {
break;
}
}
return StringUtils.collectionToDelimitedString(resultParts, ".");
return updateTypeRestrictions(result, example);
}
private void applyPropertySpecs(String path, DBObject source, Class<?> probeType,
@@ -246,7 +186,102 @@ public class MongoExampleMapper {
}
}
private boolean isEmptyIdProperty(Entry<String, Object> entry) {
private String getMappedPropertyPath(String path, Class<?> probeType) {
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(probeType);
Iterator<String> parts = Arrays.asList(path.split("\\.")).iterator();
final Stack<MongoPersistentProperty> stack = new Stack<MongoPersistentProperty>();
List<String> resultParts = new ArrayList<String>();
while (parts.hasNext()) {
final String part = parts.next();
MongoPersistentProperty prop = entity.getPersistentProperty(part);
if (prop == null) {
entity.doWithProperties(new PropertyHandler<MongoPersistentProperty>() {
@Override
public void doWithPersistentProperty(MongoPersistentProperty property) {
if (property.getFieldName().equals(part)) {
stack.push(property);
}
}
});
if (stack.isEmpty()) {
return "";
}
prop = stack.pop();
}
resultParts.add(prop.getName());
if (prop.isEntity() && mappingContext.hasPersistentEntityFor(prop.getActualType())) {
entity = mappingContext.getPersistentEntity(prop.getActualType());
} else {
break;
}
}
return StringUtils.collectionToDelimitedString(resultParts, ".");
}
private DBObject updateTypeRestrictions(DBObject query, Example example) {
DBObject result = new BasicDBObject();
if (isTypeRestricting(example.getMatcher())) {
result.putAll(query);
this.converter.getTypeMapper().writeTypeRestrictions(result, getTypesToMatch(example));
return result;
}
for (String key : query.keySet()) {
if (!this.converter.getTypeMapper().isTypeKey(key)) {
result.put(key, query.get(key));
}
}
return result;
}
private boolean isTypeRestricting(ExampleMatcher matcher) {
if (matcher.getIgnoredPaths().isEmpty()) {
return true;
}
for (String path : matcher.getIgnoredPaths()) {
if (this.converter.getTypeMapper().isTypeKey(path)) {
return false;
}
}
return true;
}
private Set<Class<?>> getTypesToMatch(Example<?> example) {
Set<Class<?>> types = new HashSet<Class<?>>();
for (TypeInformation<?> reference : mappingContext.getManagedTypes()) {
if (example.getProbeType().isAssignableFrom(reference.getType())) {
types.add(reference.getType());
}
}
return types;
}
private static boolean isEmptyIdProperty(Entry<String, Object> entry) {
return entry.getKey().equals("_id") && entry.getValue() == null;
}
@@ -272,4 +307,15 @@ public class MongoExampleMapper {
dbo.put("$options", "i");
}
}
private static DBObject orConcatenate(DBObject source) {
List<DBObject> or = new ArrayList<DBObject>(source.keySet().size());
for (String key : source.keySet()) {
or.add(new BasicDBObject(key, source.get(key)));
}
return new BasicDBObject("$or", or);
}
}

View File

@@ -15,12 +15,16 @@
*/
package org.springframework.data.mongodb.core.convert;
import lombok.Value;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils;
import com.mongodb.DBObject;
@@ -30,16 +34,18 @@ import com.mongodb.DBObject;
* when resolving more nested objects. This allows to avoid re-resolving object instances that are logically equivalent
* to already resolved ones.
* <p>
* An immutable ordered set of target objects for {@link DBObject} to {@link Object} conversions. Object paths can be
* constructed by the {@link #toObjectPath(Object)} method and extended via {@link #push(Object)}.
*
* An immutable ordered set of target objects for {@link DBObject} to {@link Object} conversions. Object paths
* can be extended via {@link #push(Object, MongoPersistentEntity, Object)}.
*
* @author Thomas Darimont
* @author Oliver Gierke
* @author Mark Paluch
* @author Christoph Strobl
* @since 1.6
*/
class ObjectPath {
public static final ObjectPath ROOT = new ObjectPath();
static final ObjectPath ROOT = new ObjectPath();
private final List<ObjectPathItem> items;
@@ -50,7 +56,7 @@ class ObjectPath {
/**
* Creates a new {@link ObjectPath} from the given parent {@link ObjectPath} by adding the provided
* {@link ObjectPathItem} to it.
*
*
* @param parent can be {@literal null}.
* @param item
*/
@@ -68,9 +74,9 @@ class ObjectPath {
* @param object must not be {@literal null}.
* @param entity must not be {@literal null}.
* @param id must not be {@literal null}.
* @return
* @return new instance of {@link ObjectPath}.
*/
public ObjectPath push(Object object, MongoPersistentEntity<?> entity, Object id) {
ObjectPath push(Object object, MongoPersistentEntity<?> entity, Object id) {
Assert.notNull(object, "Object must not be null!");
Assert.notNull(entity, "MongoPersistentEntity must not be null!");
@@ -80,14 +86,15 @@ class ObjectPath {
}
/**
* Returns the object with the given id and stored in the given collection if it's contained in the {@link ObjectPath}
* .
*
* Returns the object with the given id and stored in the given collection if it's contained in the* {@link ObjectPath}.
*
* @param id must not be {@literal null}.
* @param collection must not be {@literal null} or empty.
* @return
* @deprecated use {@link #getPathItem(Object, String, Class)}.
*/
public Object getPathItem(Object id, String collection) {
@Deprecated
Object getPathItem(Object id, String collection) {
Assert.notNull(id, "Id must not be null!");
Assert.hasText(collection, "Collection name must not be null!");
@@ -96,11 +103,7 @@ class ObjectPath {
Object object = item.getObject();
if (object == null) {
continue;
}
if (item.getIdValue() == null) {
if (object == null || item.getIdValue() == null) {
continue;
}
@@ -112,16 +115,49 @@ class ObjectPath {
return null;
}
/**
* Get the object with given {@literal id}, stored in the {@literal collection} that is assignable to the given
* {@literal type} or {@literal null} if no match found.
*
* @param id must not be {@literal null}.
* @param collection must not be {@literal null} or empty.
* @param type must not be {@literal null}.
* @return {@literal null} when no match found.
* @since 2.0
*/
<T> T getPathItem(Object id, String collection, Class<T> type) {
Assert.notNull(id, "Id must not be null!");
Assert.hasText(collection, "Collection name must not be null!");
Assert.notNull(type, "Type must not be null!");
for (ObjectPathItem item : items) {
Object object = item.getObject();
if (object == null || item.getIdValue() == null) {
continue;
}
if (collection.equals(item.getCollection()) && id.equals(item.getIdValue())
&& ClassUtils.isAssignable(type, object.getClass())) {
return type.cast(object);
}
}
return null;
}
/**
* Returns the current object of the {@link ObjectPath} or {@literal null} if the path is empty.
*
*
* @return
*/
public Object getCurrentObject() {
Object getCurrentObject() {
return items.isEmpty() ? null : items.get(items.size() - 1).getObject();
}
/*
/*
* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@@ -135,7 +171,7 @@ class ObjectPath {
List<String> strings = new ArrayList<String>(items.size());
for (ObjectPathItem item : items) {
strings.add(item.object.toString());
strings.add(ObjectUtils.nullSafeToString(item.object));
}
return StringUtils.collectionToDelimitedString(strings, " -> ");
@@ -143,40 +179,16 @@ class ObjectPath {
/**
* An item in an {@link ObjectPath}.
*
*
* @author Thomas Darimont
* @author Oliver Gierke
* @author Mark Paluch
*/
@Value
private static class ObjectPathItem {
private final Object object;
private final Object idValue;
private final String collection;
/**
* Creates a new {@link ObjectPathItem}.
*
* @param object
* @param idValue
* @param collection
*/
ObjectPathItem(Object object, Object idValue, String collection) {
this.object = object;
this.idValue = idValue;
this.collection = collection;
}
public Object getObject() {
return object;
}
public Object getIdValue() {
return idValue;
}
public String getCollection() {
return collection;
}
Object object;
Object idValue;
String collection;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2016 the original author or authors.
* Copyright 2013-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,6 +15,7 @@
*/
package org.springframework.data.mongodb.core.convert;
import java.util.Collection;
import java.util.Map.Entry;
import org.springframework.core.convert.converter.Converter;
@@ -161,6 +162,10 @@ public class UpdateMapper extends QueryMapper {
return info;
}
if (source instanceof Collection) {
return NESTED_DOCUMENT;
}
if (!type.equals(source.getClass())) {
return info;
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2014-2015 the original author or authors.
* Copyright 2014-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -53,9 +53,10 @@ import com.mongodb.util.JSON;
* indexed. <br />
* All {@link MongoPersistentProperty} of the {@link MongoPersistentEntity} are inspected for potential indexes by
* scanning related annotations.
*
*
* @author Christoph Strobl
* @author Thomas Darimont
* @author Martin Macko
* @since 1.5
*/
public class MongoPersistentEntityIndexResolver implements IndexResolver {
@@ -66,7 +67,7 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
/**
* Create new {@link MongoPersistentEntityIndexResolver}.
*
*
* @param mappingContext must not be {@literal null}.
*/
public MongoPersistentEntityIndexResolver(MongoMappingContext mappingContext) {
@@ -87,7 +88,7 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
* Resolve the {@link IndexDefinition}s for given {@literal root} entity by traversing {@link MongoPersistentProperty}
* scanning for index annotations {@link Indexed}, {@link CompoundIndex} and {@link GeospatialIndex}. The given
* {@literal root} has therefore to be annotated with {@link Document}.
*
*
* @param root must not be null.
* @return List of {@link IndexDefinitionHolder}. Will never be {@code null}.
* @throws IllegalArgumentException in case of missing {@link Document} annotation marking root entities.
@@ -133,7 +134,7 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
/**
* Recursively resolve and inspect properties of given {@literal type} for indexes to be created.
*
*
* @param type
* @param path The {@literal "dot} path.
* @param collection
@@ -264,7 +265,7 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
appendTextIndexInformation(propertyDotPath, indexDefinitionBuilder,
mappingContext.getPersistentEntity(persistentProperty.getActualType()), optionsForNestedType, guard);
} catch (CyclicPropertyReferenceException e) {
LOGGER.info(e.getMessage(), e);
LOGGER.info(e.getMessage());
} catch (InvalidDataAccessApiUsageException e) {
LOGGER.info(String.format("Potentially invalid index structure discovered. Breaking operation for %s.",
entity.getName()), e);
@@ -281,7 +282,7 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
/**
* Create {@link IndexDefinition} wrapped in {@link IndexDefinitionHolder} for {@link CompoundIndexes} of given type.
*
*
* @param dotPath The properties {@literal "dot"} path representation from its document root.
* @param fallbackCollection
* @param type
@@ -361,7 +362,7 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
/**
* Creates {@link IndexDefinition} wrapped in {@link IndexDefinitionHolder} out of {@link Indexed} for given
* {@link MongoPersistentProperty}.
*
*
* @param dotPath The properties {@literal "dot"} path representation from its document root.
* @param collection
* @param persitentProperty
@@ -402,7 +403,7 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
/**
* Creates {@link IndexDefinition} wrapped in {@link IndexDefinitionHolder} out of {@link GeoSpatialIndexed} for
* {@link MongoPersistentProperty}.
*
*
* @param dotPath The properties {@literal "dot"} path representation from its document root.
* @param collection
* @param persistentProperty
@@ -479,7 +480,7 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
/**
* {@link CycleGuard} holds information about properties and the paths for accessing those. This information is used
* to detect potential cycles within the references.
*
*
* @author Christoph Strobl
*/
static class CycleGuard {
@@ -529,24 +530,24 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
/**
* Path defines the property and its full path from the document root. <br />
* A {@link Path} with {@literal spring.data.mongodb} would be created for the property {@code Three.mongodb}.
*
*
* <pre>
* <code>
* &#64;Document
* class One {
* Two spring;
* }
*
*
* class Two {
* Three data;
* }
*
*
* class Three {
* String mongodb;
* }
* </code>
* </pre>
*
*
* @author Christoph Strobl
*/
static class Path {
@@ -569,7 +570,7 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
* the current path. Given {@literal foo.bar.bar} cycles if {@literal foo.bar} has already been visited and
* {@code class Bar} contains a property of type {@code Bar}. The previously mentioned path would not cycle if
* {@code class Bar} contained a property of type {@code SomeEntity} named {@literal bar}.
*
*
* @param property
* @param path
* @return
@@ -618,7 +619,7 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
/**
* Implementation of {@link IndexDefinition} holding additional (property)path information used for creating the
* index. The path itself is the properties {@literal "dot"} path representation from its root document.
*
*
* @author Christoph Strobl
* @since 1.5
*/
@@ -630,7 +631,7 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
/**
* Create
*
*
* @param path
*/
public IndexDefinitionHolder(String path, IndexDefinition definition, String collection) {
@@ -646,7 +647,7 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
/**
* Get the {@literal "dot"} path used to create the index.
*
*
* @return
*/
public String getPath() {
@@ -655,7 +656,7 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
/**
* Get the {@literal raw} {@link IndexDefinition}.
*
*
* @return
*/
public IndexDefinition getIndexDefinition() {

View File

@@ -99,7 +99,7 @@ public class PartTreeMongoQuery extends AbstractMongoQuery {
ReturnedType returnedType = processor.withDynamicProjection(accessor).getReturnedType();
if (returnedType.isProjecting()) {
if (returnedType.needsCustomConstruction()) {
Field fields = query.fields();

View File

@@ -28,6 +28,10 @@ import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.data.mongodb.BulkOperationException;
import org.springframework.data.mongodb.core.BulkOperations.BulkMode;
import org.springframework.data.mongodb.core.DefaultBulkOperations.BulkOperationContext;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.convert.UpdateMapper;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
@@ -43,7 +47,7 @@ import com.mongodb.WriteConcern;
/**
* Integration tests for {@link DefaultBulkOperations}.
*
*
* @author Tobias Trelle
* @author Oliver Gierke
* @author Christoph Strobl
@@ -67,17 +71,18 @@ public class DefaultBulkOperationsIntegrationTests {
@Test(expected = IllegalArgumentException.class) // DATAMONGO-934
public void rejectsNullMongoOperations() {
new DefaultBulkOperations(null, null, COLLECTION_NAME, null);
new DefaultBulkOperations(null, COLLECTION_NAME, new BulkOperationContext(BulkMode.ORDERED, null, null, null));
}
@Test(expected = IllegalArgumentException.class) // DATAMONGO-934
public void rejectsNullCollectionName() {
new DefaultBulkOperations(operations, null, null, null);
new DefaultBulkOperations(operations, null, new BulkOperationContext(BulkMode.ORDERED, null, null, null));
}
@Test(expected = IllegalArgumentException.class) // DATAMONGO-934
public void rejectsEmptyCollectionName() {
new DefaultBulkOperations(operations, null, "", null);
new DefaultBulkOperations(operations, "", new BulkOperationContext(BulkMode.ORDERED, null, null, null));
}
@Test // DATAMONGO-934
@@ -193,7 +198,7 @@ public class DefaultBulkOperationsIntegrationTests {
@Test // DATAMONGO-934
public void mixedBulkOrdered() {
BulkWriteResult result = createBulkOps(BulkMode.ORDERED).insert(newDoc("1", "v1")).//
BulkWriteResult result = createBulkOps(BulkMode.ORDERED, BaseDoc.class).insert(newDoc("1", "v1")).//
updateOne(where("_id", "1"), set("value", "v2")).//
remove(where("value", "v2")).//
execute();
@@ -215,8 +220,8 @@ public class DefaultBulkOperationsIntegrationTests {
List<Pair<Query, Update>> updates = Arrays.asList(Pair.of(where("value", "v2"), set("value", "v3")));
List<Query> removes = Arrays.asList(where("_id", "1"));
BulkWriteResult result = createBulkOps(BulkMode.ORDERED).insert(inserts).updateMulti(updates).remove(removes)
.execute();
BulkWriteResult result = createBulkOps(BulkMode.ORDERED, BaseDoc.class).insert(inserts).updateMulti(updates)
.remove(removes).execute();
assertThat(result, notNullValue());
assertThat(result.getInsertedCount(), is(3));
@@ -232,7 +237,7 @@ public class DefaultBulkOperationsIntegrationTests {
specialDoc.value = "normal-value";
specialDoc.specialValue = "special-value";
createBulkOps(BulkMode.ORDERED).insert(Arrays.asList(specialDoc)).execute();
createBulkOps(BulkMode.ORDERED, SpecialDoc.class).insert(Arrays.asList(specialDoc)).execute();
BaseDoc doc = operations.findOne(where("_id", specialDoc.id), BaseDoc.class, COLLECTION_NAME);
@@ -266,11 +271,21 @@ public class DefaultBulkOperationsIntegrationTests {
}
private BulkOperations createBulkOps(BulkMode mode) {
return createBulkOps(mode, null);
}
DefaultBulkOperations operations = new DefaultBulkOperations(this.operations, mode, COLLECTION_NAME, null);
operations.setDefaultWriteConcern(WriteConcern.ACKNOWLEDGED);
private BulkOperations createBulkOps(BulkMode mode, Class<?> entityType) {
return operations;
MongoPersistentEntity<?> entity = entityType != null
? operations.getConverter().getMappingContext().getPersistentEntity(entityType) : null;
BulkOperationContext bulkOperationContext = new BulkOperationContext(mode, entity,
new QueryMapper(operations.getConverter()), new UpdateMapper(operations.getConverter()));
DefaultBulkOperations bulkOps = new DefaultBulkOperations(operations, COLLECTION_NAME, bulkOperationContext);
bulkOps.setDefaultWriteConcern(WriteConcern.ACKNOWLEDGED);
return bulkOps;
}
private void insertSomeDocuments() {

View File

@@ -0,0 +1,159 @@
/*
* Copyright 2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.hamcrest.MatcherAssert.*;
import static org.mockito.Mockito.*;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import static org.springframework.data.mongodb.core.query.Query.*;
import static org.springframework.data.mongodb.test.util.IsBsonObject.*;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.ArgumentCaptor;
import org.mockito.Captor;
import org.mockito.Mock;
import org.mockito.runners.MockitoJUnitRunner;
import org.springframework.data.annotation.Id;
import org.springframework.data.mongodb.core.BulkOperations.BulkMode;
import org.springframework.data.mongodb.core.DefaultBulkOperations.BulkOperationContext;
import org.springframework.data.mongodb.core.convert.DbRefResolver;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.convert.UpdateMapper;
import org.springframework.data.mongodb.core.mapping.Field;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.query.BasicQuery;
import org.springframework.data.mongodb.core.query.Update;
import com.mongodb.BulkUpdateRequestBuilder;
import com.mongodb.BulkWriteOperation;
import com.mongodb.BulkWriteRequestBuilder;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
/**
* Unit tests for {@link DefaultBulkOperations}.
*
* @author Christoph Strobl
* @author Mark Paluch
*/
@RunWith(MockitoJUnitRunner.class)
public class DefaultBulkOperationsUnitTests {
@Mock MongoTemplate template;
@Mock DBCollection collection;
@Mock DbRefResolver dbRefResolver;
@Captor ArgumentCaptor<DBObject> captor;
@Mock BulkWriteOperation bulk;
@Mock BulkWriteRequestBuilder bulkWriteRequestBuilder;
@Mock BulkUpdateRequestBuilder bulkUpdateRequestBuilder;
MongoConverter converter;
MongoMappingContext mappingContext;
DefaultBulkOperations ops;
@Before
public void before() throws Exception {
when(bulk.find(any(DBObject.class))).thenReturn(bulkWriteRequestBuilder);
when(bulkWriteRequestBuilder.upsert()).thenReturn(bulkUpdateRequestBuilder);
when(collection.initializeOrderedBulkOperation()).thenReturn(bulk);
mappingContext = new MongoMappingContext();
mappingContext.afterPropertiesSet();
converter = new MappingMongoConverter(dbRefResolver, mappingContext);
when(template.getCollection(anyString())).thenReturn(collection);
ops = new DefaultBulkOperations(template, "collection-1",
new BulkOperationContext(BulkMode.ORDERED, mappingContext.getPersistentEntity(SomeDomainType.class),
new QueryMapper(converter), new UpdateMapper(converter)));
}
@Test // DATAMONGO-1678
public void updateOneShouldMapQueryAndUpdate() {
ops.updateOne(new BasicQuery("{firstName:1}"), new Update().set("lastName", "targaryen")).execute();
verify(bulk).find(captor.capture());
verify(bulkWriteRequestBuilder).updateOne(captor.capture());
assertThat(captor.getAllValues().get(0), isBsonObject().containing("first_name", 1));
assertThat(captor.getAllValues().get(1), isBsonObject().containing("$set.last_name", "targaryen"));
}
@Test // DATAMONGO-1678
public void bulkUpdateShouldMapQueryAndUpdateCorrectly() {
ops.updateMulti(query(where("firstName").is("danerys")), Update.update("firstName", "queen danerys")).execute();
verify(bulk).find(captor.capture());
verify(bulkWriteRequestBuilder).update(captor.capture());
assertThat(captor.getAllValues().get(0), isBsonObject().containing("first_name", "danerys"));
assertThat(captor.getAllValues().get(1), isBsonObject().containing("$set.first_name", "queen danerys"));
}
@Test // DATAMONGO-1678
public void bulkUpdateManyShouldMapQueryAndUpdateCorrectly() {
ops.updateOne(query(where("firstName").is("danerys")), Update.update("firstName", "queen danerys")).execute();
verify(bulk).find(captor.capture());
verify(bulkWriteRequestBuilder).updateOne(captor.capture());
assertThat(captor.getAllValues().get(0), isBsonObject().containing("first_name", "danerys"));
assertThat(captor.getAllValues().get(1), isBsonObject().containing("$set.first_name", "queen danerys"));
}
@Test // DATAMONGO-1678
public void bulkUpsertShouldMapQueryAndUpdateCorrectly() {
ops.upsert(query(where("firstName").is("danerys")), Update.update("firstName", "queen danerys")).execute();
verify(bulk).find(captor.capture());
verify(bulkUpdateRequestBuilder).update(captor.capture());
assertThat(captor.getAllValues().get(0), isBsonObject().containing("first_name", "danerys"));
assertThat(captor.getAllValues().get(1), isBsonObject().containing("$set.first_name", "queen danerys"));
}
@Test // DATAMONGO-1678
public void bulkRemoveShouldMapQueryCorrectly() {
ops.remove(query(where("firstName").is("danerys"))).execute();
verify(bulk).find(captor.capture());
assertThat(captor.getValue(), isBsonObject().containing("first_name", "danerys"));
}
class SomeDomainType {
@Id String id;
Gender gender;
@Field("first_name") String firstName;
@Field("last_name") String lastName;
}
enum Gender {
M, F
}
}

View File

@@ -0,0 +1,111 @@
/*
* Copyright 2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.hamcrest.MatcherAssert.*;
import static org.hamcrest.Matchers.*;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import static org.springframework.data.mongodb.core.query.Query.*;
import lombok.Data;
import java.net.UnknownHostException;
import org.junit.Before;
import org.junit.Test;
import org.springframework.data.annotation.Id;
import org.springframework.data.mongodb.core.mapping.DBRef;
import org.springframework.data.mongodb.core.mapping.Document;
import com.mongodb.MongoClient;
/**
* {@link org.springframework.data.mongodb.core.mapping.DBRef} related integration tests for
* {@link org.springframework.data.mongodb.core.MongoTemplate}.
*
* @author Christoph Strobl
*/
public class MongoTemplateDbRefTests {
MongoTemplate template;
@Before
public void setUp() throws UnknownHostException {
template = new MongoTemplate(new MongoClient(), "mongo-template-dbref-tests");
template.dropCollection(RefCycleLoadingIntoDifferentTypeRoot.class);
template.dropCollection(RefCycleLoadingIntoDifferentTypeIntermediate.class);
template.dropCollection(RefCycleLoadingIntoDifferentTypeRootView.class);
}
@Test // DATAMONGO-1703
public void shouldLoadRefIntoDifferentTypeCorrectly() {
// init root
RefCycleLoadingIntoDifferentTypeRoot root = new RefCycleLoadingIntoDifferentTypeRoot();
root.id = "root-1";
root.content = "jon snow";
template.save(root);
// init one and set view id ref to root.id
RefCycleLoadingIntoDifferentTypeIntermediate intermediate = new RefCycleLoadingIntoDifferentTypeIntermediate();
intermediate.id = "one-1";
intermediate.refToRootView = new RefCycleLoadingIntoDifferentTypeRootView();
intermediate.refToRootView.id = root.id;
template.save(intermediate);
// add one ref to root
root.refToIntermediate = intermediate;
template.save(root);
RefCycleLoadingIntoDifferentTypeRoot loaded = template.findOne(query(where("id").is(root.id)),
RefCycleLoadingIntoDifferentTypeRoot.class);
assertThat(loaded.content, is(equalTo("jon snow")));
assertThat(loaded.getRefToIntermediate(), is(instanceOf(RefCycleLoadingIntoDifferentTypeIntermediate.class)));
assertThat(loaded.getRefToIntermediate().getRefToRootView(),
is(instanceOf(RefCycleLoadingIntoDifferentTypeRootView.class)));
assertThat(loaded.getRefToIntermediate().getRefToRootView().getContent(), is(equalTo("jon snow")));
}
@Data
@Document(collection = "cycle-with-different-type-root")
static class RefCycleLoadingIntoDifferentTypeRoot {
@Id String id;
String content;
@DBRef RefCycleLoadingIntoDifferentTypeIntermediate refToIntermediate;
}
@Data
@Document(collection = "cycle-with-different-type-intermediate")
static class RefCycleLoadingIntoDifferentTypeIntermediate {
@Id String id;
@DBRef RefCycleLoadingIntoDifferentTypeRootView refToRootView;
}
@Data
@Document(collection = "cycle-with-different-type-root")
static class RefCycleLoadingIntoDifferentTypeRootView {
@Id String id;
String content;
}
}

View File

@@ -3210,6 +3210,25 @@ public class MongoTemplateTests {
assertThat(loaded.getValue(), instanceOf(decimal128Type));
}
@Test // DATAMONGO-1718
public void findAndRemoveAllWithoutExplicitDomainTypeShouldRemoveAndReturnEntitiesCorrectly() {
Sample jon = new Sample("1", "jon snow");
Sample bran = new Sample("2", "bran stark");
Sample rickon = new Sample("3", "rickon stark");
template.save(jon);
template.save(bran);
template.save(rickon);
List<Sample> result = template.findAllAndRemove(query(where("field").regex(".*stark$")),
template.determineCollectionName(Sample.class));
assertThat(result, hasSize(2));
assertThat(result, containsInAnyOrder(bran, rickon));
assertThat(template.count(new BasicQuery("{}"), template.determineCollectionName(Sample.class)), is(equalTo(1L)));
}
static class TypeWithNumbers {
@Id String id;

View File

@@ -166,6 +166,32 @@ public class QueryByExampleTests {
assertThat(result, hasItems(p1, p2));
}
@Test // DATAMONGO-1768
public void typedExampleMatchesNothingIfTypesDoNotMatch() {
NotAPersonButStillMatchingFields probe = new NotAPersonButStillMatchingFields();
probe.lastname = "stark";
Query query = new Query(new Criteria().alike(Example.of(probe)));
List<Person> result = operations.find(query, Person.class);
assertThat(result, hasSize(0));
}
@Test // DATAMONGO-1768
public void untypedExampleMatchesCorrectly() {
NotAPersonButStillMatchingFields probe = new NotAPersonButStillMatchingFields();
probe.lastname = "stark";
Query query = new Query(
new Criteria().alike(Example.of(probe, ExampleMatcher.matching().withIgnorePaths("_class"))));
List<Person> result = operations.find(query, Person.class);
assertThat(result, hasSize(2));
assertThat(result, hasItems(p1, p3));
}
@Document(collection = "dramatis-personae")
@EqualsAndHashCode
@ToString
@@ -175,4 +201,12 @@ public class QueryByExampleTests {
String firstname, middlename;
@Field("last_name") String lastname;
}
@EqualsAndHashCode
@ToString
static class NotAPersonButStillMatchingFields {
String firstname, middlename;
@Field("last_name") String lastname;
}
}

View File

@@ -40,7 +40,7 @@ import com.mongodb.util.JSON;
/**
* Unit tests for {@link Aggregation}.
*
*
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
@@ -564,6 +564,16 @@ public class AggregationUnitTests {
assertThat(getAsDBObject(fields, "foosum"), isBsonObject().containing("$first.$cond.else", "no-answer"));
}
@Test // DATAMONGO-1756
public void projectOperationShouldRenderNestedFieldNamesCorrectly() {
DBObject agg = newAggregation(project().and("value1.value").plus("value2.value").as("val")).toDbObject("collection",
Aggregation.DEFAULT_CONTEXT);
assertThat((BasicDBObject) extractPipelineElement(agg, 0, "$project"), is(equalTo(new BasicDBObject("val",
new BasicDBObject("$add", new BasicDbListBuilder().add("$value1.value").add("$value2.value").get())))));
}
private DBObject extractPipelineElement(DBObject agg, int index, String operation) {
List<DBObject> pipeline = (List<DBObject>) agg.get("pipeline");

View File

@@ -47,6 +47,7 @@ import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.test.util.BasicDbListBuilder;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
@@ -54,10 +55,11 @@ import com.mongodb.util.JSON;
/**
* Unit tests for {@link TypeBasedAggregationOperationContext}.
*
*
* @author Oliver Gierke
* @author Thomas Darimont
* @author Mark Paluch
* @author Christoph Strobl
*/
@RunWith(MockitoJUnitRunner.class)
public class TypeBasedAggregationOperationContextUnitTests {
@@ -336,6 +338,19 @@ public class TypeBasedAggregationOperationContextUnitTests {
assertThat(age, isBsonObject().containing("$ifNull.[1]._class", Age.class.getName()));
}
@Test // DATAMONGO-1756
public void projectOperationShouldRenderNestedFieldNamesCorrectlyForTypedAggregation() {
AggregationOperationContext context = getContext(Wrapper.class);
DBObject agg = newAggregation(Wrapper.class, project().and("nested1.value1").plus("nested2.value2").as("val"))
.toDbObject("collection", context);
BasicDBObject project = (BasicDBObject) getPipelineElementFromAggregationAt(agg, 0).get("$project");
assertThat(project, is(equalTo(new BasicDBObject("val", new BasicDBObject("$add",
new BasicDbListBuilder().add("$nested1.value1").add("$field2.nestedValue2").get())))));
}
@Document(collection = "person")
public static class FooPerson {
@@ -406,4 +421,15 @@ public class TypeBasedAggregationOperationContextUnitTests {
String name;
}
static class Wrapper {
Nested nested1;
@org.springframework.data.mongodb.core.mapping.Field("field2") Nested nested2;
}
static class Nested {
String value1;
@org.springframework.data.mongodb.core.mapping.Field("nestedValue2") String value2;
}
}

View File

@@ -64,7 +64,7 @@ public class DefaultDbRefResolverUnitTests {
when(factoryMock.getDb()).thenReturn(dbMock);
when(dbMock.getCollection(anyString())).thenReturn(collectionMock);
when(collectionMock.find(Mockito.any(DBObject.class))).thenReturn(cursorMock);
when(cursorMock.toArray()).thenReturn(Collections.<DBObject>emptyList());
when(cursorMock.toArray()).thenReturn(Collections.<DBObject> emptyList());
resolver = new DefaultDbRefResolver(factoryMock);
}
@@ -100,7 +100,7 @@ public class DefaultDbRefResolverUnitTests {
@Test // DATAMONGO-1194
public void bulkFetchShouldReturnEarlyForEmptyLists() {
resolver.bulkFetch(Collections.<DBRef>emptyList());
resolver.bulkFetch(Collections.<DBRef> emptyList());
verify(collectionMock, never()).find(Mockito.any(DBObject.class));
}
@@ -118,4 +118,17 @@ public class DefaultDbRefResolverUnitTests {
assertThat(resolver.bulkFetch(Arrays.asList(ref1, ref2)), contains(o1, o2));
}
@Test // DATAMONGO-1765
public void bulkFetchContainsDuplicates() {
DBObject document = new BasicDBObject("_id", new ObjectId());
DBRef ref1 = new DBRef("collection-1", document.get("_id"));
DBRef ref2 = new DBRef("collection-1", document.get("_id"));
when(cursorMock.toArray()).thenReturn(Arrays.asList(document));
assertThat(resolver.bulkFetch(Arrays.asList(ref1, ref2)), contains(document, document));
}
}

View File

@@ -115,11 +115,15 @@ public class MappingMongoConverterUnitTests {
@Before
public void setUp() {
CustomConversions conversions = new CustomConversions();
mappingContext = new MongoMappingContext();
mappingContext.setApplicationContext(context);
mappingContext.setSimpleTypeHolder(conversions.getSimpleTypeHolder());
mappingContext.afterPropertiesSet();
converter = new MappingMongoConverter(resolver, mappingContext);
converter.setCustomConversions(conversions);
converter.afterPropertiesSet();
}

View File

@@ -24,6 +24,7 @@ import static org.springframework.data.mongodb.test.util.IsBsonObject.*;
import java.util.Arrays;
import java.util.List;
import java.util.Set;
import java.util.regex.Pattern;
import org.bson.BSONObject;
@@ -36,8 +37,7 @@ import org.mockito.runners.MockitoJUnitRunner;
import org.springframework.data.annotation.Id;
import org.springframework.data.domain.Example;
import org.springframework.data.domain.ExampleMatcher;
import org.springframework.data.domain.ExampleMatcher.GenericPropertyMatchers;
import org.springframework.data.domain.ExampleMatcher.StringMatcher;
import org.springframework.data.domain.ExampleMatcher.*;
import org.springframework.data.geo.Point;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.convert.QueryMapperUnitTests.ClassWithGeoTypes;
@@ -47,6 +47,7 @@ import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.Field;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.test.util.IsBsonObject;
import org.springframework.data.util.TypeInformation;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
@@ -288,7 +289,7 @@ public class MongoExampleMapperUnitTests {
DBObject dbo = mapper.getMappedExample(of(probe), context.getPersistentEntity(WithDBRef.class));
com.mongodb.DBRef reference = getTypedValue(dbo, "referenceDocument", com.mongodb.DBRef.class);
assertThat(reference.getId(), Is.<Object>is("200"));
assertThat(reference.getId(), Is.<Object> is("200"));
assertThat(reference.getCollectionName(), is("refDoc"));
}
@@ -311,8 +312,8 @@ public class MongoExampleMapperUnitTests {
DBObject dbo = mapper.getMappedExample(of(probe), context.getPersistentEntity(WithDBRef.class));
assertThat(dbo.get("legacyPoint.x"), Is.<Object>is(10D));
assertThat(dbo.get("legacyPoint.y"), Is.<Object>is(20D));
assertThat(dbo.get("legacyPoint.x"), Is.<Object> is(10D));
assertThat(dbo.get("legacyPoint.y"), Is.<Object> is(20D));
}
@Test // DATAMONGO-1245
@@ -426,6 +427,52 @@ public class MongoExampleMapperUnitTests {
assertThat(mapper.getMappedExample(example), isBsonObject().containing("$or").containing("_class"));
}
@Test // DATAMONGO-1768
public void allowIgnoringTypeRestrictionBySettingUpTypeKeyAsAnIgnoredPath() {
WrapperDocument probe = new WrapperDocument();
probe.flatDoc = new FlatDocument();
probe.flatDoc.stringValue = "conflux";
DBObject dbo = mapper.getMappedExample(Example.of(probe, ExampleMatcher.matching().withIgnorePaths("_class")));
assertThat(dbo, isBsonObject().notContaining("_class"));
}
@Test // DATAMONGO-1768
public void allowIgnoringTypeRestrictionBySettingUpTypeKeyAsAnIgnoredPathWhenUsingCustomTypeMapper() {
WrapperDocument probe = new WrapperDocument();
probe.flatDoc = new FlatDocument();
probe.flatDoc.stringValue = "conflux";
MappingMongoConverter mappingMongoConverter = new MappingMongoConverter(new DefaultDbRefResolver(factory), context);
mappingMongoConverter.setTypeMapper(new DefaultMongoTypeMapper() {
@Override
public boolean isTypeKey(String key) {
return "_foo".equals(key);
}
@Override
public void writeTypeRestrictions(DBObject dbo, Set<Class<?>> restrictedTypes) {
dbo.put("_foo", "bar");
}
@Override
public void writeType(TypeInformation<?> info, DBObject sink) {
sink.put("_foo", "bar");
}
});
mappingMongoConverter.afterPropertiesSet();
DBObject dbo = new MongoExampleMapper(mappingMongoConverter)
.getMappedExample(Example.of(probe, ExampleMatcher.matching().withIgnorePaths("_foo")));
assertThat(dbo, isBsonObject().notContaining("_class").notContaining("_foo"));
}
static class FlatDocument {
@Id String id;

View File

@@ -0,0 +1,104 @@
/*
* Copyright 2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import static org.hamcrest.MatcherAssert.*;
import static org.hamcrest.Matchers.*;
import org.junit.Before;
import org.junit.Test;
import org.springframework.data.mongodb.core.mapping.BasicMongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.util.ClassTypeInformation;
/**
* Unit tests for {@link ObjectPath}.
*
* @author Christoph Strobl
*/
public class ObjectPathUnitTests {
MongoPersistentEntity<EntityOne> one;
MongoPersistentEntity<EntityTwo> two;
MongoPersistentEntity<EntityThree> three;
@Before
public void setUp() {
one = new BasicMongoPersistentEntity(ClassTypeInformation.from(EntityOne.class));
two = new BasicMongoPersistentEntity(ClassTypeInformation.from(EntityTwo.class));
three = new BasicMongoPersistentEntity(ClassTypeInformation.from(EntityThree.class));
}
@Test // DATAMONGO-1703
public void getPathItemShouldReturnMatch() {
ObjectPath path = ObjectPath.ROOT.push(new EntityOne(), one, "id-1");
assertThat(path.getPathItem("id-1", "one", EntityOne.class), is(notNullValue()));
}
@Test // DATAMONGO-1703
public void getPathItemShouldReturnNullWhenNoTypeMatchFound() {
ObjectPath path = ObjectPath.ROOT.push(new EntityOne(), one, "id-1");
assertThat(path.getPathItem("id-1", "one", EntityThree.class), is(nullValue()));
}
@Test // DATAMONGO-1703
public void getPathItemShouldReturnCachedItemWhenIdAndCollectionMatchAndIsAssignable() {
ObjectPath path = ObjectPath.ROOT.push(new EntityTwo(), one, "id-1");
assertThat(path.getPathItem("id-1", "one", EntityOne.class), is(notNullValue()));
}
@Test // DATAMONGO-1703
public void getPathItemShouldReturnNullWhenIdAndCollectionMatchButNotAssignable() {
ObjectPath path = ObjectPath.ROOT.push(new EntityOne(), one, "id-1");
assertThat(path.getPathItem("id-1", "one", EntityTwo.class), is(nullValue()));
}
@Test // DATAMONGO-1703
public void getPathItemShouldReturnNullWhenIdAndCollectionMatchAndAssignableToInterface() {
ObjectPath path = ObjectPath.ROOT.push(new EntityThree(), one, "id-1");
assertThat(path.getPathItem("id-1", "one", ValueInterface.class), is(notNullValue()));
}
@Document(collection = "one")
static class EntityOne {
}
static class EntityTwo extends EntityOne {
}
interface ValueInterface {
}
@Document(collection = "three")
static class EntityThree implements ValueInterface {
}
}

View File

@@ -901,6 +901,34 @@ public class UpdateMapperUnitTests {
}
}
@Test // DATAMONGO-1772
public void mappingShouldAddTypeKeyInListOfInterfaceTypeContainedInConcreteObjectCorrectly() {
ConcreteInner inner = new ConcreteInner();
inner.interfaceTypeList = Collections.<SomeInterfaceType> singletonList(new SomeInterfaceImpl());
List<ConcreteInner> list = Collections.singletonList(inner);
DBObject mappedUpdate = mapper.getMappedObject(new Update().set("concreteInnerList", list).getUpdateObject(),
context.getPersistentEntity(Outer.class));
assertThat(mappedUpdate, isBsonObject().containing("$set.concreteInnerList.[0].interfaceTypeList.[0]._class")
.notContaining("$set.concreteInnerList.[0]._class"));
}
@Test // DATAMONGO-1772
public void mappingShouldAddTypeKeyInListOfAbstractTypeContainedInConcreteObjectCorrectly() {
ConcreteInner inner = new ConcreteInner();
inner.abstractTypeList = Collections.<SomeAbstractType> singletonList(new SomeInterfaceImpl());
List<ConcreteInner> list = Collections.singletonList(inner);
DBObject mappedUpdate = mapper.getMappedObject(new Update().set("concreteInnerList", list).getUpdateObject(),
context.getPersistentEntity(Outer.class));
assertThat(mappedUpdate, isBsonObject().containing("$set.concreteInnerList.[0].abstractTypeList.[0]._class")
.notContaining("$set.concreteInnerList.[0]._class"));
}
static class DomainTypeWrappingConcreteyTypeHavingListOfInterfaceTypeAttributes {
ListModelWrapper concreteTypeWithListAttributeOfInterfaceType;
}
@@ -1187,4 +1215,26 @@ public class UpdateMapperUnitTests {
Integer intValue;
int primIntValue;
}
static class Outer {
List<ConcreteInner> concreteInnerList;
}
static class ConcreteInner {
List<SomeInterfaceType> interfaceTypeList;
List<SomeAbstractType> abstractTypeList;
}
interface SomeInterfaceType {
}
static abstract class SomeAbstractType {
}
static class SomeInterfaceImpl extends SomeAbstractType implements SomeInterfaceType {
}
}

View File

@@ -20,6 +20,8 @@ import static org.junit.Assert.*;
import java.util.Collection;
import org.junit.Test;
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
import org.springframework.beans.factory.support.DefaultListableBeanFactory;
import org.springframework.core.env.Environment;
import org.springframework.core.env.StandardEnvironment;
import org.springframework.core.io.ResourceLoader;
@@ -43,8 +45,10 @@ public class MongoRepositoryConfigurationExtensionUnitTests {
StandardAnnotationMetadata metadata = new StandardAnnotationMetadata(Config.class, true);
ResourceLoader loader = new PathMatchingResourcePatternResolver();
Environment environment = new StandardEnvironment();
BeanDefinitionRegistry registry = new DefaultListableBeanFactory();
RepositoryConfigurationSource configurationSource = new AnnotationRepositoryConfigurationSource(metadata,
EnableMongoRepositories.class, loader, environment);
EnableMongoRepositories.class, loader, environment, registry);
@Test // DATAMONGO-1009
public void isStrictMatchIfDomainTypeIsAnnotatedWithDocument() {

View File

@@ -30,6 +30,7 @@ import org.junit.rules.ExpectedException;
import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.mockito.runners.MockitoJUnitRunner;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.convert.DbRefResolver;
@@ -173,6 +174,14 @@ public class PartTreeMongoQueryUnitTests {
assertThat(query.getFieldsObject().get("firstname"), is((Object) 1));
}
@Test // DATAMONGO-1729
public void doesNotCreateFieldsObjectForOpenProjection() {
org.springframework.data.mongodb.core.query.Query query = deriveQueryFromMethod("findAllBy");
assertThat(query.getFieldsObject(), is(nullValue()));
}
private org.springframework.data.mongodb.core.query.Query deriveQueryFromMethod(String method, Object... args) {
Class<?>[] types = new Class<?>[args.length];
@@ -233,6 +242,8 @@ public class PartTreeMongoQueryUnitTests {
@Query(fields = "{ 'firstname' : 1 }")
List<Person> findBySex(Sex sex);
OpenProjection findAllBy();
}
interface PersonProjection {
@@ -257,4 +268,12 @@ public class PartTreeMongoQueryUnitTests {
this.lastname = lastname;
}
}
interface OpenProjection {
String getFirstname();
@Value("#{target.firstname + ' ' + target.lastname}")
String getFullname();
}
}

View File

@@ -1,6 +1,129 @@
Spring Data MongoDB Changelog
=============================
Changes in version 1.10.7.RELEASE (2017-09-11)
----------------------------------------------
* DATAMONGO-1772 - Type hint not added when updating nested list elements with inheritance.
* DATAMONGO-1768 - QueryByExample FindOne : probe type.
* DATAMONGO-1765 - Duplicate elements in DBRefs list not correctly mapped.
* DATAMONGO-1756 - Aggregation project and arithmetic operation not working with nested fields.
* DATAMONGO-1755 - Release 1.10.7 (Ingalls SR7).
Changes in version 1.10.6.RELEASE (2017-07-26)
----------------------------------------------
* DATAMONGO-1750 - Release 1.10.6 (Ingalls SR6).
Changes in version 2.0.0.RC2 (2017-07-25)
-----------------------------------------
* DATAMONGO-1753 - IndexEnsuringQueryCreationListener should skip queries without criteria.
* DATAMONGO-1752 - Executing repository methods with closed projection fails.
* DATAMONGO-1751 - Release 2.0 RC2 (Kay).
Changes in version 2.0.0.RC1 (2017-07-25)
-----------------------------------------
* DATAMONGO-1748 - Add Kotlin extensions for Criteria API.
* DATAMONGO-1746 - Inherit Project Reactor version from dependency management.
* DATAMONGO-1744 - Improve default setup for MappingMongoConverter.
* DATAMONGO-1739 - Change TerminatingFindOperation.stream() to return a Stream directly.
* DATAMONGO-1738 - Move to fluent API for repository query execution.
* DATAMONGO-1735 - Sort and fields objects in Query should not be null.
* DATAMONGO-1734 - Add count() & exists to fluent API.
* DATAMONGO-1733 - Allow usage of projection interfaces in FluentMongoOperations.
* DATAMONGO-1730 - Adapt to API changes in mapping subsystem.
* DATAMONGO-1729 - Open projection does not fetch all properties.
* DATAMONGO-1728 - ExecutableFindOperation.find(…).first() fails with NPE.
* DATAMONGO-1726 - Add terminating findOne/findFirst methods to FluentMongoOperations returning null value instead of Optional.
* DATAMONGO-1725 - Potential NullPointerException in CloseableIterableCursorAdapter.
* DATAMONGO-1723 - Fix unit tests after API changes in Spring Data Commons.
* DATAMONGO-1721 - Fix dependency cycles.
* DATAMONGO-1720 - Add JMH benchmark module.
* DATAMONGO-1719 - Add fluent alternative for ReactiveMongoOperations.
* DATAMONGO-1718 - MongoTemplate.findAndRemoveAll(Query, String) delegates to wrong overload.
* DATAMONGO-1717 - Release 2.0 RC1 (Kay).
* DATAMONGO-1715 - Remove spring-data-mongodb-log4j module.
* DATAMONGO-1713 - MongoCredentialPropertyEditor improperly resolves the credential string.
* DATAMONGO-1705 - Deprecate cross-store support.
* DATAMONGO-1703 - Allow referencing views in object graphs containing circular dependencies.
* DATAMONGO-1702 - Switch repository implementation to use fragments.
* DATAMONGO-1697 - @Version used by @EnableMongoAuditing does not increase when using collection name in MongoTemplate's updateFirst. Unexpected, not described by reference documentation.
* DATAMONGO-1682 - Add partial index support to ReactiveIndexOperations.
* DATAMONGO-1678 - DefaultBulkOperations do not map Query and Update objects properly.
* DATAMONGO-1646 - Support reactive aggregation streaming.
* DATAMONGO-1519 - Change MongoTemplate.insertDBObjectList(…) to return List<Object> instead of List<ObjectId>.
Changes in version 1.10.5.RELEASE (2017-07-24)
----------------------------------------------
* DATAMONGO-1744 - Improve default setup for MappingMongoConverter.
* DATAMONGO-1729 - Open projection does not fetch all properties.
* DATAMONGO-1725 - Potential NullPointerException in CloseableIterableCursorAdapter.
* DATAMONGO-1723 - Fix unit tests after API changes in Spring Data Commons.
* DATAMONGO-1720 - Add JMH benchmark module.
* DATAMONGO-1718 - MongoTemplate.findAndRemoveAll(Query, String) delegates to wrong overload.
* DATAMONGO-1711 - Release 1.10.5 (Ingalls SR5).
* DATAMONGO-1703 - Allow referencing views in object graphs containing circular dependencies.
* DATAMONGO-1697 - @Version used by @EnableMongoAuditing does not increase when using collection name in MongoTemplate's updateFirst. Unexpected, not described by reference documentation.
* DATAMONGO-1678 - DefaultBulkOperations do not map Query and Update objects properly.
Changes in version 2.0.0.M4 (2017-06-14)
----------------------------------------
* DATAMONGO-1716 - Upgrade to Reactive Streams driver 1.5.0.
* DATAMONGO-1714 - Deprecate MongoLog4jAppender.
* DATAMONGO-1712 - Adopt to ReactiveCrudRepository.findById(Publisher) and existsById(Publisher).
* DATAMONGO-1710 - Adopt to changed AnnotationUtils.getValue(…) and OperatorNode.getRightOperand() behavior.
* DATAMONGO-1707 - Upgrade to Reactor 3.1 M2.
* DATAMONGO-1699 - Upgrade travis.yml to use MongoDB 3.4.
* DATAMONGO-1695 - Make sure GridFsResource.getContentType() reads type from new location within file metadata.
* DATAMONGO-1693 - Support collation in ReactiveMongoTemplate.createCollection.
* DATAMONGO-1690 - Adapt to QuerydslPerdicateExecutor API changes.
* DATAMONGO-1689 - Provide Kotlin extensions for Class based methods in MongoOperations / ReactiveMongoOperations.
* DATAMONGO-1688 - Release 2.0 M4 (Kay).
* DATAMONGO-1687 - Creating capped collection with CollectionOptions.empty().capped(…) causes NPE.
* DATAMONGO-1686 - Upgarde to MongoDB reactive streams driver 1.4.
* DATAMONGO-1685 - Adapt QueryByExampleExecutor API changes.
* DATAMONGO-1619 - Use ReactiveQueryByExampleExecutor in ReactiveMongoRepository.
* DATAMONGO-1563 - Add TemplateWrapper to reduce method overloads on MongoTemplate.
Changes in version 1.10.4.RELEASE (2017-06-08)
----------------------------------------------
* DATAMONGO-1699 - Upgrade travis.yml to use MongoDB 3.4.
* DATAMONGO-1672 - Release 1.10.4 (Ingalls SR4).
* DATAMONGO-1205 - CyclicPropertyReferenceException logged with stack trace.
Changes in version 1.9.11.RELEASE (2017-06-07)
----------------------------------------------
* DATAMONGO-1671 - Release 1.9.11 (Hopper SR11).
* DATAMONGO-1205 - CyclicPropertyReferenceException logged with stack trace.
Changes in version 2.0.0.M3 (2017-05-09)
----------------------------------------
* DATAMONGO-1684 - Adopt documentation to removed JodaTime DateMidnight support.
* DATAMONGO-1679 - Adapt to API changes in CrudRepository.
* DATAMONGO-1674 - Adapt to Range API changes.
* DATAMONGO-1668 - Adopt changed Mono and Flux error handling API.
* DATAMONGO-1667 - Rename @InfiniteStream to @Tailable.
* DATAMONGO-1666 - Constructor creation with bulk fetching of DBRefs uses List instead of collection type.
* DATAMONGO-1665 - Adapt to API changes in Reactor 3.1.
* DATAMONGO-1664 - Release 2.0 M3 (Kay).
* DATAMONGO-1662 - Section "Projection Expressions" contains error "Aggregate".
* DATAMONGO-1660 - Extract CustomConversions into Spring Data Commons.
* DATAMONGO-1518 - Add support for Collations.
* DATAMONGO-1325 - Add support for $sample to aggregation.
* DATAMONGO-1205 - CyclicPropertyReferenceException logged with stack trace.
Changes in version 1.9.10.RELEASE (2017-04-19)
----------------------------------------------
* DATAMONGO-1670 - Release 1.9.10 (Hopper SR10).
Changes in version 1.10.3.RELEASE (2017-04-19)
----------------------------------------------
* DATAMONGO-1669 - Release 1.10.3 (Ingalls SR3).

View File

@@ -1,4 +1,4 @@
Spring Data MongoDB 1.10.3
Spring Data MongoDB 1.10.7
Copyright (c) [2010-2015] Pivotal Software, Inc.
This product is licensed to you under the Apache License, Version 2.0 (the "License").