Compare commits

..

77 Commits

Author SHA1 Message Date
Mark Paluch
a3882a5e5c DATAMONGO-2187 - Release version 2.1.5 (Lovelace SR5). 2019-02-13 09:56:38 +01:00
Mark Paluch
8194772388 DATAMONGO-2187 - Prepare 2.1.5 (Lovelace SR5). 2019-02-13 09:55:35 +01:00
Mark Paluch
12f18850dc DATAMONGO-2187 - Updated changelog. 2019-02-13 09:55:29 +01:00
Mark Paluch
816c1da248 DATAMONGO-2196 - Polishing.
Fix stubbing in test that sneaked in through a back port not considering the change which method was used for entity removal.

Original pull request: #641.
2019-02-12 10:47:12 +01:00
Christoph Strobl
5a78f19781 DATAMONGO-2196 - Remove applies WriteConcern to single Document delete operations.
We now make sure to apply the WriteConcern correctly when calling deleteOne on MongoCollection.

Original pull request: #641.
2019-02-07 15:20:36 +01:00
Mark Paluch
698837921b DATAMONGO-2193 - Polishing.
Reformat code.

Original pull request: #640.
2019-02-05 11:44:03 +01:00
Christoph Strobl
0f7fc7880b DATAMONGO-2193 - Fix String <> ObjectId conversion for non-Id properties.
We now make sure to only convert valid ObjectId Strings if the property can be considered as id property.

Original pull request: #640.
2019-02-05 11:43:55 +01:00
Christoph Strobl
6e42f49b08 DATAMONGO-2189 - Polishing.
Assert returned object is not the same as the saved one and move helper method.

Original Pull Request: #638
2019-01-28 13:41:27 +01:00
Mark Paluch
bdfe4e99ed DATAMONGO-2189 - Fix AfterSaveEvent to contain the saved entity in ReactiveMongoTemplate.insert(…).
ReactiveMongoTemplate.insert(…) now uses the saved entity when emitting AfterSaveEvent. This change affects usage of immutable objects that are using Id generation. Previously, the to-be-saved entity instance was used which left the Id unpopulated.

Original Pull Request: #638
2019-01-28 11:50:43 +01:00
Mark Paluch
85aa3927a6 DATAMONGO-2145 - After release cleanups. 2019-01-10 13:48:12 +01:00
Mark Paluch
33c4e4294f DATAMONGO-2145 - Prepare next development iteration. 2019-01-10 13:48:10 +01:00
Mark Paluch
a89ab387cc DATAMONGO-2145 - Release version 2.1.4 (Lovelace SR4). 2019-01-10 12:35:56 +01:00
Mark Paluch
e52b8c9d38 DATAMONGO-2145 - Prepare 2.1.4 (Lovelace SR4). 2019-01-10 12:34:54 +01:00
Mark Paluch
4dbf4795db DATAMONGO-2145 - Updated changelog. 2019-01-10 12:34:48 +01:00
Mark Paluch
8e4c6f68ae DATAMONGO-2144 - Updated changelog. 2019-01-10 12:26:36 +01:00
Mark Paluch
fddbd126ea DATAMONGO-2143 - Updated changelog. 2019-01-10 11:01:21 +01:00
Christoph Strobl
ee5b26ab1c DATAMONGO-2168 - Polishing.
MetadataBackedField no longer fails when Path detects reference to field within java.lang.Class. This can happen when splitting the property name via camel case where the first part matches to class which resolves to the getClass() call on java.lang.Object. When then the 2nd part also maps to a method (like getName()) on Class an error would be thrown.

Original Pull Request: #631
2019-01-09 17:06:29 +01:00
Mark Paluch
01e9a2ed67 DATAMONGO-2168 - Convert assertions to AssertJ.
Original Pull Request: #631
2019-01-09 17:06:29 +01:00
Mark Paluch
10107c7b81 DATAMONGO-2168 - Do not map type key in QueryMapper.
QueryMapper now excludes the type key in during mapping and retains the value as-is. This change fixes an issue in which type keys were attempted to be mapped using the entity model. Type key resolution for _class failed silently while other type keys such as className failed in property path resolution and the failure became visible.

Original Pull Request: #631
2019-01-09 17:06:29 +01:00
Mark Paluch
abe7876086 DATAMONGO-2155 - Polishing.
Reduce visibility of MappedUpdated to package-protected to avoid exposure. Rename UpdateDefinition.incVersion() to inc(). Reintroduce doUpdate() methods accepting Update and delegating to the new doUpdate() methods to preserve binary compatibility.

Original pull request: #625.
2019-01-09 16:35:19 +01:00
Christoph Strobl
a759dff5fd DATAMONGO-2155 - Introduce UpdateDefinition.
Original pull request: #625.
2019-01-09 16:35:19 +01:00
Oliver Drotbohm
9f8d081ef3 DATAMONGO-2155 - Polishing.
Original pull request: #625.
2019-01-09 16:35:19 +01:00
Christoph Strobl
b8f6030441 DATAMONGO-2155 - Bypass mapping for already mapped updates.
We now make sure that mapped updates (as in doSaveVersioned and doUpdate) are mapped only once as mapping is required only once. Mapping already mapped query/update objects comes with undesired side-effects such as following invalid property paths or reduction of type information availability.

We now make sure to map key/value pairs of Map like properties to the values domain type, and apply potentially registered custom converters to the keys.
Fixed invalid test for DATAMONGO-1423 as this one did not check the application of the registered converter.

Original pull request: #625.
2019-01-09 16:33:31 +01:00
Mark Paluch
267decf189 DATAMONGO-2173 - Polishing.
Set interrupted thread state after catching InterruptedException. Fix potential NPE by checking the cursor. Streamline generics to not hide class-level generic types.

Original pull request: #634.
2019-01-09 13:00:10 +01:00
Christoph Strobl
3a7492c68d DATAMONGO-2173 - Translate and forward exceptions during CursorReadingTask#start() to ErrorHandler.
We now make sure to translate and pass on errors during the cursor initialization procedure to the configured error handler.

Original pull request: #634.
2019-01-09 13:00:10 +01:00
Christoph Strobl
273088b6a8 DATAMONGO-2174 - Fix InvalidPersistentPropertyPath exception when updating documents.
MetadataBackedField.getPath() now returns null instead throwing an error for fields that are not part of the domain model. This allows adding any field when updating an entity.

Original pull request: #633.
2019-01-09 10:34:48 +01:00
Mark Paluch
723b481f82 DATAMONGO-2179 - Fixed broken auditing for entities using optimistic locking via batch save.
The previous implementation of (Reactive)MongoTemplate.doInsertBatch(…) prematurely initialized the version property so that the entity wasn't considered new by the auditing subsystem. Even worse, for primitive version properties, the initialization kept the property at a value of 0, so that the just persisted entity was still considered new. This mean that via the repository route, inserts are triggered even for subsequent attempts to save an entity which caused duplicate key exceptions.

We now make sure we fire the BeforeConvertEvent before the version property is initialized or updated. Also, the initialization of the property now sets primitive properties to 1 initially.

Related tickets: DATAMONGO-2139, DATAMONGO-2150.

Original Pull Request: #632
2019-01-08 08:22:38 +01:00
Mark Paluch
8a34bc46a2 DATAMONGO-2181 - Consider repository collection name in ReactiveMongoRepository.saveAll(…).
We now consider the collection name that is bound to the repository when inserting entities of which all are new. Previously, the collection name was derived from the entity.

Original Pull Request: #632
2019-01-08 08:15:41 +01:00
Mark Paluch
bb4c16f4cd DATAMONGO-2170 - Polishing.
Use ObjectUtils to compote hash code as hash code implementation contained artifacts that do not belong there. Extract test method.

Original pull request: #629.
2019-01-07 13:11:02 +01:00
Christoph Strobl
cf5b7c9763 DATAMONGO-2170 - Return null instead of empty string for IndexInfo#getPartialFilterExpression when not set.
We now return null instead of an empty string when calling IndexInfo#getPartialFilterExpression. The method has been marked to return null vales before and we’re complying to that contract and return value expectation.

Original pull request: #629.
2019-01-07 13:11:02 +01:00
Mark Paluch
f4414e98a2 DATAMONGO-2175 - Update copyright years to 2019. 2019-01-02 14:11:51 +01:00
Christoph Strobl
a97bfd2a37 DATAMONGO-2160 - Updated changelog. 2018-12-11 11:43:12 +01:00
Mark Paluch
9fe0f5c984 DATAMONGO-2150 - Polishing.
Fix imperative auditing test to use intended persist mechanism. Remove final keywords from method args and local variables in ReactiveMongoTemplate. Rename DBObject to Document.

Original Pull Request: #627
2018-12-07 14:39:10 +01:00
Mark Paluch
718a7ffe8c DATAMONGO-2150 - Fixed broken auditing for entities using optimistic locking.
The previous implementation of ReactiveMongoTemplate.doSaveVersioned(…) prematurely initialized the version property so that the entity wasn't considered new by the auditing subsystem. Even worse, for primitive version properties, the initialization kept the property at a value of 0, so that the just persisted entity was still considered new. This mean that via the repository route, inserts are triggered even for subsequent attempts to save an entity which caused duplicate key exceptions.

We now make sure we fire the BeforeConvertEvent before the version property is initialized or updated. Also, the initialization of the property now sets primitive properties to 1 initially.

Added integration tests for the auditing via ReactiveMongoTemplate and repositories.

Related ticket: DATAMONGO-2139.

Original Pull Request: #627
2018-12-07 14:38:59 +01:00
Mark Paluch
f7106dc425 DATAMONGO-2156 - Polishing.
Original pull request: #626.
2018-12-05 14:40:13 +01:00
Mark Paluch
0698f8bcb8 DATAMONGO-2156 - Remove dependency to javax.xml.bind.
We now no longer use DatatypeConverter to convert byte[] to its Base64-representation but use Spring Framework's Base64 utils.

Original pull request: #626.
2018-12-05 14:40:13 +01:00
Mark Paluch
3effd9ae6f DATAMONGO-2149 - Polishing.
Add ticket reference to follow-up ticket regarding array matching on partial DBRef expressions.

Related ticket: DATAMONGO-2154

Original pull request: #623.
2018-11-30 14:53:56 +01:00
Christoph Strobl
7002cd1456 DATAMONGO-2149 - Fix $slice in fields projection when pointing to array of DBRefs.
We now no longer try to convert the actual slice parameters into a DBRef.

Original pull request: #623.
2018-11-30 14:52:58 +01:00
Mark Paluch
a15d488657 DATAMONGO-2148 - Polishing.
Add author tag. Add logging for ReactiveMongoTemplate.count(…) and findDistinct(…) operations. Fix variable names.

Original pull request: #620.
2018-11-28 17:24:52 +01:00
Cimon Lucas (LCM)
44651581b1 DATAMONGO-2148 - Add query logging for MongoTemplate.count(…).
Original pull request: #620.
2018-11-28 17:24:46 +01:00
Mark Paluch
6d64f5b2b2 DATAMONGO-2121 - After release cleanups. 2018-11-27 14:23:35 +01:00
Mark Paluch
0c52a29ba8 DATAMONGO-2121 - Prepare next development iteration. 2018-11-27 14:23:33 +01:00
Mark Paluch
bd8bd4f568 DATAMONGO-2121 - Release version 2.1.3 (Lovelace SR3). 2018-11-27 13:43:24 +01:00
Mark Paluch
c75f29dc42 DATAMONGO-2121 - Prepare 2.1.3 (Lovelace SR3). 2018-11-27 13:42:17 +01:00
Mark Paluch
e493af7266 DATAMONGO-2121 - Updated changelog. 2018-11-27 13:42:07 +01:00
Mark Paluch
8d892e5924 DATAMONGO-2109 - Updated changelog. 2018-11-27 12:36:47 +01:00
Mark Paluch
053299f243 DATAMONGO-2110 - Updated changelog. 2018-11-27 11:27:21 +01:00
Mark Paluch
872659cc00 DATAMONGO-2119 - Polishing.
Convert anonymous JSON callback class into a private static one. Use an expressive Pattern constant.

Original pull request: #621.
2018-11-23 09:48:26 +01:00
Christoph Strobl
96978a6194 DATAMONGO-2119 - Allow SpEL usage for annotated $regex query.
Original pull request: #621.
2018-11-23 09:48:26 +01:00
Oliver Drotbohm
2253d3e301 DATAMONGO-2108 - Fixed broken auditing for entities using optimistic locking.
The previous implementation of MongoTemplate.doSaveVersioned(…) prematurely initialized the version property so that the entity wasn't considered new by the auditing subsystem. Even worse, for primitive version properties, the initialization kept the property at a value of 0, so that the just persisted entity was still considered new. This mean that via the repository route, inserts are triggered even for subsequent attempts to save an entity which caused duplicate key exceptions.

We now make sure we fire the BeforeConvertEvent before the version property is initialized or updated. Also, the initialization of the property now sets primitive properties to 1 initially.

Added integration tests for the auditing via MongoOperations and repositories.
2018-11-22 15:05:31 +01:00
Mark Paluch
5982ee84f7 DATAMONGO-2130 - Polishing.
Replace duplicate checks to ClientSession.hasActiveTransaction() with MongoResourceHolder.hasActiveTransaction(). Introduce MongoResourceHolder.getRequiredSession() to avoid nullability warnings.

Original pull request: #618.
2018-11-16 12:58:40 +01:00
Christoph Strobl
dd2af6462d DATAMONGO-2130 - Polishing.
Set timeout for InetAdress host lookup to reduce test execution time.

Original pull request: #618.
2018-11-16 12:58:40 +01:00
Christoph Strobl
622643bf24 DATAMONGO-2130 - Fix Repository count & exists inside transaction.
We now make sure invocations on repository count and exists methods delegate to countDocuments when inside a transaction.

Original pull request: #618.
2018-11-16 12:58:40 +01:00
Oliver Drotbohm
51cc55baac DATAMONGO-2135 - Default to intermediate List for properties typed to Collection.
We now defensively create a List rather than a LinkedHashSet (which Spring's CollectionFactory.createCollection(…) defaults to) to make sure we're not accidentally dropping values that are considered equal according to their Java class definition.
2018-11-15 15:26:36 +01:00
Mark Paluch
0b106e5649 DATAMONGO-2107 - After release cleanups. 2018-10-29 13:59:17 +01:00
Mark Paluch
8975d93ab3 DATAMONGO-2107 - Prepare next development iteration. 2018-10-29 13:59:15 +01:00
Mark Paluch
e25b6c49f5 DATAMONGO-2107 - Release version 2.1.2 (Lovelace SR2). 2018-10-29 12:53:51 +01:00
Mark Paluch
7a70c205de DATAMONGO-2107 - Prepare 2.1.2 (Lovelace SR2). 2018-10-29 12:52:54 +01:00
Mark Paluch
6045efa450 DATAMONGO-2107 - Updated changelog. 2018-10-29 12:52:45 +01:00
Mark Paluch
7b0816b3ee DATAMONGO-2118 - Polishing.
Fix typo in reactive repositories reference documentation.

Original pull request: #611.
2018-10-26 10:08:03 +02:00
Mona Mohamadinia
14e4ea736d DATAMONGO-2118 - Fix typo in repositories reference documentation.
Original pull request: #611.
2018-10-26 10:08:03 +02:00
Mark Paluch
32e7d9ab7f DATAMONGO-2098 - Polishing.
Annotate methods and parameters with Nullable. Use diamond syntax where appropriate.

Original pull request: #612.
2018-10-25 15:35:26 +02:00
Zied Yaich
7f35ad9e45 DATAMONGO-2098 - Fix typo in MappingMongoConverterParser method.
Original pull request: #612.
2018-10-25 15:35:26 +02:00
Mark Paluch
60228f6e5a DATAMONGO-2113 - Polishing.
Increase subscription await timeout to allow for slow system processing such as on TravisCI.

Original pull request: #615.
2018-10-25 14:33:28 +02:00
Christoph Strobl
7604492b7f DATAMONGO-2113 - Polishing.
Use AssertJ in tests.

Original pull request: #615.
2018-10-25 14:33:28 +02:00
Christoph Strobl
4680fe0e77 DATAMONGO-2113 - Fix resumeTimestamp conversion for change streams.
We now use the first 32 bits of the timestamp to create the instant and ignore the ordinal value.

Original pull request: #615.
2018-10-25 14:33:28 +02:00
Mark Paluch
b4228c88d3 DATAMONGO-2083 - Updated changelog. 2018-10-15 14:19:03 +02:00
Mark Paluch
f6ef8c94c8 DATAMONGO-2084 - Updated changelog. 2018-10-15 12:46:24 +02:00
Mark Paluch
0d0dafa85e DATAMONGO-2094 - After release cleanups. 2018-10-15 11:12:14 +02:00
Mark Paluch
29aa34619f DATAMONGO-2094 - Prepare next development iteration. 2018-10-15 11:12:12 +02:00
Mark Paluch
7f19f769c4 DATAMONGO-2094 - Release version 2.1.1 (Lovelace SR1). 2018-10-15 10:42:04 +02:00
Mark Paluch
a40e89d90a DATAMONGO-2094 - Prepare 2.1.1 (Lovelace SR1). 2018-10-15 10:40:57 +02:00
Mark Paluch
6b2350200a DATAMONGO-2094 - Updated changelog. 2018-10-15 10:40:53 +02:00
Mark Paluch
fb50b0f6e7 DATAMONGO-2096 - Polishing.
Migrate assertions to AssertJ.

Original pull request: #613.
2018-10-05 15:02:38 +02:00
Christoph Strobl
ab568229b5 DATAMONGO-2096 - Fix target field name for GraphLookup aggregation operation.
We now make sure to use the target field name instead of the alias when processing GraphLookupOperation.

Original pull request: #613.
2018-10-05 15:02:38 +02:00
Mark Paluch
7f9c1bd774 DATAMONGO-2061 - After release cleanups. 2018-09-21 07:46:17 -04:00
Mark Paluch
670a0978da DATAMONGO-2061 - Prepare next development iteration. 2018-09-21 07:46:16 -04:00
188 changed files with 2947 additions and 13143 deletions

View File

@@ -1,110 +0,0 @@
/*
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
*/
import java.net.*;
import java.io.*;
import java.nio.channels.*;
import java.util.Properties;
public class MavenWrapperDownloader {
/**
* Default URL to download the maven-wrapper.jar from, if no 'downloadUrl' is provided.
*/
private static final String DEFAULT_DOWNLOAD_URL =
"https://repo.maven.apache.org/maven2/io/takari/maven-wrapper/0.4.2/maven-wrapper-0.4.2.jar";
/**
* Path to the maven-wrapper.properties file, which might contain a downloadUrl property to
* use instead of the default one.
*/
private static final String MAVEN_WRAPPER_PROPERTIES_PATH =
".mvn/wrapper/maven-wrapper.properties";
/**
* Path where the maven-wrapper.jar will be saved to.
*/
private static final String MAVEN_WRAPPER_JAR_PATH =
".mvn/wrapper/maven-wrapper.jar";
/**
* Name of the property which should be used to override the default download url for the wrapper.
*/
private static final String PROPERTY_NAME_WRAPPER_URL = "wrapperUrl";
public static void main(String args[]) {
System.out.println("- Downloader started");
File baseDirectory = new File(args[0]);
System.out.println("- Using base directory: " + baseDirectory.getAbsolutePath());
// If the maven-wrapper.properties exists, read it and check if it contains a custom
// wrapperUrl parameter.
File mavenWrapperPropertyFile = new File(baseDirectory, MAVEN_WRAPPER_PROPERTIES_PATH);
String url = DEFAULT_DOWNLOAD_URL;
if(mavenWrapperPropertyFile.exists()) {
FileInputStream mavenWrapperPropertyFileInputStream = null;
try {
mavenWrapperPropertyFileInputStream = new FileInputStream(mavenWrapperPropertyFile);
Properties mavenWrapperProperties = new Properties();
mavenWrapperProperties.load(mavenWrapperPropertyFileInputStream);
url = mavenWrapperProperties.getProperty(PROPERTY_NAME_WRAPPER_URL, url);
} catch (IOException e) {
System.out.println("- ERROR loading '" + MAVEN_WRAPPER_PROPERTIES_PATH + "'");
} finally {
try {
if(mavenWrapperPropertyFileInputStream != null) {
mavenWrapperPropertyFileInputStream.close();
}
} catch (IOException e) {
// Ignore ...
}
}
}
System.out.println("- Downloading from: : " + url);
File outputFile = new File(baseDirectory.getAbsolutePath(), MAVEN_WRAPPER_JAR_PATH);
if(!outputFile.getParentFile().exists()) {
if(!outputFile.getParentFile().mkdirs()) {
System.out.println(
"- ERROR creating output direcrory '" + outputFile.getParentFile().getAbsolutePath() + "'");
}
}
System.out.println("- Downloading to: " + outputFile.getAbsolutePath());
try {
downloadFileFromURL(url, outputFile);
System.out.println("Done");
System.exit(0);
} catch (Throwable e) {
System.out.println("- Error downloading");
e.printStackTrace();
System.exit(1);
}
}
private static void downloadFileFromURL(String urlString, File destination) throws Exception {
URL website = new URL(urlString);
ReadableByteChannel rbc;
rbc = Channels.newChannel(website.openStream());
FileOutputStream fos = new FileOutputStream(destination);
fos.getChannel().transferFrom(rbc, 0, Long.MAX_VALUE);
fos.close();
rbc.close();
}
}

Binary file not shown.

View File

@@ -1 +0,0 @@
distributionUrl=https://repo.maven.apache.org/maven2/org/apache/maven/apache-maven/3.5.4/apache-maven-3.5.4-bin.zip

View File

@@ -1,14 +0,0 @@
FROM openjdk:11-jdk
RUN apt-get update && apt-get install -y apt-transport-https
RUN apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 --recv 9DA31620334BD75D9DCB49F368818C72E52529D4
RUN echo "deb http://repo.mongodb.org/apt/debian stretch/mongodb-org/4.0 main" | tee /etc/apt/sources.list.d/mongodb-org-4.0.list
RUN apt-get update
RUN apt-get install -y mongodb-org=4.0.3 mongodb-org-server=4.0.3 mongodb-org-shell=4.0.3 mongodb-org-mongos=4.0.3 mongodb-org-tools=4.0.3
RUN apt-get clean \
&& rm -rf /var/lib/apt/lists/*

View File

@@ -1,14 +0,0 @@
FROM openjdk:8-jdk
RUN apt-get update && apt-get install -y apt-transport-https
RUN apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 --recv 9DA31620334BD75D9DCB49F368818C72E52529D4
RUN echo "deb http://repo.mongodb.org/apt/debian stretch/mongodb-org/4.0 main" | tee /etc/apt/sources.list.d/mongodb-org-4.0.list
RUN apt-get update
RUN apt-get install -y mongodb-org=4.0.3 mongodb-org-server=4.0.3 mongodb-org-shell=4.0.3 mongodb-org-mongos=4.0.3 mongodb-org-tools=4.0.3
RUN apt-get clean \
&& rm -rf /var/lib/apt/lists/*

View File

@@ -1,39 +0,0 @@
== Running CI tasks locally
Since Concourse is built on top of Docker, it's easy to:
* Debug what went wrong on your local machine.
* Test out a a tweak to your `test.sh` script before sending it out.
* Experiment against a new image before submitting your pull request.
All of these use cases are great reasons to essentially run what Concourse does on your local machine.
IMPORTANT: To do this you must have Docker installed on your machine.
1. `docker run -it --mount type=bind,source="$(pwd)",target=/spring-data-mongodb-github springci/spring-data-8-jdk-with-mongodb /bin/bash`
+
This will launch the Docker image and mount your source code at `spring-data-mongodb-github`.
+
Next, run the `test.sh` script from inside the container:
+
2. `PROFILE=none spring-data-mongodb-github/ci/test.sh`
Since the container is binding to your source, you can make edits from your IDE and continue to run build jobs.
If you need to test the `build.sh` script, do this:
1. `mkdir /tmp/spring-data-mongodb-artifactory`
2. `docker run -it --mount type=bind,source="$(pwd)",target=/spring-data-mongodb-github --mount type=bind,source="/tmp/spring-data-mongodb-artifactory",target=/spring-data-mongodb-artifactory springci/spring-data-8-jdk-with-mongodb /bin/bash`
+
This will launch the Docker image and mount your source code at `spring-data-mongodb-github` and the temporary
artifactory output directory at `spring-data-mongodb-artifactory`.
+
Next, run the `build.sh` script from inside the container:
+
3. `spring-data-mongodb-github/ci/build.sh`
IMPORTANT: `build.sh` doesn't actually push to Artifactory so don't worry about accidentally deploying anything.
It just deploys to a local folder. That way, the `artifactory-resource` later in the pipeline can pick up these artifacts
and deliver them to artifactory.
NOTE: Docker containers can eat up disk space fast! From time to time, run `docker system prune` to clean out old images.

View File

@@ -1,15 +0,0 @@
#!/bin/bash
set -euo pipefail
[[ -d $PWD/maven && ! -d $HOME/.m2 ]] && ln -s $PWD/maven $HOME/.m2
spring_data_mongodb_artifactory=$(pwd)/spring-data-mongodb-artifactory
rm -rf $HOME/.m2/repository/org/springframework/data 2> /dev/null || :
cd spring-data-mongodb-github
./mvnw deploy \
-Dmaven.test.skip=true \
-DaltDeploymentRepository=distribution::default::file://${spring_data_mongodb_artifactory} \

View File

@@ -1,19 +0,0 @@
---
platform: linux
image_resource:
type: docker-image
source:
repository: springci/spring-data-8-jdk-with-mongodb
inputs:
- name: spring-data-mongodb-github
outputs:
- name: spring-data-mongodb-artifactory
caches:
- path: maven
run:
path: spring-data-mongodb-github/ci/build.sh

View File

@@ -1,14 +0,0 @@
#!/bin/bash
set -euo pipefail
mkdir -p /data/db
mongod &
[[ -d $PWD/maven && ! -d $HOME/.m2 ]] && ln -s $PWD/maven $HOME/.m2
rm -rf $HOME/.m2/repository/org/springframework/data/mongodb 2> /dev/null || :
cd spring-data-mongodb-github
./mvnw clean dependency:list test -P${PROFILE} -Dsort

View File

@@ -1,16 +0,0 @@
---
platform: linux
image_resource:
type: docker-image
source:
repository: springci/spring-data-8-jdk-with-mongodb
inputs:
- name: spring-data-mongodb-github
caches:
- path: maven
run:
path: spring-data-mongodb-github/ci/test.sh

286
mvnw vendored
View File

@@ -1,286 +0,0 @@
#!/bin/sh
# ----------------------------------------------------------------------------
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
# ----------------------------------------------------------------------------
# ----------------------------------------------------------------------------
# Maven2 Start Up Batch script
#
# Required ENV vars:
# ------------------
# JAVA_HOME - location of a JDK home dir
#
# Optional ENV vars
# -----------------
# M2_HOME - location of maven2's installed home dir
# MAVEN_OPTS - parameters passed to the Java VM when running Maven
# e.g. to debug Maven itself, use
# set MAVEN_OPTS=-Xdebug -Xrunjdwp:transport=dt_socket,server=y,suspend=y,address=8000
# MAVEN_SKIP_RC - flag to disable loading of mavenrc files
# ----------------------------------------------------------------------------
if [ -z "$MAVEN_SKIP_RC" ] ; then
if [ -f /etc/mavenrc ] ; then
. /etc/mavenrc
fi
if [ -f "$HOME/.mavenrc" ] ; then
. "$HOME/.mavenrc"
fi
fi
# OS specific support. $var _must_ be set to either true or false.
cygwin=false;
darwin=false;
mingw=false
case "`uname`" in
CYGWIN*) cygwin=true ;;
MINGW*) mingw=true;;
Darwin*) darwin=true
# Use /usr/libexec/java_home if available, otherwise fall back to /Library/Java/Home
# See https://developer.apple.com/library/mac/qa/qa1170/_index.html
if [ -z "$JAVA_HOME" ]; then
if [ -x "/usr/libexec/java_home" ]; then
export JAVA_HOME="`/usr/libexec/java_home`"
else
export JAVA_HOME="/Library/Java/Home"
fi
fi
;;
esac
if [ -z "$JAVA_HOME" ] ; then
if [ -r /etc/gentoo-release ] ; then
JAVA_HOME=`java-config --jre-home`
fi
fi
if [ -z "$M2_HOME" ] ; then
## resolve links - $0 may be a link to maven's home
PRG="$0"
# need this for relative symlinks
while [ -h "$PRG" ] ; do
ls=`ls -ld "$PRG"`
link=`expr "$ls" : '.*-> \(.*\)$'`
if expr "$link" : '/.*' > /dev/null; then
PRG="$link"
else
PRG="`dirname "$PRG"`/$link"
fi
done
saveddir=`pwd`
M2_HOME=`dirname "$PRG"`/..
# make it fully qualified
M2_HOME=`cd "$M2_HOME" && pwd`
cd "$saveddir"
# echo Using m2 at $M2_HOME
fi
# For Cygwin, ensure paths are in UNIX format before anything is touched
if $cygwin ; then
[ -n "$M2_HOME" ] &&
M2_HOME=`cygpath --unix "$M2_HOME"`
[ -n "$JAVA_HOME" ] &&
JAVA_HOME=`cygpath --unix "$JAVA_HOME"`
[ -n "$CLASSPATH" ] &&
CLASSPATH=`cygpath --path --unix "$CLASSPATH"`
fi
# For Mingw, ensure paths are in UNIX format before anything is touched
if $mingw ; then
[ -n "$M2_HOME" ] &&
M2_HOME="`(cd "$M2_HOME"; pwd)`"
[ -n "$JAVA_HOME" ] &&
JAVA_HOME="`(cd "$JAVA_HOME"; pwd)`"
# TODO classpath?
fi
if [ -z "$JAVA_HOME" ]; then
javaExecutable="`which javac`"
if [ -n "$javaExecutable" ] && ! [ "`expr \"$javaExecutable\" : '\([^ ]*\)'`" = "no" ]; then
# readlink(1) is not available as standard on Solaris 10.
readLink=`which readlink`
if [ ! `expr "$readLink" : '\([^ ]*\)'` = "no" ]; then
if $darwin ; then
javaHome="`dirname \"$javaExecutable\"`"
javaExecutable="`cd \"$javaHome\" && pwd -P`/javac"
else
javaExecutable="`readlink -f \"$javaExecutable\"`"
fi
javaHome="`dirname \"$javaExecutable\"`"
javaHome=`expr "$javaHome" : '\(.*\)/bin'`
JAVA_HOME="$javaHome"
export JAVA_HOME
fi
fi
fi
if [ -z "$JAVACMD" ] ; then
if [ -n "$JAVA_HOME" ] ; then
if [ -x "$JAVA_HOME/jre/sh/java" ] ; then
# IBM's JDK on AIX uses strange locations for the executables
JAVACMD="$JAVA_HOME/jre/sh/java"
else
JAVACMD="$JAVA_HOME/bin/java"
fi
else
JAVACMD="`which java`"
fi
fi
if [ ! -x "$JAVACMD" ] ; then
echo "Error: JAVA_HOME is not defined correctly." >&2
echo " We cannot execute $JAVACMD" >&2
exit 1
fi
if [ -z "$JAVA_HOME" ] ; then
echo "Warning: JAVA_HOME environment variable is not set."
fi
CLASSWORLDS_LAUNCHER=org.codehaus.plexus.classworlds.launcher.Launcher
# traverses directory structure from process work directory to filesystem root
# first directory with .mvn subdirectory is considered project base directory
find_maven_basedir() {
if [ -z "$1" ]
then
echo "Path not specified to find_maven_basedir"
return 1
fi
basedir="$1"
wdir="$1"
while [ "$wdir" != '/' ] ; do
if [ -d "$wdir"/.mvn ] ; then
basedir=$wdir
break
fi
# workaround for JBEAP-8937 (on Solaris 10/Sparc)
if [ -d "${wdir}" ]; then
wdir=`cd "$wdir/.."; pwd`
fi
# end of workaround
done
echo "${basedir}"
}
# concatenates all lines of a file
concat_lines() {
if [ -f "$1" ]; then
echo "$(tr -s '\n' ' ' < "$1")"
fi
}
BASE_DIR=`find_maven_basedir "$(pwd)"`
if [ -z "$BASE_DIR" ]; then
exit 1;
fi
##########################################################################################
# Extension to allow automatically downloading the maven-wrapper.jar from Maven-central
# This allows using the maven wrapper in projects that prohibit checking in binary data.
##########################################################################################
if [ -r "$BASE_DIR/.mvn/wrapper/maven-wrapper.jar" ]; then
if [ "$MVNW_VERBOSE" = true ]; then
echo "Found .mvn/wrapper/maven-wrapper.jar"
fi
else
if [ "$MVNW_VERBOSE" = true ]; then
echo "Couldn't find .mvn/wrapper/maven-wrapper.jar, downloading it ..."
fi
jarUrl="https://repo.maven.apache.org/maven2/io/takari/maven-wrapper/0.4.2/maven-wrapper-0.4.2.jar"
while IFS="=" read key value; do
case "$key" in (wrapperUrl) jarUrl="$value"; break ;;
esac
done < "$BASE_DIR/.mvn/wrapper/maven-wrapper.properties"
if [ "$MVNW_VERBOSE" = true ]; then
echo "Downloading from: $jarUrl"
fi
wrapperJarPath="$BASE_DIR/.mvn/wrapper/maven-wrapper.jar"
if command -v wget > /dev/null; then
if [ "$MVNW_VERBOSE" = true ]; then
echo "Found wget ... using wget"
fi
wget "$jarUrl" -O "$wrapperJarPath"
elif command -v curl > /dev/null; then
if [ "$MVNW_VERBOSE" = true ]; then
echo "Found curl ... using curl"
fi
curl -o "$wrapperJarPath" "$jarUrl"
else
if [ "$MVNW_VERBOSE" = true ]; then
echo "Falling back to using Java to download"
fi
javaClass="$BASE_DIR/.mvn/wrapper/MavenWrapperDownloader.java"
if [ -e "$javaClass" ]; then
if [ ! -e "$BASE_DIR/.mvn/wrapper/MavenWrapperDownloader.class" ]; then
if [ "$MVNW_VERBOSE" = true ]; then
echo " - Compiling MavenWrapperDownloader.java ..."
fi
# Compiling the Java class
("$JAVA_HOME/bin/javac" "$javaClass")
fi
if [ -e "$BASE_DIR/.mvn/wrapper/MavenWrapperDownloader.class" ]; then
# Running the downloader
if [ "$MVNW_VERBOSE" = true ]; then
echo " - Running MavenWrapperDownloader.java ..."
fi
("$JAVA_HOME/bin/java" -cp .mvn/wrapper MavenWrapperDownloader "$MAVEN_PROJECTBASEDIR")
fi
fi
fi
fi
##########################################################################################
# End of extension
##########################################################################################
export MAVEN_PROJECTBASEDIR=${MAVEN_BASEDIR:-"$BASE_DIR"}
if [ "$MVNW_VERBOSE" = true ]; then
echo $MAVEN_PROJECTBASEDIR
fi
MAVEN_OPTS="$(concat_lines "$MAVEN_PROJECTBASEDIR/.mvn/jvm.config") $MAVEN_OPTS"
# For Cygwin, switch paths to Windows format before running java
if $cygwin; then
[ -n "$M2_HOME" ] &&
M2_HOME=`cygpath --path --windows "$M2_HOME"`
[ -n "$JAVA_HOME" ] &&
JAVA_HOME=`cygpath --path --windows "$JAVA_HOME"`
[ -n "$CLASSPATH" ] &&
CLASSPATH=`cygpath --path --windows "$CLASSPATH"`
[ -n "$MAVEN_PROJECTBASEDIR" ] &&
MAVEN_PROJECTBASEDIR=`cygpath --path --windows "$MAVEN_PROJECTBASEDIR"`
fi
WRAPPER_LAUNCHER=org.apache.maven.wrapper.MavenWrapperMain
exec "$JAVACMD" \
$MAVEN_OPTS \
-classpath "$MAVEN_PROJECTBASEDIR/.mvn/wrapper/maven-wrapper.jar" \
"-Dmaven.home=${M2_HOME}" "-Dmaven.multiModuleProjectDirectory=${MAVEN_PROJECTBASEDIR}" \
${WRAPPER_LAUNCHER} $MAVEN_CONFIG "$@"

161
mvnw.cmd vendored
View File

@@ -1,161 +0,0 @@
@REM ----------------------------------------------------------------------------
@REM Licensed to the Apache Software Foundation (ASF) under one
@REM or more contributor license agreements. See the NOTICE file
@REM distributed with this work for additional information
@REM regarding copyright ownership. The ASF licenses this file
@REM to you under the Apache License, Version 2.0 (the
@REM "License"); you may not use this file except in compliance
@REM with the License. You may obtain a copy of the License at
@REM
@REM http://www.apache.org/licenses/LICENSE-2.0
@REM
@REM Unless required by applicable law or agreed to in writing,
@REM software distributed under the License is distributed on an
@REM "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
@REM KIND, either express or implied. See the License for the
@REM specific language governing permissions and limitations
@REM under the License.
@REM ----------------------------------------------------------------------------
@REM ----------------------------------------------------------------------------
@REM Maven2 Start Up Batch script
@REM
@REM Required ENV vars:
@REM JAVA_HOME - location of a JDK home dir
@REM
@REM Optional ENV vars
@REM M2_HOME - location of maven2's installed home dir
@REM MAVEN_BATCH_ECHO - set to 'on' to enable the echoing of the batch commands
@REM MAVEN_BATCH_PAUSE - set to 'on' to wait for a key stroke before ending
@REM MAVEN_OPTS - parameters passed to the Java VM when running Maven
@REM e.g. to debug Maven itself, use
@REM set MAVEN_OPTS=-Xdebug -Xrunjdwp:transport=dt_socket,server=y,suspend=y,address=8000
@REM MAVEN_SKIP_RC - flag to disable loading of mavenrc files
@REM ----------------------------------------------------------------------------
@REM Begin all REM lines with '@' in case MAVEN_BATCH_ECHO is 'on'
@echo off
@REM set title of command window
title %0
@REM enable echoing my setting MAVEN_BATCH_ECHO to 'on'
@if "%MAVEN_BATCH_ECHO%" == "on" echo %MAVEN_BATCH_ECHO%
@REM set %HOME% to equivalent of $HOME
if "%HOME%" == "" (set "HOME=%HOMEDRIVE%%HOMEPATH%")
@REM Execute a user defined script before this one
if not "%MAVEN_SKIP_RC%" == "" goto skipRcPre
@REM check for pre script, once with legacy .bat ending and once with .cmd ending
if exist "%HOME%\mavenrc_pre.bat" call "%HOME%\mavenrc_pre.bat"
if exist "%HOME%\mavenrc_pre.cmd" call "%HOME%\mavenrc_pre.cmd"
:skipRcPre
@setlocal
set ERROR_CODE=0
@REM To isolate internal variables from possible post scripts, we use another setlocal
@setlocal
@REM ==== START VALIDATION ====
if not "%JAVA_HOME%" == "" goto OkJHome
echo.
echo Error: JAVA_HOME not found in your environment. >&2
echo Please set the JAVA_HOME variable in your environment to match the >&2
echo location of your Java installation. >&2
echo.
goto error
:OkJHome
if exist "%JAVA_HOME%\bin\java.exe" goto init
echo.
echo Error: JAVA_HOME is set to an invalid directory. >&2
echo JAVA_HOME = "%JAVA_HOME%" >&2
echo Please set the JAVA_HOME variable in your environment to match the >&2
echo location of your Java installation. >&2
echo.
goto error
@REM ==== END VALIDATION ====
:init
@REM Find the project base dir, i.e. the directory that contains the folder ".mvn".
@REM Fallback to current working directory if not found.
set MAVEN_PROJECTBASEDIR=%MAVEN_BASEDIR%
IF NOT "%MAVEN_PROJECTBASEDIR%"=="" goto endDetectBaseDir
set EXEC_DIR=%CD%
set WDIR=%EXEC_DIR%
:findBaseDir
IF EXIST "%WDIR%"\.mvn goto baseDirFound
cd ..
IF "%WDIR%"=="%CD%" goto baseDirNotFound
set WDIR=%CD%
goto findBaseDir
:baseDirFound
set MAVEN_PROJECTBASEDIR=%WDIR%
cd "%EXEC_DIR%"
goto endDetectBaseDir
:baseDirNotFound
set MAVEN_PROJECTBASEDIR=%EXEC_DIR%
cd "%EXEC_DIR%"
:endDetectBaseDir
IF NOT EXIST "%MAVEN_PROJECTBASEDIR%\.mvn\jvm.config" goto endReadAdditionalConfig
@setlocal EnableExtensions EnableDelayedExpansion
for /F "usebackq delims=" %%a in ("%MAVEN_PROJECTBASEDIR%\.mvn\jvm.config") do set JVM_CONFIG_MAVEN_PROPS=!JVM_CONFIG_MAVEN_PROPS! %%a
@endlocal & set JVM_CONFIG_MAVEN_PROPS=%JVM_CONFIG_MAVEN_PROPS%
:endReadAdditionalConfig
SET MAVEN_JAVA_EXE="%JAVA_HOME%\bin\java.exe"
set WRAPPER_JAR="%MAVEN_PROJECTBASEDIR%\.mvn\wrapper\maven-wrapper.jar"
set WRAPPER_LAUNCHER=org.apache.maven.wrapper.MavenWrapperMain
set DOWNLOAD_URL="https://repo.maven.apache.org/maven2/io/takari/maven-wrapper/0.4.2/maven-wrapper-0.4.2.jar"
FOR /F "tokens=1,2 delims==" %%A IN (%MAVEN_PROJECTBASEDIR%\.mvn\wrapper\maven-wrapper.properties) DO (
IF "%%A"=="wrapperUrl" SET DOWNLOAD_URL=%%B
)
@REM Extension to allow automatically downloading the maven-wrapper.jar from Maven-central
@REM This allows using the maven wrapper in projects that prohibit checking in binary data.
if exist %WRAPPER_JAR% (
echo Found %WRAPPER_JAR%
) else (
echo Couldn't find %WRAPPER_JAR%, downloading it ...
echo Downloading from: %DOWNLOAD_URL%
powershell -Command "(New-Object Net.WebClient).DownloadFile('%DOWNLOAD_URL%', '%WRAPPER_JAR%')"
echo Finished downloading %WRAPPER_JAR%
)
@REM End of extension
%MAVEN_JAVA_EXE% %JVM_CONFIG_MAVEN_PROPS% %MAVEN_OPTS% %MAVEN_DEBUG_OPTS% -classpath %WRAPPER_JAR% "-Dmaven.multiModuleProjectDirectory=%MAVEN_PROJECTBASEDIR%" %WRAPPER_LAUNCHER% %MAVEN_CONFIG% %*
if ERRORLEVEL 1 goto error
goto end
:error
set ERROR_CODE=1
:end
@endlocal & set ERROR_CODE=%ERROR_CODE%
if not "%MAVEN_SKIP_RC%" == "" goto skipRcPost
@REM check for post script, once with legacy .bat ending and once with .cmd ending
if exist "%HOME%\mavenrc_post.bat" call "%HOME%\mavenrc_post.bat"
if exist "%HOME%\mavenrc_post.cmd" call "%HOME%\mavenrc_post.cmd"
:skipRcPost
@REM pause the script if MAVEN_BATCH_PAUSE is set to 'on'
if "%MAVEN_BATCH_PAUSE%" == "on" pause
if "%MAVEN_TERMINATE_CMD%" == "on" exit %ERROR_CODE%
exit /B %ERROR_CODE%

33
pom.xml
View File

@@ -5,7 +5,7 @@
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.2.0.M2</version>
<version>2.1.5.RELEASE</version>
<packaging>pom</packaging>
<name>Spring Data MongoDB</name>
@@ -15,7 +15,7 @@
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>2.2.0.M2</version>
<version>2.1.5.RELEASE</version>
</parent>
<modules>
@@ -27,9 +27,9 @@
<properties>
<project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>2.2.0.M2</springdata.commons>
<mongo>3.10.1</mongo>
<mongo.reactivestreams>1.11.0</mongo.reactivestreams>
<springdata.commons>2.1.5.RELEASE</springdata.commons>
<mongo>3.8.2</mongo>
<mongo.reactivestreams>1.9.2</mongo.reactivestreams>
<jmh.version>1.19</jmh.version>
</properties>
@@ -137,6 +137,25 @@
<module>spring-data-mongodb-benchmarks</module>
</modules>
</profile>
<profile>
<id>distribute</id>
<build>
<plugins>
<plugin>
<groupId>org.asciidoctor</groupId>
<artifactId>asciidoctor-maven-plugin</artifactId>
<configuration>
<attributes>
<mongo-reactivestreams>${mongo.reactivestreams}</mongo-reactivestreams>
<reactor>${reactor}</reactor>
</attributes>
</configuration>
</plugin>
</plugins>
</build>
</profile>
</profiles>
<dependencies>
@@ -150,8 +169,8 @@
<repositories>
<repository>
<id>spring-libs-milestone</id>
<url>https://repo.spring.io/libs-milestone</url>
<id>spring-libs-release</id>
<url>https://repo.spring.io/libs-release</url>
</repository>
</repositories>

View File

@@ -7,7 +7,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.2.0.M2</version>
<version>2.1.5.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -87,7 +87,6 @@
<plugin>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<useSystemClassLoader>false</useSystemClassLoader>
<testSourceDirectory>${project.build.sourceDirectory}</testSourceDirectory>
<testClassesDirectory>${project.build.outputDirectory}</testClassesDirectory>
<excludes>

View File

@@ -6,7 +6,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.2.0.M2</version>
<version>2.1.5.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -50,7 +50,7 @@
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>2.2.0.M2</version>
<version>2.1.5.RELEASE</version>
</dependency>
<!-- reactive -->

View File

@@ -1,6 +1,5 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
@@ -14,7 +13,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.2.0.M2</version>
<version>2.1.5.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -36,15 +35,8 @@
<plugin>
<groupId>org.asciidoctor</groupId>
<artifactId>asciidoctor-maven-plugin</artifactId>
<configuration>
<attributes>
<mongo-reactivestreams>${mongo.reactivestreams}</mongo-reactivestreams>
<reactor>${reactor}</reactor>
</attributes>
</configuration>
</plugin>
</plugins>
</build>
</project>

View File

@@ -11,7 +11,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.2.0.M2</version>
<version>2.1.5.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -83,14 +83,14 @@
<!-- reactive -->
<dependency>
<groupId>org.mongodb</groupId>
<groupId>org.mongodb</groupId>
<artifactId>mongodb-driver-reactivestreams</artifactId>
<version>${mongo.reactivestreams}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.mongodb</groupId>
<groupId>org.mongodb</groupId>
<artifactId>mongodb-driver-async</artifactId>
<version>${mongo}</version>
<optional>true</optional>
@@ -107,7 +107,7 @@
</dependency>
<dependency>
<groupId>io.projectreactor</groupId>
<groupId>io.projectreactor</groupId>
<artifactId>reactor-core</artifactId>
<optional>true</optional>
</dependency>
@@ -119,14 +119,14 @@
</dependency>
<dependency>
<groupId>io.reactivex</groupId>
<groupId>io.reactivex</groupId>
<artifactId>rxjava</artifactId>
<version>${rxjava}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>io.reactivex</groupId>
<groupId>io.reactivex</groupId>
<artifactId>rxjava-reactive-streams</artifactId>
<version>${rxjava-reactive-streams}</version>
<optional>true</optional>
@@ -264,35 +264,41 @@
<dependency>
<groupId>org.jetbrains.kotlin</groupId>
<artifactId>kotlin-stdlib</artifactId>
<version>${kotlin}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.jetbrains.kotlin</groupId>
<artifactId>kotlin-reflect</artifactId>
<version>${kotlin}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.jetbrains.kotlinx</groupId>
<artifactId>kotlinx-coroutines-core</artifactId>
<version>${kotlin-coroutines}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.jetbrains.kotlinx</groupId>
<artifactId>kotlinx-coroutines-reactor</artifactId>
<version>${kotlin-coroutines}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>io.mockk</groupId>
<artifactId>mockk</artifactId>
<version>${mockk}</version>
<groupId>org.jetbrains.kotlin</groupId>
<artifactId>kotlin-test</artifactId>
<version>${kotlin}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.nhaarman</groupId>
<artifactId>mockito-kotlin</artifactId>
<version>1.5.0</version>
<scope>test</scope>
<exclusions>
<exclusion>
<groupId>org.jetbrains.kotlin</groupId>
<artifactId>kotlin-stdlib</artifactId>
</exclusion>
<exclusion>
<groupId>org.jetbrains.kotlin</groupId>
<artifactId>kotlin-reflect</artifactId>
</exclusion>
<exclusion>
<groupId>org.mockito</groupId>
<artifactId>mockito-core</artifactId>
</exclusion>
</exclusions>
</dependency>
</dependencies>
@@ -329,7 +335,6 @@
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<useSystemClassLoader>false</useSystemClassLoader>
<useFile>false</useFile>
<includes>
<include>**/*Tests.java</include>

View File

@@ -87,7 +87,6 @@ public abstract class MongoConfigurationSupport {
mappingContext.setInitialEntitySet(getInitialEntitySet());
mappingContext.setSimpleTypeHolder(customConversions().getSimpleTypeHolder());
mappingContext.setFieldNamingStrategy(fieldNamingStrategy());
mappingContext.setAutoIndexCreation(autoIndexCreation());
return mappingContext;
}
@@ -191,16 +190,4 @@ public abstract class MongoConfigurationSupport {
return abbreviateFieldNames() ? new CamelCaseAbbreviatingFieldNamingStrategy()
: PropertyNameFieldNamingStrategy.INSTANCE;
}
/**
* Configure whether to automatically create indices for domain types by deriving the
* {@link org.springframework.data.mongodb.core.index.IndexDefinition} from the entity or not.
*
* @return {@literal true} by default. <br />
* <strong>INFO</strong>: As of 3.x the default will be set to {@literal false}.
* @since 2.2
*/
protected boolean autoIndexCreation() {
return true;
}
}

View File

@@ -17,7 +17,6 @@ package org.springframework.data.mongodb.config;
import java.beans.PropertyEditorSupport;
import java.io.UnsupportedEncodingException;
import java.lang.reflect.Method;
import java.net.URLDecoder;
import java.util.ArrayList;
import java.util.Arrays;
@@ -27,7 +26,6 @@ import java.util.regex.Matcher;
import java.util.regex.Pattern;
import org.springframework.lang.Nullable;
import org.springframework.util.ReflectionUtils;
import org.springframework.util.StringUtils;
import com.mongodb.MongoCredential;
@@ -80,23 +78,12 @@ public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
verifyUserNamePresent(userNameAndPassword);
credentials.add(MongoCredential.createGSSAPICredential(userNameAndPassword[0]));
} else if ("MONGODB-CR".equals(authMechanism)) {
} else if (MongoCredential.MONGODB_CR_MECHANISM.equals(authMechanism)) {
verifyUsernameAndPasswordPresent(userNameAndPassword);
verifyDatabasePresent(database);
Method createCRCredentialMethod = ReflectionUtils.findMethod(MongoCredential.class,
"createMongoCRCredential", String.class, String.class, char[].class);
if (createCRCredentialMethod == null) {
throw new IllegalArgumentException("MONGODB-CR is no longer supported.");
}
MongoCredential credential = MongoCredential.class
.cast(ReflectionUtils.invokeMethod(createCRCredentialMethod, null, userNameAndPassword[0], database,
userNameAndPassword[1].toCharArray()));
credentials.add(credential);
credentials.add(MongoCredential.createMongoCRCredential(userNameAndPassword[0], database,
userNameAndPassword[1].toCharArray()));
} else if (MongoCredential.MONGODB_X509_MECHANISM.equals(authMechanism)) {
verifyUserNamePresent(userNameAndPassword);

View File

@@ -20,7 +20,6 @@ import lombok.EqualsAndHashCode;
import java.time.Instant;
import java.util.concurrent.atomic.AtomicReferenceFieldUpdater;
import org.bson.BsonTimestamp;
import org.bson.BsonValue;
import org.bson.Document;
import org.springframework.data.mongodb.core.convert.MongoConverter;
@@ -85,19 +84,8 @@ public class ChangeStreamEvent<T> {
@Nullable
public Instant getTimestamp() {
return getBsonTimestamp() != null ? converter.getConversionService().convert(raw.getClusterTime(), Instant.class)
: null;
}
/**
* Get the {@link ChangeStreamDocument#getClusterTime() cluster time}.
*
* @return can be {@literal null}.
* @since 2.2
*/
@Nullable
public BsonTimestamp getBsonTimestamp() {
return raw != null ? raw.getClusterTime() : null;
return raw != null && raw.getClusterTime() != null
? converter.getConversionService().convert(raw.getClusterTime(), Instant.class) : null;
}
/**

View File

@@ -21,15 +21,12 @@ import java.time.Instant;
import java.util.Arrays;
import java.util.Optional;
import org.bson.BsonTimestamp;
import org.bson.BsonValue;
import org.bson.Document;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.query.Collation;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import org.springframework.util.ObjectUtils;
import com.mongodb.client.model.changestream.ChangeStreamDocument;
import com.mongodb.client.model.changestream.FullDocument;
@@ -50,7 +47,7 @@ public class ChangeStreamOptions {
private @Nullable BsonValue resumeToken;
private @Nullable FullDocument fullDocumentLookup;
private @Nullable Collation collation;
private @Nullable Object resumeTimestamp;
private @Nullable Instant resumeTimestamp;
protected ChangeStreamOptions() {}
@@ -86,15 +83,7 @@ public class ChangeStreamOptions {
* @return {@link Optional#empty()} if not set.
*/
public Optional<Instant> getResumeTimestamp() {
return Optional.ofNullable(resumeTimestamp).map(timestamp -> asTimestampOfType(timestamp, Instant.class));
}
/**
* @return {@link Optional#empty()} if not set.
* @since 2.2
*/
public Optional<BsonTimestamp> getResumeBsonTimestamp() {
return Optional.ofNullable(resumeTimestamp).map(timestamp -> asTimestampOfType(timestamp, BsonTimestamp.class));
return Optional.ofNullable(resumeTimestamp);
}
/**
@@ -114,29 +103,6 @@ public class ChangeStreamOptions {
return new ChangeStreamOptionsBuilder();
}
private static <T> T asTimestampOfType(Object timestamp, Class<T> targetType) {
return targetType.cast(doGetTimestamp(timestamp, targetType));
}
private static <T> Object doGetTimestamp(Object timestamp, Class<T> targetType) {
if (ClassUtils.isAssignableValue(targetType, timestamp)) {
return timestamp;
}
if (timestamp instanceof Instant) {
return new BsonTimestamp((int) ((Instant) timestamp).getEpochSecond(), 0);
}
if (timestamp instanceof BsonTimestamp) {
return Instant.ofEpochSecond(((BsonTimestamp) timestamp).getTime());
}
throw new IllegalArgumentException(
"o_O that should actually not happen. The timestamp should be an Instant or a BsonTimestamp but was "
+ ObjectUtils.nullSafeClassName(timestamp));
}
/**
* Builder for creating {@link ChangeStreamOptions}.
*
@@ -149,7 +115,7 @@ public class ChangeStreamOptions {
private @Nullable BsonValue resumeToken;
private @Nullable FullDocument fullDocumentLookup;
private @Nullable Collation collation;
private @Nullable Object resumeTimestamp;
private @Nullable Instant resumeTimestamp;
private ChangeStreamOptionsBuilder() {}
@@ -258,21 +224,6 @@ public class ChangeStreamOptions {
return this;
}
/**
* Set the cluster time to resume from.
*
* @param resumeTimestamp must not be {@literal null}.
* @return this.
* @since 2.2
*/
public ChangeStreamOptionsBuilder resumeAt(BsonTimestamp resumeTimestamp) {
Assert.notNull(resumeTimestamp, "ResumeTimestamp must not be null!");
this.resumeTimestamp = resumeTimestamp;
return this;
}
/**
* @return the built {@link ChangeStreamOptions}
*/

View File

@@ -38,10 +38,11 @@ import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import org.springframework.util.LinkedMultiValueMap;
import org.springframework.util.MultiValueMap;
import com.mongodb.util.JSONParseException;
/**
* Common operations performed on an entity in the context of it's mapping metadata.
*
@@ -164,15 +165,8 @@ class EntityOperations {
try {
return Document.parse(source);
} catch (org.bson.json.JsonParseException o_O) {
} catch (JSONParseException | org.bson.json.JsonParseException o_O) {
throw new MappingException("Could not parse given String to save into a JSON document!", o_O);
} catch (RuntimeException o_O) {
// legacy 3.x exception
if (ClassUtils.matchesTypeName(o_O.getClass(), "JSONParseException")) {
throw new MappingException("Could not parse given String to save into a JSON document!", o_O);
}
throw o_O;
}
}
@@ -205,16 +199,6 @@ class EntityOperations {
*/
Query getByIdQuery();
/**
* Returns the {@link Query} to remove an entity by its {@literal id} and if applicable {@literal version}.
*
* @return the {@link Query} to use for removing the entity. Never {@literal null}.
* @since 2.2
*/
default Query getRemoveByQuery() {
return isVersionedEntity() ? getQueryForVersion() : getByIdQuery();
}
/**
* Returns the {@link Query} to find the entity in its current version.
*
@@ -245,11 +229,9 @@ class EntityOperations {
}
/**
* Returns the value of the version if the entity {@link #isVersionedEntity() has a version property}.
* Returns the value of the version if the entity has a version property, {@literal null} otherwise.
*
* @return the entity version. Can be {@literal null}.
* @throws IllegalStateException if the entity does not define a {@literal version} property. Make sure to check
* {@link #isVersionedEntity()}.
* @return
*/
@Nullable
Object getVersion();
@@ -305,8 +287,8 @@ class EntityOperations {
/**
* Returns the current version value if the entity has a version property.
*
* @return the current version or {@literal null} in case it's uninitialized.
* @throws IllegalStateException if the entity does not define a {@literal version} property.
* @return the current version or {@literal null} in case it's uninitialized or the entity doesn't expose a version
* property.
*/
@Nullable
Number getVersion();
@@ -508,10 +490,10 @@ class EntityOperations {
public Query getQueryForVersion() {
MongoPersistentProperty idProperty = entity.getRequiredIdProperty();
MongoPersistentProperty versionProperty = entity.getRequiredVersionProperty();
MongoPersistentProperty property = entity.getRequiredVersionProperty();
return new Query(Criteria.where(idProperty.getName()).is(getId())//
.and(versionProperty.getName()).is(getVersion()));
.and(property.getName()).is(getVersion()));
}
/*

View File

@@ -92,7 +92,7 @@ public class MappedDocument {
* mapped to the specific domain type.
*
* @author Christoph Strobl
* @since 2.2
* @since 2.1.4
*/
class MappedUpdate implements UpdateDefinition {
@@ -137,14 +137,5 @@ public class MappedDocument {
public Boolean isIsolated() {
return delegate.isIsolated();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.UpdateDefinition#getArrayFilters()
*/
@Override
public List<ArrayFilter> getArrayFilters() {
return delegate.getArrayFilters();
}
}
}

View File

@@ -41,8 +41,7 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
private static final MongoClientOptions DEFAULT_MONGO_OPTIONS = MongoClientOptions.builder().build();
// TODO: Mongo Driver 4 - use application name insetad of description if not available
private @Nullable String description = DEFAULT_MONGO_OPTIONS.getApplicationName();
private @Nullable String description = DEFAULT_MONGO_OPTIONS.getDescription();
private int minConnectionsPerHost = DEFAULT_MONGO_OPTIONS.getMinConnectionsPerHost();
private int connectionsPerHost = DEFAULT_MONGO_OPTIONS.getConnectionsPerHost();
private int threadsAllowedToBlockForConnectionMultiplier = DEFAULT_MONGO_OPTIONS
@@ -52,8 +51,6 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
private int maxConnectionLifeTime = DEFAULT_MONGO_OPTIONS.getMaxConnectionLifeTime();
private int connectTimeout = DEFAULT_MONGO_OPTIONS.getConnectTimeout();
private int socketTimeout = DEFAULT_MONGO_OPTIONS.getSocketTimeout();
// TODO: Mongo Driver 4 - check if available
private boolean socketKeepAlive = DEFAULT_MONGO_OPTIONS.isSocketKeepAlive();
private @Nullable ReadPreference readPreference = DEFAULT_MONGO_OPTIONS.getReadPreference();
private DBDecoderFactory dbDecoderFactory = DEFAULT_MONGO_OPTIONS.getDbDecoderFactory();
@@ -61,8 +58,6 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
private @Nullable WriteConcern writeConcern = DEFAULT_MONGO_OPTIONS.getWriteConcern();
private @Nullable SocketFactory socketFactory = DEFAULT_MONGO_OPTIONS.getSocketFactory();
private boolean cursorFinalizerEnabled = DEFAULT_MONGO_OPTIONS.isCursorFinalizerEnabled();
// TODO: Mongo Driver 4 - remove this option
private boolean alwaysUseMBeans = DEFAULT_MONGO_OPTIONS.isAlwaysUseMBeans();
private int heartbeatFrequency = DEFAULT_MONGO_OPTIONS.getHeartbeatFrequency();
private int minHeartbeatFrequency = DEFAULT_MONGO_OPTIONS.getMinHeartbeatFrequency();
@@ -79,7 +74,6 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
*
* @param description
*/
// TODO: Mongo Driver 4 - deprecate that one and add application name
public void setDescription(@Nullable String description) {
this.description = description;
}
@@ -241,7 +235,7 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
}
/**
* This controls if the driver should us an SSL connection. Defaults to {@literal false}.
* This controls if the driver should us an SSL connection. Defaults to |@literal false}.
*
* @param ssl
*/
@@ -291,7 +285,7 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
.cursorFinalizerEnabled(cursorFinalizerEnabled) //
.dbDecoderFactory(dbDecoderFactory) //
.dbEncoderFactory(dbEncoderFactory) //
.applicationName(description) // TODO: Mongo Driver 4 - use application name if description not available
.description(description) //
.heartbeatConnectTimeout(heartbeatConnectTimeout) //
.heartbeatFrequency(heartbeatFrequency) //
.heartbeatSocketTimeout(heartbeatSocketTimeout) //
@@ -303,9 +297,8 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
.readPreference(readPreference) //
.requiredReplicaSetName(requiredReplicaSetName) //
.serverSelectionTimeout(serverSelectionTimeout) //
.sslEnabled(ssl) //
.socketFactory(socketFactoryToUse) // TODO: Mongo Driver 4 - remove if not available
.socketKeepAlive(socketKeepAlive) // TODO: Mongo Driver 4 - remove if not available
.socketFactory(socketFactoryToUse) //
.socketKeepAlive(socketKeepAlive) //
.socketTimeout(socketTimeout) //
.threadsAllowedToBlockForConnectionMultiplier(threadsAllowedToBlockForConnectionMultiplier) //
.writeConcern(writeConcern).build();

View File

@@ -1350,10 +1350,7 @@ public interface MongoOperations extends FluentMongoOperations {
UpdateResult updateMulti(Query query, Update update, Class<?> entityClass, String collectionName);
/**
* Remove the given object from the collection by {@literal id} and (if applicable) its
* {@link org.springframework.data.annotation.Version}. <br />
* Use {@link DeleteResult#getDeletedCount()} for insight whether an {@link DeleteResult#wasAcknowledged()
* acknowledged} remove operation was successful or not.
* Remove the given object from the collection by id.
*
* @param object must not be {@literal null}.
* @return the {@link DeleteResult} which lets you access the results of the previous delete.
@@ -1361,10 +1358,7 @@ public interface MongoOperations extends FluentMongoOperations {
DeleteResult remove(Object object);
/**
* Removes the given object from the given collection by {@literal id} and (if applicable) its
* {@link org.springframework.data.annotation.Version}. <br />
* Use {@link DeleteResult#getDeletedCount()} for insight whether an {@link DeleteResult#wasAcknowledged()
* acknowledged} remove operation was successful or not.
* Removes the given object from the given collection.
*
* @param object must not be {@literal null}.
* @param collectionName name of the collection where the objects will removed, must not be {@literal null} or empty.

View File

@@ -68,16 +68,7 @@ import org.springframework.data.mongodb.core.aggregation.AggregationResults;
import org.springframework.data.mongodb.core.aggregation.Fields;
import org.springframework.data.mongodb.core.aggregation.TypeBasedAggregationOperationContext;
import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
import org.springframework.data.mongodb.core.convert.DbRefResolver;
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver;
import org.springframework.data.mongodb.core.convert.JsonSchemaMapper;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.convert.MongoCustomConversions;
import org.springframework.data.mongodb.core.convert.MongoJsonSchemaMapper;
import org.springframework.data.mongodb.core.convert.MongoWriter;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.convert.UpdateMapper;
import org.springframework.data.mongodb.core.convert.*;
import org.springframework.data.mongodb.core.index.IndexOperations;
import org.springframework.data.mongodb.core.index.IndexOperationsProvider;
import org.springframework.data.mongodb.core.index.MongoMappingEventPublisher;
@@ -104,7 +95,6 @@ import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.mongodb.core.query.UpdateDefinition;
import org.springframework.data.mongodb.core.query.UpdateDefinition.ArrayFilter;
import org.springframework.data.mongodb.core.validation.Validator;
import org.springframework.data.projection.SpelAwareProxyProjectionFactory;
import org.springframework.data.util.CloseableIterator;
@@ -256,14 +246,10 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
mappingContext = this.mongoConverter.getMappingContext();
// We create indexes based on mapping events
if (mappingContext instanceof MongoMappingContext) {
MongoMappingContext mappingContext = (MongoMappingContext) this.mappingContext;
if (mappingContext.isAutoIndexCreation()) {
indexCreator = new MongoPersistentEntityIndexCreator(mappingContext, this);
eventPublisher = new MongoMappingEventPublisher(indexCreator);
mappingContext.setApplicationEventPublisher(eventPublisher);
indexCreator = new MongoPersistentEntityIndexCreator((MongoMappingContext) mappingContext, this);
eventPublisher = new MongoMappingEventPublisher(indexCreator);
if (mappingContext instanceof ApplicationEventPublisherAware) {
((ApplicationEventPublisherAware) mappingContext).setApplicationEventPublisher(eventPublisher);
}
}
}
@@ -973,7 +959,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
Document nearDocument = near.toDocument();
Document command = new Document("geoNear", collection);
command.putAll(queryMapper.getMappedObject(nearDocument, Optional.empty()));
command.putAll(nearDocument);
if (nearDocument.containsKey("query")) {
Document query = (Document) nearDocument.get("query");
@@ -1570,7 +1556,12 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
return doUpdate(collectionName, query, update, entityClass, false, true);
}
protected UpdateResult doUpdate(final String collectionName, final Query query, final UpdateDefinition update,
protected UpdateResult doUpdate(final String collectionName, final Query query, final Update update,
@Nullable final Class<?> entityClass, final boolean upsert, final boolean multi) {
return doUpdate(collectionName, query, (UpdateDefinition) update, entityClass, upsert, multi);
}
private UpdateResult doUpdate(final String collectionName, final Query query, final UpdateDefinition update,
@Nullable final Class<?> entityClass, final boolean upsert, final boolean multi) {
Assert.notNull(collectionName, "CollectionName must not be null!");
@@ -1588,11 +1579,6 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
UpdateOptions opts = new UpdateOptions();
opts.upsert(upsert);
if (update.hasArrayFilters()) {
opts.arrayFilters(
update.getArrayFilters().stream().map(ArrayFilter::asDocument).collect(Collectors.toList()));
}
Document queryObj = new Document();
if (query != null) {
@@ -1601,8 +1587,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
query.getCollation().map(Collation::toMongoCollation).ifPresent(opts::collation);
}
Document updateObj = update instanceof MappedUpdate ? update.getUpdateObject()
: updateMapper.getMappedObject(update.getUpdateObject(), entity);
Document updateObj = update instanceof MappedUpdate ? update.getUpdateObject() : updateMapper.getMappedObject(update.getUpdateObject(), entity);
if (multi && update.isIsolated() && !queryObj.containsKey("$isolated")) {
queryObj.put("$isolated", 1);
@@ -1637,8 +1622,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
});
}
private void increaseVersionForUpdateIfNecessary(@Nullable MongoPersistentEntity<?> persistentEntity,
UpdateDefinition update) {
private void increaseVersionForUpdateIfNecessary(@Nullable MongoPersistentEntity<?> persistentEntity, UpdateDefinition update) {
if (persistentEntity != null && persistentEntity.hasVersionProperty()) {
String versionFieldName = persistentEntity.getRequiredVersionProperty().getFieldName();
@@ -1653,7 +1637,9 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
Assert.notNull(object, "Object must not be null!");
return remove(object, operations.determineCollectionName(object.getClass()));
Query query = operations.forEntity(object).getByIdQuery();
return remove(query, object.getClass());
}
@Override
@@ -1662,7 +1648,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
Assert.notNull(object, "Object must not be null!");
Assert.hasText(collectionName, "Collection name must not be null or empty!");
Query query = operations.forEntity(object).getRemoveByQuery();
Query query = operations.forEntity(object).getByIdQuery();
return doRemove(collectionName, query, object.getClass(), false);
}
@@ -2557,9 +2543,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
collectionName);
}
return executeFindOneInternal(
new FindAndModifyCallback(mappedQuery, fields, sort, mappedUpdate,
update.getArrayFilters().stream().map(ArrayFilter::asDocument).collect(Collectors.toList()), options),
return executeFindOneInternal(new FindAndModifyCallback(mappedQuery, fields, sort, mappedUpdate, options),
new ReadDocumentCallback<>(readerToUse, entityClass, collectionName), collectionName);
}
@@ -2916,16 +2900,14 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
private final Document fields;
private final Document sort;
private final Document update;
private final List<Document> arrayFilters;
private final FindAndModifyOptions options;
public FindAndModifyCallback(Document query, Document fields, Document sort, Document update,
List<Document> arrayFilters, FindAndModifyOptions options) {
FindAndModifyOptions options) {
this.query = query;
this.fields = fields;
this.sort = sort;
this.update = update;
this.arrayFilters = arrayFilters;
this.options = options;
}
@@ -2943,10 +2925,6 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
options.getCollation().map(Collation::toMongoCollation).ifPresent(opts::collation);
if (!arrayFilters.isEmpty()) {
opts.arrayFilters(arrayFilters);
}
return collection.findOneAndUpdate(query, update, opts);
}
}

View File

@@ -31,6 +31,7 @@ import java.util.function.Consumer;
import java.util.function.Function;
import java.util.stream.Collectors;
import org.bson.BsonTimestamp;
import org.bson.BsonValue;
import org.bson.Document;
import org.bson.codecs.Codec;
@@ -40,7 +41,6 @@ import org.reactivestreams.Publisher;
import org.reactivestreams.Subscriber;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.BeansException;
import org.springframework.context.ApplicationContext;
import org.springframework.context.ApplicationContextAware;
@@ -93,7 +93,6 @@ import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.mongodb.core.query.UpdateDefinition;
import org.springframework.data.mongodb.core.query.UpdateDefinition.ArrayFilter;
import org.springframework.data.mongodb.core.validation.Validator;
import org.springframework.data.projection.SpelAwareProxyProjectionFactory;
import org.springframework.data.util.Optionals;
@@ -234,15 +233,12 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
if (this.mappingContext instanceof MongoMappingContext) {
MongoMappingContext mongoMappingContext = (MongoMappingContext) this.mappingContext;
this.indexCreator = new ReactiveMongoPersistentEntityIndexCreator(mongoMappingContext, this::indexOps);
this.eventPublisher = new MongoMappingEventPublisher(this.indexCreatorListener);
if (mongoMappingContext.isAutoIndexCreation()) {
this.indexCreator = new ReactiveMongoPersistentEntityIndexCreator(mongoMappingContext, this::indexOps);
this.eventPublisher = new MongoMappingEventPublisher(this.indexCreatorListener);
mongoMappingContext.setApplicationEventPublisher(this.eventPublisher);
this.mappingContext.getPersistentEntities()
.forEach(entity -> onCheckForIndexes(entity, subscriptionExceptionHandler));
}
mongoMappingContext.setApplicationEventPublisher(this.eventPublisher);
this.mappingContext.getPersistentEntities()
.forEach(entity -> onCheckForIndexes(entity, subscriptionExceptionHandler));
}
}
@@ -1616,7 +1612,12 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
return doUpdate(collectionName, query, update, entityClass, false, true);
}
protected Mono<UpdateResult> doUpdate(String collectionName, Query query, @Nullable UpdateDefinition update,
protected Mono<UpdateResult> doUpdate(String collectionName, Query query, @Nullable Update update,
@Nullable Class<?> entityClass, boolean upsert, boolean multi) {
return doUpdate(collectionName, query, (UpdateDefinition) update, entityClass, upsert, multi);
}
private Mono<UpdateResult> doUpdate(String collectionName, Query query, @Nullable UpdateDefinition update,
@Nullable Class<?> entityClass, boolean upsert, boolean multi) {
MongoPersistentEntity<?> entity = entityClass == null ? null : getPersistentEntity(entityClass);
@@ -1642,11 +1643,6 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
UpdateOptions updateOptions = new UpdateOptions().upsert(upsert);
query.getCollation().map(Collation::toMongoCollation).ifPresent(updateOptions::collation);
if (update.hasArrayFilters()) {
updateOptions.arrayFilters(update.getArrayFilters().stream().map(ArrayFilter::asDocument)
.map(it -> queryMapper.getMappedObject(it, entity)).collect(Collectors.toList()));
}
if (!UpdateMapper.isUpdateObject(updateObj)) {
ReplaceOptions replaceOptions = new ReplaceOptions();
@@ -1695,7 +1691,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
return false;
}
return document.containsKey(persistentEntity.getRequiredVersionProperty().getFieldName());
return document.containsKey(persistentEntity.getRequiredIdProperty().getFieldName());
}
/*
@@ -1724,7 +1720,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
Assert.notNull(object, "Object must not be null!");
return remove(operations.forEntity(object).getRemoveByQuery(), object.getClass());
return remove(operations.forEntity(object).getByIdQuery(), object.getClass());
}
/*
@@ -1736,7 +1732,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
Assert.notNull(object, "Object must not be null!");
Assert.hasText(collectionName, "Collection name must not be null or empty!");
return doRemove(collectionName, operations.forEntity(object).getRemoveByQuery(), object.getClass());
return doRemove(collectionName, operations.forEntity(object).getByIdQuery(), object.getClass());
}
private void assertUpdateableIdIfNotSet(Object value) {
@@ -1794,14 +1790,15 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
Document queryObject = query.getQueryObject();
MongoPersistentEntity<?> entity = getPersistentEntity(entityClass);
Document removeQuery = queryMapper.getMappedObject(queryObject, entity);
return execute(collectionName, collection -> {
maybeEmitEvent(new BeforeDeleteEvent<>(removeQuery, entityClass, collectionName));
Document removeQuey = queryMapper.getMappedObject(queryObject, entity);
maybeEmitEvent(new BeforeDeleteEvent<>(removeQuey, entityClass, collectionName));
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.REMOVE, collectionName, entityClass,
null, removeQuery);
null, removeQuey);
DeleteOptions deleteOptions = new DeleteOptions();
query.getCollation().map(Collation::toMongoCollation).ifPresent(deleteOptions::collation);
@@ -1811,13 +1808,13 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Remove using query: {} in collection: {}.",
new Object[] { serializeToJsonSafely(removeQuery), collectionName });
new Object[] { serializeToJsonSafely(removeQuey), collectionName });
}
if (query.getLimit() > 0 || query.getSkip() > 0) {
FindPublisher<Document> cursor = new QueryFindPublisherPreparer(query, entityClass)
.prepare(collection.find(removeQuery)) //
.prepare(collection.find(removeQuey)) //
.projection(MappedDocument.getIdOnlyProjection());
return Flux.from(cursor) //
@@ -1828,10 +1825,10 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
return collectionToUse.deleteMany(MappedDocument.getIdIn(val), deleteOptions);
});
} else {
return collectionToUse.deleteMany(removeQuery, deleteOptions);
return collectionToUse.deleteMany(removeQuey, deleteOptions);
}
}).doOnNext(it -> maybeEmitEvent(new AfterDeleteEvent<>(queryObject, entityClass, collectionName))) //
}).doOnNext(deleteResult -> maybeEmitEvent(new AfterDeleteEvent<>(queryObject, entityClass, collectionName)))
.next();
}
@@ -1932,7 +1929,8 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
publisher = options.getResumeToken().map(BsonValue::asDocument).map(publisher::resumeAfter).orElse(publisher);
publisher = options.getCollation().map(Collation::toMongoCollation).map(publisher::collation).orElse(publisher);
publisher = options.getResumeBsonTimestamp().map(publisher::startAtOperationTime).orElse(publisher);
publisher = options.getResumeTimestamp().map(it -> new BsonTimestamp((int) it.getEpochSecond(), 0))
.map(publisher::startAtOperationTime).orElse(publisher);
publisher = publisher.fullDocument(options.getFullDocumentLookup().orElse(fullDocument));
return Flux.from(publisher).map(document -> new ChangeStreamEvent<>(document, targetType, getConverter()));
@@ -2146,7 +2144,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
Flux<T> flux = find(query, entityClass, collectionName);
return Flux.from(flux).collectList().filter(it -> !it.isEmpty())
return Flux.from(flux).collectList()
.flatMapMany(list -> Flux.from(remove(operations.getByIdInQuery(list), entityClass, collectionName))
.flatMap(deleteResult -> Flux.fromIterable(list)));
}
@@ -2374,7 +2372,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
collectionName));
}
return executeFindOneInternal(new FindAndModifyCallback(mappedQuery, fields, sort, mappedUpdate, update.getArrayFilters().stream().map(ArrayFilter::asDocument).collect(Collectors.toList()), options),
return executeFindOneInternal(new FindAndModifyCallback(mappedQuery, fields, sort, mappedUpdate, options),
new ReadDocumentCallback<>(this.mongoConverter, entityClass, collectionName), collectionName);
});
}
@@ -2758,7 +2756,6 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
private final Document fields;
private final Document sort;
private final Document update;
private final List<Document> arrayFilters;
private final FindAndModifyOptions options;
@Override
@@ -2774,12 +2771,12 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
return collection.findOneAndDelete(query, findOneAndDeleteOptions);
}
FindOneAndUpdateOptions findOneAndUpdateOptions = convertToFindOneAndUpdateOptions(options, fields, sort, arrayFilters);
FindOneAndUpdateOptions findOneAndUpdateOptions = convertToFindOneAndUpdateOptions(options, fields, sort);
return collection.findOneAndUpdate(query, update, findOneAndUpdateOptions);
}
private static FindOneAndUpdateOptions convertToFindOneAndUpdateOptions(FindAndModifyOptions options, Document fields,
Document sort, List<Document> arrayFilters) {
private FindOneAndUpdateOptions convertToFindOneAndUpdateOptions(FindAndModifyOptions options, Document fields,
Document sort) {
FindOneAndUpdateOptions result = new FindOneAndUpdateOptions();
@@ -2792,7 +2789,6 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
}
result = options.getCollation().map(Collation::toMongoCollation).map(result::collation).orElse(result);
result.arrayFilters(arrayFilters);
return result;
}
@@ -3193,7 +3189,6 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
// Double check type as Spring infrastructure does not consider nested generics
if (entity instanceof MongoPersistentEntity) {
onCheckForIndexes((MongoPersistentEntity<?>) entity, subscriptionExceptionHandler);
}
}

View File

@@ -20,7 +20,6 @@ import java.util.Map.Entry;
import org.bson.Document;
import org.bson.conversions.Bson;
import org.bson.types.ObjectId;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.BeansException;
@@ -515,7 +514,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
if (idProperty != null && !dbObjectAccessor.hasValue(idProperty)) {
Object value = idMapper.convertId(accessor.getProperty(idProperty), idProperty.getFieldType());
Object value = idMapper.convertId(accessor.getProperty(idProperty));
if (value != null) {
dbObjectAccessor.put(idProperty, value);
@@ -619,7 +618,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return;
}
MongoPersistentEntity<?> entity = valueType.isSubTypeOf(prop.getType())
MongoPersistentEntity<?> entity = isSubTypeOf(obj.getClass(), prop.getType())
? mappingContext.getRequiredPersistentEntity(obj.getClass())
: mappingContext.getRequiredPersistentEntity(type);
@@ -976,8 +975,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
throw new MappingException("Cannot create a reference to an object with a NULL id.");
}
return dbRefResolver.createDbRef(property == null ? null : property.getDBRef(), entity,
idMapper.convertId(id, idProperty != null ? idProperty.getFieldType() : ObjectId.class));
return dbRefResolver.createDbRef(property == null ? null : property.getDBRef(), entity, idMapper.convertId(id));
}
throw new MappingException("No id property found on class " + entity.getType());
@@ -1007,8 +1005,9 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
Assert.notNull(targetType, "Target type must not be null!");
Assert.notNull(path, "Object path must not be null!");
Class<?> collectionType = targetType.isSubTypeOf(Collection.class) //
? targetType.getType() //
Class<?> collectionType = targetType.getType();
collectionType = isSubTypeOf(collectionType, Collection.class) //
? collectionType //
: List.class;
TypeInformation<?> componentType = targetType.getComponentType() != null //
@@ -1625,6 +1624,17 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return true;
}
/**
* Returns whether the given type is a sub type of the given reference, i.e. assignable but not the exact same type.
*
* @param type must not be {@literal null}.
* @param reference must not be {@literal null}.
* @return
*/
private static boolean isSubTypeOf(Class<?> type, Class<?> reference) {
return !type.equals(reference) && reference.isAssignableFrom(type);
}
/**
* Marker class used to indicate we have a non root document object here that might be used within an update - so we
* need to preserve type hints for potential nested elements but need to remove it on top level.

View File

@@ -18,8 +18,6 @@ package org.springframework.data.mongodb.core.convert;
import org.bson.BsonValue;
import org.bson.Document;
import org.bson.conversions.Bson;
import org.bson.types.ObjectId;
import org.springframework.core.convert.ConversionException;
import org.springframework.data.convert.EntityConverter;
import org.springframework.data.convert.EntityReader;
import org.springframework.data.convert.TypeMapper;
@@ -85,18 +83,7 @@ public interface MongoConverter
if (sourceDocument.containsKey("$ref") && sourceDocument.containsKey("$id")) {
Object id = sourceDocument.get("$id");
String collection = sourceDocument.getString("$ref");
MongoPersistentEntity<?> entity = getMappingContext().getPersistentEntity(targetType);
if (entity != null && entity.hasIdProperty()) {
id = convertId(id, entity.getIdProperty().getFieldType());
}
DBRef ref = sourceDocument.containsKey("$db") ? new DBRef(sourceDocument.getString("$db"), collection, id)
: new DBRef(collection, id);
sourceDocument = dbRefResolver.fetch(ref);
sourceDocument = dbRefResolver.fetch(new DBRef(sourceDocument.getString("$ref"), sourceDocument.get("$id")));
if (sourceDocument == null) {
return null;
}
@@ -115,38 +102,4 @@ public interface MongoConverter
}
return getConversionService().convert(source, targetType);
}
/**
* Converts the given raw id value into either {@link ObjectId} or {@link String}.
*
* @param id
* @param targetType
* @return {@literal null} if source {@literal id} is already {@literal null}.
* @since 2.2
*/
@Nullable
default Object convertId(@Nullable Object id, Class<?> targetType) {
if (id == null) {
return null;
}
if (ClassUtils.isAssignable(ObjectId.class, targetType)) {
if (id instanceof String) {
if (ObjectId.isValid(id.toString())) {
return new ObjectId(id.toString());
}
}
}
try {
return getConversionService().canConvert(id.getClass(), targetType)
? getConversionService().convert(id, targetType)
: convertToMongoType(id, null);
} catch (ConversionException o_O) {
return convertToMongoType(id, null);
}
}
}

View File

@@ -15,12 +15,9 @@
*/
package org.springframework.data.mongodb.core.convert;
import static org.springframework.data.convert.ConverterBuilder.*;
import java.math.BigDecimal;
import java.math.BigInteger;
import java.net.MalformedURLException;
import java.net.URI;
import java.net.URL;
import java.time.Instant;
import java.util.ArrayList;
@@ -93,8 +90,6 @@ abstract class MongoConverters {
converters.add(BinaryToByteArrayConverter.INSTANCE);
converters.add(BsonTimestampToInstantConverter.INSTANCE);
converters.add(reading(String.class, URI.class, URI::create).andWriting(URI::toString));
return converters;
}

View File

@@ -15,25 +15,14 @@
*/
package org.springframework.data.mongodb.core.convert;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.HashSet;
import java.util.Iterator;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;
import java.util.*;
import java.util.Map.Entry;
import java.util.Optional;
import java.util.Set;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
import org.bson.BsonValue;
import org.bson.Document;
import org.bson.conversions.Bson;
import org.bson.types.ObjectId;
import org.springframework.core.convert.ConversionException;
import org.springframework.core.convert.ConversionService;
import org.springframework.core.convert.converter.Converter;
import org.springframework.data.domain.Example;
@@ -57,7 +46,6 @@ import org.springframework.data.util.TypeInformation;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
@@ -257,16 +245,7 @@ public class QueryMapper {
*/
protected Field createPropertyField(@Nullable MongoPersistentEntity<?> entity, String key,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext) {
if (entity == null) {
return new Field(key);
}
if (Field.ID_KEY.equals(key)) {
return new MetadataBackedField(key, entity, mappingContext, entity.getIdProperty());
}
return new MetadataBackedField(key, entity, mappingContext);
return entity == null ? new Field(key) : new MetadataBackedField(key, entity, mappingContext);
}
/**
@@ -344,11 +323,11 @@ public class QueryMapper {
String inKey = valueDbo.containsField("$in") ? "$in" : "$nin";
List<Object> ids = new ArrayList<Object>();
for (Object id : (Iterable<?>) valueDbo.get(inKey)) {
ids.add(convertId(id, getIdTypeForField(documentField)));
ids.add(convertId(id));
}
resultDbo.put(inKey, ids);
} else if (valueDbo.containsField("$ne")) {
resultDbo.put("$ne", convertId(valueDbo.get("$ne"), getIdTypeForField(documentField)));
resultDbo.put("$ne", convertId(valueDbo.get("$ne")));
} else {
return getMappedObject(resultDbo, Optional.empty());
}
@@ -363,18 +342,18 @@ public class QueryMapper {
String inKey = valueDbo.containsKey("$in") ? "$in" : "$nin";
List<Object> ids = new ArrayList<Object>();
for (Object id : (Iterable<?>) valueDbo.get(inKey)) {
ids.add(convertId(id, getIdTypeForField(documentField)));
ids.add(convertId(id));
}
resultDbo.put(inKey, ids);
} else if (valueDbo.containsKey("$ne")) {
resultDbo.put("$ne", convertId(valueDbo.get("$ne"), getIdTypeForField(documentField)));
resultDbo.put("$ne", convertId(valueDbo.get("$ne")));
} else {
return getMappedObject(resultDbo, Optional.empty());
}
return resultDbo;
} else {
return convertId(value, getIdTypeForField(documentField));
return convertId(value);
}
}
@@ -389,14 +368,6 @@ public class QueryMapper {
return convertSimpleOrDocument(value, documentField.getPropertyEntity());
}
private boolean isIdField(Field documentField) {
return documentField.getProperty() != null && documentField.getProperty().isIdProperty();
}
private Class<?> getIdTypeForField(Field documentField) {
return isIdField(documentField) ? documentField.getProperty().getFieldType() : ObjectId.class;
}
/**
* Returns whether the given {@link Field} represents an association reference that together with the given value
* requires conversion to a {@link org.springframework.data.mongodb.core.mapping.DBRef} object. We check whether the
@@ -517,14 +488,7 @@ public class QueryMapper {
if (source instanceof DBRef) {
DBRef ref = (DBRef) source;
Object id = convertId(ref.getId(),
property != null && property.isIdProperty() ? property.getFieldType() : ObjectId.class);
if (StringUtils.hasText(ref.getDatabaseName())) {
return new DBRef(ref.getDatabaseName(), ref.getCollectionName(), id);
} else {
return new DBRef(ref.getCollectionName(), id);
}
return new DBRef(ref.getCollectionName(), convertId(ref.getId()));
}
if (source instanceof Iterable) {
@@ -605,24 +569,24 @@ public class QueryMapper {
*
* @param id
* @return
* @since 2.2
*/
@Nullable
public Object convertId(@Nullable Object id) {
return convertId(id, ObjectId.class);
}
/**
* Converts the given raw id value into either {@link ObjectId} or {@link Class targetType}.
*
* @param id can be {@literal null}.
* @param targetType
* @return the converted {@literal id} or {@literal null} if the source was already {@literal null}.
* @since 2.2
*/
@Nullable
public Object convertId(@Nullable Object id, Class<?> targetType) {
return converter.convertId(id, targetType);
if (id == null) {
return null;
}
if (id instanceof String) {
return ObjectId.isValid(id.toString()) ? conversionService.convert(id, ObjectId.class) : id;
}
try {
return conversionService.canConvert(id.getClass(), ObjectId.class) ? conversionService.convert(id, ObjectId.class)
: delegateConvertToMongoType(id, null);
} catch (ConversionException o_O) {
return delegateConvertToMongoType(id, null);
}
}
/**
@@ -770,8 +734,6 @@ public class QueryMapper {
*/
protected static class Field {
protected static final Pattern POSITIONAL_OPERATOR = Pattern.compile("\\$\\[.*\\]");
private static final String ID_KEY = "_id";
protected final String name;
@@ -1044,10 +1006,7 @@ public class QueryMapper {
try {
String rawPath = pathExpression.replaceAll("\\.\\d+", "") //
.replaceAll(POSITIONAL_OPERATOR.pattern(), "");
PropertyPath path = PropertyPath.from(rawPath, entity.getTypeInformation());
PropertyPath path = PropertyPath.from(pathExpression.replaceAll("\\.\\d+", ""), entity.getTypeInformation());
if (isPathToJavaLangClassProperty(path)) {
return null;
@@ -1196,11 +1155,6 @@ public class QueryMapper {
return true;
}
Matcher matcher = POSITIONAL_OPERATOR.matcher(partial);
if (matcher.find()) {
return true;
}
try {
Long.valueOf(partial);
return true;

View File

@@ -289,7 +289,7 @@ public class UpdateMapper extends QueryMapper {
public MetadataBackedUpdateField(MongoPersistentEntity<?> entity, String key,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext) {
super(key.replaceAll("\\.\\$(\\[.*\\])?", ""), entity, mappingContext);
super(key.replaceAll("\\.\\$", ""), entity, mappingContext);
this.key = key;
}

View File

@@ -55,8 +55,7 @@ public @interface CompoundIndex {
/**
* @return
* @see <a href=
* "https://docs.mongodb.org/manual/core/index-unique/">https://docs.mongodb.org/manual/core/index-unique/</a>
* @see <a href="https://docs.mongodb.org/manual/core/index-unique/">https://docs.mongodb.org/manual/core/index-unique/</a>
*/
boolean unique() default false;
@@ -64,15 +63,13 @@ public @interface CompoundIndex {
* If set to true index will skip over any document that is missing the indexed field.
*
* @return
* @see <a href=
* "https://docs.mongodb.org/manual/core/index-sparse/">https://docs.mongodb.org/manual/core/index-sparse/</a>
* @see <a href="https://docs.mongodb.org/manual/core/index-sparse/">https://docs.mongodb.org/manual/core/index-sparse/</a>
*/
boolean sparse() default false;
/**
* @return
* @see <a href=
* "https://docs.mongodb.org/manual/core/index-creation/#index-creation-duplicate-dropping">https://docs.mongodb.org/manual/core/index-creation/#index-creation-duplicate-dropping</a>
* @see <a href="https://docs.mongodb.org/manual/core/index-creation/#index-creation-duplicate-dropping">https://docs.mongodb.org/manual/core/index-creation/#index-creation-duplicate-dropping</a>
* @deprecated since 2.1. No longer supported by MongoDB as of server version 3.0.
*/
@Deprecated
@@ -134,8 +131,7 @@ public @interface CompoundIndex {
* If {@literal true} the index will be created in the background.
*
* @return
* @see <a href=
* "https://docs.mongodb.org/manual/core/indexes/#background-construction">https://docs.mongodb.org/manual/core/indexes/#background-construction</a>
* @see <a href="https://docs.mongodb.org/manual/core/indexes/#background-construction">https://docs.mongodb.org/manual/core/indexes/#background-construction</a>
*/
boolean background() default false;

View File

@@ -15,54 +15,25 @@
*/
package org.springframework.data.mongodb.core.index;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.util.ClassTypeInformation;
import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexResolver.IndexDefinitionHolder;
import org.springframework.data.util.TypeInformation;
import org.springframework.util.Assert;
/**
* {@link IndexResolver} finds those {@link IndexDefinition}s to be created for a given class.
*
* @author Christoph Strobl
* @author Thomas Darimont
* @author Mark Paluch
* @since 1.5
*/
public interface IndexResolver {
interface IndexResolver {
/**
* Creates a new {@link IndexResolver} given {@link MongoMappingContext}.
*
* @param mappingContext must not be {@literal null}.
* @return the new {@link IndexResolver}.
* @since 2.2
*/
static IndexResolver create(MongoMappingContext mappingContext) {
Assert.notNull(mappingContext, "MongoMappingContext must not be null!");
return new MongoPersistentEntityIndexResolver(mappingContext);
}
/**
* Find and create {@link IndexDefinition}s for properties of given {@link TypeInformation}. {@link IndexDefinition}s
* are created for properties and types with {@link Indexed}, {@link CompoundIndexes} or {@link GeoSpatialIndexed}.
* Find and create {@link IndexDefinition}s for properties of given {@link TypeInformation}. {@link IndexDefinition}s are created
* for properties and types with {@link Indexed}, {@link CompoundIndexes} or {@link GeoSpatialIndexed}.
*
* @param typeInformation
* @return Empty {@link Iterable} in case no {@link IndexDefinition} could be resolved for type.
*/
Iterable<? extends IndexDefinition> resolveIndexFor(TypeInformation<?> typeInformation);
/**
* Find and create {@link IndexDefinition}s for properties of given {@link TypeInformation}. {@link IndexDefinition}s
* are created for properties and types with {@link Indexed}, {@link CompoundIndexes} or {@link GeoSpatialIndexed}.
*
* @param entityType
* @return Empty {@link Iterable} in case no {@link IndexDefinition} could be resolved for type.
* @see 2.2
*/
default Iterable<? extends IndexDefinition> resolveIndexFor(Class<?> entityType) {
return resolveIndexFor(ClassTypeInformation.from(entityType));
}
Iterable<? extends IndexDefinitionHolder> resolveIndexFor(TypeInformation<?> typeInformation);
}

View File

@@ -31,7 +31,7 @@ import java.lang.annotation.Target;
* @author Christoph Strobl
* @author Jordi Llach
*/
@Target({ ElementType.ANNOTATION_TYPE, ElementType.FIELD })
@Target({ElementType.ANNOTATION_TYPE, ElementType.FIELD})
@Retention(RetentionPolicy.RUNTIME)
public @interface Indexed {
@@ -39,8 +39,7 @@ public @interface Indexed {
* If set to true reject all documents that contain a duplicate value for the indexed field.
*
* @return
* @see <a href=
* "https://docs.mongodb.org/manual/core/index-unique/">https://docs.mongodb.org/manual/core/index-unique/</a>
* @see <a href="https://docs.mongodb.org/manual/core/index-unique/">https://docs.mongodb.org/manual/core/index-unique/</a>
*/
boolean unique() default false;
@@ -50,15 +49,13 @@ public @interface Indexed {
* If set to true index will skip over any document that is missing the indexed field.
*
* @return
* @see <a href=
* "https://docs.mongodb.org/manual/core/index-sparse/">https://docs.mongodb.org/manual/core/index-sparse/</a>
* @see <a href="https://docs.mongodb.org/manual/core/index-sparse/">https://docs.mongodb.org/manual/core/index-sparse/</a>
*/
boolean sparse() default false;
/**
* @return
* @see <a href=
* "https://docs.mongodb.org/manual/core/index-creation/#index-creation-duplicate-dropping">https://docs.mongodb.org/manual/core/index-creation/#index-creation-duplicate-dropping</a>
* @see <a href="https://docs.mongodb.org/manual/core/index-creation/#index-creation-duplicate-dropping">https://docs.mongodb.org/manual/core/index-creation/#index-creation-duplicate-dropping</a>
* @deprecated since 2.1. No longer supported by MongoDB as of server version 3.0.
*/
@Deprecated
@@ -118,8 +115,7 @@ public @interface Indexed {
* If {@literal true} the index will be created in the background.
*
* @return
* @see <a href=
* "https://docs.mongodb.org/manual/core/indexes/#background-construction">https://docs.mongodb.org/manual/core/indexes/#background-construction</a>
* @see <a href="https://docs.mongodb.org/manual/core/indexes/#background-construction">https://docs.mongodb.org/manual/core/indexes/#background-construction</a>
*/
boolean background() default false;
@@ -127,8 +123,7 @@ public @interface Indexed {
* Configures the number of seconds after which the collection should expire. Defaults to -1 for no expiry.
*
* @return
* @see <a href=
* "https://docs.mongodb.org/manual/tutorial/expire-data/">https://docs.mongodb.org/manual/tutorial/expire-data/</a>
* @see <a href="https://docs.mongodb.org/manual/tutorial/expire-data/">https://docs.mongodb.org/manual/tutorial/expire-data/</a>
*/
int expireAfterSeconds() default -1;
}

View File

@@ -1,84 +0,0 @@
/*
* Copyright 2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.index;
import java.util.Collections;
import java.util.Map;
import java.util.Set;
import java.util.concurrent.ConcurrentHashMap;
import java.util.concurrent.ConcurrentSkipListSet;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
/**
* @author Christoph Strobl
* @since 2.2
*/
class JustOnceLogger {
private static final Map<String, Set<String>> KNOWN_LOGS = new ConcurrentHashMap<>();
private static final String AUTO_INDEX_CREATION_CONFIG_CHANGE;
static {
AUTO_INDEX_CREATION_CONFIG_CHANGE = "Automatic index creation will be disabled by default as of Spring Data MongoDB 3.x."
+ System.lineSeparator()
+ "\tPlease use 'MongoMappingContext#setAutoIndexCreation(boolean)' or override 'MongoConfigurationSupport#autoIndexCreation()' to be explicit."
+ System.lineSeparator()
+ "\tHowever, we recommend setting up indices manually in an application ready block. You may use index derivation there as well."
+ System.lineSeparator() + System.lineSeparator() //
+ "\t> -----------------------------------------------------------------------------------------"
+ System.lineSeparator() //
+ "\t> @EventListener(ApplicationReadyEvent.class)" + System.lineSeparator() //
+ "\t> public void initIndicesAfterStartup() {" + System.lineSeparator() //
+ "\t>" + System.lineSeparator() //
+ "\t> IndexOperations indexOps = mongoTemplate.indexOps(DomainType.class);" + System.lineSeparator()//
+ "\t>" + System.lineSeparator() //
+ "\t> IndexResolver resolver = new MongoPersistentEntityIndexResolver(mongoMappingContext);"
+ System.lineSeparator() //
+ "\t> resolver.resolveIndexFor(DomainType.class).forEach(indexOps::ensureIndex);" + System.lineSeparator() //
+ "\t> }" + System.lineSeparator() //
+ "\t> -----------------------------------------------------------------------------------------"
+ System.lineSeparator();
}
static void logWarnIndexCreationConfigurationChange(String loggerName) {
warnOnce(loggerName, AUTO_INDEX_CREATION_CONFIG_CHANGE);
}
static void warnOnce(String loggerName, String message) {
Logger logger = LoggerFactory.getLogger(loggerName);
if (!logger.isWarnEnabled()) {
return;
}
if (!KNOWN_LOGS.containsKey(loggerName)) {
KNOWN_LOGS.put(loggerName, new ConcurrentSkipListSet<>(Collections.singleton(message)));
logger.warn(message);
} else {
Set<String> messages = KNOWN_LOGS.get(loggerName);
if (messages.contains(message)) {
return;
}
messages.add(message);
logger.warn(message);
}
}
}

View File

@@ -63,13 +63,11 @@ public class MongoPersistentEntityIndexCreator implements ApplicationListener<Ma
/**
* Creates a new {@link MongoPersistentEntityIndexCreator} for the given {@link MongoMappingContext} and
* {@link MongoDbFactory}.
*
* @param mappingContext must not be {@literal null}.
* @param mappingContext must not be {@literal null}.
* @param indexOperationsProvider must not be {@literal null}.
*/
public MongoPersistentEntityIndexCreator(MongoMappingContext mappingContext,
IndexOperationsProvider indexOperationsProvider) {
this(mappingContext, indexOperationsProvider, IndexResolver.create(mappingContext));
public MongoPersistentEntityIndexCreator(MongoMappingContext mappingContext, IndexOperationsProvider indexOperationsProvider) {
this(mappingContext, indexOperationsProvider, new MongoPersistentEntityIndexResolver(mappingContext));
}
/**
@@ -80,8 +78,8 @@ public class MongoPersistentEntityIndexCreator implements ApplicationListener<Ma
* @param mongoDbFactory must not be {@literal null}.
* @param indexResolver must not be {@literal null}.
*/
public MongoPersistentEntityIndexCreator(MongoMappingContext mappingContext,
IndexOperationsProvider indexOperationsProvider, IndexResolver indexResolver) {
public MongoPersistentEntityIndexCreator(MongoMappingContext mappingContext, IndexOperationsProvider indexOperationsProvider,
IndexResolver indexResolver) {
Assert.notNull(mappingContext, "MongoMappingContext must not be null!");
Assert.notNull(indexOperationsProvider, "IndexOperationsProvider must not be null!");
@@ -110,7 +108,6 @@ public class MongoPersistentEntityIndexCreator implements ApplicationListener<Ma
// Double check type as Spring infrastructure does not consider nested generics
if (entity instanceof MongoPersistentEntity) {
checkForIndexes((MongoPersistentEntity<?>) entity);
}
}
@@ -134,16 +131,8 @@ public class MongoPersistentEntityIndexCreator implements ApplicationListener<Ma
private void checkForAndCreateIndexes(MongoPersistentEntity<?> entity) {
if (entity.isAnnotationPresent(Document.class)) {
for (IndexDefinition indexDefinition : indexResolver.resolveIndexFor(entity.getTypeInformation())) {
JustOnceLogger.logWarnIndexCreationConfigurationChange(this.getClass().getName());
IndexDefinitionHolder indexToCreate = indexDefinition instanceof IndexDefinitionHolder
? (IndexDefinitionHolder) indexDefinition
: new IndexDefinitionHolder("", indexDefinition, entity.getCollection());
for (IndexDefinitionHolder indexToCreate : indexResolver.resolveIndexFor(entity.getTypeInformation())) {
createIndex(indexToCreate);
}
}
}
@@ -157,8 +146,8 @@ public class MongoPersistentEntityIndexCreator implements ApplicationListener<Ma
} catch (UncategorizedMongoDbException ex) {
if (ex.getCause() instanceof MongoException
&& MongoDbErrorCodes.isDataIntegrityViolationCode(((MongoException) ex.getCause()).getCode())) {
if (ex.getCause() instanceof MongoException &&
MongoDbErrorCodes.isDataIntegrityViolationCode(((MongoException) ex.getCause()).getCode())) {
IndexInfo existingIndex = fetchIndexInformation(indexDefinition);
String message = "Cannot create index for '%s' in collection '%s' with keys '%s' and options '%s'.";

View File

@@ -63,7 +63,7 @@ public class ReactiveMongoPersistentEntityIndexCreator {
*/
public ReactiveMongoPersistentEntityIndexCreator(MongoMappingContext mappingContext,
ReactiveIndexOperationsProvider operationsProvider) {
this(mappingContext, operationsProvider, IndexResolver.create(mappingContext));
this(mappingContext, operationsProvider, new MongoPersistentEntityIndexResolver(mappingContext));
}
/**
@@ -125,12 +125,7 @@ public class ReactiveMongoPersistentEntityIndexCreator {
List<Mono<?>> publishers = new ArrayList<>();
if (entity.isAnnotationPresent(Document.class)) {
for (IndexDefinition indexDefinition : indexResolver.resolveIndexFor(entity.getTypeInformation())) {
IndexDefinitionHolder indexToCreate = indexDefinition instanceof IndexDefinitionHolder
? (IndexDefinitionHolder) indexDefinition
: new IndexDefinitionHolder("", indexDefinition, entity.getCollection());
for (IndexDefinitionHolder indexToCreate : indexResolver.resolveIndexFor(entity.getTypeInformation())) {
publishers.add(createIndex(indexToCreate));
}
}
@@ -140,8 +135,6 @@ public class ReactiveMongoPersistentEntityIndexCreator {
Mono<String> createIndex(IndexDefinitionHolder indexDefinition) {
JustOnceLogger.logWarnIndexCreationConfigurationChange(this.getClass().getName());
return operationsProvider.indexOps(indexDefinition.getCollection()).ensureIndex(indexDefinition) //
.onErrorResume(ReactiveMongoPersistentEntityIndexCreator::isDataIntegrityViolation,
e -> translateException(e, indexDefinition));

View File

@@ -67,7 +67,8 @@ public class BasicMongoPersistentProperty extends AnnotationBasedPersistentPrope
/**
* Creates a new {@link BasicMongoPersistentProperty}.
*
* @param property
* @param field
* @param propertyDescriptor
* @param owner
* @param simpleTypeHolder
* @param fieldNamingStrategy
@@ -143,32 +144,6 @@ public class BasicMongoPersistentProperty extends AnnotationBasedPersistentPrope
return fieldName;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.MongoPersistentProperty#getFieldType()
*/
@Override
public Class<?> getFieldType() {
if (!isIdProperty()) {
return getType();
}
MongoId idAnnotation = findAnnotation(MongoId.class);
if (idAnnotation == null) {
return FieldType.OBJECT_ID.getJavaClass();
}
FieldType fieldType = idAnnotation.targetType();
if (fieldType == FieldType.IMPLICIT) {
return getType();
}
return fieldType.getJavaClass();
}
/**
* @return true if {@link org.springframework.data.mongodb.core.mapping.Field} having non blank
* {@link org.springframework.data.mongodb.core.mapping.Field#value()} present.

View File

@@ -33,7 +33,6 @@ public class CachingMongoPersistentProperty extends BasicMongoPersistentProperty
private @Nullable boolean dbRefResolved;
private @Nullable DBRef dbref;
private @Nullable String fieldName;
private @Nullable Class<?> fieldType;
private @Nullable Boolean usePropertyAccess;
private @Nullable Boolean isTransient;
@@ -90,20 +89,6 @@ public class CachingMongoPersistentProperty extends BasicMongoPersistentProperty
return this.fieldName;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.BasicMongoPersistentProperty#getFieldType()
*/
@Override
public Class<?> getFieldType() {
if (this.fieldType == null) {
this.fieldType = super.getFieldType();
}
return this.fieldType;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mapping.model.AnnotationBasedPersistentProperty#usePropertyAccess()

View File

@@ -1,65 +0,0 @@
/*
* Copyright 2018-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.mapping;
import org.bson.types.ObjectId;
/**
* Enumeration of field value types that can be used to represent a {@link org.bson.Document} field value. This
* enumeration contains a subset of {@link org.bson.BsonType} that is supported by the mapping and conversion
* components.
* <p/>
* Bson types are identified by a {@code byte} {@link #getBsonType() value}. This enumeration typically returns the
* according bson type value except for {@link #IMPLICIT} which is a marker to derive the field type from a property.
*
* @author Mark Paluch
* @since 2.2
* @see org.bson.BsonType
*/
public enum FieldType {
/**
* Implicit type that is derived from the property value.
*/
IMPLICIT(-1, Object.class), STRING(2, String.class), OBJECT_ID(7, ObjectId.class);
private final int bsonType;
private final Class<?> javaClass;
FieldType(int bsonType, Class<?> javaClass) {
this.bsonType = bsonType;
this.javaClass = javaClass;
}
/**
* Returns the BSON type identifier. Can be {@code -1} if {@link FieldType} maps to a synthetic Bson type.
*
* @return the BSON type identifier. Can be {@code -1} if {@link FieldType} maps to a synthetic Bson type.
*/
public int getBsonType() {
return bsonType;
}
/**
* Returns the Java class used to represent the type.
*
* @return the Java class used to represent the type.
*/
public Class<?> getJavaClass() {
return javaClass;
}
}

View File

@@ -1,61 +0,0 @@
/*
* Copyright 2018-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.mapping;
import java.lang.annotation.ElementType;
import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;
import java.lang.annotation.Target;
import org.springframework.core.annotation.AliasFor;
import org.springframework.data.annotation.Id;
/**
* {@link MongoId} represents a MongoDB specific {@link Id} annotation that allows customizing {@literal id} conversion.
* Id properties use {@link org.springframework.data.mongodb.core.mapping.FieldType#IMPLICIT} as the default
* {@literal id's} target type. This means that the actual property value is used. No conversion attempts to any other
* type are made. <br />
* In contrast to {@link Id &#64;Id}, {@link String} {@literal id's} are stored as the such even when the actual value
* represents a valid {@link org.bson.types.ObjectId#isValid(String) ObjectId hex String}. To trigger {@link String} to
* {@link org.bson.types.ObjectId} conversion use {@link MongoId#targetType() &#64;MongoId(FieldType.OBJECT_ID)}.
*
* @author Christoph Strobl
* @author Mark Paluch
* @since 2.2
*/
@Id
@Retention(RetentionPolicy.RUNTIME)
@Target({ ElementType.FIELD, ElementType.METHOD, ElementType.ANNOTATION_TYPE })
public @interface MongoId {
/**
* @return the preferred id type.
* @see #targetType()
*/
@AliasFor("targetType")
FieldType value() default FieldType.IMPLICIT;
/**
* Get the preferred {@literal _id} type to be used. Defaults to {@link FieldType#IMPLICIT} which uses the property's
* type. If defined different, the given value is attempted to be converted into the desired target type via
* {@link org.springframework.data.mongodb.core.convert.MongoConverter#convertId(Object, Class)}.
*
* @return the preferred {@literal id} type. {@link FieldType#IMPLICIT} by default.
*/
@AliasFor("value")
FieldType targetType() default FieldType.IMPLICIT;
}

View File

@@ -43,7 +43,6 @@ public class MongoMappingContext extends AbstractMappingContext<BasicMongoPersis
private FieldNamingStrategy fieldNamingStrategy = DEFAULT_NAMING_STRATEGY;
private @Nullable ApplicationContext context;
private boolean autoIndexCreation = true;
/**
* Creates a new {@link MongoMappingContext}.
@@ -102,30 +101,4 @@ public class MongoMappingContext extends AbstractMappingContext<BasicMongoPersis
this.context = applicationContext;
}
/**
* Returns whether auto-index creation is enabled or disabled. <br />
* <strong>NOTE:</strong>Index creation should happen at a well-defined time that is ideally controlled by the
* application itself.
*
* @return {@literal true} when auto-index creation is enabled; {@literal false} otherwise.
* @since 2.2
* @see org.springframework.data.mongodb.core.index.Indexed
*/
public boolean isAutoIndexCreation() {
return autoIndexCreation;
}
/**
* Enables/disables auto-index creation. <br />
* <strong>NOTE:</strong>Index creation should happen at a well-defined time that is ideally controlled by the
* application itself.
*
* @param autoCreateIndexes set to {@literal false} to disable auto-index creation.
* @since 2.2
* @see org.springframework.data.mongodb.core.index.Indexed
*/
public void setAutoIndexCreation(boolean autoCreateIndexes) {
this.autoIndexCreation = autoCreateIndexes;
}
}

View File

@@ -38,15 +38,6 @@ public interface MongoPersistentProperty extends PersistentProperty<MongoPersist
*/
String getFieldName();
/**
* Returns the {@link Class Java FieldType} of the field a property is persisted to.
*
* @return
* @since 2.2
* @see FieldType
*/
Class<?> getFieldType();
/**
* Returns the order of the field if defined. Will return -1 if undefined.
*

View File

@@ -115,7 +115,8 @@ class ChangeStreamTask extends CursorReadingTask<ChangeStreamDocument<Document>,
.orElseGet(() -> ClassUtils.isAssignable(Document.class, targetType) ? FullDocument.DEFAULT
: FullDocument.UPDATE_LOOKUP);
startAt = changeStreamOptions.getResumeBsonTimestamp().orElse(null);
startAt = changeStreamOptions.getResumeTimestamp().map(it -> new BsonTimestamp((int) it.getEpochSecond(), 0))
.orElse(null);
}
MongoDatabase db = StringUtils.hasText(options.getDatabaseName())

View File

@@ -26,10 +26,10 @@ import java.util.Map.Entry;
import java.util.regex.Pattern;
import java.util.stream.Collectors;
import org.bson.BSON;
import org.bson.BsonRegularExpression;
import org.bson.Document;
import org.bson.types.Binary;
import org.springframework.data.domain.Example;
import org.springframework.data.geo.Circle;
import org.springframework.data.geo.Point;
@@ -67,20 +67,6 @@ public class Criteria implements CriteriaDefinition {
*/
private static final Object NOT_SET = new Object();
private static final int[] FLAG_LOOKUP = new int[Character.MAX_VALUE];
static {
FLAG_LOOKUP['g'] = 256;
FLAG_LOOKUP['i'] = Pattern.CASE_INSENSITIVE;
FLAG_LOOKUP['m'] = Pattern.MULTILINE;
FLAG_LOOKUP['s'] = Pattern.DOTALL;
FLAG_LOOKUP['c'] = Pattern.CANON_EQ;
FLAG_LOOKUP['x'] = Pattern.COMMENTS;
FLAG_LOOKUP['d'] = Pattern.UNIX_LINES;
FLAG_LOOKUP['t'] = Pattern.LITERAL;
FLAG_LOOKUP['u'] = Pattern.UNICODE_CASE;
}
private @Nullable String key;
private List<Criteria> criteriaChain;
private LinkedHashMap<String, Object> criteria = new LinkedHashMap<String, Object>();
@@ -464,7 +450,7 @@ public class Criteria implements CriteriaDefinition {
Assert.notNull(regex, "Regex string must not be null!");
return Pattern.compile(regex, regexFlags(options));
return Pattern.compile(regex, options == null ? 0 : BSON.regexFlags(options));
}
/**
@@ -919,47 +905,6 @@ public class Criteria implements CriteriaDefinition {
|| (value instanceof GeoCommand && ((GeoCommand) value).getShape() instanceof GeoJson);
}
/**
* Lookup the MongoDB specific flags for a given regex option string.
*
* @param s the Regex option/flag to look up. Can be {@literal null}.
* @return zero if given {@link String} is {@literal null} or empty.
* @since 2.2
*/
private static int regexFlags(@Nullable String s) {
int flags = 0;
if (s == null) {
return flags;
}
for (final char f : s.toLowerCase().toCharArray()) {
flags |= regexFlag(f);
}
return flags;
}
/**
* Lookup the MongoDB specific flags for a given character.
*
* @param c the Regex option/flag to look up.
* @return
* @throws IllegalArgumentException for unknown flags
* @since 2.2
*/
private static int regexFlag(char c) {
int flag = FLAG_LOOKUP[c];
if (flag == 0) {
throw new IllegalArgumentException(String.format("Unrecognized flag [%c]", c));
}
return flag;
}
/**
* MongoDB specific <a href="https://docs.mongodb.com/manual/reference/operator/query-bitwise/">bitwise query
* operators</a> like {@code $bitsAllClear, $bitsAllSet,...} for usage with {@link Criteria#bits()} and {@link Query}.
@@ -1136,7 +1081,7 @@ public class Criteria implements CriteriaDefinition {
/**
* Default implementation of {@link BitwiseCriteriaOperators}.
*
*
* @author Christoph Strobl
* @currentRead Beyond the Shadows - Brent Weeks
*/

View File

@@ -1,159 +0,0 @@
/*
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.query;
import java.math.BigDecimal;
import java.math.RoundingMode;
import org.springframework.data.geo.Distance;
import org.springframework.data.geo.Metric;
import org.springframework.data.geo.Metrics;
/**
* {@link Metric} and {@link Distance} conversions using the metric system.
*
* @author Mark Paluch
* @since 2.2
*/
class MetricConversion {
private static final BigDecimal METERS_MULTIPLIER = new BigDecimal(Metrics.KILOMETERS.getMultiplier())
.multiply(new BigDecimal(1000));
// to achieve a calculation that is accurate to 0.3 meters
private static final int PRECISION = 8;
/**
* Return meters to {@code metric} multiplier.
*
* @param metric
* @return
*/
protected static double getMetersToMetricMultiplier(Metric metric) {
ConversionMultiplier conversionMultiplier = ConversionMultiplier.builder().from(METERS_MULTIPLIER).to(metric)
.build();
return conversionMultiplier.multiplier().doubleValue();
}
/**
* Return {@code distance} in meters.
*
* @param distance
* @return
*/
protected static double getDistanceInMeters(Distance distance) {
return new BigDecimal(distance.getValue()).multiply(getMetricToMetersMultiplier(distance.getMetric()))
.doubleValue();
}
/**
* Return {@code metric} to meters multiplier.
*
* @param metric
* @return
*/
private static BigDecimal getMetricToMetersMultiplier(Metric metric) {
ConversionMultiplier conversionMultiplier = ConversionMultiplier.builder().from(metric).to(METERS_MULTIPLIER)
.build();
return conversionMultiplier.multiplier();
}
/**
* Provides a multiplier to convert between various metrics. Metrics must share the same base scale and provide a
* multiplier to convert between the base scale and its own metric.
*
* @author Mark Paluch
*/
private static class ConversionMultiplier {
private final BigDecimal source;
private final BigDecimal target;
ConversionMultiplier(Number source, Number target) {
if (source instanceof BigDecimal) {
this.source = (BigDecimal) source;
} else {
this.source = new BigDecimal(source.doubleValue());
}
if (target instanceof BigDecimal) {
this.target = (BigDecimal) target;
} else {
this.target = new BigDecimal(target.doubleValue());
}
}
/**
* Returns the multiplier to convert a number from the {@code source} metric to the {@code target} metric.
*
* @return
*/
BigDecimal multiplier() {
return target.divide(source, PRECISION, RoundingMode.HALF_UP);
}
/**
* Creates a new {@link ConversionMultiplierBuilder}.
*
* @return
*/
static ConversionMultiplierBuilder builder() {
return new ConversionMultiplierBuilder();
}
}
/**
* Builder for {@link ConversionMultiplier}.
*
* @author Mark Paluch
*/
private static class ConversionMultiplierBuilder {
private Number from;
private Number to;
ConversionMultiplierBuilder() {}
ConversionMultiplierBuilder from(Number from) {
this.from = from;
return this;
}
ConversionMultiplierBuilder from(Metric from) {
this.from = from.getMultiplier();
return this;
}
ConversionMultiplierBuilder to(Number to) {
this.to = to;
return this;
}
ConversionMultiplierBuilder to(Metric to) {
this.to = to.getMultiplier();
return this;
}
ConversionMultiplier build() {
return new ConversionMultiplier(this.from, this.to);
}
}
}

View File

@@ -24,147 +24,12 @@ import org.springframework.data.geo.Distance;
import org.springframework.data.geo.Metric;
import org.springframework.data.geo.Metrics;
import org.springframework.data.geo.Point;
import org.springframework.data.mongodb.core.geo.GeoJsonPoint;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
/**
* Builder class to build near-queries. <br />
* MongoDB {@code $geoNear} operator allows usage of a {@literal GeoJSON Point} or legacy coordinate pair. Though
* syntactically different, there's no difference between {@code near: [-73.99171, 40.738868]} and {@code near: { type:
* "Point", coordinates: [-73.99171, 40.738868] } } for the MongoDB server<br />
* <br />
* Please note that there is a huge difference in the distance calculation. Using the legacy format (for near) operates
* upon {@literal Radians} on an Earth like sphere, whereas the {@literal GeoJSON} format uses {@literal Meters}. The
* actual type within the document is of no concern at this point.<br />
* To avoid a serious headache make sure to set the {@link Metric} to the desired unit of measure which ensures the
* distance to be calculated correctly.<br />
* <p />
* In other words: <br />
* Assume you've got 5 Documents like the ones below <br />
*
* <pre>
* <code>
* {
* "_id" : ObjectId("5c10f3735d38908db52796a5"),
* "name" : "Penn Station",
* "location" : { "type" : "Point", "coordinates" : [ -73.99408, 40.75057 ] }
* }
* {
* "_id" : ObjectId("5c10f3735d38908db52796a6"),
* "name" : "10gen Office",
* "location" : { "type" : "Point", "coordinates" : [ -73.99171, 40.738868 ] }
* }
* {
* "_id" : ObjectId("5c10f3735d38908db52796a9"),
* "name" : "City Bakery ",
* "location" : { "type" : "Point", "coordinates" : [ -73.992491, 40.738673 ] }
* }
* {
* "_id" : ObjectId("5c10f3735d38908db52796aa"),
* "name" : "Splash Bar",
* "location" : { "type" : "Point", "coordinates" : [ -73.992491, 40.738673 ] }
* }
* {
* "_id" : ObjectId("5c10f3735d38908db52796ab"),
* "name" : "Momofuku Milk Bar",
* "location" : { "type" : "Point", "coordinates" : [ -73.985839, 40.731698 ] }
* }
* </code>
* </pre>
*
* Fetching all Documents within a 400 Meter radius from {@code [-73.99171, 40.738868] } would look like this using
* {@literal GeoJSON}:
*
* <pre>
* <code>
* {
* $geoNear: {
* maxDistance: 400,
* num: 10,
* near: { type: "Point", coordinates: [-73.99171, 40.738868] },
* spherical:true,
* key: "location",
* distanceField: "distance"
* }
* }
*
* </code>
* </pre>
*
* resulting in the following 3 Documents.
*
* <pre>
* <code>
* {
* "_id" : ObjectId("5c10f3735d38908db52796a6"),
* "name" : "10gen Office",
* "location" : { "type" : "Point", "coordinates" : [ -73.99171, 40.738868 ] }
* "distance" : 0.0 // Meters
* }
* {
* "_id" : ObjectId("5c10f3735d38908db52796a9"),
* "name" : "City Bakery ",
* "location" : { "type" : "Point", "coordinates" : [ -73.992491, 40.738673 ] }
* "distance" : 69.3582262492474 // Meters
* }
* {
* "_id" : ObjectId("5c10f3735d38908db52796aa"),
* "name" : "Splash Bar",
* "location" : { "type" : "Point", "coordinates" : [ -73.992491, 40.738673 ] }
* "distance" : 69.3582262492474 // Meters
* }
* </code>
* </pre>
*
* Using legacy coordinate pairs one operates upon radians as discussed before. Assume we use {@link Metrics#KILOMETERS}
* when constructing the geoNear command. The {@link Metric} will make sure the distance multiplier is set correctly, so
* the command is rendered like
*
* <pre>
* <code>
* {
* $geoNear: {
* maxDistance: 0.0000627142377, // 400 Meters
* distanceMultiplier: 6378.137,
* num: 10,
* near: [-73.99171, 40.738868],
* spherical:true,
* key: "location",
* distanceField: "distance"
* }
* }
* </code>
* </pre>
*
* Please note the calculated distance now uses {@literal Kilometers} instead of {@literal Meters} as unit of measure,
* so we need to take it times 1000 to match up to {@literal Meters} as in the {@literal GeoJSON} variant. <br />
* Still as we've been requesting the {@link Distance} in {@link Metrics#KILOMETERS} the {@link Distance#getValue()}
* reflects exactly this.
*
* <pre>
* <code>
* {
* "_id" : ObjectId("5c10f3735d38908db52796a6"),
* "name" : "10gen Office",
* "location" : { "type" : "Point", "coordinates" : [ -73.99171, 40.738868 ] }
* "distance" : 0.0 // Kilometers
* }
* {
* "_id" : ObjectId("5c10f3735d38908db52796a9"),
* "name" : "City Bakery ",
* "location" : { "type" : "Point", "coordinates" : [ -73.992491, 40.738673 ] }
* "distance" : 0.0693586286032982 // Kilometers
* }
* {
* "_id" : ObjectId("5c10f3735d38908db52796aa"),
* "name" : "Splash Bar",
* "location" : { "type" : "Point", "coordinates" : [ -73.992491, 40.738673 ] }
* "distance" : 0.0693586286032982 // Kilometers
* }
* </code>
* </pre>
* Builder class to build near-queries.
*
* @author Oliver Gierke
* @author Thomas Darimont
@@ -224,14 +89,10 @@ public final class NearQuery {
}
/**
* Creates a new {@link NearQuery} starting at the given {@link Point}. <br />
* <strong>NOTE</strong> There is a difference in using {@link Point} versus {@link GeoJsonPoint}. {@link Point}
* values are rendered as coordinate pairs in the legacy format and operate upon radians, whereas the
* {@link GeoJsonPoint} uses according to its specification {@literal meters} as unit of measure. This may lead to
* different results when using a {@link Metrics#NEUTRAL neutral Metric}.
* Creates a new {@link NearQuery} starting at the given {@link Point}.
*
* @param point must not be {@literal null}.
* @return new instance of {@link NearQuery}.
* @return
*/
public static NearQuery near(Point point) {
return near(point, Metrics.NEUTRAL);
@@ -240,15 +101,11 @@ public final class NearQuery {
/**
* Creates a {@link NearQuery} starting near the given {@link Point} using the given {@link Metric} to adapt given
* values to further configuration. E.g. setting a {@link #maxDistance(double)} will be interpreted as a value of the
* initially set {@link Metric}. <br />
* <strong>NOTE</strong> There is a difference in using {@link Point} versus {@link GeoJsonPoint}. {@link Point}
* values are rendered as coordinate pairs in the legacy format and operate upon radians, whereas the
* {@link GeoJsonPoint} uses according to its specification {@literal meters} as unit of measure. This may lead to
* different results when using a {@link Metrics#NEUTRAL neutral Metric}.
* initially set {@link Metric}.
*
* @param point must not be {@literal null}.
* @param metric must not be {@literal null}.
* @return new instance of {@link NearQuery}.
* @return
*/
public static NearQuery near(Point point, Metric metric) {
return new NearQuery(point, metric);
@@ -559,46 +416,25 @@ public final class NearQuery {
}
if (maxDistance != null) {
document.put("maxDistance", getDistanceValueInRadiantsOrMeters(maxDistance));
document.put("maxDistance", maxDistance.getNormalizedValue());
}
if (minDistance != null) {
document.put("minDistance", getDistanceValueInRadiantsOrMeters(minDistance));
document.put("minDistance", minDistance.getNormalizedValue());
}
if (metric != null) {
document.put("distanceMultiplier", getDistanceMultiplier());
document.put("distanceMultiplier", metric.getMultiplier());
}
if (num != null) {
document.put("num", num);
}
if (usesGeoJson()) {
document.put("near", point);
} else {
document.put("near", Arrays.asList(point.getX(), point.getY()));
}
document.put("near", Arrays.asList(point.getX(), point.getY()));
document.put("spherical", spherical ? spherical : usesGeoJson());
document.put("spherical", spherical);
return document;
}
private double getDistanceMultiplier() {
return usesMetricSystem() ? MetricConversion.getMetersToMetricMultiplier(metric) : metric.getMultiplier();
}
private double getDistanceValueInRadiantsOrMeters(Distance distance) {
return usesMetricSystem() ? MetricConversion.getDistanceInMeters(distance) : distance.getNormalizedValue();
}
private boolean usesMetricSystem() {
return usesGeoJson();
}
private boolean usesGeoJson() {
return point instanceof GeoJsonPoint;
}
}

View File

@@ -15,7 +15,6 @@
*/
package org.springframework.data.mongodb.core.query;
import java.util.Arrays;
import java.util.Collection;
import java.util.Collections;
import java.util.Iterator;
@@ -23,10 +22,10 @@ import java.util.LinkedHashMap;
import java.util.Map;
import org.bson.Document;
import org.springframework.core.convert.converter.Converter;
import org.springframework.lang.Nullable;
import org.springframework.util.ObjectUtils;
import com.mongodb.util.JSON;
/**
* Utility methods for JSON serialization.
@@ -119,32 +118,19 @@ public abstract class SerializationUtils {
}
try {
String json = value instanceof Document ? ((Document) value).toJson() : serializeValue(value);
return json.replaceAll("\":", "\" :").replaceAll("\\{\"", "{ \"");
return value instanceof Document ? ((Document) value).toJson() : JSON.serialize(value);
} catch (Exception e) {
if (value instanceof Collection) {
return toString((Collection<?>) value);
} else if (value instanceof Map) {
return toString((Map<?, ?>) value);
} else if (ObjectUtils.isArray(value)) {
return toString(Arrays.asList(ObjectUtils.toObjectArray(value)));
} else {
return String.format("{ \"$java\" : %s }", value.toString());
}
}
}
public static String serializeValue(@Nullable Object value) {
if (value == null) {
return "null";
}
String documentJson = new Document("toBeEncoded", value).toJson();
return documentJson.substring(documentJson.indexOf(':') + 1, documentJson.length() - 1).trim();
}
private static String toString(Map<?, ?> source) {
return iterableToDelimitedString(source.entrySet(), "{ ", " }",
entry -> String.format("\"%s\" : %s", entry.getKey(), serializeToJsonSafely(entry.getValue())));

View File

@@ -15,7 +15,6 @@
*/
package org.springframework.data.mongodb.core.query;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
import java.util.Collections;
@@ -27,7 +26,6 @@ import java.util.Objects;
import java.util.Set;
import org.bson.Document;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.data.domain.Sort;
import org.springframework.data.domain.Sort.Direction;
@@ -60,7 +58,6 @@ public class Update implements UpdateDefinition {
private Set<String> keysToUpdate = new HashSet<>();
private Map<String, Object> modifierOps = new LinkedHashMap<>();
private Map<String, PushOperatorBuilder> pushCommandBuilders = new LinkedHashMap<>(1);
private List<ArrayFilter> arrayFilters = new ArrayList<>();
/**
* Static factory method to create an Update using the provided key
@@ -402,35 +399,6 @@ public class Update implements UpdateDefinition {
return this;
}
/**
* Filter elements in an array that match the given criteria for update. {@link CriteriaDefinition} is passed directly
* to the driver without further type or field mapping.
*
* @param criteria must not be {@literal null}.
* @return this.
* @since 2.2
*/
public Update filterArray(CriteriaDefinition criteria) {
this.arrayFilters.add(criteria::getCriteriaObject);
return this;
}
/**
* Filter elements in an array that match the given criteria for update. {@code expression} is used directly with the
* driver without further further type or field mapping.
*
* @param identifier the positional operator identifier filter criteria name.
* @param expression the positional operator filter expression.
* @return this.
* @since 2.2
*/
public Update filterArray(String identifier, Object expression) {
this.arrayFilters.add(() -> new Document(identifier, expression));
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.UpdateDefinition#isIsolated()
@@ -447,14 +415,6 @@ public class Update implements UpdateDefinition {
return new Document(modifierOps);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.UpdateDefinition#getArrayFilters()
*/
public List<ArrayFilter> getArrayFilters() {
return Collections.unmodifiableList(this.arrayFilters);
}
/**
* This method is not called anymore rather override {@link #addMultiFieldOperation(String, String, Object)}.
*

View File

@@ -15,8 +15,6 @@
*/
package org.springframework.data.mongodb.core.query;
import java.util.List;
import org.bson.Document;
/**
@@ -24,7 +22,7 @@ import org.bson.Document;
*
* @author Christoph Strobl
* @author Mark Paluch
* @since 2.2
* @since 2.1.4
*/
public interface UpdateDefinition {
@@ -55,37 +53,4 @@ public interface UpdateDefinition {
* @param key must not be {@literal null}.
*/
void inc(String key);
/**
* Get the specification which elements to modify in an array field. {@link ArrayFilter} are passed directly to the
* driver without further type or field mapping.
*
* @return never {@literal null}.
* @since 2.2
*/
List<ArrayFilter> getArrayFilters();
/**
* @return {@literal true} if {@link UpdateDefinition} contains {@link #getArrayFilters() array filters}.
* @since 2.2
*/
default boolean hasArrayFilters() {
return !getArrayFilters().isEmpty();
}
/**
* A filter to specify which elements to modify in an array field.
*
* @since 2.2
*/
interface ArrayFilter {
/**
* Get the {@link Document} representation of the filter to apply. The returned {@link Document} is used directly
* with the driver without further type or field mapping.
*
* @return never {@literal null}.
*/
Document asDocument();
}
}

View File

@@ -45,149 +45,122 @@ public class MethodReferenceNode extends ExpressionNode {
Map<String, AggregationMethodReference> map = new HashMap<String, AggregationMethodReference>();
// BOOLEAN OPERATORS
map.put("and", arrayArgRef().forOperator("$and"));
map.put("or", arrayArgRef().forOperator("$or"));
map.put("not", arrayArgRef().forOperator("$not"));
map.put("and", arrayArgumentAggregationMethodReference().forOperator("$and"));
map.put("or", arrayArgumentAggregationMethodReference().forOperator("$or"));
map.put("not", arrayArgumentAggregationMethodReference().forOperator("$not"));
// SET OPERATORS
map.put("setEquals", arrayArgRef().forOperator("$setEquals"));
map.put("setIntersection", arrayArgRef().forOperator("$setIntersection"));
map.put("setUnion", arrayArgRef().forOperator("$setUnion"));
map.put("setDifference", arrayArgRef().forOperator("$setDifference"));
map.put("setEquals", arrayArgumentAggregationMethodReference().forOperator("$setEquals"));
map.put("setIntersection", arrayArgumentAggregationMethodReference().forOperator("$setIntersection"));
map.put("setUnion", arrayArgumentAggregationMethodReference().forOperator("$setUnion"));
map.put("setDifference", arrayArgumentAggregationMethodReference().forOperator("$setDifference"));
// 2nd.
map.put("setIsSubset", arrayArgRef().forOperator("$setIsSubset"));
map.put("anyElementTrue", arrayArgRef().forOperator("$anyElementTrue"));
map.put("allElementsTrue", arrayArgRef().forOperator("$allElementsTrue"));
map.put("setIsSubset", arrayArgumentAggregationMethodReference().forOperator("$setIsSubset"));
map.put("anyElementTrue", arrayArgumentAggregationMethodReference().forOperator("$anyElementTrue"));
map.put("allElementsTrue", arrayArgumentAggregationMethodReference().forOperator("$allElementsTrue"));
// COMPARISON OPERATORS
map.put("cmp", arrayArgRef().forOperator("$cmp"));
map.put("eq", arrayArgRef().forOperator("$eq"));
map.put("gt", arrayArgRef().forOperator("$gt"));
map.put("gte", arrayArgRef().forOperator("$gte"));
map.put("lt", arrayArgRef().forOperator("$lt"));
map.put("lte", arrayArgRef().forOperator("$lte"));
map.put("ne", arrayArgRef().forOperator("$ne"));
map.put("cmp", arrayArgumentAggregationMethodReference().forOperator("$cmp"));
map.put("eq", arrayArgumentAggregationMethodReference().forOperator("$eq"));
map.put("gt", arrayArgumentAggregationMethodReference().forOperator("$gt"));
map.put("gte", arrayArgumentAggregationMethodReference().forOperator("$gte"));
map.put("lt", arrayArgumentAggregationMethodReference().forOperator("$lt"));
map.put("lte", arrayArgumentAggregationMethodReference().forOperator("$lte"));
map.put("ne", arrayArgumentAggregationMethodReference().forOperator("$ne"));
// ARITHMETIC OPERATORS
map.put("abs", singleArgRef().forOperator("$abs"));
map.put("add", arrayArgRef().forOperator("$add"));
map.put("ceil", singleArgRef().forOperator("$ceil"));
map.put("divide", arrayArgRef().forOperator("$divide"));
map.put("exp", singleArgRef().forOperator("$exp"));
map.put("floor", singleArgRef().forOperator("$floor"));
map.put("ln", singleArgRef().forOperator("$ln"));
map.put("log", arrayArgRef().forOperator("$log"));
map.put("log10", singleArgRef().forOperator("$log10"));
map.put("mod", arrayArgRef().forOperator("$mod"));
map.put("multiply", arrayArgRef().forOperator("$multiply"));
map.put("pow", arrayArgRef().forOperator("$pow"));
map.put("sqrt", singleArgRef().forOperator("$sqrt"));
map.put("subtract", arrayArgRef().forOperator("$subtract"));
map.put("trunc", singleArgRef().forOperator("$trunc"));
map.put("abs", singleArgumentAggregationMethodReference().forOperator("$abs"));
map.put("add", arrayArgumentAggregationMethodReference().forOperator("$add"));
map.put("ceil", singleArgumentAggregationMethodReference().forOperator("$ceil"));
map.put("divide", arrayArgumentAggregationMethodReference().forOperator("$divide"));
map.put("exp", singleArgumentAggregationMethodReference().forOperator("$exp"));
map.put("floor", singleArgumentAggregationMethodReference().forOperator("$floor"));
map.put("ln", singleArgumentAggregationMethodReference().forOperator("$ln"));
map.put("log", arrayArgumentAggregationMethodReference().forOperator("$log"));
map.put("log10", singleArgumentAggregationMethodReference().forOperator("$log10"));
map.put("mod", arrayArgumentAggregationMethodReference().forOperator("$mod"));
map.put("multiply", arrayArgumentAggregationMethodReference().forOperator("$multiply"));
map.put("pow", arrayArgumentAggregationMethodReference().forOperator("$pow"));
map.put("sqrt", singleArgumentAggregationMethodReference().forOperator("$sqrt"));
map.put("subtract", arrayArgumentAggregationMethodReference().forOperator("$subtract"));
map.put("trunc", singleArgumentAggregationMethodReference().forOperator("$trunc"));
// STRING OPERATORS
map.put("concat", arrayArgRef().forOperator("$concat"));
map.put("strcasecmp", arrayArgRef().forOperator("$strcasecmp"));
map.put("substr", arrayArgRef().forOperator("$substr"));
map.put("toLower", singleArgRef().forOperator("$toLower"));
map.put("toUpper", singleArgRef().forOperator("$toUpper"));
map.put("indexOfBytes", arrayArgRef().forOperator("$indexOfBytes"));
map.put("indexOfCP", arrayArgRef().forOperator("$indexOfCP"));
map.put("split", arrayArgRef().forOperator("$split"));
map.put("strLenBytes", singleArgRef().forOperator("$strLenBytes"));
map.put("strLenCP", singleArgRef().forOperator("$strLenCP"));
map.put("substrCP", arrayArgRef().forOperator("$substrCP"));
map.put("trim", mapArgRef().forOperator("$trim").mappingParametersTo("input", "chars"));
map.put("ltrim", mapArgRef().forOperator("$ltrim").mappingParametersTo("input", "chars"));
map.put("rtrim", mapArgRef().forOperator("$rtrim").mappingParametersTo("input", "chars"));
map.put("concat", arrayArgumentAggregationMethodReference().forOperator("$concat"));
map.put("strcasecmp", arrayArgumentAggregationMethodReference().forOperator("$strcasecmp"));
map.put("substr", arrayArgumentAggregationMethodReference().forOperator("$substr"));
map.put("toLower", singleArgumentAggregationMethodReference().forOperator("$toLower"));
map.put("toUpper", singleArgumentAggregationMethodReference().forOperator("$toUpper"));
map.put("strcasecmp", arrayArgumentAggregationMethodReference().forOperator("$strcasecmp"));
map.put("indexOfBytes", arrayArgumentAggregationMethodReference().forOperator("$indexOfBytes"));
map.put("indexOfCP", arrayArgumentAggregationMethodReference().forOperator("$indexOfCP"));
map.put("split", arrayArgumentAggregationMethodReference().forOperator("$split"));
map.put("strLenBytes", singleArgumentAggregationMethodReference().forOperator("$strLenBytes"));
map.put("strLenCP", singleArgumentAggregationMethodReference().forOperator("$strLenCP"));
map.put("substrCP", arrayArgumentAggregationMethodReference().forOperator("$substrCP"));
// TEXT SEARCH OPERATORS
map.put("meta", singleArgRef().forOperator("$meta"));
map.put("meta", singleArgumentAggregationMethodReference().forOperator("$meta"));
// ARRAY OPERATORS
map.put("arrayElemAt", arrayArgRef().forOperator("$arrayElemAt"));
map.put("concatArrays", arrayArgRef().forOperator("$concatArrays"));
map.put("filter", mapArgRef().forOperator("$filter") //
map.put("arrayElemAt", arrayArgumentAggregationMethodReference().forOperator("$arrayElemAt"));
map.put("concatArrays", arrayArgumentAggregationMethodReference().forOperator("$concatArrays"));
map.put("filter", mapArgumentAggregationMethodReference().forOperator("$filter") //
.mappingParametersTo("input", "as", "cond"));
map.put("isArray", singleArgRef().forOperator("$isArray"));
map.put("size", singleArgRef().forOperator("$size"));
map.put("slice", arrayArgRef().forOperator("$slice"));
map.put("reverseArray", singleArgRef().forOperator("$reverseArray"));
map.put("reduce", mapArgRef().forOperator("$reduce").mappingParametersTo("input", "initialValue", "in"));
map.put("zip", mapArgRef().forOperator("$zip").mappingParametersTo("inputs", "useLongestLength", "defaults"));
map.put("in", arrayArgRef().forOperator("$in"));
map.put("arrayToObject", singleArgRef().forOperator("$arrayToObject"));
map.put("indexOfArray", arrayArgRef().forOperator("$indexOfArray"));
map.put("range", arrayArgRef().forOperator("$range"));
map.put("isArray", singleArgumentAggregationMethodReference().forOperator("$isArray"));
map.put("size", singleArgumentAggregationMethodReference().forOperator("$size"));
map.put("slice", arrayArgumentAggregationMethodReference().forOperator("$slice"));
map.put("reverseArray", singleArgumentAggregationMethodReference().forOperator("$reverseArray"));
map.put("reduce", mapArgumentAggregationMethodReference().forOperator("$reduce").mappingParametersTo("input",
"initialValue", "in"));
map.put("zip", mapArgumentAggregationMethodReference().forOperator("$zip").mappingParametersTo("inputs",
"useLongestLength", "defaults"));
map.put("in", arrayArgumentAggregationMethodReference().forOperator("$in"));
// VARIABLE OPERATORS
map.put("map", mapArgRef().forOperator("$map") //
map.put("map", mapArgumentAggregationMethodReference().forOperator("$map") //
.mappingParametersTo("input", "as", "in"));
map.put("let", mapArgRef().forOperator("$let").mappingParametersTo("vars", "in"));
map.put("let", mapArgumentAggregationMethodReference().forOperator("$let").mappingParametersTo("vars", "in"));
// LITERAL OPERATORS
map.put("literal", singleArgRef().forOperator("$literal"));
map.put("literal", singleArgumentAggregationMethodReference().forOperator("$literal"));
// DATE OPERATORS
map.put("dayOfYear", singleArgRef().forOperator("$dayOfYear"));
map.put("dayOfMonth", singleArgRef().forOperator("$dayOfMonth"));
map.put("dayOfWeek", singleArgRef().forOperator("$dayOfWeek"));
map.put("year", singleArgRef().forOperator("$year"));
map.put("month", singleArgRef().forOperator("$month"));
map.put("week", singleArgRef().forOperator("$week"));
map.put("hour", singleArgRef().forOperator("$hour"));
map.put("minute", singleArgRef().forOperator("$minute"));
map.put("second", singleArgRef().forOperator("$second"));
map.put("millisecond", singleArgRef().forOperator("$millisecond"));
map.put("dateToString", mapArgRef().forOperator("$dateToString") //
map.put("dayOfYear", singleArgumentAggregationMethodReference().forOperator("$dayOfYear"));
map.put("dayOfMonth", singleArgumentAggregationMethodReference().forOperator("$dayOfMonth"));
map.put("dayOfWeek", singleArgumentAggregationMethodReference().forOperator("$dayOfWeek"));
map.put("year", singleArgumentAggregationMethodReference().forOperator("$year"));
map.put("month", singleArgumentAggregationMethodReference().forOperator("$month"));
map.put("week", singleArgumentAggregationMethodReference().forOperator("$week"));
map.put("hour", singleArgumentAggregationMethodReference().forOperator("$hour"));
map.put("minute", singleArgumentAggregationMethodReference().forOperator("$minute"));
map.put("second", singleArgumentAggregationMethodReference().forOperator("$second"));
map.put("millisecond", singleArgumentAggregationMethodReference().forOperator("$millisecond"));
map.put("dateToString", mapArgumentAggregationMethodReference().forOperator("$dateToString") //
.mappingParametersTo("format", "date"));
map.put("dateFromString", mapArgRef().forOperator("$dateFromString") //
.mappingParametersTo("dateString", "format", "timezone", "onError", "onNull"));
map.put("dateFromParts", mapArgRef().forOperator("$dateFromParts").mappingParametersTo("year", "month", "day",
"hour", "minute", "second", "milliseconds", "timezone"));
map.put("isoDateFromParts", mapArgRef().forOperator("$dateFromParts").mappingParametersTo("isoWeekYear", "isoWeek",
"isoDayOfWeek", "hour", "minute", "second", "milliseconds", "timezone"));
map.put("dateToParts", mapArgRef().forOperator("$dateToParts") //
.mappingParametersTo("date", "timezone", "iso8601"));
map.put("isoDayOfWeek", singleArgRef().forOperator("$isoDayOfWeek"));
map.put("isoWeek", singleArgRef().forOperator("$isoWeek"));
map.put("isoWeekYear", singleArgRef().forOperator("$isoWeekYear"));
map.put("isoDayOfWeek", singleArgumentAggregationMethodReference().forOperator("$isoDayOfWeek"));
map.put("isoWeek", singleArgumentAggregationMethodReference().forOperator("$isoWeek"));
map.put("isoWeekYear", singleArgumentAggregationMethodReference().forOperator("$isoWeekYear"));
// CONDITIONAL OPERATORS
map.put("cond", mapArgRef().forOperator("$cond") //
map.put("cond", mapArgumentAggregationMethodReference().forOperator("$cond") //
.mappingParametersTo("if", "then", "else"));
map.put("ifNull", arrayArgRef().forOperator("$ifNull"));
map.put("ifNull", arrayArgumentAggregationMethodReference().forOperator("$ifNull"));
// GROUP OPERATORS
map.put("sum", arrayArgRef().forOperator("$sum"));
map.put("avg", arrayArgRef().forOperator("$avg"));
map.put("first", singleArgRef().forOperator("$first"));
map.put("last", singleArgRef().forOperator("$last"));
map.put("max", arrayArgRef().forOperator("$max"));
map.put("min", arrayArgRef().forOperator("$min"));
map.put("push", singleArgRef().forOperator("$push"));
map.put("addToSet", singleArgRef().forOperator("$addToSet"));
map.put("stdDevPop", arrayArgRef().forOperator("$stdDevPop"));
map.put("stdDevSamp", arrayArgRef().forOperator("$stdDevSamp"));
map.put("sum", arrayArgumentAggregationMethodReference().forOperator("$sum"));
map.put("avg", arrayArgumentAggregationMethodReference().forOperator("$avg"));
map.put("first", singleArgumentAggregationMethodReference().forOperator("$first"));
map.put("last", singleArgumentAggregationMethodReference().forOperator("$last"));
map.put("max", arrayArgumentAggregationMethodReference().forOperator("$max"));
map.put("min", arrayArgumentAggregationMethodReference().forOperator("$min"));
map.put("push", singleArgumentAggregationMethodReference().forOperator("$push"));
map.put("addToSet", singleArgumentAggregationMethodReference().forOperator("$addToSet"));
map.put("stdDevPop", arrayArgumentAggregationMethodReference().forOperator("$stdDevPop"));
map.put("stdDevSamp", arrayArgumentAggregationMethodReference().forOperator("$stdDevSamp"));
// TYPE OPERATORS
map.put("type", singleArgRef().forOperator("$type"));
// OBJECT OPERATORS
map.put("objectToArray", singleArgRef().forOperator("$objectToArray"));
map.put("mergeObjects", arrayArgRef().forOperator("$mergeObjects"));
// CONVERT OPERATORS
map.put("convert", mapArgRef().forOperator("$convert") //
.mappingParametersTo("input", "to", "onError", "onNull"));
map.put("toBool", singleArgRef().forOperator("$toBool"));
map.put("toDate", singleArgRef().forOperator("$toDate"));
map.put("toDecimal", singleArgRef().forOperator("$toDecimal"));
map.put("toDouble", singleArgRef().forOperator("$toDouble"));
map.put("toInt", singleArgRef().forOperator("$toInt"));
map.put("toLong", singleArgRef().forOperator("$toLong"));
map.put("toObjectId", singleArgRef().forOperator("$toObjectId"));
map.put("toString", singleArgRef().forOperator("$toString"));
map.put("type", singleArgumentAggregationMethodReference().forOperator("$type"));
FUNCTIONS = Collections.unmodifiableMap(map);
}
@@ -199,7 +172,7 @@ public class MethodReferenceNode extends ExpressionNode {
/**
* Returns the name of the method.
*
* @deprecated since 1.10. Please use {@link #getMethodReference()}.
* @Deprecated since 1.10. Please use {@link #getMethodReference()}.
*/
@Nullable
@Deprecated
@@ -282,7 +255,7 @@ public class MethodReferenceNode extends ExpressionNode {
*
* @return never {@literal null}.
*/
static AggregationMethodReference singleArgRef() {
static AggregationMethodReference singleArgumentAggregationMethodReference() {
return new AggregationMethodReference(null, ArgumentType.SINGLE, null);
}
@@ -291,7 +264,7 @@ public class MethodReferenceNode extends ExpressionNode {
*
* @return never {@literal null}.
*/
static AggregationMethodReference arrayArgRef() {
static AggregationMethodReference arrayArgumentAggregationMethodReference() {
return new AggregationMethodReference(null, ArgumentType.ARRAY, null);
}
@@ -300,7 +273,7 @@ public class MethodReferenceNode extends ExpressionNode {
*
* @return never {@literal null}.
*/
static AggregationMethodReference mapArgRef() {
static AggregationMethodReference mapArgumentAggregationMethodReference() {
return new AggregationMethodReference(null, ArgumentType.MAP, null);
}

View File

@@ -1,266 +0,0 @@
/*
* Copyright 2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.gridfs;
import lombok.RequiredArgsConstructor;
import reactor.core.CoreSubscriber;
import reactor.core.publisher.Mono;
import reactor.core.publisher.Operators;
import reactor.util.concurrent.Queues;
import reactor.util.context.Context;
import java.nio.ByteBuffer;
import java.util.Queue;
import java.util.concurrent.atomic.AtomicIntegerFieldUpdater;
import java.util.concurrent.atomic.AtomicLongFieldUpdater;
import java.util.function.BiConsumer;
import org.reactivestreams.Publisher;
import org.reactivestreams.Subscription;
import org.springframework.core.io.buffer.DataBuffer;
import org.springframework.core.io.buffer.DataBufferUtils;
import org.springframework.core.io.buffer.DefaultDataBufferFactory;
import com.mongodb.reactivestreams.client.Success;
import com.mongodb.reactivestreams.client.gridfs.AsyncInputStream;
/**
* Adapter accepting a binary stream {@link Publisher} and emitting its through {@link AsyncInputStream}.
* <p>
* This adapter subscribes to the binary {@link Publisher} as soon as the first chunk gets {@link #read(ByteBuffer)
* requested}. Requests are queued and binary chunks are requested from the {@link Publisher}. As soon as the
* {@link Publisher} emits items, chunks are provided to the read request which completes the number-of-written-bytes
* {@link Publisher}.
* <p>
* {@link AsyncInputStream} is supposed to work as sequential callback API that is called until reaching EOF.
* {@link #close()} is propagated as cancellation signal to the binary {@link Publisher}.
*
* @author Mark Paluch
* @author Christoph Strobl
* @since 2.2
*/
@RequiredArgsConstructor
class AsyncInputStreamAdapter implements AsyncInputStream {
private static final AtomicLongFieldUpdater<AsyncInputStreamAdapter> DEMAND = AtomicLongFieldUpdater
.newUpdater(AsyncInputStreamAdapter.class, "demand");
private static final AtomicIntegerFieldUpdater<AsyncInputStreamAdapter> SUBSCRIBED = AtomicIntegerFieldUpdater
.newUpdater(AsyncInputStreamAdapter.class, "subscribed");
private static final int SUBSCRIPTION_NOT_SUBSCRIBED = 0;
private static final int SUBSCRIPTION_SUBSCRIBED = 1;
private final Publisher<? extends DataBuffer> buffers;
private final Context subscriberContext;
private final DefaultDataBufferFactory factory = new DefaultDataBufferFactory();
private volatile Subscription subscription;
private volatile boolean cancelled;
private volatile boolean complete;
private volatile Throwable error;
private final Queue<BiConsumer<DataBuffer, Integer>> readRequests = Queues.<BiConsumer<DataBuffer, Integer>> small()
.get();
// see DEMAND
volatile long demand;
// see SUBSCRIBED
volatile int subscribed = SUBSCRIPTION_NOT_SUBSCRIBED;
/*
* (non-Javadoc)
* @see com.mongodb.reactivestreams.client.gridfs.AsyncInputStream#read(java.nio.ByteBuffer)
*/
@Override
public Publisher<Integer> read(ByteBuffer dst) {
return Mono.create(sink -> {
readRequests.offer((db, bytecount) -> {
try {
if (error != null) {
sink.error(error);
return;
}
if (bytecount == -1) {
sink.success(-1);
return;
}
ByteBuffer byteBuffer = db.asByteBuffer();
int toWrite = byteBuffer.remaining();
dst.put(byteBuffer);
sink.success(toWrite);
} catch (Exception e) {
sink.error(e);
} finally {
DataBufferUtils.release(db);
}
});
request(1);
});
}
/*
* (non-Javadoc)
* @see com.mongodb.reactivestreams.client.gridfs.AsyncInputStream#skip(long)
*/
@Override
public Publisher<Long> skip(long bytesToSkip) {
throw new UnsupportedOperationException("Skip is currently not implemented");
}
/*
* (non-Javadoc)
* @see com.mongodb.reactivestreams.client.gridfs.AsyncInputStream#close()
*/
@Override
public Publisher<Success> close() {
return Mono.create(sink -> {
cancelled = true;
if (error != null) {
sink.error(error);
return;
}
sink.success(Success.SUCCESS);
});
}
protected void request(int n) {
if (complete) {
terminatePendingReads();
return;
}
Operators.addCap(DEMAND, this, n);
if (SUBSCRIBED.get(this) == SUBSCRIPTION_NOT_SUBSCRIBED) {
if (SUBSCRIBED.compareAndSet(this, SUBSCRIPTION_NOT_SUBSCRIBED, SUBSCRIPTION_SUBSCRIBED)) {
buffers.subscribe(new DataBufferCoreSubscriber());
}
} else {
Subscription subscription = this.subscription;
if (subscription != null) {
requestFromSubscription(subscription);
}
}
}
void requestFromSubscription(Subscription subscription) {
long demand = DEMAND.get(AsyncInputStreamAdapter.this);
if (cancelled) {
subscription.cancel();
}
if (demand > 0 && DEMAND.compareAndSet(AsyncInputStreamAdapter.this, demand, demand - 1)) {
subscription.request(1);
}
}
/**
* Terminates pending reads with empty buffers.
*/
void terminatePendingReads() {
BiConsumer<DataBuffer, Integer> readers;
while ((readers = readRequests.poll()) != null) {
readers.accept(factory.wrap(new byte[0]), -1);
}
}
private class DataBufferCoreSubscriber implements CoreSubscriber<DataBuffer> {
@Override
public Context currentContext() {
return AsyncInputStreamAdapter.this.subscriberContext;
}
@Override
public void onSubscribe(Subscription s) {
AsyncInputStreamAdapter.this.subscription = s;
Operators.addCap(DEMAND, AsyncInputStreamAdapter.this, -1);
s.request(1);
}
@Override
public void onNext(DataBuffer dataBuffer) {
if (cancelled || complete) {
DataBufferUtils.release(dataBuffer);
Operators.onNextDropped(dataBuffer, AsyncInputStreamAdapter.this.subscriberContext);
return;
}
BiConsumer<DataBuffer, Integer> poll = AsyncInputStreamAdapter.this.readRequests.poll();
if (poll == null) {
DataBufferUtils.release(dataBuffer);
Operators.onNextDropped(dataBuffer, AsyncInputStreamAdapter.this.subscriberContext);
subscription.cancel();
return;
}
poll.accept(dataBuffer, dataBuffer.readableByteCount());
requestFromSubscription(subscription);
}
@Override
public void onError(Throwable t) {
if (AsyncInputStreamAdapter.this.cancelled || AsyncInputStreamAdapter.this.complete) {
Operators.onErrorDropped(t, AsyncInputStreamAdapter.this.subscriberContext);
return;
}
AsyncInputStreamAdapter.this.error = t;
AsyncInputStreamAdapter.this.complete = true;
terminatePendingReads();
}
@Override
public void onComplete() {
AsyncInputStreamAdapter.this.complete = true;
terminatePendingReads();
}
}
}

View File

@@ -1,78 +0,0 @@
/*
* Copyright 2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.gridfs;
import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;
import org.reactivestreams.Publisher;
import org.springframework.core.io.buffer.DataBuffer;
import org.springframework.core.io.buffer.DataBufferFactory;
import org.springframework.core.io.buffer.DataBufferUtils;
import com.mongodb.reactivestreams.client.gridfs.AsyncInputStream;
/**
* Utility methods to create adapters from between {@link Publisher} of {@link DataBuffer} and {@link AsyncInputStream}.
*
* @author Mark Paluch
* @since 2.2
*/
class BinaryStreamAdapters {
/**
* Creates a {@link Flux} emitting {@link DataBuffer} by reading binary chunks from {@link AsyncInputStream}.
* Publisher termination (completion, error, cancellation) closes the {@link AsyncInputStream}.
* <p/>
* The resulting {@link org.reactivestreams.Publisher} filters empty binary chunks and uses {@link DataBufferFactory}
* settings to determine the chunk size.
*
* @param inputStream must not be {@literal null}.
* @param dataBufferFactory must not be {@literal null}.
* @return {@link Flux} emitting {@link DataBuffer}s.
* @see DataBufferFactory#allocateBuffer()
*/
static Flux<DataBuffer> toPublisher(AsyncInputStream inputStream, DataBufferFactory dataBufferFactory) {
return DataBufferPublisherAdapter.createBinaryStream(inputStream, dataBufferFactory) //
.filter(it -> {
if (it.readableByteCount() == 0) {
DataBufferUtils.release(it);
return false;
}
return true;
});
}
/**
* Creates a {@link Mono} emitting a {@link AsyncInputStream} to consume a {@link Publisher} emitting
* {@link DataBuffer} and exposing the binary stream through {@link AsyncInputStream}. {@link DataBuffer}s are
* released by the adapter during consumption.
* <p/>
* This method returns a {@link Mono} to retain the {@link reactor.util.context.Context subscriber context}.
*
* @param dataBuffers must not be {@literal null}.
* @return {@link Mono} emitting {@link AsyncInputStream}.
* @see DataBufferUtils#release(DataBuffer)
*/
static Mono<AsyncInputStream> toAsyncInputStream(Publisher<? extends DataBuffer> dataBuffers) {
return Mono.create(sink -> {
sink.success(new AsyncInputStreamAdapter(dataBuffers, sink.currentContext()));
});
}
}

View File

@@ -1,219 +0,0 @@
/*
* Copyright 2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.gridfs;
import lombok.RequiredArgsConstructor;
import reactor.core.CoreSubscriber;
import reactor.core.publisher.Flux;
import reactor.core.publisher.FluxSink;
import reactor.core.publisher.Mono;
import reactor.core.publisher.Operators;
import reactor.util.context.Context;
import java.nio.ByteBuffer;
import java.util.concurrent.atomic.AtomicIntegerFieldUpdater;
import java.util.concurrent.atomic.AtomicLongFieldUpdater;
import org.reactivestreams.Publisher;
import org.reactivestreams.Subscription;
import org.springframework.core.io.buffer.DataBuffer;
import org.springframework.core.io.buffer.DataBufferFactory;
import org.springframework.core.io.buffer.DataBufferUtils;
import com.mongodb.reactivestreams.client.gridfs.AsyncInputStream;
/**
* Utility to adapt a {@link AsyncInputStream} to a {@link Publisher} emitting {@link DataBuffer}.
*
* @author Mark Paluch
* @author Christoph Strobl
* @since 2.2
*/
class DataBufferPublisherAdapter {
/**
* Creates a {@link Publisher} emitting {@link DataBuffer}s by reading binary chunks from {@link AsyncInputStream}.
* Closes the {@link AsyncInputStream} once the {@link Publisher} terminates.
*
* @param inputStream must not be {@literal null}.
* @param dataBufferFactory must not be {@literal null}.
* @return the resulting {@link Publisher}.
*/
static Flux<DataBuffer> createBinaryStream(AsyncInputStream inputStream, DataBufferFactory dataBufferFactory) {
State state = new State(inputStream, dataBufferFactory);
return Flux.usingWhen(Mono.just(inputStream), it -> {
return Flux.<DataBuffer> create((sink) -> {
sink.onDispose(state::close);
sink.onCancel(state::close);
sink.onRequest(n -> {
state.request(sink, n);
});
});
}, AsyncInputStream::close, AsyncInputStream::close, AsyncInputStream::close) //
.concatMap(Flux::just, 1);
}
@RequiredArgsConstructor
static class State {
private static final AtomicLongFieldUpdater<State> DEMAND = AtomicLongFieldUpdater.newUpdater(State.class, "demand");
private static final AtomicIntegerFieldUpdater<State> STATE = AtomicIntegerFieldUpdater.newUpdater(State.class, "state");
private static final AtomicIntegerFieldUpdater<State> READ = AtomicIntegerFieldUpdater.newUpdater(State.class, "read");
private static final int STATE_OPEN = 0;
private static final int STATE_CLOSED = 1;
private static final int READ_NONE = 0;
private static final int READ_IN_PROGRESS = 1;
final AsyncInputStream inputStream;
final DataBufferFactory dataBufferFactory;
// see DEMAND
volatile long demand;
// see STATE
volatile int state = STATE_OPEN;
// see READ_IN_PROGRESS
volatile int read = READ_NONE;
void request(FluxSink<DataBuffer> sink, long n) {
Operators.addCap(DEMAND, this, n);
if (onShouldRead()) {
emitNext(sink);
}
}
boolean onShouldRead() {
return !isClosed() && getDemand() > 0 && onWantRead();
}
boolean onWantRead() {
return READ.compareAndSet(this, READ_NONE, READ_IN_PROGRESS);
}
boolean onReadDone() {
return READ.compareAndSet(this, READ_IN_PROGRESS, READ_NONE);
}
long getDemand() {
return DEMAND.get(this);
}
void close() {
STATE.compareAndSet(this, STATE_OPEN, STATE_CLOSED);
}
boolean isClosed() {
return STATE.get(this) == STATE_CLOSED;
}
/**
* Emit the next {@link DataBuffer}.
*
* @param sink
*/
void emitNext(FluxSink<DataBuffer> sink) {
DataBuffer dataBuffer = dataBufferFactory.allocateBuffer();
ByteBuffer intermediate = ByteBuffer.allocate(dataBuffer.capacity());
Mono.from(inputStream.read(intermediate)).subscribe(new BufferCoreSubscriber(sink, dataBuffer, intermediate));
}
private class BufferCoreSubscriber implements CoreSubscriber<Integer> {
private final FluxSink<DataBuffer> sink;
private final DataBuffer dataBuffer;
private final ByteBuffer intermediate;
BufferCoreSubscriber(FluxSink<DataBuffer> sink, DataBuffer dataBuffer, ByteBuffer intermediate) {
this.sink = sink;
this.dataBuffer = dataBuffer;
this.intermediate = intermediate;
}
@Override
public Context currentContext() {
return sink.currentContext();
}
@Override
public void onSubscribe(Subscription s) {
s.request(1);
}
@Override
public void onNext(Integer bytes) {
if (isClosed()) {
onReadDone();
DataBufferUtils.release(dataBuffer);
Operators.onNextDropped(dataBuffer, sink.currentContext());
return;
}
intermediate.flip();
dataBuffer.write(intermediate);
sink.next(dataBuffer);
try {
if (bytes == -1) {
sink.complete();
}
} finally {
onReadDone();
}
}
@Override
public void onError(Throwable t) {
if (isClosed()) {
Operators.onErrorDropped(t, sink.currentContext());
return;
}
onReadDone();
DataBufferUtils.release(dataBuffer);
Operators.onNextDropped(dataBuffer, sink.currentContext());
sink.error(t);
}
@Override
public void onComplete() {
if (onShouldRead()) {
emitNext(sink);
}
}
}
}
}

View File

@@ -1,104 +0,0 @@
/*
* Copyright 2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.gridfs;
import java.util.Optional;
import org.bson.Document;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
import com.mongodb.client.gridfs.model.GridFSUploadOptions;
/**
* Base class offering common tasks like query mapping and {@link GridFSUploadOptions} computation to be shared across
* imperative and reactive implementations.
*
* @author Christoph Strobl
* @since 2.2
*/
class GridFsOperationsSupport {
private final QueryMapper queryMapper;
private final MongoConverter converter;
/**
* @param converter must not be {@literal null}.
*/
GridFsOperationsSupport(MongoConverter converter) {
Assert.notNull(converter, "MongoConverter must not be null!");
this.converter = converter;
this.queryMapper = new QueryMapper(converter);
}
/**
* @param query pass the given query though a {@link QueryMapper} to apply type conversion.
* @return never {@literal null}.
*/
protected Document getMappedQuery(Document query) {
return queryMapper.getMappedObject(query, Optional.empty());
}
/**
* Compute the {@link GridFSUploadOptions} to be used from the given {@literal contentType} and {@literal metadata}
* {@link Document}.
*
* @param contentType can be {@literal null}.
* @param metadata can be {@literal null}
* @return never {@literal null}.
*/
protected GridFSUploadOptions computeUploadOptionsFor(@Nullable String contentType, @Nullable Document metadata) {
Document targetMetadata = new Document();
if (StringUtils.hasText(contentType)) {
targetMetadata.put(GridFsResource.CONTENT_TYPE_FIELD, contentType);
}
if (metadata != null) {
targetMetadata.putAll(metadata);
}
GridFSUploadOptions options = new GridFSUploadOptions();
options.metadata(targetMetadata);
return options;
}
/**
* Convert a given {@literal value} into a {@link Document}.
*
* @param value can be {@literal null}.
* @return an empty {@link Document} if the source value is {@literal null}.
*/
protected Document toDocument(@Nullable Object value) {
if (value instanceof Document) {
return (Document) value;
}
Document document = new Document();
if (value != null) {
converter.write(value, document);
}
return document;
}
}

View File

@@ -29,6 +29,7 @@ import org.bson.types.ObjectId;
import org.springframework.core.io.support.ResourcePatternResolver;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
@@ -39,6 +40,7 @@ import com.mongodb.client.gridfs.GridFSBucket;
import com.mongodb.client.gridfs.GridFSBuckets;
import com.mongodb.client.gridfs.GridFSFindIterable;
import com.mongodb.client.gridfs.model.GridFSFile;
import com.mongodb.client.gridfs.model.GridFSUploadOptions;
/**
* {@link GridFsOperations} implementation to store content into MongoDB GridFS.
@@ -52,11 +54,13 @@ import com.mongodb.client.gridfs.model.GridFSFile;
* @author Hartmut Lang
* @author Niklas Helge Hanft
*/
public class GridFsTemplate extends GridFsOperationsSupport implements GridFsOperations, ResourcePatternResolver {
public class GridFsTemplate implements GridFsOperations, ResourcePatternResolver {
private final MongoDbFactory dbFactory;
private final @Nullable String bucket;
private final MongoConverter converter;
private final QueryMapper queryMapper;
/**
* Creates a new {@link GridFsTemplate} using the given {@link MongoDbFactory} and {@link MongoConverter}.
@@ -77,12 +81,14 @@ public class GridFsTemplate extends GridFsOperationsSupport implements GridFsOpe
*/
public GridFsTemplate(MongoDbFactory dbFactory, MongoConverter converter, @Nullable String bucket) {
super(converter);
Assert.notNull(dbFactory, "MongoDbFactory must not be null!");
Assert.notNull(converter, "MongoConverter must not be null!");
this.dbFactory = dbFactory;
this.converter = converter;
this.bucket = bucket;
this.queryMapper = new QueryMapper(converter);
}
/*
@@ -131,9 +137,16 @@ public class GridFsTemplate extends GridFsOperationsSupport implements GridFsOpe
* (non-Javadoc)
* @see org.springframework.data.mongodb.gridfs.GridFsOperations#store(java.io.InputStream, java.lang.String, java.lang.String, java.lang.Object)
*/
public ObjectId store(InputStream content, @Nullable String filename, @Nullable String contentType,
@Nullable Object metadata) {
return store(content, filename, contentType, toDocument(metadata));
public ObjectId store(InputStream content, @Nullable String filename, @Nullable String contentType, @Nullable Object metadata) {
Document document = null;
if (metadata != null) {
document = new Document();
converter.write(metadata, document);
}
return store(content, filename, contentType, document);
}
/*
@@ -148,11 +161,25 @@ public class GridFsTemplate extends GridFsOperationsSupport implements GridFsOpe
* (non-Javadoc)
* @see org.springframework.data.mongodb.gridfs.GridFsOperations#store(java.io.InputStream, java.lang.String, com.mongodb.Document)
*/
public ObjectId store(InputStream content, @Nullable String filename, @Nullable String contentType,
@Nullable Document metadata) {
public ObjectId store(InputStream content, @Nullable String filename, @Nullable String contentType, @Nullable Document metadata) {
Assert.notNull(content, "InputStream must not be null!");
return getGridFs().uploadFromStream(filename, content, computeUploadOptionsFor(contentType, metadata));
GridFSUploadOptions options = new GridFSUploadOptions();
Document mData = new Document();
if (StringUtils.hasText(contentType)) {
mData.put(GridFsResource.CONTENT_TYPE_FIELD, contentType);
}
if (metadata != null) {
mData.putAll(metadata);
}
options.metadata(mData);
return getGridFs().uploadFromStream(filename, content, options);
}
/*
@@ -183,8 +210,8 @@ public class GridFsTemplate extends GridFsOperationsSupport implements GridFsOpe
*/
public void delete(Query query) {
for (GridFSFile gridFSFile : find(query)) {
getGridFs().delete(((BsonObjectId) gridFSFile.getId()).getValue());
for (GridFSFile x : find(query)) {
getGridFs().delete(((BsonObjectId) x.getId()).getValue());
}
}
@@ -219,9 +246,9 @@ public class GridFsTemplate extends GridFsOperationsSupport implements GridFsOpe
}
/*
* (non-Javadoc)
* @see org.springframework.core.io.support.ResourcePatternResolver#getResources(java.lang.String)
*/
* (non-Javadoc)
* @see org.springframework.core.io.support.ResourcePatternResolver#getResources(java.lang.String)
*/
public GridFsResource[] getResources(String locationPattern) {
if (!StringUtils.hasText(locationPattern)) {
@@ -245,6 +272,10 @@ public class GridFsTemplate extends GridFsOperationsSupport implements GridFsOpe
return new GridFsResource[] { getResource(locationPattern) };
}
private Document getMappedQuery(Document query) {
return queryMapper.getMappedObject(query, Optional.empty());
}
private GridFSBucket getGridFs() {
MongoDatabase db = dbFactory.getDb();

View File

@@ -1,233 +0,0 @@
/*
* Copyright 2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.gridfs;
import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;
import org.bson.Document;
import org.bson.types.ObjectId;
import org.reactivestreams.Publisher;
import org.springframework.core.io.buffer.DataBuffer;
import org.springframework.data.domain.Sort;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.lang.Nullable;
import com.mongodb.client.gridfs.model.GridFSFile;
import com.mongodb.reactivestreams.client.gridfs.AsyncInputStream;
/**
* Collection of operations to store and read files from MongoDB GridFS using reactive infrastructure.
*
* @author Mark Paluch
* @author Christoph Strobl
* @since 2.2
*/
public interface ReactiveGridFsOperations {
/**
* Stores the given content into a file with the given name.
*
* @param content must not be {@literal null}.
* @param filename must not be {@literal null} or empty.
* @return a {@link Mono} emitting the {@link ObjectId} of the {@link com.mongodb.client.gridfs.model.GridFSFile} just
* created.
*/
default Mono<ObjectId> store(Publisher<DataBuffer> content, String filename) {
return store(content, filename, (Object) null);
}
/**
* Stores the given content into a file applying the given metadata.
*
* @param content must not be {@literal null}.
* @param metadata can be {@literal null}.
* @return a {@link Mono} emitting the {@link ObjectId} of the {@link com.mongodb.client.gridfs.model.GridFSFile} just
* created.
*/
default Mono<ObjectId> store(Publisher<DataBuffer> content, @Nullable Object metadata) {
return store(content, null, metadata);
}
/**
* Stores the given content into a file applying the given metadata.
*
* @param content must not be {@literal null}.
* @param metadata can be {@literal null}.
* @return a {@link Mono} emitting the {@link ObjectId} of the {@link com.mongodb.client.gridfs.model.GridFSFile} just
* created.
*/
default Mono<ObjectId> store(Publisher<DataBuffer> content, @Nullable Document metadata) {
return store(content, null, metadata);
}
/**
* Stores the given content into a file with the given name and content type.
*
* @param content must not be {@literal null}.
* @param filename must not be {@literal null} or empty.
* @param contentType can be {@literal null}.
* @return a {@link Mono} emitting the {@link ObjectId} of the {@link com.mongodb.client.gridfs.model.GridFSFile} just
* created.
*/
default Mono<ObjectId> store(Publisher<DataBuffer> content, @Nullable String filename, @Nullable String contentType) {
return store(content, filename, contentType, (Object) null);
}
/**
* Stores the given content into a file with the given name using the given metadata. The metadata object will be
* marshalled before writing.
*
* @param content must not be {@literal null}.
* @param filename can be {@literal null} or empty.
* @param metadata can be {@literal null}.
* @return a {@link Mono} emitting the {@link ObjectId} of the {@link com.mongodb.client.gridfs.model.GridFSFile} just
* created.
*/
default Mono<ObjectId> store(Publisher<DataBuffer> content, @Nullable String filename, @Nullable Object metadata) {
return store(content, filename, null, metadata);
}
/**
* Stores the given content into a file with the given name and content type using the given metadata. The metadata
* object will be marshalled before writing.
*
* @param content must not be {@literal null}.
* @param filename must not be {@literal null} or empty.
* @param contentType can be {@literal null}.
* @param metadata can be {@literal null}
* @return a {@link Mono} emitting the {@link ObjectId} of the {@link com.mongodb.client.gridfs.model.GridFSFile} just
* created.
*/
Mono<ObjectId> store(AsyncInputStream content, @Nullable String filename, @Nullable String contentType,
@Nullable Object metadata);
/**
* Stores the given content into a file with the given name and content type using the given metadata. The metadata
* object will be marshalled before writing.
*
* @param content must not be {@literal null}.
* @param filename must not be {@literal null} or empty.
* @param contentType can be {@literal null}.
* @param metadata can be {@literal null}
* @return a {@link Mono} emitting the {@link ObjectId} of the {@link com.mongodb.client.gridfs.model.GridFSFile} just
* created.
*/
Mono<ObjectId> store(Publisher<DataBuffer> content, @Nullable String filename, @Nullable String contentType,
@Nullable Object metadata);
/**
* Stores the given content into a file with the given name using the given metadata.
*
* @param content must not be {@literal null}.
* @param filename must not be {@literal null} or empty.
* @param metadata can be {@literal null}.
* @return a {@link Mono} emitting the {@link ObjectId} of the {@link com.mongodb.client.gridfs.model.GridFSFile} just
* created.
*/
default Mono<ObjectId> store(Publisher<DataBuffer> content, @Nullable String filename, @Nullable Document metadata) {
return store(content, filename, null, metadata);
}
/**
* Stores the given content into a file with the given name and content type using the given metadata.
*
* @param content must not be {@literal null}.
* @param filename must not be {@literal null} or empty.
* @param contentType can be {@literal null}.
* @param metadata can be {@literal null}.
* @return a {@link Mono} emitting the {@link ObjectId} of the {@link com.mongodb.client.gridfs.model.GridFSFile} just
* created.
*/
Mono<ObjectId> store(AsyncInputStream content, @Nullable String filename, @Nullable String contentType,
@Nullable Document metadata);
/**
* Stores the given content into a file with the given name and content type using the given metadata.
*
* @param content must not be {@literal null}.
* @param filename must not be {@literal null} or empty.
* @param contentType can be {@literal null}.
* @param metadata can be {@literal null}.
* @return a {@link Mono} emitting the {@link ObjectId} of the {@link com.mongodb.client.gridfs.model.GridFSFile} just
* created.
*/
Mono<ObjectId> store(Publisher<DataBuffer> content, @Nullable String filename, @Nullable String contentType,
@Nullable Document metadata);
/**
* Returns a {@link Flux} emitting all files matching the given query. <br />
* <strong>Note:</strong> Currently {@link Sort} criteria defined at the {@link Query} will not be regarded as MongoDB
* does not support ordering for GridFS file access.
*
* @see <a href="https://jira.mongodb.org/browse/JAVA-431">MongoDB Jira: JAVA-431</a>
* @param query must not be {@literal null}.
* @return {@link Flux#empty()} if no mach found.
*/
Flux<GridFSFile> find(Query query);
/**
* Returns a {@link Mono} emitting a single {@link com.mongodb.client.gridfs.model.GridFSFile} matching the given
* query or {@link Mono#empty()} in case no file matches. <br />
* <strong>NOTE</strong> If more than one file matches the given query the resulting {@link Mono} emits an error. If
* you want to obtain the first found file use {@link #findFirst(Query)}.
*
* @param query must not be {@literal null}.
* @return {@link Mono#empty()} if not match found.
*/
Mono<GridFSFile> findOne(Query query);
/**
* Returns a {@link Mono} emitting the frist {@link com.mongodb.client.gridfs.model.GridFSFile} matching the given
* query or {@link Mono#empty()} in case no file matches.
*
* @param query must not be {@literal null}.
* @return {@link Mono#empty()} if not match found.
*/
Mono<GridFSFile> findFirst(Query query);
/**
* Deletes all files matching the given {@link Query}.
*
* @param query must not be {@literal null}.
* @return a {@link Mono} signalling operation completion.
*/
Mono<Void> delete(Query query);
/**
* Returns a {@link Mono} emitting the {@link ReactiveGridFsResource} with the given file name.
*
* @param filename must not be {@literal null}.
* @return {@link Mono#empty()} if no match found.
*/
Mono<ReactiveGridFsResource> getResource(String filename);
/**
* Returns a {@link Mono} emitting the {@link ReactiveGridFsResource} for a {@link GridFSFile}.
*
* @param file must not be {@literal null}.
* @return {@link Mono#empty()} if no match found.
*/
Mono<ReactiveGridFsResource> getResource(GridFSFile file);
/**
* Returns a {@link Flux} emitting all {@link ReactiveGridFsResource}s matching the given file name pattern.
*
* @param filenamePattern must not be {@literal null}.
* @return {@link Flux#empty()} if no match found.
*/
Flux<ReactiveGridFsResource> getResources(String filenamePattern);
}

View File

@@ -1,180 +0,0 @@
/*
* Copyright 2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.gridfs;
import reactor.core.publisher.Flux;
import java.io.ByteArrayInputStream;
import java.io.FileNotFoundException;
import java.io.IOException;
import java.io.InputStream;
import org.reactivestreams.Publisher;
import org.springframework.core.io.AbstractResource;
import org.springframework.core.io.Resource;
import org.springframework.core.io.buffer.DataBuffer;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import com.mongodb.client.gridfs.model.GridFSFile;
/**
* Reactive {@link GridFSFile} based {@link Resource} implementation.
*
* @author Mark Paluch
* @since 2.2
*/
public class ReactiveGridFsResource extends AbstractResource {
static final String CONTENT_TYPE_FIELD = "_contentType";
private static final ByteArrayInputStream EMPTY_INPUT_STREAM = new ByteArrayInputStream(new byte[0]);
private final @Nullable GridFSFile file;
private final String filename;
private final Flux<DataBuffer> content;
/**
* Creates a new, absent {@link ReactiveGridFsResource}.
*
* @param filename filename of the absent resource.
* @param content
* @since 2.1
*/
private ReactiveGridFsResource(String filename, Publisher<DataBuffer> content) {
this.file = null;
this.filename = filename;
this.content = Flux.from(content);
}
/**
* Creates a new {@link ReactiveGridFsResource} from the given {@link GridFSFile}.
*
* @param file must not be {@literal null}.
* @param content
*/
public ReactiveGridFsResource(GridFSFile file, Publisher<DataBuffer> content) {
this.file = file;
this.filename = file.getFilename();
this.content = Flux.from(content);
}
/**
* Obtain an absent {@link ReactiveGridFsResource}.
*
* @param filename filename of the absent resource, must not be {@literal null}.
* @return never {@literal null}.
* @since 2.1
*/
public static ReactiveGridFsResource absent(String filename) {
Assert.notNull(filename, "Filename must not be null");
return new ReactiveGridFsResource(filename, Flux.empty());
}
/*
* (non-Javadoc)
* @see org.springframework.core.io.InputStreamResource#getInputStream()
*/
@Override
public InputStream getInputStream() throws IllegalStateException {
throw new UnsupportedOperationException();
}
/*
* (non-Javadoc)
* @see org.springframework.core.io.AbstractResource#contentLength()
*/
@Override
public long contentLength() throws IOException {
verifyExists();
return file.getLength();
}
/*
* (non-Javadoc)
* @see org.springframework.core.io.AbstractResource#getFilename()
*/
@Override
public String getFilename() throws IllegalStateException {
return filename;
}
/*
* (non-Javadoc)
* @see org.springframework.core.io.AbstractResource#exists()
*/
@Override
public boolean exists() {
return file != null;
}
/*
* (non-Javadoc)
* @see org.springframework.core.io.AbstractResource#lastModified()
*/
@Override
public long lastModified() throws IOException {
verifyExists();
return file.getUploadDate().getTime();
}
/*
* (non-Javadoc)
* @see org.springframework.core.io.AbstractResource#getDescription()
*/
@Override
public String getDescription() {
return String.format("GridFs resource [%s]", this.getFilename());
}
/**
* Returns the {@link Resource}'s id.
*
* @return never {@literal null}.
* @throws IllegalStateException if the file does not {@link #exists()}.
*/
public Object getId() {
Assert.state(exists(), () -> String.format("%s does not exist.", getDescription()));
return file.getId();
}
/**
* Retrieve the download stream.
*
* @return
*/
public Flux<DataBuffer> getDownloadStream() {
if (!exists()) {
return Flux.error(new FileNotFoundException(String.format("%s does not exist.", getDescription())));
}
return content;
}
private void verifyExists() throws FileNotFoundException {
if (!exists()) {
throw new FileNotFoundException(String.format("%s does not exist.", getDescription()));
}
}
}

View File

@@ -1,275 +0,0 @@
/*
* Copyright 2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.gridfs;
import static org.springframework.data.mongodb.core.query.Query.*;
import static org.springframework.data.mongodb.gridfs.GridFsCriteria.*;
import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;
import org.bson.Document;
import org.bson.types.ObjectId;
import org.reactivestreams.Publisher;
import org.springframework.core.io.buffer.DataBuffer;
import org.springframework.core.io.buffer.DataBufferFactory;
import org.springframework.core.io.buffer.DefaultDataBufferFactory;
import org.springframework.dao.IncorrectResultSizeDataAccessException;
import org.springframework.data.mongodb.ReactiveMongoDatabaseFactory;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.SerializationUtils;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
import com.mongodb.client.gridfs.model.GridFSFile;
import com.mongodb.reactivestreams.client.MongoDatabase;
import com.mongodb.reactivestreams.client.gridfs.AsyncInputStream;
import com.mongodb.reactivestreams.client.gridfs.GridFSBucket;
import com.mongodb.reactivestreams.client.gridfs.GridFSBuckets;
import com.mongodb.reactivestreams.client.gridfs.GridFSDownloadStream;
import com.mongodb.reactivestreams.client.gridfs.GridFSFindPublisher;
/**
* {@link ReactiveGridFsOperations} implementation to store content into MongoDB GridFS. Uses by default
* {@link DefaultDataBufferFactory} to create {@link DataBuffer buffers}.
*
* @author Mark Paluch
* @since 2.2
*/
public class ReactiveGridFsTemplate extends GridFsOperationsSupport implements ReactiveGridFsOperations {
private final ReactiveMongoDatabaseFactory dbFactory;
private final DataBufferFactory dataBufferFactory;
private final @Nullable String bucket;
/**
* Creates a new {@link ReactiveGridFsTemplate} using the given {@link ReactiveMongoDatabaseFactory} and
* {@link MongoConverter}.
*
* @param dbFactory must not be {@literal null}.
* @param converter must not be {@literal null}.
*/
public ReactiveGridFsTemplate(ReactiveMongoDatabaseFactory dbFactory, MongoConverter converter) {
this(dbFactory, converter, null);
}
/**
* Creates a new {@link ReactiveGridFsTemplate} using the given {@link ReactiveMongoDatabaseFactory} and
* {@link MongoConverter}.
*
* @param dbFactory must not be {@literal null}.
* @param converter must not be {@literal null}.
* @param bucket
*/
public ReactiveGridFsTemplate(ReactiveMongoDatabaseFactory dbFactory, MongoConverter converter,
@Nullable String bucket) {
this(new DefaultDataBufferFactory(), dbFactory, converter, bucket);
}
/**
* Creates a new {@link ReactiveGridFsTemplate} using the given {@link DataBufferFactory},
* {@link ReactiveMongoDatabaseFactory} and {@link MongoConverter}.
*
* @param dataBufferFactory must not be {@literal null}.
* @param dbFactory must not be {@literal null}.
* @param converter must not be {@literal null}.
* @param bucket
*/
public ReactiveGridFsTemplate(DataBufferFactory dataBufferFactory, ReactiveMongoDatabaseFactory dbFactory,
MongoConverter converter, @Nullable String bucket) {
super(converter);
Assert.notNull(dataBufferFactory, "DataBufferFactory must not be null!");
Assert.notNull(dbFactory, "ReactiveMongoDatabaseFactory must not be null!");
this.dataBufferFactory = dataBufferFactory;
this.dbFactory = dbFactory;
this.bucket = bucket;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.gridfs.ReactiveGridFsOperations#store(com.mongodb.reactivestreams.client.gridfs.AsyncInputStream, java.lang.String, java.lang.String, java.lang.Object)
*/
@Override
public Mono<ObjectId> store(AsyncInputStream content, @Nullable String filename, @Nullable String contentType,
@Nullable Object metadata) {
return store(content, filename, contentType, toDocument(metadata));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.gridfs.ReactiveGridFsOperations#store(org.reactivestreams.Publisher, java.lang.String, java.lang.String, java.lang.Object)
*/
@Override
public Mono<ObjectId> store(Publisher<DataBuffer> content, @Nullable String filename, @Nullable String contentType,
@Nullable Object metadata) {
return store(content, filename, contentType, toDocument(metadata));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.gridfs.ReactiveGridFsOperations#store(com.mongodb.reactivestreams.client.gridfs.AsyncInputStream, java.lang.String, java.lang.String, org.bson.Document)
*/
@Override
public Mono<ObjectId> store(AsyncInputStream content, @Nullable String filename, @Nullable String contentType,
@Nullable Document metadata) {
Assert.notNull(content, "InputStream must not be null!");
return Mono.from(getGridFs().uploadFromStream(filename, content, computeUploadOptionsFor(contentType, metadata)));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.gridfs.ReactiveGridFsOperations#store(org.reactivestreams.Publisher, java.lang.String, java.lang.String, org.bson.Document)
*/
@Override
public Mono<ObjectId> store(Publisher<DataBuffer> content, @Nullable String filename, @Nullable String contentType,
@Nullable Document metadata) {
Assert.notNull(content, "Content must not be null!");
return BinaryStreamAdapters.toAsyncInputStream(content).flatMap(it -> store(it, filename, contentType, metadata));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.gridfs.ReactiveGridFsOperations#find(org.springframework.data.mongodb.core.query.Query)
*/
@Override
public Flux<GridFSFile> find(Query query) {
return Flux.from(prepareQuery(query));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.gridfs.ReactiveGridFsOperations#findOne(org.springframework.data.mongodb.core.query.Query)
*/
@Override
public Mono<GridFSFile> findOne(Query query) {
return Flux.from(prepareQuery(query).limit(2)) //
.collectList() //
.flatMap(it -> {
if (it.isEmpty()) {
return Mono.empty();
}
if (it.size() > 1) {
return Mono.error(new IncorrectResultSizeDataAccessException(
"Query " + SerializationUtils.serializeToJsonSafely(query) + " returned non unique result.", 1));
}
return Mono.just(it.get(0));
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.gridfs.ReactiveGridFsOperations#findFirst(org.springframework.data.mongodb.core.query.Query)
*/
@Override
public Mono<GridFSFile> findFirst(Query query) {
return Flux.from(prepareQuery(query).limit(1)).next();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.gridfs.ReactiveGridFsOperations#delete(org.springframework.data.mongodb.core.query.Query)
*/
@Override
public Mono<Void> delete(Query query) {
return find(query).flatMap(it -> getGridFs().delete(it.getId())).then();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.gridfs.ReactiveGridFsOperations#getResource(java.lang.String)
*/
@Override
public Mono<ReactiveGridFsResource> getResource(String location) {
Assert.notNull(location, "Filename must not be null!");
return findOne(query(whereFilename().is(location))).flatMap(this::getResource)
.defaultIfEmpty(ReactiveGridFsResource.absent(location));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.gridfs.ReactiveGridFsOperations#getResource(com.mongodb.client.gridfs.model.GridFSFile)
*/
@Override
public Mono<ReactiveGridFsResource> getResource(GridFSFile file) {
Assert.notNull(file, "GridFSFile must not be null!");
return Mono.fromSupplier(() -> {
GridFSDownloadStream stream = getGridFs().openDownloadStream(file.getObjectId());
return new ReactiveGridFsResource(file, BinaryStreamAdapters.toPublisher(stream, dataBufferFactory));
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.gridfs.ReactiveGridFsOperations#getResources(java.lang.String)
*/
@Override
public Flux<ReactiveGridFsResource> getResources(String locationPattern) {
if (!StringUtils.hasText(locationPattern)) {
return Flux.empty();
}
AntPath path = new AntPath(locationPattern);
if (path.isPattern()) {
Flux<GridFSFile> files = find(query(whereFilename().regex(path.toRegex())));
return files.flatMap(this::getResource);
}
return getResource(locationPattern).flux();
}
protected GridFSFindPublisher prepareQuery(Query query) {
Assert.notNull(query, "Query must not be null!");
Document queryObject = getMappedQuery(query.getQueryObject());
Document sortObject = getMappedQuery(query.getSortObject());
GridFSFindPublisher publisherToUse = getGridFs().find(queryObject).sort(sortObject);
Integer cursorBatchSize = query.getMeta().getCursorBatchSize();
if (cursorBatchSize != null) {
publisherToUse = publisherToUse.batchSize(cursorBatchSize);
}
return publisherToUse;
}
protected GridFSBucket getGridFs() {
MongoDatabase db = dbFactory.getMongoDatabase();
return bucket == null ? GridFSBuckets.create(db) : GridFSBuckets.create(db, bucket);
}
}

View File

@@ -19,20 +19,27 @@ import java.lang.annotation.Annotation;
import java.util.Collection;
import java.util.Collections;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
import org.springframework.beans.factory.support.RootBeanDefinition;
import org.springframework.core.annotation.AnnotationAttributes;
import org.springframework.data.config.ParsingUtils;
import org.springframework.data.mongodb.config.BeanNames;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.repository.MongoRepository;
import org.springframework.data.mongodb.repository.support.MongoRepositoryFactoryBean;
import org.springframework.data.repository.config.AnnotationRepositoryConfigurationSource;
import org.springframework.data.repository.config.RepositoryConfigurationExtension;
import org.springframework.data.repository.config.RepositoryConfigurationExtensionSupport;
import org.springframework.data.repository.config.RepositoryConfigurationSource;
import org.springframework.data.repository.config.XmlRepositoryConfigurationSource;
import org.springframework.data.repository.core.RepositoryMetadata;
import org.w3c.dom.Element;
/**
* {@link org.springframework.data.repository.config.RepositoryConfigurationExtension} for MongoDB.
* {@link RepositoryConfigurationExtension} for MongoDB.
*
* @author Oliver Gierke
* @author Mark Paluch
@@ -112,6 +119,25 @@ public class MongoRepositoryConfigurationExtension extends RepositoryConfigurati
builder.addPropertyValue("createIndexesForQueryMethods", attributes.getBoolean("createIndexesForQueryMethods"));
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.config.RepositoryConfigurationExtensionSupport#registerBeansForRoot(org.springframework.beans.factory.support.BeanDefinitionRegistry, org.springframework.data.repository.config.RepositoryConfigurationSource)
*/
@Override
public void registerBeansForRoot(BeanDefinitionRegistry registry, RepositoryConfigurationSource configurationSource) {
super.registerBeansForRoot(registry, configurationSource);
if (!registry.containsBeanDefinition(BeanNames.MAPPING_CONTEXT_BEAN_NAME)) {
RootBeanDefinition definition = new RootBeanDefinition(MongoMappingContext.class);
definition.setRole(AbstractBeanDefinition.ROLE_INFRASTRUCTURE);
definition.setSource(configurationSource.getSource());
registry.registerBeanDefinition(BeanNames.MAPPING_CONTEXT_BEAN_NAME, definition);
}
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.config.RepositoryConfigurationExtensionSupport#useRepositoryConfiguration(org.springframework.data.repository.core.RepositoryMetadata)

View File

@@ -0,0 +1,666 @@
/*
* Copyright 2015-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.repository.query;
import lombok.EqualsAndHashCode;
import lombok.RequiredArgsConstructor;
import lombok.Value;
import lombok.experimental.UtilityClass;
import java.io.StringWriter;
import java.util.ArrayList;
import java.util.Collection;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;
import java.util.NoSuchElementException;
import java.util.UUID;
import java.util.function.Function;
import java.util.function.Supplier;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
import org.bson.codecs.BinaryCodec;
import org.bson.codecs.Codec;
import org.bson.codecs.UuidCodec;
import org.bson.json.JsonWriter;
import org.bson.types.Binary;
import org.springframework.data.mongodb.CodecRegistryProvider;
import org.springframework.data.mongodb.repository.query.StringBasedMongoQuery.ParameterBinding;
import org.springframework.data.repository.query.QueryMethodEvaluationContextProvider;
import org.springframework.expression.EvaluationContext;
import org.springframework.expression.Expression;
import org.springframework.expression.spel.standard.SpelExpressionParser;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.Base64Utils;
import org.springframework.util.CollectionUtils;
import org.springframework.util.StringUtils;
import com.mongodb.DBObject;
import com.mongodb.MongoClient;
import com.mongodb.util.JSON;
/**
* {@link ExpressionEvaluatingParameterBinder} allows to evaluate, convert and bind parameters to placeholders within a
* {@link String}.
*
* @author Christoph Strobl
* @author Thomas Darimont
* @author Oliver Gierke
* @author Mark Paluch
* @since 1.9
*/
class ExpressionEvaluatingParameterBinder {
private final SpelExpressionParser expressionParser;
private final QueryMethodEvaluationContextProvider evaluationContextProvider;
private final CodecRegistryProvider codecRegistryProvider;
/**
* Creates new {@link ExpressionEvaluatingParameterBinder}
*
* @param expressionParser must not be {@literal null}.
* @param evaluationContextProvider must not be {@literal null}.
*/
public ExpressionEvaluatingParameterBinder(SpelExpressionParser expressionParser,
QueryMethodEvaluationContextProvider evaluationContextProvider) {
Assert.notNull(expressionParser, "ExpressionParser must not be null!");
Assert.notNull(evaluationContextProvider, "EvaluationContextProvider must not be null!");
this.expressionParser = expressionParser;
this.evaluationContextProvider = evaluationContextProvider;
this.codecRegistryProvider = () -> MongoClient.getDefaultCodecRegistry();
}
/**
* Bind values provided by {@link MongoParameterAccessor} to placeholders in {@literal raw} while considering
* potential conversions and parameter types.
*
* @param raw can be empty.
* @param accessor must not be {@literal null}.
* @param bindingContext must not be {@literal null}.
* @return {@literal null} if given {@code raw} value is empty.
*/
public String bind(String raw, MongoParameterAccessor accessor, BindingContext bindingContext) {
if (!StringUtils.hasText(raw)) {
return raw;
}
return replacePlaceholders(raw, accessor, bindingContext);
}
/**
* Replaced the parameter placeholders with the actual parameter values from the given {@link ParameterBinding}s.
*
* @param input must not be {@literal null} or empty.
* @param accessor must not be {@literal null}.
* @param bindingContext must not be {@literal null}.
* @return
*/
private String replacePlaceholders(String input, MongoParameterAccessor accessor, BindingContext bindingContext) {
if (!bindingContext.hasBindings()) {
return input;
}
if (input.matches("^\\?\\d+$")) {
return getParameterValueForBinding(accessor, bindingContext.getParameters(),
bindingContext.getBindings().iterator().next());
}
Matcher matcher = createReplacementPattern(bindingContext.getBindings()).matcher(input);
StringBuffer buffer = new StringBuffer();
int parameterIndex = 0;
while (matcher.find()) {
Placeholder placeholder = extractPlaceholder(parameterIndex++, matcher);
ParameterBinding binding = bindingContext.getBindingFor(placeholder);
String valueForBinding = getParameterValueForBinding(accessor, bindingContext.getParameters(), binding);
// appendReplacement does not like unescaped $ sign and others, so we need to quote that stuff first
matcher.appendReplacement(buffer, Matcher.quoteReplacement(valueForBinding));
if (StringUtils.hasText(placeholder.getSuffix())) {
buffer.append(placeholder.getSuffix());
}
if (placeholder.isQuoted()) {
postProcessQuotedBinding(buffer, valueForBinding,
!binding.isExpression() ? accessor.getBindableValue(binding.getParameterIndex()) : null,
binding.isExpression());
}
}
matcher.appendTail(buffer);
return buffer.toString();
}
/**
* Sanitize String binding by replacing single quoted values with double quotes which prevents potential single quotes
* contained in replacement to interfere with the Json parsing. Also take care of complex objects by removing the
* quotation entirely.
*
* @param buffer the {@link StringBuffer} to operate upon.
* @param valueForBinding the actual binding value.
* @param raw the raw binding value
* @param isExpression {@literal true} if the binding value results from a SpEL expression.
*/
private void postProcessQuotedBinding(StringBuffer buffer, String valueForBinding, @Nullable Object raw,
boolean isExpression) {
int quotationMarkIndex = buffer.length() - valueForBinding.length() - 1;
char quotationMark = buffer.charAt(quotationMarkIndex);
while (quotationMark != '\'' && quotationMark != '"') {
quotationMarkIndex--;
if (quotationMarkIndex < 0) {
throw new IllegalArgumentException("Could not find opening quotes for quoted parameter");
}
quotationMark = buffer.charAt(quotationMarkIndex);
}
// remove quotation char before the complex object string
if (valueForBinding.startsWith("{") && (raw instanceof DBObject || isExpression)) {
buffer.deleteCharAt(quotationMarkIndex);
} else {
if (isExpression) {
buffer.deleteCharAt(quotationMarkIndex);
return;
}
if (quotationMark == '\'') {
buffer.replace(quotationMarkIndex, quotationMarkIndex + 1, "\"");
}
buffer.append("\"");
}
}
/**
* Returns the serialized value to be used for the given {@link ParameterBinding}.
*
* @param accessor must not be {@literal null}.
* @param parameters
* @param binding must not be {@literal null}.
* @return
*/
@SuppressWarnings("unchecked")
private String getParameterValueForBinding(MongoParameterAccessor accessor, MongoParameters parameters,
ParameterBinding binding) {
Object value = binding.isExpression()
? evaluateExpression(binding.getExpression(), parameters, accessor.getValues())
: accessor.getBindableValue(binding.getParameterIndex());
if (value instanceof String && binding.isQuoted()) {
if (binding.isExpression() && ((String) value).startsWith("{")) {
return (String) value;
}
return binding.isExpression() ? JSON.serialize(value) : QuotedString.unquote(JSON.serialize(value));
}
return EncodableValue.create(value).encode(codecRegistryProvider, binding.isQuoted());
}
/**
* Evaluates the given {@code expressionString}.
*
* @param expressionString must not be {@literal null} or empty.
* @param parameters must not be {@literal null}.
* @param parameterValues must not be {@literal null}.
* @return
*/
@Nullable
private Object evaluateExpression(String expressionString, MongoParameters parameters, Object[] parameterValues) {
EvaluationContext evaluationContext = evaluationContextProvider.getEvaluationContext(parameters, parameterValues);
Expression expression = expressionParser.parseExpression(expressionString);
return expression.getValue(evaluationContext, Object.class);
}
/**
* Creates a replacement {@link Pattern} for all {@link ParameterBinding#getParameter() binding parameters} including
* a potentially trailing quotation mark.
*
* @param bindings
* @return
*/
private Pattern createReplacementPattern(List<ParameterBinding> bindings) {
StringBuilder regex = new StringBuilder();
for (ParameterBinding binding : bindings) {
regex.append("|");
regex.append("(" + Pattern.quote(binding.getParameter()) + ")");
regex.append("([\\w.]*");
regex.append("(\\W?['\"]|\\w*')?)");
}
return Pattern.compile(regex.substring(1));
}
/**
* Extract the placeholder stripping any trailing trailing quotation mark that might have resulted from the
* {@link #createReplacementPattern(List) pattern} used.
*
* @param parameterIndex The actual parameter index.
* @param matcher The actual {@link Matcher}.
* @return
*/
private Placeholder extractPlaceholder(int parameterIndex, Matcher matcher) {
String rawPlaceholder = matcher.group(parameterIndex * 3 + 1);
String suffix = matcher.group(parameterIndex * 3 + 2);
if (!StringUtils.hasText(rawPlaceholder)) {
rawPlaceholder = matcher.group();
if (rawPlaceholder.matches(".*\\d$")) {
suffix = "";
} else {
int index = rawPlaceholder.replaceAll("[^\\?0-9]*$", "").length() - 1;
if (index > 0 && rawPlaceholder.length() > index) {
suffix = rawPlaceholder.substring(index + 1);
}
}
if (QuotedString.endsWithQuote(rawPlaceholder)) {
rawPlaceholder = rawPlaceholder.substring(0,
rawPlaceholder.length() - (StringUtils.hasText(suffix) ? suffix.length() : 1));
}
}
if (StringUtils.hasText(suffix)) {
boolean quoted = QuotedString.endsWithQuote(suffix);
return Placeholder.of(parameterIndex, rawPlaceholder, quoted,
quoted ? QuotedString.unquoteSuffix(suffix) : suffix);
}
return Placeholder.of(parameterIndex, rawPlaceholder, false, null);
}
/**
* @author Christoph Strobl
* @author Mark Paluch
* @since 1.9
*/
static class BindingContext {
final MongoParameters parameters;
final Map<Placeholder, ParameterBinding> bindings;
/**
* Creates new {@link BindingContext}.
*
* @param parameters
* @param bindings
*/
public BindingContext(MongoParameters parameters, List<ParameterBinding> bindings) {
this.parameters = parameters;
this.bindings = mapBindings(bindings);
}
/**
* @return {@literal true} when list of bindings is not empty.
*/
boolean hasBindings() {
return !CollectionUtils.isEmpty(bindings);
}
/**
* Get unmodifiable list of {@link ParameterBinding}s.
*
* @return never {@literal null}.
*/
public List<ParameterBinding> getBindings() {
return new ArrayList<ParameterBinding>(bindings.values());
}
/**
* Get the concrete {@link ParameterBinding} for a given {@literal placeholder}.
*
* @param placeholder must not be {@literal null}.
* @return
* @throws java.util.NoSuchElementException
* @since 1.10
*/
ParameterBinding getBindingFor(Placeholder placeholder) {
if (!bindings.containsKey(placeholder)) {
throw new NoSuchElementException(String.format("Could not to find binding for placeholder '%s'.", placeholder));
}
return bindings.get(placeholder);
}
/**
* Get the associated {@link MongoParameters}.
*
* @return
*/
public MongoParameters getParameters() {
return parameters;
}
private static Map<Placeholder, ParameterBinding> mapBindings(List<ParameterBinding> bindings) {
Map<Placeholder, ParameterBinding> map = new LinkedHashMap<Placeholder, ParameterBinding>(bindings.size(), 1);
int parameterIndex = 0;
for (ParameterBinding binding : bindings) {
map.put(Placeholder.of(parameterIndex++, binding.getParameter(), binding.isQuoted(), null), binding);
}
return map;
}
}
/**
* Encapsulates a quoted/unquoted parameter placeholder.
*
* @author Mark Paluch
* @since 1.9
*/
@Value(staticConstructor = "of")
@EqualsAndHashCode(exclude = { "quoted", "suffix" })
static class Placeholder {
private int parameterIndex;
private final String parameter;
private final boolean quoted;
private final @Nullable String suffix;
/*
* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
return quoted ? String.format("'%s'", parameter + (suffix != null ? suffix : ""))
: parameter + (suffix != null ? suffix : "");
}
}
/**
* Utility to handle quoted strings using single/double quotes.
*
* @author Mark Paluch
*/
@UtilityClass
static class QuotedString {
/**
* @param string
* @return {@literal true} if {@literal string} ends with a single/double quote.
*/
static boolean endsWithQuote(String string) {
return string.endsWith("'") || string.endsWith("\"");
}
/**
* Remove trailing quoting from {@literal quoted}.
*
* @param quoted
* @return {@literal quoted} with removed quotes.
*/
public static String unquoteSuffix(String quoted) {
return quoted.substring(0, quoted.length() - 1);
}
/**
* Remove leading and trailing quoting from {@literal quoted}.
*
* @param quoted
* @return {@literal quoted} with removed quotes.
*/
public static String unquote(String quoted) {
return quoted.substring(1, quoted.length() - 1);
}
}
/**
* Value object encapsulating a bindable value, that can be encoded to be represented as JSON (BSON).
*
* @author Mark Paluch
*/
abstract static class EncodableValue {
/**
* Obtain a {@link EncodableValue} given {@code value}.
*
* @param value the value to encode, may be {@literal null}.
* @return the {@link EncodableValue} for {@code value}.
*/
@SuppressWarnings("unchecked")
public static EncodableValue create(@Nullable Object value) {
if (value instanceof byte[]) {
return new BinaryValue((byte[]) value);
}
if (value instanceof UUID) {
return new UuidValue((UUID) value);
}
if (value instanceof Collection) {
Collection<?> collection = (Collection<?>) value;
Class<?> commonElement = CollectionUtils.findCommonElementType(collection);
if (commonElement != null) {
if (UUID.class.isAssignableFrom(commonElement)) {
return new UuidCollection((Collection<UUID>) value);
}
if (byte[].class.isAssignableFrom(commonElement)) {
return new BinaryCollectionValue((Collection<byte[]>) value);
}
}
}
return new ObjectValue(value);
}
/**
* Encode the encapsulated value.
*
* @param provider
* @param quoted
* @return
*/
public abstract String encode(CodecRegistryProvider provider, boolean quoted);
/**
* Encode a {@code value} to JSON.
*
* @param provider
* @param value
* @param defaultCodec
* @param <V>
* @return
*/
protected <V> String encode(CodecRegistryProvider provider, V value, Supplier<Codec<V>> defaultCodec) {
StringWriter writer = new StringWriter();
doEncode(provider, writer, value, defaultCodec);
return writer.toString();
}
/**
* Encode a {@link Collection} to JSON and potentially apply a {@link Function mapping function} before encoding.
*
* @param provider
* @param value
* @param mappingFunction
* @param defaultCodec
* @param <I> Input value type.
* @param <V> Target type.
* @return
*/
protected <I, V> String encodeCollection(CodecRegistryProvider provider, Iterable<I> value,
Function<I, V> mappingFunction, Supplier<Codec<V>> defaultCodec) {
StringWriter writer = new StringWriter();
writer.append("[");
value.forEach(it -> {
if (writer.getBuffer().length() > 1) {
writer.append(", ");
}
doEncode(provider, writer, mappingFunction.apply(it), defaultCodec);
});
writer.append("]");
writer.flush();
return writer.toString();
}
@SuppressWarnings("unchecked")
private <V> void doEncode(CodecRegistryProvider provider, StringWriter writer, V value,
Supplier<Codec<V>> defaultCodec) {
Codec<V> codec = provider.getCodecFor((Class<V>) value.getClass()).orElseGet(defaultCodec);
JsonWriter jsonWriter = new JsonWriter(writer);
codec.encode(jsonWriter, value, null);
jsonWriter.flush();
}
}
/**
* {@link EncodableValue} for {@code byte[]} to render to {@literal $binary}.
*/
@RequiredArgsConstructor
static class BinaryValue extends EncodableValue {
private final byte[] value;
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.query.ExpressionEvaluatingParameterBinder.EncodableValue#encode(org.springframework.data.mongodb.CodecRegistryProvider, boolean)
*/
@Override
public String encode(CodecRegistryProvider provider, boolean quoted) {
if (quoted) {
return Base64Utils.encodeToString(this.value);
}
return encode(provider, new Binary(this.value), BinaryCodec::new);
}
}
/**
* {@link EncodableValue} for {@link Collection} containing only {@code byte[]} items to render to a BSON list
* containing {@literal $binary}.
*/
@RequiredArgsConstructor
static class BinaryCollectionValue extends EncodableValue {
private final Collection<byte[]> value;
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.query.ExpressionEvaluatingParameterBinder.EncodableValue#encode(org.springframework.data.mongodb.CodecRegistryProvider, boolean)
*/
@Override
public String encode(CodecRegistryProvider provider, boolean quoted) {
return encodeCollection(provider, this.value, Binary::new, BinaryCodec::new);
}
}
/**
* {@link EncodableValue} for {@link UUID} to render to {@literal $binary}.
*/
@RequiredArgsConstructor
static class UuidValue extends EncodableValue {
private final UUID value;
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.query.ExpressionEvaluatingParameterBinder.EncodableValue#encode(org.springframework.data.mongodb.CodecRegistryProvider, boolean)
*/
@Override
public String encode(CodecRegistryProvider provider, boolean quoted) {
if (quoted) {
return this.value.toString();
}
return encode(provider, this.value, UuidCodec::new);
}
}
/**
* {@link EncodableValue} for {@link Collection} containing only {@link UUID} items to render to a BSON list
* containing {@literal $binary}.
*/
@RequiredArgsConstructor
static class UuidCollection extends EncodableValue {
private final Collection<UUID> value;
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.query.ExpressionEvaluatingParameterBinder.EncodableValue#encode(org.springframework.data.mongodb.CodecRegistryProvider, boolean)
*/
@Override
public String encode(CodecRegistryProvider provider, boolean quoted) {
return encodeCollection(provider, this.value, Function.identity(), UuidCodec::new);
}
}
/**
* Fallback-{@link EncodableValue} for {@link Object}-typed values.
*/
@RequiredArgsConstructor
static class ObjectValue extends EncodableValue {
private final @Nullable Object value;
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.query.ExpressionEvaluatingParameterBinder.EncodableValue#encode(org.springframework.data.mongodb.CodecRegistryProvider, boolean)
*/
@Override
public String encode(CodecRegistryProvider provider, boolean quoted) {
return JSON.serialize(this.value);
}
}
}

View File

@@ -16,13 +16,11 @@
package org.springframework.data.mongodb.repository.query;
import org.springframework.data.repository.core.EntityInformation;
import org.springframework.lang.Nullable;
/**
* Mongo specific {@link EntityInformation}.
*
* @author Oliver Gierke
* @author Mark Paluch
*/
public interface MongoEntityInformation<T, ID> extends EntityInformation<T, ID> {
@@ -39,26 +37,4 @@ public interface MongoEntityInformation<T, ID> extends EntityInformation<T, ID>
* @return
*/
String getIdAttribute();
/**
* Returns whether the entity uses optimistic locking.
*
* @return true if the entity defines a {@link org.springframework.data.annotation.Version} property.
* @since 2.2
*/
default boolean isVersioned() {
return false;
}
/**
* Returns the version value for the entity or {@literal null} if the entity is not {@link #isVersioned() versioned}.
*
* @param entity must not be {@literal null}
* @return can be {@literal null}.
* @since 2.2
*/
@Nullable
default Object getVersion(T entity) {
return null;
}
}

View File

@@ -25,9 +25,7 @@ import java.util.regex.Pattern;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.data.domain.Range;
import org.springframework.data.domain.Range.Bound;
import org.springframework.data.domain.Sort;
import org.springframework.data.geo.Distance;
import org.springframework.data.geo.Metrics;
@@ -44,6 +42,7 @@ import org.springframework.data.mongodb.core.query.CriteriaDefinition;
import org.springframework.data.mongodb.core.query.MongoRegexCreator;
import org.springframework.data.mongodb.core.query.MongoRegexCreator.MatchMode;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.repository.query.ConvertingParameterAccessor.PotentiallyConvertingIterator;
import org.springframework.data.repository.query.parser.AbstractQueryCreator;
import org.springframework.data.repository.query.parser.Part;
import org.springframework.data.repository.query.parser.Part.IgnoreCaseType;
@@ -188,7 +187,7 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
case LESS_THAN_EQUAL:
return criteria.lte(parameters.next());
case BETWEEN:
return computeBetweenPart(criteria, parameters);
return criteria.gt(parameters.next()).lt(parameters.next());
case IS_NOT_NULL:
return criteria.ne(null);
case IS_NULL:
@@ -419,51 +418,6 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
return false;
}
/**
* Compute a {@link Type#BETWEEN} typed {@link Part} using {@link Criteria#gt(Object) $gt},
* {@link Criteria#gte(Object) $gte}, {@link Criteria#lt(Object) $lt} and {@link Criteria#lte(Object) $lte}.
* <p/>
* In case the first {@literal value} is actually a {@link Range} the lower and upper bounds of the {@link Range} are
* used according to their {@link Bound#isInclusive() inclusion} definition. Otherwise the {@literal value} is used
* for {@literal $gt} and {@link Iterator#next() parameters.next()} as {@literal $lt}.
*
* @param criteria must not be {@literal null}.
* @param parameters must not be {@literal null}.
* @return
* @since 2.2
*/
private static Criteria computeBetweenPart(Criteria criteria, Iterator<Object> parameters) {
Object value = parameters.next();
if (!(value instanceof Range)) {
return criteria.gt(value).lt(parameters.next());
}
Range<?> range = (Range<?>) value;
Optional<?> min = range.getLowerBound().getValue();
Optional<?> max = range.getUpperBound().getValue();
min.ifPresent(it -> {
if (range.getLowerBound().isInclusive()) {
criteria.gte(it);
} else {
criteria.gt(it);
}
});
max.ifPresent(it -> {
if (range.getUpperBound().isInclusive()) {
criteria.lte(it);
} else {
criteria.lt(it);
}
});
return criteria;
}
private static MatchMode toMatchMode(Type type) {
switch (type) {

View File

@@ -270,7 +270,6 @@ public class MongoQueryMethod extends QueryMethod {
}
if (meta.maxScanDocuments() > 0) {
// TODO: Mongo 4 - removal
metaAttributes.setMaxScan(meta.maxScanDocuments());
}
@@ -283,8 +282,6 @@ public class MongoQueryMethod extends QueryMethod {
}
if (meta.snapshot()) {
// TODO: Mongo 4 - removal
metaAttributes.setSnapshot(meta.snapshot());
}

View File

@@ -16,7 +16,6 @@
package org.springframework.data.mongodb.repository.query;
import org.bson.Document;
import org.bson.json.JsonParseException;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.MongoTemplate;
@@ -32,6 +31,10 @@ import org.springframework.data.repository.query.ReturnedType;
import org.springframework.data.repository.query.parser.PartTree;
import org.springframework.util.StringUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.util.JSON;
import com.mongodb.util.JSONParseException;
/**
* {@link RepositoryQuery} implementation for Mongo.
*
@@ -111,12 +114,12 @@ public class PartTreeMongoQuery extends AbstractMongoQuery {
try {
BasicQuery result = new BasicQuery(query.getQueryObject(), Document.parse(fieldSpec));
BasicQuery result = new BasicQuery(query.getQueryObject(), new Document((BasicDBObject) JSON.parse(fieldSpec)));
result.setSortObject(query.getSortObject());
return result;
} catch (JsonParseException o_O) {
} catch (JSONParseException o_O) {
throw new IllegalStateException(String.format("Invalid query or field specification in %s!", getQueryMethod()),
o_O);
}

View File

@@ -16,7 +16,6 @@
package org.springframework.data.mongodb.repository.query;
import org.bson.Document;
import org.bson.json.JsonParseException;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.ReactiveMongoOperations;
@@ -31,6 +30,8 @@ import org.springframework.data.repository.query.ReturnedType;
import org.springframework.data.repository.query.parser.PartTree;
import org.springframework.util.StringUtils;
import com.mongodb.util.JSONParseException;
/**
* Reactive PartTree {@link RepositoryQuery} implementation for Mongo.
*
@@ -109,7 +110,7 @@ public class ReactivePartTreeMongoQuery extends AbstractReactiveMongoQuery {
return result;
} catch (JsonParseException o_O) {
} catch (JSONParseException o_O) {
throw new IllegalStateException(String.format("Invalid query or field specification in %s!", getQueryMethod()),
o_O);
}

View File

@@ -15,15 +15,18 @@
*/
package org.springframework.data.mongodb.repository.query;
import org.bson.Document;
import java.util.ArrayList;
import java.util.List;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.ReactiveMongoOperations;
import org.springframework.data.mongodb.core.query.BasicQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.util.json.ParameterBindingContext;
import org.springframework.data.mongodb.util.json.ParameterBindingDocumentCodec;
import org.springframework.data.mongodb.repository.query.ExpressionEvaluatingParameterBinder.BindingContext;
import org.springframework.data.mongodb.repository.query.StringBasedMongoQuery.ParameterBinding;
import org.springframework.data.mongodb.repository.query.StringBasedMongoQuery.ParameterBindingParser;
import org.springframework.data.repository.query.QueryMethodEvaluationContextProvider;
import org.springframework.expression.spel.standard.SpelExpressionParser;
import org.springframework.util.Assert;
@@ -39,17 +42,16 @@ public class ReactiveStringBasedMongoQuery extends AbstractReactiveMongoQuery {
private static final String COUNT_EXISTS_AND_DELETE = "Manually defined query for %s cannot be a count and exists or delete query at the same time!";
private static final Logger LOG = LoggerFactory.getLogger(ReactiveStringBasedMongoQuery.class);
private static final ParameterBindingDocumentCodec CODEC = new ParameterBindingDocumentCodec();
private static final ParameterBindingParser BINDING_PARSER = ParameterBindingParser.INSTANCE;
private final String query;
private final String fieldSpec;
private final SpelExpressionParser expressionParser;
private final QueryMethodEvaluationContextProvider evaluationContextProvider;
private final boolean isCountQuery;
private final boolean isExistsQuery;
private final boolean isDeleteQuery;
private final List<ParameterBinding> queryParameterBindings;
private final List<ParameterBinding> fieldSpecParameterBindings;
private final ExpressionEvaluatingParameterBinder parameterBinder;
/**
* Creates a new {@link ReactiveStringBasedMongoQuery} for the given {@link MongoQueryMethod} and
@@ -83,10 +85,13 @@ public class ReactiveStringBasedMongoQuery extends AbstractReactiveMongoQuery {
Assert.notNull(query, "Query must not be null!");
Assert.notNull(expressionParser, "SpelExpressionParser must not be null!");
this.query = query;
this.expressionParser = expressionParser;
this.evaluationContextProvider = evaluationContextProvider;
this.fieldSpec = method.getFieldSpecification();
this.queryParameterBindings = new ArrayList<ParameterBinding>();
this.query = BINDING_PARSER.parseAndCollectParameterBindingsFromQueryIntoBindings(query,
this.queryParameterBindings);
this.fieldSpecParameterBindings = new ArrayList<ParameterBinding>();
this.fieldSpec = BINDING_PARSER.parseAndCollectParameterBindingsFromQueryIntoBindings(
method.getFieldSpecification(), this.fieldSpecParameterBindings);
if (method.hasAnnotatedQuery()) {
@@ -106,6 +111,8 @@ public class ReactiveStringBasedMongoQuery extends AbstractReactiveMongoQuery {
this.isExistsQuery = false;
this.isDeleteQuery = false;
}
this.parameterBinder = new ExpressionEvaluatingParameterBinder(expressionParser, evaluationContextProvider);
}
/*
@@ -115,13 +122,12 @@ public class ReactiveStringBasedMongoQuery extends AbstractReactiveMongoQuery {
@Override
protected Query createQuery(ConvertingParameterAccessor accessor) {
ParameterBindingContext bindingContext = new ParameterBindingContext((accessor::getBindableValue), expressionParser,
evaluationContextProvider.getEvaluationContext(getQueryMethod().getParameters(), accessor.getValues()));
String queryString = parameterBinder.bind(this.query, accessor,
new BindingContext(getQueryMethod().getParameters(), queryParameterBindings));
String fieldsString = parameterBinder.bind(this.fieldSpec, accessor,
new BindingContext(getQueryMethod().getParameters(), fieldSpecParameterBindings));
Document queryObject = CODEC.decode(this.query, bindingContext);
Document fieldsObject = CODEC.decode(this.fieldSpec, bindingContext);
Query query = new BasicQuery(queryObject, fieldsObject).with(accessor.getSort());
Query query = new BasicQuery(queryString, fieldsString).with(accessor.getSort());
if (LOG.isDebugEnabled()) {
LOG.debug(String.format("Created query %s for %s fields.", query.getQueryObject(), query.getFieldsObject()));

View File

@@ -15,17 +15,31 @@
*/
package org.springframework.data.mongodb.repository.query;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
import java.util.regex.PatternSyntaxException;
import org.bson.BSON;
import org.bson.Document;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.query.BasicQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.util.json.ParameterBindingContext;
import org.springframework.data.mongodb.util.json.ParameterBindingDocumentCodec;
import org.springframework.data.mongodb.repository.query.ExpressionEvaluatingParameterBinder.BindingContext;
import org.springframework.data.repository.query.QueryMethodEvaluationContextProvider;
import org.springframework.expression.spel.standard.SpelExpressionParser;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
import com.mongodb.DBObject;
import com.mongodb.DBRef;
import com.mongodb.util.JSON;
import com.mongodb.util.JSONCallback;
/**
* Query to use a plain JSON String to create the {@link Query} to actually execute.
@@ -39,17 +53,16 @@ public class StringBasedMongoQuery extends AbstractMongoQuery {
private static final String COUNT_EXISTS_AND_DELETE = "Manually defined query for %s cannot be a count and exists or delete query at the same time!";
private static final Logger LOG = LoggerFactory.getLogger(StringBasedMongoQuery.class);
private static final ParameterBindingDocumentCodec CODEC = new ParameterBindingDocumentCodec();
private static final ParameterBindingParser BINDING_PARSER = ParameterBindingParser.INSTANCE;
private final String query;
private final String fieldSpec;
private final SpelExpressionParser expressionParser;
private final QueryMethodEvaluationContextProvider evaluationContextProvider;
private final boolean isCountQuery;
private final boolean isExistsQuery;
private final boolean isDeleteQuery;
private final List<ParameterBinding> queryParameterBindings;
private final List<ParameterBinding> fieldSpecParameterBindings;
private final ExpressionEvaluatingParameterBinder parameterBinder;
/**
* Creates a new {@link StringBasedMongoQuery} for the given {@link MongoQueryMethod}, {@link MongoOperations},
@@ -82,10 +95,15 @@ public class StringBasedMongoQuery extends AbstractMongoQuery {
Assert.notNull(query, "Query must not be null!");
Assert.notNull(expressionParser, "SpelExpressionParser must not be null!");
this.query = query;
this.expressionParser = expressionParser;
this.evaluationContextProvider = evaluationContextProvider;
this.fieldSpec = method.getFieldSpecification();
this.queryParameterBindings = new ArrayList<ParameterBinding>();
this.query = BINDING_PARSER.parseAndCollectParameterBindingsFromQueryIntoBindings(query,
this.queryParameterBindings);
this.fieldSpecParameterBindings = new ArrayList<ParameterBinding>();
this.fieldSpec = BINDING_PARSER.parseAndCollectParameterBindingsFromQueryIntoBindings(
method.getFieldSpecification(), this.fieldSpecParameterBindings);
this.parameterBinder = new ExpressionEvaluatingParameterBinder(expressionParser, evaluationContextProvider);
if (method.hasAnnotatedQuery()) {
@@ -114,13 +132,12 @@ public class StringBasedMongoQuery extends AbstractMongoQuery {
@Override
protected Query createQuery(ConvertingParameterAccessor accessor) {
ParameterBindingContext bindingContext = new ParameterBindingContext((accessor::getBindableValue), expressionParser,
evaluationContextProvider.getEvaluationContext(getQueryMethod().getParameters(), accessor.getValues()));
String queryString = parameterBinder.bind(this.query, accessor,
new BindingContext(getQueryMethod().getParameters(), queryParameterBindings));
String fieldsString = parameterBinder.bind(this.fieldSpec, accessor,
new BindingContext(getQueryMethod().getParameters(), fieldSpecParameterBindings));
Document queryObject = CODEC.decode(this.query, bindingContext);
Document fieldsObject = CODEC.decode(this.fieldSpec, bindingContext);
Query query = new BasicQuery(queryObject, fieldsObject).with(accessor.getSort());
Query query = new BasicQuery(queryString, fieldsString).with(accessor.getSort());
if (LOG.isDebugEnabled()) {
LOG.debug(String.format("Created query %s for %s fields.", query.getQueryObject(), query.getFieldsObject()));
@@ -169,4 +186,268 @@ public class StringBasedMongoQuery extends AbstractMongoQuery {
boolean isDeleteQuery) {
return BooleanUtil.countBooleanTrueValues(isCountQuery, isExistsQuery, isDeleteQuery) > 1;
}
/**
* A parser that extracts the parameter bindings from a given query string.
*
* @author Thomas Darimont
*/
enum ParameterBindingParser {
INSTANCE;
private static final String EXPRESSION_PARAM_QUOTE = "'";
private static final String EXPRESSION_PARAM_PREFIX = "?expr";
private static final String INDEX_BASED_EXPRESSION_PARAM_START = "?#{";
private static final String NAME_BASED_EXPRESSION_PARAM_START = ":#{";
private static final char CURRLY_BRACE_OPEN = '{';
private static final char CURRLY_BRACE_CLOSE = '}';
private static final String PARAMETER_PREFIX = "_param_";
private static final String PARSEABLE_PARAMETER = "\"" + PARAMETER_PREFIX + "$1\"";
private static final Pattern PARAMETER_BINDING_PATTERN = Pattern.compile("\\?(\\d+)");
private static final Pattern PARSEABLE_BINDING_PATTERN = Pattern.compile("\"?" + PARAMETER_PREFIX + "(\\d+)\"?");
private final static int PARAMETER_INDEX_GROUP = 1;
/**
* Returns a list of {@link ParameterBinding}s found in the given {@code input} or an
* {@link Collections#emptyList()}.
*
* @param input can be empty.
* @param bindings must not be {@literal null}.
* @return
*/
public String parseAndCollectParameterBindingsFromQueryIntoBindings(String input, List<ParameterBinding> bindings) {
if (!StringUtils.hasText(input)) {
return input;
}
Assert.notNull(bindings, "Parameter bindings must not be null!");
String transformedInput = transformQueryAndCollectExpressionParametersIntoBindings(input, bindings);
String parseableInput = makeParameterReferencesParseable(transformedInput);
collectParameterReferencesIntoBindings(bindings,
JSON.parse(parseableInput, new LenientPatternDecodingCallback()));
return transformedInput;
}
private static String transformQueryAndCollectExpressionParametersIntoBindings(String input,
List<ParameterBinding> bindings) {
StringBuilder result = new StringBuilder();
int startIndex = 0;
int currentPos = 0;
int exprIndex = 0;
while (currentPos < input.length()) {
int indexOfExpressionParameter = getIndexOfExpressionParameter(input, currentPos);
// no expression parameter found
if (indexOfExpressionParameter < 0) {
break;
}
int exprStart = indexOfExpressionParameter + 3;
currentPos = exprStart;
// eat parameter expression
int curlyBraceOpenCnt = 1;
while (curlyBraceOpenCnt > 0) {
switch (input.charAt(currentPos++)) {
case CURRLY_BRACE_OPEN:
curlyBraceOpenCnt++;
break;
case CURRLY_BRACE_CLOSE:
curlyBraceOpenCnt--;
break;
default:
}
}
result.append(input.subSequence(startIndex, indexOfExpressionParameter));
result.append(EXPRESSION_PARAM_QUOTE).append(EXPRESSION_PARAM_PREFIX);
result.append(exprIndex);
result.append(EXPRESSION_PARAM_QUOTE);
bindings.add(new ParameterBinding(exprIndex, true, input.substring(exprStart, currentPos - 1)));
startIndex = currentPos;
exprIndex++;
}
return result.append(input.subSequence(currentPos, input.length())).toString();
}
private static String makeParameterReferencesParseable(String input) {
Matcher matcher = PARAMETER_BINDING_PATTERN.matcher(input);
return matcher.replaceAll(PARSEABLE_PARAMETER);
}
private static void collectParameterReferencesIntoBindings(List<ParameterBinding> bindings, Object value) {
if (value instanceof String) {
String string = ((String) value).trim();
potentiallyAddBinding(string, bindings);
} else if (value instanceof Pattern) {
String string = value.toString().trim();
Matcher valueMatcher = PARSEABLE_BINDING_PATTERN.matcher(string);
while (valueMatcher.find()) {
int paramIndex = Integer.parseInt(valueMatcher.group(PARAMETER_INDEX_GROUP));
/*
* The pattern is used as a direct parameter replacement, e.g. 'field': ?1,
* therefore we treat it as not quoted to remain backwards compatible.
*/
boolean quoted = !string.equals(PARAMETER_PREFIX + paramIndex);
bindings.add(new ParameterBinding(paramIndex, quoted));
}
} else if (value instanceof DBRef) {
DBRef dbref = (DBRef) value;
potentiallyAddBinding(dbref.getCollectionName(), bindings);
potentiallyAddBinding(dbref.getId().toString(), bindings);
} else if (value instanceof Document) {
Document document = (Document) value;
for (String field : document.keySet()) {
collectParameterReferencesIntoBindings(bindings, field);
collectParameterReferencesIntoBindings(bindings, document.get(field));
}
} else if (value instanceof DBObject) {
DBObject dbo = (DBObject) value;
for (String field : dbo.keySet()) {
collectParameterReferencesIntoBindings(bindings, field);
collectParameterReferencesIntoBindings(bindings, dbo.get(field));
}
}
}
private static void potentiallyAddBinding(String source, List<ParameterBinding> bindings) {
Matcher valueMatcher = PARSEABLE_BINDING_PATTERN.matcher(source);
while (valueMatcher.find()) {
int paramIndex = Integer.parseInt(valueMatcher.group(PARAMETER_INDEX_GROUP));
boolean quoted = source.startsWith("'") || source.startsWith("\"");
bindings.add(new ParameterBinding(paramIndex, quoted));
}
}
private static int getIndexOfExpressionParameter(String input, int position) {
int indexOfExpressionParameter = input.indexOf(INDEX_BASED_EXPRESSION_PARAM_START, position);
return indexOfExpressionParameter < 0 ? input.indexOf(NAME_BASED_EXPRESSION_PARAM_START, position)
: indexOfExpressionParameter;
}
}
/**
* {@link JSONCallback} with lenient handling for {@link PatternSyntaxException} falling back to a placeholder
* {@link Pattern} for intermediate query document rendering.
*/
private static class LenientPatternDecodingCallback extends JSONCallback {
private static final Pattern EMPTY_MARKER = Pattern.compile("__Spring_Data_MongoDB_Bind_Marker__");
/*
* (non-Javadoc)
* @see com.mongodb.util.JSONCallback#objectDone()
*/
@Override
public Object objectDone() {
return exceptionSwallowingStackReducingObjectDone();
}
private Object exceptionSwallowingStackReducingObjectDone/*CauseWeJustNeedTheStructureNotTheActualValue*/() {
Object value;
try {
return super.objectDone();
} catch (PatternSyntaxException e) {
value = EMPTY_MARKER;
}
if (!isStackEmpty()) {
_put(curName(), value);
} else {
value = !BSON.hasDecodeHooks() ? value : BSON.applyDecodingHooks(value);
setRoot(value);
}
return value;
}
}
/**
* A generic parameter binding with name or position information.
*
* @author Thomas Darimont
*/
static class ParameterBinding {
private final int parameterIndex;
private final boolean quoted;
private final @Nullable String expression;
/**
* Creates a new {@link ParameterBinding} with the given {@code parameterIndex} and {@code quoted} information.
*
* @param parameterIndex
* @param quoted whether or not the parameter is already quoted.
*/
public ParameterBinding(int parameterIndex, boolean quoted) {
this(parameterIndex, quoted, null);
}
public ParameterBinding(int parameterIndex, boolean quoted, @Nullable String expression) {
this.parameterIndex = parameterIndex;
this.quoted = quoted;
this.expression = expression;
}
public boolean isQuoted() {
return quoted;
}
public int getParameterIndex() {
return parameterIndex;
}
public String getParameter() {
return "?" + (isExpression() ? "expr" : "") + parameterIndex;
}
@Nullable
public String getExpression() {
return expression;
}
public boolean isExpression() {
return this.expression != null;
}
}
}

View File

@@ -16,7 +16,6 @@
package org.springframework.data.mongodb.repository.support;
import org.bson.types.ObjectId;
import org.springframework.data.mapping.PersistentPropertyAccessor;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.repository.query.MongoEntityInformation;
import org.springframework.data.repository.core.support.PersistentEntityInformation;
@@ -88,16 +87,14 @@ public class MappingMongoEntityInformation<T, ID> extends PersistentEntityInform
this.fallbackIdType = idType != null ? idType : (Class<ID>) ObjectId.class;
}
/*
* (non-Javadoc)
/* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.MongoEntityInformation#getCollectionName()
*/
public String getCollectionName() {
return customCollectionName == null ? entityMetadata.getCollection() : customCollectionName;
}
/*
* (non-Javadoc)
/* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.MongoEntityInformation#getIdAttribute()
*/
public String getIdAttribute() {
@@ -109,6 +106,7 @@ public class MappingMongoEntityInformation<T, ID> extends PersistentEntityInform
* @see org.springframework.data.repository.core.support.PersistentEntityInformation#getIdType()
*/
@Override
@SuppressWarnings("unchecked")
public Class<ID> getIdType() {
if (this.entityMetadata.hasIdProperty()) {
@@ -117,30 +115,4 @@ public class MappingMongoEntityInformation<T, ID> extends PersistentEntityInform
return fallbackIdType;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.MongoEntityInformation#isVersioned()
*/
@Override
public boolean isVersioned() {
return this.entityMetadata.hasVersionProperty();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.MongoEntityInformation#getVersion(Object)
*/
@Override
public Object getVersion(T entity) {
if (!isVersioned()) {
return null;
}
PersistentPropertyAccessor<T> accessor = this.entityMetadata.getPropertyAccessor(entity);
return accessor.getProperty(this.entityMetadata.getRequiredVersionProperty());
}
}

View File

@@ -22,18 +22,23 @@ import org.springframework.dao.IncorrectResultSizeDataAccessException;
import org.springframework.data.domain.Page;
import org.springframework.data.domain.Pageable;
import org.springframework.data.domain.Sort;
import org.springframework.data.domain.Sort.Order;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.repository.query.MongoEntityInformation;
import org.springframework.data.querydsl.EntityPathResolver;
import org.springframework.data.querydsl.QSort;
import org.springframework.data.querydsl.QuerydslPredicateExecutor;
import org.springframework.data.querydsl.SimpleEntityPathResolver;
import org.springframework.data.repository.core.EntityInformation;
import org.springframework.data.repository.support.PageableExecutionUtils;
import org.springframework.util.Assert;
import com.querydsl.core.NonUniqueResultException;
import com.querydsl.core.types.EntityPath;
import com.querydsl.core.types.Expression;
import com.querydsl.core.types.OrderSpecifier;
import com.querydsl.core.types.Predicate;
import com.querydsl.core.types.dsl.PathBuilder;
/**
* MongoDB-specific {@link QuerydslPredicateExecutor} that allows execution {@link Predicate}s in various forms.
@@ -45,9 +50,10 @@ import com.querydsl.core.types.Predicate;
* @author Mark Paluch
* @since 2.0
*/
public class QuerydslMongoPredicateExecutor<T> extends QuerydslPredicateExecutorSupport<T>
implements QuerydslPredicateExecutor<T> {
public class QuerydslMongoPredicateExecutor<T> implements QuerydslPredicateExecutor<T> {
private final PathBuilder<T> builder;
private final EntityInformation<T, ?> entityInformation;
private final MongoOperations mongoOperations;
/**
@@ -74,8 +80,12 @@ public class QuerydslMongoPredicateExecutor<T> extends QuerydslPredicateExecutor
public QuerydslMongoPredicateExecutor(MongoEntityInformation<T, ?> entityInformation, MongoOperations mongoOperations,
EntityPathResolver resolver) {
super(mongoOperations.getConverter(), pathBuilderFor(resolver.createPath(entityInformation.getJavaType())),
entityInformation);
Assert.notNull(resolver, "EntityPathResolver must not be null!");
EntityPath<T> path = resolver.createPath(entityInformation.getJavaType());
this.builder = new PathBuilder<T>(path.getType(), path.getMetadata());
this.entityInformation = entityInformation;
this.mongoOperations = mongoOperations;
}
@@ -200,7 +210,7 @@ public class QuerydslMongoPredicateExecutor<T> extends QuerydslPredicateExecutor
* @return
*/
private SpringDataMongodbQuery<T> createQuery() {
return new SpringDataMongodbQuery<>(mongoOperations, typeInformation().getJavaType());
return new SpringDataMongodbQuery<>(mongoOperations, entityInformation.getJavaType());
}
/**
@@ -225,7 +235,32 @@ public class QuerydslMongoPredicateExecutor<T> extends QuerydslPredicateExecutor
*/
private SpringDataMongodbQuery<T> applySorting(SpringDataMongodbQuery<T> query, Sort sort) {
toOrderSpecifiers(sort).forEach(query::orderBy);
// TODO: find better solution than instanceof check
if (sort instanceof QSort) {
List<OrderSpecifier<?>> orderSpecifiers = ((QSort) sort).getOrderSpecifiers();
query.orderBy(orderSpecifiers.toArray(new OrderSpecifier<?>[orderSpecifiers.size()]));
return query;
}
sort.stream().map(this::toOrder).forEach(query::orderBy);
return query;
}
/**
* Transforms a plain {@link Order} into a Querydsl specific {@link OrderSpecifier}.
*
* @param order
* @return
*/
@SuppressWarnings({ "rawtypes", "unchecked" })
private OrderSpecifier<?> toOrder(Order order) {
Expression<Object> property = builder.get(order.getProperty());
return new OrderSpecifier(
order.isAscending() ? com.querydsl.core.types.Order.ASC : com.querydsl.core.types.Order.DESC, property);
}
}

View File

@@ -1,92 +0,0 @@
/*
* Copyright 2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.repository.support;
import java.util.List;
import java.util.stream.Collectors;
import org.springframework.data.domain.Sort;
import org.springframework.data.domain.Sort.Order;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.querydsl.QSort;
import org.springframework.data.repository.core.EntityInformation;
import com.querydsl.core.types.EntityPath;
import com.querydsl.core.types.Expression;
import com.querydsl.core.types.OrderSpecifier;
import com.querydsl.core.types.dsl.PathBuilder;
/**
* @author Christoph Strobl
* @since 2.2
*/
abstract class QuerydslPredicateExecutorSupport<T> {
private final SpringDataMongodbSerializer serializer;
private final PathBuilder<T> builder;
private final EntityInformation<T, ?> entityInformation;
QuerydslPredicateExecutorSupport(MongoConverter converter, PathBuilder<T> builder,
EntityInformation<T, ?> entityInformation) {
this.serializer = new SpringDataMongodbSerializer(converter);
this.builder = builder;
this.entityInformation = entityInformation;
}
protected static <E> PathBuilder<E> pathBuilderFor(EntityPath<E> path) {
return new PathBuilder<>(path.getType(), path.getMetadata());
}
protected EntityInformation<T, ?> typeInformation() {
return entityInformation;
}
protected SpringDataMongodbSerializer mongodbSerializer() {
return serializer;
}
/**
* Transforms a plain {@link Order} into a Querydsl specific {@link OrderSpecifier}.
*
* @param order
* @return
*/
@SuppressWarnings({ "rawtypes", "unchecked" })
protected OrderSpecifier<?> toOrder(Order order) {
Expression<Object> property = builder.get(order.getProperty());
return new OrderSpecifier(
order.isAscending() ? com.querydsl.core.types.Order.ASC : com.querydsl.core.types.Order.DESC, property);
}
/**
* Converts the given {@link Sort} to {@link OrderSpecifier}.
*
* @param sort
* @return
*/
protected List<OrderSpecifier<?>> toOrderSpecifiers(Sort sort) {
if (sort instanceof QSort) {
return ((QSort) sort).getOrderSpecifiers();
}
return sort.stream().map(this::toOrder).collect(Collectors.toList());
}
}

View File

@@ -15,8 +15,6 @@
*/
package org.springframework.data.mongodb.repository.support;
import static org.springframework.data.querydsl.QuerydslUtils.*;
import lombok.AccessLevel;
import lombok.RequiredArgsConstructor;
@@ -34,13 +32,10 @@ import org.springframework.data.mongodb.repository.query.ReactiveMongoQueryMetho
import org.springframework.data.mongodb.repository.query.ReactivePartTreeMongoQuery;
import org.springframework.data.mongodb.repository.query.ReactiveStringBasedMongoQuery;
import org.springframework.data.projection.ProjectionFactory;
import org.springframework.data.querydsl.ReactiveQuerydslPredicateExecutor;
import org.springframework.data.repository.core.NamedQueries;
import org.springframework.data.repository.core.RepositoryInformation;
import org.springframework.data.repository.core.RepositoryMetadata;
import org.springframework.data.repository.core.support.ReactiveRepositoryFactorySupport;
import org.springframework.data.repository.core.support.RepositoryComposition.RepositoryFragments;
import org.springframework.data.repository.core.support.RepositoryFragment;
import org.springframework.data.repository.query.QueryLookupStrategy;
import org.springframework.data.repository.query.QueryLookupStrategy.Key;
import org.springframework.data.repository.query.QueryMethodEvaluationContextProvider;
@@ -86,30 +81,6 @@ public class ReactiveMongoRepositoryFactory extends ReactiveRepositoryFactorySup
return SimpleReactiveMongoRepository.class;
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.core.support.RepositoryFactorySupport#getRepositoryFragments(org.springframework.data.repository.core.RepositoryMetadata)
*/
@Override
protected RepositoryFragments getRepositoryFragments(RepositoryMetadata metadata) {
RepositoryFragments fragments = RepositoryFragments.empty();
boolean isQueryDslRepository = QUERY_DSL_PRESENT
&& ReactiveQuerydslPredicateExecutor.class.isAssignableFrom(metadata.getRepositoryInterface());
if (isQueryDslRepository) {
MongoEntityInformation<?, Serializable> entityInformation = getEntityInformation(metadata.getDomainType(),
metadata);
fragments = fragments.append(RepositoryFragment.implemented(getTargetRepositoryViaReflection(
ReactiveQuerydslMongoPredicateExecutor.class, entityInformation, operations)));
}
return fragments;
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.core.support.RepositoryFactorySupport#getTargetRepository(org.springframework.data.repository.core.RepositoryInformation)
@@ -142,12 +113,12 @@ public class ReactiveMongoRepositoryFactory extends ReactiveRepositoryFactorySup
@SuppressWarnings("unchecked")
private <T, ID> MongoEntityInformation<T, ID> getEntityInformation(Class<T> domainClass,
@Nullable RepositoryMetadata metadata) {
@Nullable RepositoryInformation information) {
MongoPersistentEntity<?> entity = mappingContext.getRequiredPersistentEntity(domainClass);
return new MappingMongoEntityInformation<>((MongoPersistentEntity<T>) entity,
metadata != null ? (Class<ID>) metadata.getIdType() : null);
return new MappingMongoEntityInformation<T, ID>((MongoPersistentEntity<T>) entity,
information != null ? (Class<ID>) information.getIdType() : null);
}
/**

View File

@@ -1,197 +0,0 @@
/*
* Copyright 2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.repository.support;
import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;
import org.springframework.data.domain.Sort;
import org.springframework.data.mongodb.core.ReactiveMongoOperations;
import org.springframework.data.mongodb.repository.query.MongoEntityInformation;
import org.springframework.data.querydsl.EntityPathResolver;
import org.springframework.data.querydsl.QuerydslPredicateExecutor;
import org.springframework.data.querydsl.ReactiveQuerydslPredicateExecutor;
import org.springframework.data.querydsl.SimpleEntityPathResolver;
import org.springframework.util.Assert;
import com.querydsl.core.types.EntityPath;
import com.querydsl.core.types.OrderSpecifier;
import com.querydsl.core.types.Predicate;
/**
* MongoDB-specific {@link QuerydslPredicateExecutor} that allows execution {@link Predicate}s in various forms.
*
* @author Mark Paluch
* @author Christoph Strobl
* @since 2.2
*/
public class ReactiveQuerydslMongoPredicateExecutor<T> extends QuerydslPredicateExecutorSupport<T>
implements ReactiveQuerydslPredicateExecutor<T> {
private final ReactiveMongoOperations mongoOperations;
/**
* Creates a new {@link ReactiveQuerydslMongoPredicateExecutor} for the given {@link MongoEntityInformation} and
* {@link ReactiveMongoOperations}. Uses the {@link SimpleEntityPathResolver} to create an {@link EntityPath} for the
* given domain class.
*
* @param entityInformation must not be {@literal null}.
* @param mongoOperations must not be {@literal null}.
*/
public ReactiveQuerydslMongoPredicateExecutor(MongoEntityInformation<T, ?> entityInformation,
ReactiveMongoOperations mongoOperations) {
this(entityInformation, mongoOperations, SimpleEntityPathResolver.INSTANCE);
}
/**
* Creates a new {@link ReactiveQuerydslMongoPredicateExecutor} for the given {@link MongoEntityInformation},
* {@link ReactiveMongoOperations} and {@link EntityPathResolver}.
*
* @param entityInformation must not be {@literal null}.
* @param mongoOperations must not be {@literal null}.
* @param resolver must not be {@literal null}.
*/
public ReactiveQuerydslMongoPredicateExecutor(MongoEntityInformation<T, ?> entityInformation,
ReactiveMongoOperations mongoOperations, EntityPathResolver resolver) {
super(mongoOperations.getConverter(), pathBuilderFor(resolver.createPath(entityInformation.getJavaType())),
entityInformation);
this.mongoOperations = mongoOperations;
}
/*
* (non-Javadoc)
* @see org.springframework.data.querydsl.ReactiveQuerydslPredicateExecutor#findOne(com.querydsl.core.types.Predicate)
*/
@Override
public Mono<T> findOne(Predicate predicate) {
Assert.notNull(predicate, "Predicate must not be null!");
return createQueryFor(predicate).fetchOne();
}
/*
* (non-Javadoc)
* @see org.springframework.data.querydsl.ReactiveQuerydslPredicateExecutor#findAll(com.querydsl.core.types.Predicate)
*/
@Override
public Flux<T> findAll(Predicate predicate) {
Assert.notNull(predicate, "Predicate must not be null!");
return createQueryFor(predicate).fetch();
}
/*
* (non-Javadoc)
* @see org.springframework.data.querydsl.ReactiveQuerydslPredicateExecutor#findAll(com.querydsl.core.types.Predicate, com.querydsl.core.types.OrderSpecifier[])
*/
@Override
public Flux<T> findAll(Predicate predicate, OrderSpecifier<?>... orders) {
Assert.notNull(predicate, "Predicate must not be null!");
Assert.notNull(orders, "Order specifiers must not be null!");
return createQueryFor(predicate).orderBy(orders).fetch();
}
/*
* (non-Javadoc)
* @see org.springframework.data.querydsl.ReactiveQuerydslPredicateExecutor#findAll(com.querydsl.core.types.Predicate, org.springframework.data.domain.Sort)
*/
@Override
public Flux<T> findAll(Predicate predicate, Sort sort) {
Assert.notNull(predicate, "Predicate must not be null!");
Assert.notNull(sort, "Sort must not be null!");
return applySorting(createQueryFor(predicate), sort).fetch();
}
/*
* (non-Javadoc)
* @see org.springframework.data.querydsl.ReactiveQuerydslPredicateExecutor#findAll(com.querydsl.core.types.OrderSpecifier[])
*/
@Override
public Flux<T> findAll(OrderSpecifier<?>... orders) {
Assert.notNull(orders, "Order specifiers must not be null!");
return createQuery().orderBy(orders).fetch();
}
/*
* (non-Javadoc)
* @see org.springframework.data.querydsl.ReactiveQuerydslPredicateExecutor#count(com.querydsl.core.types.Predicate)
*/
@Override
public Mono<Long> count(Predicate predicate) {
Assert.notNull(predicate, "Predicate must not be null!");
return createQueryFor(predicate).fetchCount();
}
/*
* (non-Javadoc)
* @see org.springframework.data.querydsl.ReactiveQuerydslPredicateExecutor#exists(com.querydsl.core.types.Predicate)
*/
@Override
public Mono<Boolean> exists(Predicate predicate) {
Assert.notNull(predicate, "Predicate must not be null!");
return createQueryFor(predicate).fetchCount().map(it -> it != 0);
}
/**
* Creates a {@link ReactiveSpringDataMongodbQuery} for the given {@link Predicate}.
*
* @param predicate
* @return
*/
private ReactiveSpringDataMongodbQuery<T> createQueryFor(Predicate predicate) {
return createQuery().where(predicate);
}
/**
* Creates a {@link ReactiveSpringDataMongodbQuery}.
*
* @return
*/
private ReactiveSpringDataMongodbQuery<T> createQuery() {
Class<T> javaType = typeInformation().getJavaType();
return new ReactiveSpringDataMongodbQuery<>(mongodbSerializer(), mongoOperations, javaType,
mongoOperations.getCollectionName(javaType));
}
/**
* Applies the given {@link Sort} to the given {@link ReactiveSpringDataMongodbQuery}.
*
* @param query
* @param sort
* @return
*/
private ReactiveSpringDataMongodbQuery<T> applySorting(ReactiveSpringDataMongodbQuery<T> query, Sort sort) {
toOrderSpecifiers(sort).forEach(query::orderBy);
return query;
}
}

View File

@@ -1,282 +0,0 @@
/*
* Copyright 2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.repository.support;
import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;
import java.util.Collection;
import java.util.Collections;
import java.util.List;
import org.springframework.data.mongodb.core.ReactiveFindOperation.FindWithProjection;
import org.springframework.data.mongodb.core.ReactiveMongoOperations;
import org.springframework.data.mongodb.core.query.BasicQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.lang.Nullable;
import org.springframework.util.LinkedMultiValueMap;
import org.springframework.util.MultiValueMap;
import org.springframework.util.StringUtils;
import com.querydsl.core.JoinExpression;
import com.querydsl.core.QueryMetadata;
import com.querydsl.core.QueryModifiers;
import com.querydsl.core.types.Expression;
import com.querydsl.core.types.ExpressionUtils;
import com.querydsl.core.types.Operation;
import com.querydsl.core.types.OrderSpecifier;
import com.querydsl.core.types.Path;
import com.querydsl.core.types.Predicate;
import com.querydsl.core.types.dsl.CollectionPathBase;
/**
* MongoDB query with utilizing {@link ReactiveMongoOperations} for command execution.
*
* @param <K> result type
* @author Mark Paluch
* @author Christoph Strobl
* @since 2.2
*/
class ReactiveSpringDataMongodbQuery<K> extends QuerydslAbstractMongodbQuery<K, ReactiveSpringDataMongodbQuery<K>> {
private final Class<K> entityClass;
private final ReactiveMongoOperations mongoOperations;
private final FindWithProjection<K> find;
ReactiveSpringDataMongodbQuery(ReactiveMongoOperations mongoOperations, Class<? extends K> entityClass) {
this(new SpringDataMongodbSerializer(mongoOperations.getConverter()), mongoOperations, entityClass, null);
}
ReactiveSpringDataMongodbQuery(MongodbDocumentSerializer serializer, ReactiveMongoOperations mongoOperations,
Class<? extends K> entityClass, @Nullable String collection) {
super(serializer);
this.entityClass = (Class<K>) entityClass;
this.mongoOperations = mongoOperations;
this.find = StringUtils.hasText(collection) ? mongoOperations.query(this.entityClass).inCollection(collection)
: mongoOperations.query(this.entityClass);
}
/**
* Fetch all matching query results.
*
* @return {@link Flux} emitting all query results or {@link Flux#empty()} if there are none.
*/
Flux<K> fetch() {
return createQuery().flatMapMany(it -> find.matching(it).all());
}
/**
* Fetch the first matching query result.
*
* @return {@link Mono} emitting the first query result or {@link Mono#empty()} if there are none.
* @throws org.springframework.dao.IncorrectResultSizeDataAccessException if more than one match found.
*/
Mono<K> fetchOne() {
return createQuery().flatMap(it -> find.matching(it).one());
}
/**
* Fetch the count of matching query results.
*
* @return {@link Mono} emitting the first query result count. Emits always a count even item.
*/
Mono<Long> fetchCount() {
return createQuery().flatMap(it -> find.matching(it).count());
}
/**
* Define a join.
*
* @param ref reference
* @param target join target
* @return new instance of {@link QuerydslJoinBuilder}.
*/
<T> QuerydslJoinBuilder<ReactiveSpringDataMongodbQuery<K>, K, T> join(Path<T> ref, Path<T> target) {
return new QuerydslJoinBuilder<>(getQueryMixin(), ref, target);
}
/**
* Define a join.
*
* @param ref reference
* @param target join target
* @return new instance of {@link QuerydslJoinBuilder}.
*/
<T> QuerydslJoinBuilder<ReactiveSpringDataMongodbQuery<K>, K, T> join(CollectionPathBase<?, T, ?> ref,
Path<T> target) {
return new QuerydslJoinBuilder<>(getQueryMixin(), ref, target);
}
/**
* Define a constraint for an embedded object.
*
* @param collection collection must not be {@literal null}.
* @param target target must not be {@literal null}.
* @return new instance of {@link QuerydslAnyEmbeddedBuilder}.
*/
<T> QuerydslAnyEmbeddedBuilder<ReactiveSpringDataMongodbQuery<K>, K> anyEmbedded(
Path<? extends Collection<T>> collection, Path<T> target) {
return new QuerydslAnyEmbeddedBuilder<>(getQueryMixin(), collection);
}
protected Mono<Query> createQuery() {
QueryMetadata metadata = getQueryMixin().getMetadata();
return createQuery(createFilter(metadata), metadata.getProjection(), metadata.getModifiers(),
metadata.getOrderBy());
}
/**
* Creates a MongoDB query that is emitted through a {@link Mono} given {@link Mono} of {@link Predicate}.
*
* @param filter must not be {@literal null}.
* @param projection can be {@literal null} if no projection is given. Query requests all fields in such case.
* @param modifiers must not be {@literal null}.
* @param orderBy must not be {@literal null}.
* @return {@link Mono} emitting the {@link Query}.
*/
protected Mono<Query> createQuery(Mono<Predicate> filter, @Nullable Expression<?> projection,
QueryModifiers modifiers, List<OrderSpecifier<?>> orderBy) {
return filter.map(this::createQuery) //
.defaultIfEmpty(createQuery(null)) //
.map(it -> {
BasicQuery basicQuery = new BasicQuery(it, createProjection(projection));
Integer limit = modifiers.getLimitAsInteger();
Integer offset = modifiers.getOffsetAsInteger();
if (limit != null) {
basicQuery.limit(limit);
}
if (offset != null) {
basicQuery.skip(offset);
}
if (orderBy.size() > 0) {
basicQuery.setSortObject(createSort(orderBy));
}
return basicQuery;
});
}
protected Mono<Predicate> createFilter(QueryMetadata metadata) {
if (!metadata.getJoins().isEmpty()) {
return createJoinFilter(metadata).map(it -> ExpressionUtils.allOf(metadata.getWhere(), it))
.switchIfEmpty(Mono.justOrEmpty(metadata.getWhere()));
}
return Mono.justOrEmpty(metadata.getWhere());
}
/**
* Creates a Join filter by querying {@link com.mongodb.DBRef references}.
*
* @param metadata
* @return
*/
@SuppressWarnings("unchecked")
protected Mono<Predicate> createJoinFilter(QueryMetadata metadata) {
MultiValueMap<Expression<?>, Mono<Predicate>> predicates = new LinkedMultiValueMap<>();
List<JoinExpression> joins = metadata.getJoins();
for (int i = joins.size() - 1; i >= 0; i--) {
JoinExpression join = joins.get(i);
Path<?> source = (Path) ((Operation<?>) join.getTarget()).getArg(0);
Path<?> target = (Path) ((Operation<?>) join.getTarget()).getArg(1);
Collection<Mono<Predicate>> extraFilters = predicates.get(target.getRoot());
Mono<Predicate> filter = allOf(extraFilters).map(it -> ExpressionUtils.allOf(join.getCondition(), it))
.switchIfEmpty(Mono.justOrEmpty(join.getCondition()));
Mono<Predicate> predicate = getIds(target.getType(), filter) //
.collectList() //
.handle((it, sink) -> {
if (it.isEmpty()) {
sink.error(new NoMatchException(source));
return;
}
Path<?> path = ExpressionUtils.path(String.class, source, "$id");
sink.next(ExpressionUtils.in((Path<Object>) path, it));
});
predicates.add(source.getRoot(), predicate);
}
Path<?> source = (Path) ((Operation) joins.get(0).getTarget()).getArg(0);
return allOf(predicates.get(source.getRoot())).onErrorResume(NoMatchException.class,
e -> Mono.just(ExpressionUtils.predicate(QuerydslMongoOps.NO_MATCH, e.source)));
}
private Mono<Predicate> allOf(@Nullable Collection<Mono<Predicate>> predicates) {
return predicates != null ? Flux.concat(predicates).collectList().map(ExpressionUtils::allOf) : Mono.empty();
}
/**
* Fetch the list of ids matching a given condition.
*
* @param targetType must not be {@literal null}.
* @param condition must not be {@literal null}.
* @return empty {@link List} if none found.
*/
protected Flux<Object> getIds(Class<?> targetType, Mono<Predicate> condition) {
return condition.flatMapMany(it -> getIds(targetType, it))
.switchIfEmpty(Flux.defer(() -> getIds(targetType, (Predicate) null)));
}
/**
* Fetch the list of ids matching a given condition.
*
* @param targetType must not be {@literal null}.
* @param condition must not be {@literal null}.
* @return empty {@link List} if none found.
*/
protected Flux<Object> getIds(Class<?> targetType, @Nullable Predicate condition) {
return createQuery(Mono.justOrEmpty(condition), null, QueryModifiers.EMPTY, Collections.emptyList())
.flatMapMany(query -> mongoOperations.findDistinct(query, "_id", targetType, Object.class));
}
/**
* Marker exception to indicate no matches for a query using reference Id's.
*/
static class NoMatchException extends RuntimeException {
final Path<?> source;
NoMatchException(Path<?> source) {
this.source = source;
}
@Override
public synchronized Throwable fillInStackTrace() {
return null;
}
}
}

View File

@@ -23,7 +23,6 @@ import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
import org.springframework.dao.OptimisticLockingFailureException;
import org.springframework.data.domain.Example;
import org.springframework.data.domain.Page;
import org.springframework.data.domain.PageImpl;
@@ -41,8 +40,6 @@ import org.springframework.data.util.Streamable;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import com.mongodb.client.result.DeleteResult;
/**
* Repository base implementation for Mongo.
*
@@ -164,14 +161,7 @@ public class SimpleMongoRepository<T, ID> implements MongoRepository<T, ID> {
Assert.notNull(entity, "The given entity must not be null!");
DeleteResult deleteResult = mongoOperations.remove(entity, entityInformation.getCollectionName());
if (entityInformation.isVersioned() && deleteResult.wasAcknowledged() && deleteResult.getDeletedCount() == 0) {
throw new OptimisticLockingFailureException(String.format(
"The entity with id %s with version %s in %s cannot be deleted! Was it modified or deleted in the meantime?",
entityInformation.getId(entity), entityInformation.getVersion(entity),
entityInformation.getCollectionName()));
}
deleteById(entityInformation.getRequiredId(entity));
}
/*

View File

@@ -28,7 +28,6 @@ import java.util.stream.Collectors;
import org.reactivestreams.Publisher;
import org.springframework.dao.IncorrectResultSizeDataAccessException;
import org.springframework.dao.OptimisticLockingFailureException;
import org.springframework.data.domain.Example;
import org.springframework.data.domain.Sort;
import org.springframework.data.mongodb.core.ReactiveMongoOperations;
@@ -40,8 +39,6 @@ import org.springframework.data.util.StreamUtils;
import org.springframework.data.util.Streamable;
import org.springframework.util.Assert;
import com.mongodb.client.result.DeleteResult;
/**
* Reactive repository base implementation for Mongo.
*
@@ -358,24 +355,7 @@ public class SimpleReactiveMongoRepository<T, ID extends Serializable> implement
Assert.notNull(entity, "The given entity must not be null!");
Mono<DeleteResult> remove = mongoOperations.remove(entity, entityInformation.getCollectionName());
if (entityInformation.isVersioned()) {
remove = remove.handle((deleteResult, sink) -> {
if (deleteResult.wasAcknowledged() && deleteResult.getDeletedCount() == 0) {
sink.error(new OptimisticLockingFailureException(String.format(
"The entity with id %s with version %s in %s cannot be deleted! Was it modified or deleted in the meantime?",
entityInformation.getId(entity), entityInformation.getVersion(entity),
entityInformation.getCollectionName())));
} else {
sink.next(deleteResult);
}
});
}
return remove.then();
return deleteById(entityInformation.getRequiredId(entity));
}
/*
@@ -387,7 +367,7 @@ public class SimpleReactiveMongoRepository<T, ID extends Serializable> implement
Assert.notNull(entities, "The given Iterable of entities must not be null!");
return Flux.fromIterable(entities).flatMap(this::delete).then();
return Flux.fromIterable(entities).flatMap(entity -> deleteById(entityInformation.getRequiredId(entity))).then();
}
/*

View File

@@ -120,9 +120,28 @@ class SpringDataMongodbSerializer extends MongodbDocumentSerializer {
value = value instanceof Optional ? ((Optional) value).orElse(null) : value;
if (ID_KEY.equals(key) || (key != null && key.endsWith("." + ID_KEY))) {
return convertId(key, value);
}
return super.asDocument(key, value instanceof Pattern ? value : converter.convertToMongoType(value));
}
/**
* Convert a given, already known to be an {@literal id} or even a nested document id, value into the according id
* representation following the conversion rules of {@link QueryMapper#convertId(Object)}.
*
* @param key the property path to the given value.
* @param idValue the raw {@literal id} value.
* @return the {@literal id} representation in the required format.
*/
private Document convertId(String key, Object idValue) {
Object convertedId = mapper.convertId(idValue);
return mapper.getMappedObject(super.asDocument(key, convertedId), Optional.empty());
}
/*
* (non-Javadoc)
* @see com.querydsl.mongodb.MongodbSerializer#isReference(com.querydsl.core.types.Path)

View File

@@ -1,170 +0,0 @@
/*
* Copyright 2008-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.util.json;
import static java.time.format.DateTimeFormatter.*;
import java.lang.reflect.InvocationTargetException;
import java.lang.reflect.Method;
import java.time.Instant;
import java.time.ZoneId;
import java.time.ZonedDateTime;
import java.time.format.DateTimeParseException;
import java.time.temporal.TemporalAccessor;
import java.time.temporal.TemporalQuery;
import java.util.Calendar;
import java.util.TimeZone;
/**
* JsonBuffer implementation borrowed from <a href=
* "https://github.com/mongodb/mongo-java-driver/blob/master/bson/src/main/org/bson/json/DateTimeFormatter.java">MongoDB
* Inc.</a> licensed under the Apache License, Version 2.0. <br />
* Formatted and modified.
*
* @author Jeff Yemin
* @author Ross Lawley
* @since 2.2
*/
class DateTimeFormatter {
private static final FormatterImpl FORMATTER_IMPL;
static {
FormatterImpl dateTimeHelper;
try {
dateTimeHelper = loadDateTimeFormatter("org.bson.json.DateTimeFormatter$Java8DateTimeFormatter");
} catch (LinkageError e) {
// this is expected if running on a release prior to Java 8: fallback to JAXB.
dateTimeHelper = loadDateTimeFormatter("org.bson.json.DateTimeFormatter$JaxbDateTimeFormatter");
}
FORMATTER_IMPL = dateTimeHelper;
}
private static FormatterImpl loadDateTimeFormatter(final String className) {
try {
return (FormatterImpl) Class.forName(className).getDeclaredConstructor().newInstance();
} catch (ClassNotFoundException e) {
// this is unexpected as it means the class itself is not found
throw new ExceptionInInitializerError(e);
} catch (InstantiationException e) {
// this is unexpected as it means the class can't be instantiated
throw new ExceptionInInitializerError(e);
} catch (IllegalAccessException e) {
// this is unexpected as it means the no-args constructor isn't accessible
throw new ExceptionInInitializerError(e);
} catch (NoSuchMethodException e) {
throw new ExceptionInInitializerError(e);
} catch (InvocationTargetException e) {
throw new ExceptionInInitializerError(e);
}
}
static long parse(final String dateTimeString) {
return FORMATTER_IMPL.parse(dateTimeString);
}
static String format(final long dateTime) {
return FORMATTER_IMPL.format(dateTime);
}
private interface FormatterImpl {
long parse(String dateTimeString);
String format(long dateTime);
}
// Reflective use of DatatypeConverter avoids a compile-time dependency on the java.xml.bind module in Java 9
static class JaxbDateTimeFormatter implements FormatterImpl {
private static final Method DATATYPE_CONVERTER_PARSE_DATE_TIME_METHOD;
private static final Method DATATYPE_CONVERTER_PRINT_DATE_TIME_METHOD;
static {
try {
DATATYPE_CONVERTER_PARSE_DATE_TIME_METHOD = Class.forName("javax.xml.bind.DatatypeConverter")
.getDeclaredMethod("parseDateTime", String.class);
DATATYPE_CONVERTER_PRINT_DATE_TIME_METHOD = Class.forName("javax.xml.bind.DatatypeConverter")
.getDeclaredMethod("printDateTime", Calendar.class);
} catch (NoSuchMethodException e) {
throw new ExceptionInInitializerError(e);
} catch (ClassNotFoundException e) {
throw new ExceptionInInitializerError(e);
}
}
@Override
public long parse(final String dateTimeString) {
try {
return ((Calendar) DATATYPE_CONVERTER_PARSE_DATE_TIME_METHOD.invoke(null, dateTimeString)).getTimeInMillis();
} catch (IllegalAccessException e) {
throw new IllegalStateException(e);
} catch (InvocationTargetException e) {
throw (RuntimeException) e.getCause();
}
}
@Override
public String format(final long dateTime) {
Calendar calendar = Calendar.getInstance();
calendar.setTimeInMillis(dateTime);
calendar.setTimeZone(TimeZone.getTimeZone("Z"));
try {
return (String) DATATYPE_CONVERTER_PRINT_DATE_TIME_METHOD.invoke(null, calendar);
} catch (IllegalAccessException e) {
throw new IllegalStateException();
} catch (InvocationTargetException e) {
throw (RuntimeException) e.getCause();
}
}
}
static class Java8DateTimeFormatter implements FormatterImpl {
// if running on Java 8 or above then java.time.format.DateTimeFormatter will be available and initialization will
// succeed.
// Otherwise it will fail.
static {
try {
Class.forName("java.time.format.DateTimeFormatter");
} catch (ClassNotFoundException e) {
throw new ExceptionInInitializerError(e);
}
}
@Override
public long parse(final String dateTimeString) {
try {
return ISO_OFFSET_DATE_TIME.parse(dateTimeString, new TemporalQuery<Instant>() {
@Override
public Instant queryFrom(final TemporalAccessor temporal) {
return Instant.from(temporal);
}
}).toEpochMilli();
} catch (DateTimeParseException e) {
throw new IllegalArgumentException(e.getMessage());
}
}
@Override
public String format(final long dateTime) {
return ZonedDateTime.ofInstant(Instant.ofEpochMilli(dateTime), ZoneId.of("Z")).format(ISO_OFFSET_DATE_TIME);
}
}
private DateTimeFormatter() {}
}

View File

@@ -1,73 +0,0 @@
/*
* Copyright 2008-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.util.json;
import org.bson.json.JsonParseException;
/**
* JsonBuffer implementation borrowed from <a href=
* "https://github.com/mongodb/mongo-java-driver/blob/master/bson/src/main/org/bson/json/JsonBuffer.java">MongoDB
* Inc.</a> licensed under the Apache License, Version 2.0. <br />
* Formatted and modified.
*
* @author Jeff Yemin
* @author Ross Lawley
* @since 2.2
*/
class JsonBuffer {
private final String buffer;
private int position;
private boolean eof;
JsonBuffer(final String buffer) {
this.buffer = buffer;
}
public int getPosition() {
return position;
}
public void setPosition(final int position) {
this.position = position;
}
public int read() {
if (eof) {
throw new JsonParseException("Trying to read past EOF.");
} else if (position >= buffer.length()) {
eof = true;
return -1;
} else {
return buffer.charAt(position++);
}
}
public void unread(final int c) {
eof = false;
if (c != -1 && buffer.charAt(position - 1) == c) {
position--;
}
}
public String substring(final int beginIndex) {
return buffer.substring(beginIndex);
}
public String substring(final int beginIndex, final int endIndex) {
return buffer.substring(beginIndex, endIndex);
}
}

View File

@@ -1,623 +0,0 @@
/*
* Copyright 2008-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.util.json;
import org.bson.BsonRegularExpression;
import org.bson.json.JsonParseException;
/**
* Parses the string representation of a JSON object into a set of {@link JsonToken}-derived objects. <br />
* JsonScanner implementation borrowed from <a href=
* "https://github.com/mongodb/mongo-java-driver/blob/master/bson/src/main/org/bson/json/JsonScanner.java">MongoDB
* Inc.</a> licensed under the Apache License, Version 2.0. <br />
* Formatted and modified to allow reading Spring Data specific placeholder values.
*
* @author Jeff Yemin
* @author Trisha Gee
* @author Robert Guo
* @author Ross Lawley
* @author Christoph Strobl
* @since 2.2
*/
class JsonScanner {
private final JsonBuffer buffer;
JsonScanner(final String json) {
this(new JsonBuffer(json));
}
JsonScanner(final JsonBuffer buffer) {
this.buffer = buffer;
}
/**
* @param newPosition the new position of the cursor position in the buffer
*/
public void setBufferPosition(final int newPosition) {
buffer.setPosition(newPosition);
}
/**
* @return the current location of the cursor in the buffer
*/
public int getBufferPosition() {
return buffer.getPosition();
}
/**
* Finds and returns the next complete token from this scanner. If scanner reached the end of the source, it will
* return a token with {@code JSONTokenType.END_OF_FILE} type.
*
* @return The next token.
* @throws JsonParseException if source is invalid.
*/
public JsonToken nextToken() {
int c = buffer.read();
while (c != -1 && Character.isWhitespace(c)) {
c = buffer.read();
}
if (c == -1) {
return new JsonToken(JsonTokenType.END_OF_FILE, "<eof>");
}
switch (c) {
case '{':
return new JsonToken(JsonTokenType.BEGIN_OBJECT, "{");
case '}':
return new JsonToken(JsonTokenType.END_OBJECT, "}");
case '[':
return new JsonToken(JsonTokenType.BEGIN_ARRAY, "[");
case ']':
return new JsonToken(JsonTokenType.END_ARRAY, "]");
case '(':
return new JsonToken(JsonTokenType.LEFT_PAREN, "(");
case ')':
return new JsonToken(JsonTokenType.RIGHT_PAREN, ")");
case ':':
c = buffer.read();
buffer.unread(c);
if (c == '#') { // for binding the SQL style ':#{#firstname}"'
return scanBindString();
}
return new JsonToken(JsonTokenType.COLON, ":");
case ',':
return new JsonToken(JsonTokenType.COMMA, ",");
case '\'':
case '"':
return scanString((char) c);
case '/':
return scanRegularExpression();
default:
if (c == '-' || Character.isDigit(c)) {
return scanNumber((char) c);
} else if (c == '$' || c == '_' || Character.isLetter(c)) {
return scanUnquotedString();
} else if (c == '?') { // for binding parameters. Both simple and SpEL ones.
return scanBindString();
} else {
int position = buffer.getPosition();
buffer.unread(c);
throw new JsonParseException("Invalid JSON input. Position: %d. Character: '%c'.", position, c);
}
}
}
/**
* Reads {@code RegularExpressionToken} from source. The following variants of lexemes are possible:
*
* <pre>
* /pattern/
* /\(pattern\)/
* /pattern/ims
* </pre>
*
* Options can include 'i','m','x','s'
*
* @return The regular expression token.
* @throws JsonParseException if regular expression representation is not valid.
*/
private JsonToken scanRegularExpression() {
int start = buffer.getPosition() - 1;
int options = -1;
RegularExpressionState state = RegularExpressionState.IN_PATTERN;
while (true) {
int c = buffer.read();
switch (state) {
case IN_PATTERN:
switch (c) {
case -1:
state = RegularExpressionState.INVALID;
break;
case '/':
state = RegularExpressionState.IN_OPTIONS;
options = buffer.getPosition();
break;
case '\\':
state = RegularExpressionState.IN_ESCAPE_SEQUENCE;
break;
default:
state = RegularExpressionState.IN_PATTERN;
break;
}
break;
case IN_ESCAPE_SEQUENCE:
state = RegularExpressionState.IN_PATTERN;
break;
case IN_OPTIONS:
switch (c) {
case 'i':
case 'm':
case 'x':
case 's':
state = RegularExpressionState.IN_OPTIONS;
break;
case ',':
case '}':
case ']':
case ')':
case -1:
state = RegularExpressionState.DONE;
break;
default:
if (Character.isWhitespace(c)) {
state = RegularExpressionState.DONE;
} else {
state = RegularExpressionState.INVALID;
}
break;
}
break;
default:
break;
}
switch (state) {
case DONE:
buffer.unread(c);
int end = buffer.getPosition();
BsonRegularExpression regex = new BsonRegularExpression(buffer.substring(start + 1, options - 1),
buffer.substring(options, end));
return new JsonToken(JsonTokenType.REGULAR_EXPRESSION, regex);
case INVALID:
throw new JsonParseException("Invalid JSON regular expression. Position: %d.", buffer.getPosition());
default:
}
}
}
/**
* Reads {@code StringToken} from source.
*
* @return The string token.
*/
private JsonToken scanBindString() {
int start = buffer.getPosition() - 1;
int c = buffer.read();
int charCount = 0;
boolean isExpression = false;
int parenthesisCount = 0;
while (c == '$' || c == '_' || Character.isLetterOrDigit(c) || c == '#' || c == '{' || c == '[' || c == ']'
|| (isExpression && isExpressionAllowedChar(c))) {
if (charCount == 0 && c == '#') {
isExpression = true;
} else if (isExpression) {
if (c == '{') {
parenthesisCount++;
} else if (c == '}') {
parenthesisCount--;
if (parenthesisCount == 0) {
buffer.read();
break;
}
}
}
charCount++;
c = buffer.read();
}
buffer.unread(c);
String lexeme = buffer.substring(start, buffer.getPosition());
return new JsonToken(JsonTokenType.UNQUOTED_STRING, lexeme);
}
private static boolean isExpressionAllowedChar(int c) {
return (c == '+' || //
c == '-' || //
c == ':' || //
c == '.' || //
c == ',' || //
c == '*' || //
c == '/' || //
c == '%' || //
c == '(' || //
c == ')' || //
c == '[' || //
c == ']' || //
c == '#' || //
c == '{' || //
c == '}' || //
c == '@' || //
c == '^' || //
c == '!' || //
c == '=' || //
c == '&' || //
c == '|' || //
c == '?' || //
c == '$' || //
c == '>' || //
c == '<' || //
c == '"' || //
c == '\'' || //
c == ' ');
}
/**
* Reads {@code StringToken} from source.
*
* @return The string token.
*/
private JsonToken scanUnquotedString() {
int start = buffer.getPosition() - 1;
int c = buffer.read();
while (c == '$' || c == '_' || Character.isLetterOrDigit(c)) {
c = buffer.read();
}
buffer.unread(c);
String lexeme = buffer.substring(start, buffer.getPosition());
return new JsonToken(JsonTokenType.UNQUOTED_STRING, lexeme);
}
/**
* Reads number token from source. The following variants of lexemes are possible:
*
* <pre>
* 12
* 123
* -0
* -345
* -0.0
* 0e1
* 0e-1
* -0e-1
* 1e12
* -Infinity
* </pre>
*
* @return The number token.
* @throws JsonParseException if number representation is invalid.
*/
// CHECKSTYLE:OFF
private JsonToken scanNumber(final char firstChar) {
int c = firstChar;
int start = buffer.getPosition() - 1;
NumberState state;
switch (c) {
case '-':
state = NumberState.SAW_LEADING_MINUS;
break;
case '0':
state = NumberState.SAW_LEADING_ZERO;
break;
default:
state = NumberState.SAW_INTEGER_DIGITS;
break;
}
JsonTokenType type = JsonTokenType.INT64;
while (true) {
c = buffer.read();
switch (state) {
case SAW_LEADING_MINUS:
switch (c) {
case '0':
state = NumberState.SAW_LEADING_ZERO;
break;
case 'I':
state = NumberState.SAW_MINUS_I;
break;
default:
if (Character.isDigit(c)) {
state = NumberState.SAW_INTEGER_DIGITS;
} else {
state = NumberState.INVALID;
}
break;
}
break;
case SAW_LEADING_ZERO:
switch (c) {
case '.':
state = NumberState.SAW_DECIMAL_POINT;
break;
case 'e':
case 'E':
state = NumberState.SAW_EXPONENT_LETTER;
break;
case ',':
case '}':
case ']':
case ')':
case -1:
state = NumberState.DONE;
break;
default:
if (Character.isDigit(c)) {
state = NumberState.SAW_INTEGER_DIGITS;
} else if (Character.isWhitespace(c)) {
state = NumberState.DONE;
} else {
state = NumberState.INVALID;
}
break;
}
break;
case SAW_INTEGER_DIGITS:
switch (c) {
case '.':
state = NumberState.SAW_DECIMAL_POINT;
break;
case 'e':
case 'E':
state = NumberState.SAW_EXPONENT_LETTER;
break;
case ',':
case '}':
case ']':
case ')':
case -1:
state = NumberState.DONE;
break;
default:
if (Character.isDigit(c)) {
state = NumberState.SAW_INTEGER_DIGITS;
} else if (Character.isWhitespace(c)) {
state = NumberState.DONE;
} else {
state = NumberState.INVALID;
}
break;
}
break;
case SAW_DECIMAL_POINT:
type = JsonTokenType.DOUBLE;
if (Character.isDigit(c)) {
state = NumberState.SAW_FRACTION_DIGITS;
} else {
state = NumberState.INVALID;
}
break;
case SAW_FRACTION_DIGITS:
switch (c) {
case 'e':
case 'E':
state = NumberState.SAW_EXPONENT_LETTER;
break;
case ',':
case '}':
case ']':
case ')':
case -1:
state = NumberState.DONE;
break;
default:
if (Character.isDigit(c)) {
state = NumberState.SAW_FRACTION_DIGITS;
} else if (Character.isWhitespace(c)) {
state = NumberState.DONE;
} else {
state = NumberState.INVALID;
}
break;
}
break;
case SAW_EXPONENT_LETTER:
type = JsonTokenType.DOUBLE;
switch (c) {
case '+':
case '-':
state = NumberState.SAW_EXPONENT_SIGN;
break;
default:
if (Character.isDigit(c)) {
state = NumberState.SAW_EXPONENT_DIGITS;
} else {
state = NumberState.INVALID;
}
break;
}
break;
case SAW_EXPONENT_SIGN:
if (Character.isDigit(c)) {
state = NumberState.SAW_EXPONENT_DIGITS;
} else {
state = NumberState.INVALID;
}
break;
case SAW_EXPONENT_DIGITS:
switch (c) {
case ',':
case '}':
case ']':
case ')':
state = NumberState.DONE;
break;
default:
if (Character.isDigit(c)) {
state = NumberState.SAW_EXPONENT_DIGITS;
} else if (Character.isWhitespace(c)) {
state = NumberState.DONE;
} else {
state = NumberState.INVALID;
}
break;
}
break;
case SAW_MINUS_I:
boolean sawMinusInfinity = true;
char[] nfinity = new char[] { 'n', 'f', 'i', 'n', 'i', 't', 'y' };
for (int i = 0; i < nfinity.length; i++) {
if (c != nfinity[i]) {
sawMinusInfinity = false;
break;
}
c = buffer.read();
}
if (sawMinusInfinity) {
type = JsonTokenType.DOUBLE;
switch (c) {
case ',':
case '}':
case ']':
case ')':
case -1:
state = NumberState.DONE;
break;
default:
if (Character.isWhitespace(c)) {
state = NumberState.DONE;
} else {
state = NumberState.INVALID;
}
break;
}
} else {
state = NumberState.INVALID;
}
break;
default:
}
switch (state) {
case INVALID:
throw new JsonParseException("Invalid JSON number");
case DONE:
buffer.unread(c);
String lexeme = buffer.substring(start, buffer.getPosition());
if (type == JsonTokenType.DOUBLE) {
return new JsonToken(JsonTokenType.DOUBLE, Double.parseDouble(lexeme));
} else {
long value = Long.parseLong(lexeme);
if (value < Integer.MIN_VALUE || value > Integer.MAX_VALUE) {
return new JsonToken(JsonTokenType.INT64, value);
} else {
return new JsonToken(JsonTokenType.INT32, (int) value);
}
}
default:
}
}
}
// CHECKSTYLE:ON
/**
* Reads {@code StringToken} from source.
*
* @return The string token.
*/
// CHECKSTYLE:OFF
private JsonToken scanString(final char quoteCharacter) {
StringBuilder sb = new StringBuilder();
while (true) {
int c = buffer.read();
switch (c) {
case '\\':
c = buffer.read();
switch (c) {
case '\'':
sb.append('\'');
break;
case '"':
sb.append('"');
break;
case '\\':
sb.append('\\');
break;
case '/':
sb.append('/');
break;
case 'b':
sb.append('\b');
break;
case 'f':
sb.append('\f');
break;
case 'n':
sb.append('\n');
break;
case 'r':
sb.append('\r');
break;
case 't':
sb.append('\t');
break;
case 'u':
int u1 = buffer.read();
int u2 = buffer.read();
int u3 = buffer.read();
int u4 = buffer.read();
if (u4 != -1) {
String hex = new String(new char[] { (char) u1, (char) u2, (char) u3, (char) u4 });
sb.append((char) Integer.parseInt(hex, 16));
}
break;
default:
throw new JsonParseException("Invalid escape sequence in JSON string '\\%c'.", c);
}
break;
default:
if (c == quoteCharacter) {
return new JsonToken(JsonTokenType.STRING, sb.toString());
}
if (c != -1) {
sb.append((char) c);
}
}
if (c == -1) {
throw new JsonParseException("End of file in JSON string.");
}
}
}
private enum NumberState {
SAW_LEADING_MINUS, SAW_LEADING_ZERO, SAW_INTEGER_DIGITS, SAW_DECIMAL_POINT, SAW_FRACTION_DIGITS, SAW_EXPONENT_LETTER, SAW_EXPONENT_SIGN, SAW_EXPONENT_DIGITS, SAW_MINUS_I, DONE, INVALID
}
private enum RegularExpressionState {
IN_PATTERN, IN_ESCAPE_SEQUENCE, IN_OPTIONS, DONE, INVALID
}
}

View File

@@ -1,86 +0,0 @@
/*
* Copyright 2008-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.util.json;
import static java.lang.String.*;
import org.bson.BsonDouble;
import org.bson.json.JsonParseException;
import org.bson.types.Decimal128;
/**
* JsonToken implementation borrowed from <a href=
* "https://github.com/mongodb/mongo-java-driver/blob/master/bson/src/main/org/bson/json/JsonToken.java">MongoDB
* Inc.</a> licensed under the Apache License, Version 2.0. <br />
*
* @author Jeff Yemin
* @author Ross Lawley
* @since 2.2
*/
class JsonToken {
private final Object value;
private final JsonTokenType type;
JsonToken(final JsonTokenType type, final Object value) {
this.value = value;
this.type = type;
}
Object getValue() {
return value;
}
<T> T getValue(final Class<T> clazz) {
try {
if (Long.class == clazz) {
if (value instanceof Integer) {
return clazz.cast(((Integer) value).longValue());
} else if (value instanceof String) {
return clazz.cast(Long.valueOf((String) value));
}
} else if (Integer.class == clazz) {
if (value instanceof String) {
return clazz.cast(Integer.valueOf((String) value));
}
} else if (Double.class == clazz) {
if (value instanceof String) {
return clazz.cast(Double.valueOf((String) value));
}
} else if (Decimal128.class == clazz) {
if (value instanceof Integer) {
return clazz.cast(new Decimal128((Integer) value));
} else if (value instanceof Long) {
return clazz.cast(new Decimal128((Long) value));
} else if (value instanceof Double) {
return clazz.cast(new BsonDouble((Double) value).decimal128Value());
} else if (value instanceof String) {
return clazz.cast(Decimal128.parse((String) value));
}
}
return clazz.cast(value);
} catch (Exception e) {
throw new JsonParseException(format("Exception converting value '%s' to type %s", value, clazz.getName()), e);
}
}
public JsonTokenType getType() {
return type;
}
}

View File

@@ -1,107 +0,0 @@
/*
* Copyright 2008-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.util.json;
/**
* JsonTokenType implementation borrowed from <a href=
* "https://github.com/mongodb/mongo-java-driver/blob/master/bson/src/main/org/bson/json/JsonTokenType.java">MongoDB
* Inc.</a> licensed under the Apache License, Version 2.0. <br />
*
* @author Jeff Yemin
* @author Ross Lawley
* @since 2.2
*/
enum JsonTokenType {
/**
* An invalid token.
*/
INVALID,
/**
* A begin array token (a '[').
*/
BEGIN_ARRAY,
/**
* A begin object token (a '{').
*/
BEGIN_OBJECT,
/**
* An end array token (a ']').
*/
END_ARRAY,
/**
* A left parenthesis (a '(').
*/
LEFT_PAREN,
/**
* A right parenthesis (a ')').
*/
RIGHT_PAREN,
/**
* An end object token (a '}').
*/
END_OBJECT,
/**
* A colon token (a ':').
*/
COLON,
/**
* A comma token (a ',').
*/
COMMA,
/**
* A Double token.
*/
DOUBLE,
/**
* An Int32 token.
*/
INT32,
/**
* And Int64 token.
*/
INT64,
/**
* A regular expression token.
*/
REGULAR_EXPRESSION,
/**
* A string token.
*/
STRING,
/**
* An unquoted string token.
*/
UNQUOTED_STRING,
/**
* An end of file token.
*/
END_OF_FILE
}

View File

@@ -1,52 +0,0 @@
/*
* Copyright 2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.util.json;
import lombok.Getter;
import lombok.RequiredArgsConstructor;
import org.springframework.expression.EvaluationContext;
import org.springframework.expression.Expression;
import org.springframework.expression.spel.standard.SpelExpressionParser;
import org.springframework.lang.Nullable;
/**
* Reusable context for binding parameters to an placeholder or a SpEL expression within a JSON structure. <br />
* To be used along with {@link ParameterBindingDocumentCodec#decode(String, ParameterBindingContext)}.
*
* @author Christoph Strobl
* @since 2.2
*/
@RequiredArgsConstructor
@Getter
public class ParameterBindingContext {
private final ValueProvider valueProvider;
private final SpelExpressionParser expressionParser;
private final EvaluationContext evaluationContext;
@Nullable
public Object bindableValueForIndex(int index) {
return valueProvider.getBindableValue(index);
}
@Nullable
public Object evaluateExpression(String expressionString) {
Expression expression = expressionParser.parseExpression(expressionString);
return expression.getValue(this.evaluationContext, Object.class);
}
}

View File

@@ -1,329 +0,0 @@
/*
* Copyright 2008-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.util.json;
import static java.util.Arrays.*;
import static org.bson.assertions.Assertions.*;
import static org.bson.codecs.configuration.CodecRegistries.*;
import java.util.ArrayList;
import java.util.Collection;
import java.util.List;
import java.util.Map;
import java.util.UUID;
import org.bson.AbstractBsonReader.State;
import org.bson.BsonBinarySubType;
import org.bson.BsonDocument;
import org.bson.BsonDocumentWriter;
import org.bson.BsonReader;
import org.bson.BsonType;
import org.bson.BsonValue;
import org.bson.BsonWriter;
import org.bson.Document;
import org.bson.Transformer;
import org.bson.codecs.BsonTypeClassMap;
import org.bson.codecs.BsonTypeCodecMap;
import org.bson.codecs.BsonValueCodecProvider;
import org.bson.codecs.Codec;
import org.bson.codecs.CollectibleCodec;
import org.bson.codecs.DecoderContext;
import org.bson.codecs.DocumentCodecProvider;
import org.bson.codecs.EncoderContext;
import org.bson.codecs.IdGenerator;
import org.bson.codecs.ObjectIdGenerator;
import org.bson.codecs.ValueCodecProvider;
import org.bson.codecs.configuration.CodecRegistry;
import org.springframework.data.spel.EvaluationContextProvider;
import org.springframework.expression.spel.standard.SpelExpressionParser;
import org.springframework.lang.Nullable;
import org.springframework.util.StringUtils;
/**
* A {@link Codec} implementation that allows binding parameters to placeholders or SpEL expressions when decoding a
* JSON String. <br />
* Modified version of <a href=
* "https://github.com/mongodb/mongo-java-driver/blob/master/bson/src/main/org/bson/codecs/DocumentCodec.java">MongoDB
* Inc. DocumentCodec</a> licensed under the Apache License, Version 2.0. <br />
*
* @author Jeff Yemin
* @author Ross Lawley
* @author Ralph Schaer
* @author Christoph Strobl
* @since 2.2
*/
public class ParameterBindingDocumentCodec implements CollectibleCodec<Document> {
private static final String ID_FIELD_NAME = "_id";
private static final CodecRegistry DEFAULT_REGISTRY = fromProviders(
asList(new ValueCodecProvider(), new BsonValueCodecProvider(), new DocumentCodecProvider()));
private static final BsonTypeClassMap DEFAULT_BSON_TYPE_CLASS_MAP = new BsonTypeClassMap();
private final BsonTypeCodecMap bsonTypeCodecMap;
private final CodecRegistry registry;
private final IdGenerator idGenerator;
private final Transformer valueTransformer;
/**
* Construct a new instance with a default {@code CodecRegistry}.
*/
public ParameterBindingDocumentCodec() {
this(DEFAULT_REGISTRY);
}
/**
* Construct a new instance with the given registry.
*
* @param registry the registry
*/
public ParameterBindingDocumentCodec(final CodecRegistry registry) {
this(registry, DEFAULT_BSON_TYPE_CLASS_MAP);
}
/**
* Construct a new instance with the given registry and BSON type class map.
*
* @param registry the registry
* @param bsonTypeClassMap the BSON type class map
*/
public ParameterBindingDocumentCodec(final CodecRegistry registry, final BsonTypeClassMap bsonTypeClassMap) {
this(registry, bsonTypeClassMap, null);
}
/**
* Construct a new instance with the given registry and BSON type class map. The transformer is applied as a last step
* when decoding values, which allows users of this codec to control the decoding process. For example, a user of this
* class could substitute a value decoded as a Document with an instance of a special purpose class (e.g., one
* representing a DBRef in MongoDB).
*
* @param registry the registry
* @param bsonTypeClassMap the BSON type class map
* @param valueTransformer the value transformer to use as a final step when decoding the value of any field in the
* document
*/
public ParameterBindingDocumentCodec(final CodecRegistry registry, final BsonTypeClassMap bsonTypeClassMap,
final Transformer valueTransformer) {
this.registry = notNull("registry", registry);
this.bsonTypeCodecMap = new BsonTypeCodecMap(notNull("bsonTypeClassMap", bsonTypeClassMap), registry);
this.idGenerator = new ObjectIdGenerator();
this.valueTransformer = valueTransformer != null ? valueTransformer : new Transformer() {
@Override
public Object transform(final Object value) {
return value;
}
};
}
@Override
public boolean documentHasId(final Document document) {
return document.containsKey(ID_FIELD_NAME);
}
@Override
public BsonValue getDocumentId(final Document document) {
if (!documentHasId(document)) {
throw new IllegalStateException("The document does not contain an _id");
}
Object id = document.get(ID_FIELD_NAME);
if (id instanceof BsonValue) {
return (BsonValue) id;
}
BsonDocument idHoldingDocument = new BsonDocument();
BsonWriter writer = new BsonDocumentWriter(idHoldingDocument);
writer.writeStartDocument();
writer.writeName(ID_FIELD_NAME);
writeValue(writer, EncoderContext.builder().build(), id);
writer.writeEndDocument();
return idHoldingDocument.get(ID_FIELD_NAME);
}
@Override
public Document generateIdIfAbsentFromDocument(final Document document) {
if (!documentHasId(document)) {
document.put(ID_FIELD_NAME, idGenerator.generate());
}
return document;
}
@Override
public void encode(final BsonWriter writer, final Document document, final EncoderContext encoderContext) {
writeMap(writer, document, encoderContext);
}
// Spring Data Customization START
public Document decode(@Nullable String json, Object[] values) {
return decode(json, new ParameterBindingContext((index) -> values[index], new SpelExpressionParser(),
EvaluationContextProvider.DEFAULT.getEvaluationContext(values)));
}
public Document decode(@Nullable String json, ParameterBindingContext bindingContext) {
if (StringUtils.isEmpty(json)) {
return new Document();
}
ParameterBindingJsonReader reader = new ParameterBindingJsonReader(json, bindingContext);
return this.decode(reader, DecoderContext.builder().build());
}
@Override
public Document decode(final BsonReader reader, final DecoderContext decoderContext) {
if (reader instanceof ParameterBindingJsonReader) {
ParameterBindingJsonReader bindingReader = (ParameterBindingJsonReader) reader;
// check if the reader has actually found something to replace on top level and did so.
// binds just placeholder queries like: `@Query(?0)`
if (bindingReader.currentValue instanceof org.bson.Document) {
return (Document) bindingReader.currentValue;
}
}
Document document = new Document();
reader.readStartDocument();
while (reader.readBsonType() != BsonType.END_OF_DOCUMENT) {
String fieldName = reader.readName();
document.put(fieldName, readValue(reader, decoderContext));
}
reader.readEndDocument();
return document;
}
// Spring Data Customization END
@Override
public Class<Document> getEncoderClass() {
return Document.class;
}
private void beforeFields(final BsonWriter bsonWriter, final EncoderContext encoderContext,
final Map<String, Object> document) {
if (encoderContext.isEncodingCollectibleDocument() && document.containsKey(ID_FIELD_NAME)) {
bsonWriter.writeName(ID_FIELD_NAME);
writeValue(bsonWriter, encoderContext, document.get(ID_FIELD_NAME));
}
}
private boolean skipField(final EncoderContext encoderContext, final String key) {
return encoderContext.isEncodingCollectibleDocument() && key.equals(ID_FIELD_NAME);
}
@SuppressWarnings({ "unchecked", "rawtypes" })
private void writeValue(final BsonWriter writer, final EncoderContext encoderContext, final Object value) {
if (value == null) {
writer.writeNull();
} else if (value instanceof Iterable) {
writeIterable(writer, (Iterable<Object>) value, encoderContext.getChildContext());
} else if (value instanceof Map) {
writeMap(writer, (Map<String, Object>) value, encoderContext.getChildContext());
} else {
Codec codec = registry.get(value.getClass());
encoderContext.encodeWithChildContext(codec, writer, value);
}
}
private void writeMap(final BsonWriter writer, final Map<String, Object> map, final EncoderContext encoderContext) {
writer.writeStartDocument();
beforeFields(writer, encoderContext, map);
for (final Map.Entry<String, Object> entry : map.entrySet()) {
if (skipField(encoderContext, entry.getKey())) {
continue;
}
writer.writeName(entry.getKey());
writeValue(writer, encoderContext, entry.getValue());
}
writer.writeEndDocument();
}
private void writeIterable(final BsonWriter writer, final Iterable<Object> list,
final EncoderContext encoderContext) {
writer.writeStartArray();
for (final Object value : list) {
writeValue(writer, encoderContext, value);
}
writer.writeEndArray();
}
private Object readValue(final BsonReader reader, final DecoderContext decoderContext) {
// Spring Data Customization START
if (reader instanceof ParameterBindingJsonReader) {
ParameterBindingJsonReader bindingReader = (ParameterBindingJsonReader) reader;
// check if the reader has actually found something to replace and did so.
// resets the reader state to move on after the actual value
// returns the replacement value
if (bindingReader.currentValue != null) {
Object value = bindingReader.currentValue;
bindingReader.setState(State.TYPE);
bindingReader.currentValue = null;
return value;
}
}
// Spring Data Customization END
BsonType bsonType = reader.getCurrentBsonType();
if (bsonType == BsonType.NULL) {
reader.readNull();
return null;
} else if (bsonType == BsonType.ARRAY) {
return readList(reader, decoderContext);
} else if (bsonType == BsonType.BINARY && BsonBinarySubType.isUuid(reader.peekBinarySubType())
&& reader.peekBinarySize() == 16) {
return registry.get(UUID.class).decode(reader, decoderContext);
}
// Spring Data Customization START
// By default the registry uses DocumentCodec for parsing.
// We need to reroute that to our very own implementation or we'll end up only mapping half the placeholders.
Codec<?> codecToUse = bsonTypeCodecMap.get(bsonType);
if (codecToUse instanceof org.bson.codecs.DocumentCodec) {
codecToUse = this;
}
return valueTransformer.transform(codecToUse.decode(reader, decoderContext));
// Spring Data Customization END
}
private List<Object> readList(final BsonReader reader, final DecoderContext decoderContext) {
reader.readStartArray();
List<Object> list = new ArrayList<>();
while (reader.readBsonType() != BsonType.END_OF_DOCUMENT) {
// Spring Data Customization START
Object listValue = readValue(reader, decoderContext);
if (listValue instanceof Collection) {
list.addAll((Collection) listValue);
break;
}
list.add(listValue);
// Spring Data Customization END
}
reader.readEndArray();
return list;
}
}

View File

@@ -1,36 +0,0 @@
/*
* Copyright 2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.util.json;
import org.springframework.lang.Nullable;
/**
* A value provider to retrieve bindable values by their parameter index.
*
* @author Christoph Strobl
* @since 2.2
*/
@FunctionalInterface
public interface ValueProvider {
/**
* @param index parameter index to use.
* @return can be {@literal null}.
* @throws RuntimeException if the requested element does not exist.
*/
@Nullable
Object getBindableValue(int index);
}

View File

@@ -24,9 +24,8 @@ import kotlin.reflect.KClass
* @author Mark Paluch
* @since 2.0
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("aggregateAndReturn<T>()"))
fun <T : Any> ExecutableAggregationOperation.aggregateAndReturn(entityClass: KClass<T>): ExecutableAggregationOperation.ExecutableAggregation<T> =
aggregateAndReturn(entityClass.java)
aggregateAndReturn(entityClass.java)
/**
* Extension for [ExecutableAggregationOperation.aggregateAndReturn] leveraging reified type parameters.
@@ -36,4 +35,4 @@ fun <T : Any> ExecutableAggregationOperation.aggregateAndReturn(entityClass: KCl
* @since 2.0
*/
inline fun <reified T : Any> ExecutableAggregationOperation.aggregateAndReturn(): ExecutableAggregationOperation.ExecutableAggregation<T> =
aggregateAndReturn(T::class.java)
aggregateAndReturn(T::class.java)

View File

@@ -24,7 +24,6 @@ import kotlin.reflect.KClass
* @author Mark Paluch
* @since 2.0
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("query<T>()"))
fun <T : Any> ExecutableFindOperation.query(entityClass: KClass<T>): ExecutableFindOperation.ExecutableFind<T> =
query(entityClass.java)
@@ -45,7 +44,6 @@ inline fun <reified T : Any> ExecutableFindOperation.query(): ExecutableFindOper
* @author Mark Paluch
* @since 2.0
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("asType<T>()"))
fun <T : Any> ExecutableFindOperation.FindWithProjection<*>.asType(resultType: KClass<T>): ExecutableFindOperation.FindWithQuery<T> =
`as`(resultType.java)
@@ -65,7 +63,6 @@ inline fun <reified T : Any> ExecutableFindOperation.FindWithProjection<*>.asTyp
* @author Christoph Strobl
* @since 2.1
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("asType<T>()"))
fun <T : Any> ExecutableFindOperation.DistinctWithProjection.asType(resultType: KClass<T>): ExecutableFindOperation.TerminatingDistinct<T> =
`as`(resultType.java);

View File

@@ -24,7 +24,6 @@ import kotlin.reflect.KClass
* @author Mark Paluch
* @since 2.0
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("insert<T>()"))
fun <T : Any> ExecutableInsertOperation.insert(entityClass: KClass<T>): ExecutableInsertOperation.ExecutableInsert<T> =
insert(entityClass.java)

View File

@@ -23,7 +23,6 @@ import kotlin.reflect.KClass
* @author Christoph Strobl
* @since 2.1
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("mapReduce<T>()"))
fun <T : Any> ExecutableMapReduceOperation.mapReduce(entityClass: KClass<T>): ExecutableMapReduceOperation.MapReduceWithMapFunction<T> =
mapReduce(entityClass.java)
@@ -42,7 +41,6 @@ inline fun <reified T : Any> ExecutableMapReduceOperation.mapReduce(): Executabl
* @author Christoph Strobl
* @since 2.1
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("asType<T>()"))
fun <T : Any> ExecutableMapReduceOperation.MapReduceWithProjection<*>.asType(resultType: KClass<T>): ExecutableMapReduceOperation.MapReduceWithQuery<T> =
`as`(resultType.java)

View File

@@ -24,7 +24,6 @@ import kotlin.reflect.KClass
* @author Mark Paluch
* @since 2.0
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("remove<T>()"))
fun <T : Any> ExecutableRemoveOperation.remove(entityClass: KClass<T>): ExecutableRemoveOperation.ExecutableRemove<T> =
remove(entityClass.java)

View File

@@ -23,7 +23,6 @@ import kotlin.reflect.KClass
* @author Christoph Strobl
* @since 2.0
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("update<T>()"))
fun <T : Any> ExecutableUpdateOperation.update(entityClass: KClass<T>): ExecutableUpdateOperation.ExecutableUpdate<T> =
update(entityClass.java)

View File

@@ -41,7 +41,6 @@ import kotlin.reflect.KClass
* @author Sebastien Deleuze
* @since 2.0
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("getCollectionName<T>()"))
fun <T : Any> MongoOperations.getCollectionName(entityClass: KClass<T>): String =
getCollectionName(entityClass.java)
@@ -88,7 +87,6 @@ inline fun <reified T : Any> MongoOperations.stream(query: Query, collectionName
* @author Sebastien Deleuze
* @since 2.0
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("createCollection<T>(collectionOptions)"))
fun <T : Any> MongoOperations.createCollection(entityClass: KClass<T>, collectionOptions: CollectionOptions? = null): MongoCollection<Document> =
if (collectionOptions != null) createCollection(entityClass.java, collectionOptions)
else createCollection(entityClass.java)
@@ -110,7 +108,6 @@ inline fun <reified T : Any> MongoOperations.createCollection(
* @author Sebastien Deleuze
* @since 2.0
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("collectionExists<T>()"))
fun <T : Any> MongoOperations.collectionExists(entityClass: KClass<T>): Boolean =
collectionExists(entityClass.java)
@@ -129,7 +126,6 @@ inline fun <reified T : Any> MongoOperations.collectionExists(): Boolean =
* @author Sebastien Deleuze
* @since 2.0
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("dropCollection<T>()"))
fun <T : Any> MongoOperations.dropCollection(entityClass: KClass<T>) {
dropCollection(entityClass.java)
}
@@ -150,7 +146,6 @@ inline fun <reified T : Any> MongoOperations.dropCollection() {
* @author Sebastien Deleuze
* @since 2.0
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("indexOps<T>()"))
fun <T : Any> MongoOperations.indexOps(entityClass: KClass<T>): IndexOperations =
indexOps(entityClass.java)
@@ -169,7 +164,6 @@ inline fun <reified T : Any> MongoOperations.indexOps(): IndexOperations =
* @author Sebastien Deleuze
* @since 2.0
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("bulkOps<T>(bulkMode, collectionName)"))
fun <T : Any> MongoOperations.bulkOps(bulkMode: BulkMode, entityClass: KClass<T>, collectionName: String? = null): BulkOperations =
if (collectionName != null) bulkOps(bulkMode, entityClass.java, collectionName)
else bulkOps(bulkMode, entityClass.java)
@@ -218,7 +212,6 @@ inline fun <reified T : Any> MongoOperations.group(criteria: Criteria, inputColl
* @author Sebastien Deleuze
* @since 2.0
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("aggregate<T>(aggregation)"))
inline fun <reified O : Any> MongoOperations.aggregate(aggregation: Aggregation, inputType: KClass<*>): AggregationResults<O> =
aggregate(aggregation, inputType.java, O::class.java)
@@ -237,7 +230,6 @@ inline fun <reified O : Any> MongoOperations.aggregate(aggregation: Aggregation,
* @author Sebastien Deleuze
* @since 2.0
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("aggregateStream<T>(aggregation)"))
inline fun <reified O : Any> MongoOperations.aggregateStream(aggregation: Aggregation, inputType: KClass<*>): CloseableIterator<O> =
aggregateStream(aggregation, inputType.java, O::class.java)
@@ -295,7 +287,6 @@ inline fun <reified T : Any> MongoOperations.findOne(query: Query, collectionNam
* @author Sebastien Deleuze
* @since 2.0
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("exists<T>(query, collectionName)"))
fun <T : Any> MongoOperations.exists(query: Query, entityClass: KClass<T>, collectionName: String? = null): Boolean =
if (collectionName != null) exists(query, entityClass.java, collectionName)
else exists(query, entityClass.java)
@@ -337,7 +328,6 @@ inline fun <reified T : Any> MongoOperations.findById(id: Any, collectionName: S
* @author Christoph Strobl
* @since 2.1
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("findDistinct<T, E>(field)"))
inline fun <reified T : Any> MongoOperations.findDistinct(field: String, entityClass: KClass<*>): List<T> =
findDistinct(field, entityClass.java, T::class.java);
@@ -347,7 +337,6 @@ inline fun <reified T : Any> MongoOperations.findDistinct(field: String, entityC
* @author Christoph Strobl
* @since 2.1
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("findDistinct<T, E>(query, field)"))
inline fun <reified T : Any> MongoOperations.findDistinct(query: Query, field: String, entityClass: KClass<*>): List<T> =
findDistinct(query, field, entityClass.java, T::class.java)
@@ -357,7 +346,6 @@ inline fun <reified T : Any> MongoOperations.findDistinct(query: Query, field: S
* @author Christoph Strobl
* @since 2.1
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("findDistinct<T, E>(query, field, collectionName)"))
inline fun <reified T : Any> MongoOperations.findDistinct(query: Query, field: String, collectionName: String, entityClass: KClass<*>): List<T> =
findDistinct(query, field, collectionName, entityClass.java, T::class.java)
@@ -398,7 +386,6 @@ inline fun <reified T : Any> MongoOperations.findAndRemove(query: Query, collect
* @author Sebastien Deleuze
* @since 2.0
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("count<T>(query, collectionName)"))
fun <T : Any> MongoOperations.count(query: Query = Query(), entityClass: KClass<T>, collectionName: String? = null): Long =
if (collectionName != null) count(query, entityClass.java, collectionName)
else count(query, entityClass.java)
@@ -419,27 +406,16 @@ inline fun <reified T : Any> MongoOperations.count(query: Query = Query(), colle
* @author Sebastien Deleuze
* @since 2.0
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("insert<T>(batchToSave)"))
fun <T : Any> MongoOperations.insert(batchToSave: Collection<T>, entityClass: KClass<T>) {
insert(batchToSave, entityClass.java)
}
/**
* Extension for [MongoOperations.insert] leveraging reified type parameters.
*
* @author Mark Paluch
* @since 2.2
*/
@Suppress("EXTENSION_SHADOWED_BY_MEMBER")
inline fun <reified T : Any> MongoOperations.insert(batchToSave: Collection<T>): Collection<T> = insert(batchToSave, T::class.java)
/**
* Extension for [MongoOperations.upsert] providing a [KClass] based variant.
*
* @author Sebastien Deleuze
* @since 2.0
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("upsert<T>(query, update, collectionName)"))
fun <T : Any> MongoOperations.upsert(query: Query, update: Update, entityClass: KClass<T>, collectionName: String? = null): UpdateResult =
if (collectionName != null) upsert(query, update, entityClass.java, collectionName)
else upsert(query, update, entityClass.java)
@@ -461,7 +437,6 @@ inline fun <reified T : Any> MongoOperations.upsert(query: Query, update: Update
* @author Sebastien Deleuze
* @since 2.0
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("updateFirst<T>(query, update, collectionName)"))
fun <T : Any> MongoOperations.updateFirst(query: Query, update: Update, entityClass: KClass<T>, collectionName: String? = null): UpdateResult =
if (collectionName != null) updateFirst(query, update, entityClass.java, collectionName)
else updateFirst(query, update, entityClass.java)
@@ -483,7 +458,6 @@ inline fun <reified T : Any> MongoOperations.updateFirst(query: Query, update: U
* @author Sebastien Deleuze
* @since 2.0
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("updateMulti<T>(query, update, collectionName)"))
fun <T : Any> MongoOperations.updateMulti(query: Query, update: Update, entityClass: KClass<T>, collectionName: String? = null): UpdateResult =
if (collectionName != null) updateMulti(query, update, entityClass.java, collectionName)
else updateMulti(query, update, entityClass.java)
@@ -505,7 +479,6 @@ inline fun <reified T : Any> MongoOperations.updateMulti(query: Query, update: U
* @author Sebastien Deleuze
* @since 2.0
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("remove<T>(query, collectionName)"))
fun <T : Any> MongoOperations.remove(query: Query, entityClass: KClass<T>, collectionName: String? = null): DeleteResult =
if (collectionName != null) remove(query, entityClass.java, collectionName)
else remove(query, entityClass.java)

View File

@@ -23,7 +23,6 @@ import kotlin.reflect.KClass
* @author Mark Paluch
* @since 2.0
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("aggregateAndReturn<T>()"))
fun <T : Any> ReactiveAggregationOperation.aggregateAndReturn(entityClass: KClass<T>): ReactiveAggregationOperation.ReactiveAggregation<T> =
aggregateAndReturn(entityClass.java)

View File

@@ -15,8 +15,6 @@
*/
package org.springframework.data.mongodb.core
import kotlinx.coroutines.reactive.awaitFirstOrNull
import kotlinx.coroutines.reactive.awaitSingle
import kotlin.reflect.KClass
/**
@@ -25,7 +23,6 @@ import kotlin.reflect.KClass
* @author Mark Paluch
* @since 2.0
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("query<T>()"))
fun <T : Any> ReactiveFindOperation.query(entityClass: KClass<T>): ReactiveFindOperation.ReactiveFind<T> =
query(entityClass.java)
@@ -44,7 +41,6 @@ inline fun <reified T : Any> ReactiveFindOperation.query(): ReactiveFindOperatio
* @author Mark Paluch
* @since 2.0
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("asType<T>()"))
fun <T : Any> ReactiveFindOperation.FindWithProjection<*>.asType(resultType: KClass<T>): ReactiveFindOperation.FindWithQuery<T> =
`as`(resultType.java)
@@ -58,56 +54,19 @@ inline fun <reified T : Any> ReactiveFindOperation.FindWithProjection<*>.asType(
`as`(T::class.java)
/**
* Extension for [ExecutableFindOperation.DistinctWithProjection.as] providing a [KClass] based variant.
* Extension for [ExecutableFindOperation.DistinctWithProjection. as] providing a [KClass] based variant.
*
* @author Christoph Strobl
* @since 2.1
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("asType<T>()"))
fun <T : Any> ReactiveFindOperation.DistinctWithProjection.asType(resultType: KClass<T>): ReactiveFindOperation.TerminatingDistinct<T> =
`as`(resultType.java);
/**
* Extension for [ReactiveFindOperation.DistinctWithProjection.as] leveraging reified type parameters.
* Extension for [ReactiveFindOperation.DistinctWithProjection. as] leveraging reified type parameters.
*
* @author Christoph Strobl
* @since 2.1
*/
inline fun <reified T : Any> ReactiveFindOperation.DistinctWithProjection.asType(): ReactiveFindOperation.DistinctWithProjection =
`as`(T::class.java)
/**
* Coroutines variant of [ReactiveFindOperation.TerminatingFind.one].
*
* @author Sebastien Deleuze
* @since 2.2
*/
suspend inline fun <reified T : Any> ReactiveFindOperation.TerminatingFind<T>.awaitOne(): T? =
one().awaitFirstOrNull()
/**
* Coroutines variant of [ReactiveFindOperation.TerminatingFind.first].
*
* @author Sebastien Deleuze
* @since 2.2
*/
suspend inline fun <reified T : Any> ReactiveFindOperation.TerminatingFind<T>.awaitFirst(): T? =
first().awaitFirstOrNull()
/**
* Coroutines variant of [ReactiveFindOperation.TerminatingFind.count].
*
* @author Sebastien Deleuze
* @since 2.2
*/
suspend fun <T : Any> ReactiveFindOperation.TerminatingFind<T>.awaitCount(): Long =
count().awaitSingle()
/**
* Coroutines variant of [ReactiveFindOperation.TerminatingFind.exists].
*
* @author Sebastien Deleuze
* @since 2.2
*/
suspend fun <T : Any> ReactiveFindOperation.TerminatingFind<T>.awaitExists(): Boolean =
exists().awaitSingle()

View File

@@ -15,7 +15,6 @@
*/
package org.springframework.data.mongodb.core
import kotlinx.coroutines.reactive.awaitSingle
import kotlin.reflect.KClass
/**
@@ -24,7 +23,6 @@ import kotlin.reflect.KClass
* @author Mark Paluch
* @since 2.0
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("insert<T>()"))
fun <T : Any> ReactiveInsertOperation.insert(entityClass: KClass<T>): ReactiveInsertOperation.ReactiveInsert<T> =
insert(entityClass.java)
@@ -36,12 +34,3 @@ fun <T : Any> ReactiveInsertOperation.insert(entityClass: KClass<T>): ReactiveIn
*/
inline fun <reified T : Any> ReactiveInsertOperation.insert(): ReactiveInsertOperation.ReactiveInsert<T> =
insert(T::class.java)
/**
* Coroutines variant of [ReactiveInsertOperation.TerminatingInsert.one].
*
* @author Sebastien Deleuze
* @since 2.2
*/
suspend inline fun <reified T: Any> ReactiveInsertOperation.TerminatingInsert<T>.oneAndAwait(o: T): T =
one(o).awaitSingle()

View File

@@ -23,7 +23,6 @@ import kotlin.reflect.KClass
* @author Christoph Strobl
* @since 2.1
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("mapReduce<T>()"))
fun <T : Any> ReactiveMapReduceOperation.mapReduce(entityClass: KClass<T>): ReactiveMapReduceOperation.MapReduceWithMapFunction<T> =
mapReduce(entityClass.java)
@@ -42,7 +41,6 @@ inline fun <reified T : Any> ReactiveMapReduceOperation.mapReduce(): ReactiveMap
* @author Christoph Strobl
* @since 2.1
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("asType<T>()"))
fun <T : Any> ReactiveMapReduceOperation.MapReduceWithProjection<*>.asType(resultType: KClass<T>): ReactiveMapReduceOperation.MapReduceWithQuery<T> =
`as`(resultType.java)

View File

@@ -34,7 +34,6 @@ import kotlin.reflect.KClass
* @author Sebastien Deleuze
* @since 2.0
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("indexOps<T>()"))
fun <T : Any> ReactiveMongoOperations.indexOps(entityClass: KClass<T>): ReactiveIndexOperations =
indexOps(entityClass.java)
@@ -62,7 +61,6 @@ inline fun <reified T : Any> ReactiveMongoOperations.execute(action: ReactiveCol
* @author Sebastien Deleuze
* @since 2.0
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("createCollection<T>(collectionOptions)"))
fun <T : Any> ReactiveMongoOperations.createCollection(entityClass: KClass<T>, collectionOptions: CollectionOptions? = null): Mono<MongoCollection<Document>> =
if (collectionOptions != null) createCollection(entityClass.java, collectionOptions) else createCollection(entityClass.java)
@@ -81,7 +79,6 @@ inline fun <reified T : Any> ReactiveMongoOperations.createCollection(collection
* @author Sebastien Deleuze
* @since 2.0
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("collectionExists<T>()"))
fun <T : Any> ReactiveMongoOperations.collectionExists(entityClass: KClass<T>): Mono<Boolean> =
collectionExists(entityClass.java)
@@ -100,7 +97,6 @@ inline fun <reified T : Any> ReactiveMongoOperations.collectionExists(): Mono<Bo
* @author Sebastien Deleuze
* @since 2.0
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("dropCollection<T>()"))
fun <T : Any> ReactiveMongoOperations.dropCollection(entityClass: KClass<T>): Mono<Void> =
dropCollection(entityClass.java)
@@ -137,7 +133,6 @@ inline fun <reified T : Any> ReactiveMongoOperations.findOne(query: Query, colle
* @author Sebastien Deleuze
* @since 2.0
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("exists<T>(query, collectionName)"))
fun <T : Any> ReactiveMongoOperations.exists(query: Query, entityClass: KClass<T>, collectionName: String? = null): Mono<Boolean> =
if (collectionName != null) exists(query, entityClass.java, collectionName) else exists(query, entityClass.java)
@@ -175,7 +170,6 @@ inline fun <reified T : Any> ReactiveMongoOperations.findById(id: Any, collectio
* @author Christoph Strobl
* @since 2.1
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("findDistinct<T, E>(field)"))
inline fun <reified T : Any> ReactiveMongoOperations.findDistinct(field: String, entityClass: KClass<*>): Flux<T> =
findDistinct(field, entityClass.java, T::class.java);
@@ -185,7 +179,6 @@ inline fun <reified T : Any> ReactiveMongoOperations.findDistinct(field: String,
* @author Christoph Strobl
* @since 2.1
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("findDistinct<T, E>(query, field)"))
inline fun <reified T : Any> ReactiveMongoOperations.findDistinct(query: Query, field: String, entityClass: KClass<*>): Flux<T> =
findDistinct(query, field, entityClass.java, T::class.java)
@@ -195,7 +188,6 @@ inline fun <reified T : Any> ReactiveMongoOperations.findDistinct(query: Query,
* @author Christoph Strobl
* @since 2.1
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("findDistinct<T, E>(query, field, collectionName)"))
inline fun <reified T : Any> ReactiveMongoOperations.findDistinct(query: Query, field: String, collectionName: String, entityClass: KClass<*>): Flux<T> =
findDistinct(query, field, collectionName, entityClass.java, T::class.java)
@@ -244,7 +236,6 @@ inline fun <reified T : Any> ReactiveMongoOperations.findAndRemove(query: Query,
* @author Sebastien Deleuze
* @since 2.0
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("count<T>(query, collectionName)"))
fun <T : Any> ReactiveMongoOperations.count(query: Query = Query(), entityClass: KClass<T>, collectionName: String? = null): Mono<Long> =
if (collectionName != null) count(query, entityClass.java, collectionName)
else count(query, entityClass.java)
@@ -266,26 +257,15 @@ inline fun <reified T : Any> ReactiveMongoOperations.count(query: Query = Query(
* @author Sebastien Deleuze
* @since 2.0
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("insert<T>(batchToSave)"))
fun <T : Any> ReactiveMongoOperations.insert(batchToSave: Collection<T>, entityClass: KClass<T>): Flux<T> =
insert(batchToSave, entityClass.java)
/**
* Extension for [ReactiveMongoOperations.insert] leveraging reified type parameters.
*
* @author Mark Paluch
* @since 2.2
*/
@Suppress("EXTENSION_SHADOWED_BY_MEMBER")
inline fun <reified T : Any> ReactiveMongoOperations.insert(batchToSave: Collection<T>): Flux<T> = insert(batchToSave, T::class.java)
/**
* Extension for [ReactiveMongoOperations.insertAll] providing a [KClass] based variant.
*
* @author Sebastien Deleuze
* @since 2.0
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("insertAll<T>(batchToSave)"))
fun <T : Any> ReactiveMongoOperations.insertAll(batchToSave: Mono<out Collection<T>>, entityClass: KClass<T>): Flux<T> =
insertAll(batchToSave, entityClass.java)
@@ -295,7 +275,6 @@ fun <T : Any> ReactiveMongoOperations.insertAll(batchToSave: Mono<out Collection
* @author Sebastien Deleuze
* @since 2.0
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("upsert<T>(query, update, collectionName)"))
fun <T : Any> ReactiveMongoOperations.upsert(query: Query, update: Update, entityClass: KClass<T>, collectionName: String? = null): Mono<UpdateResult> =
if (collectionName != null) upsert(query, update, entityClass.java, collectionName) else upsert(query, update, entityClass.java)
@@ -316,7 +295,6 @@ inline fun <reified T : Any> ReactiveMongoOperations.upsert(query: Query, update
* @author Sebastien Deleuze
* @since 2.0
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("updateFirst<T>(query, update, collectionName)"))
fun <T : Any> ReactiveMongoOperations.updateFirst(query: Query, update: Update, entityClass: KClass<T>, collectionName: String? = null): Mono<UpdateResult> =
if (collectionName != null) updateFirst(query, update, entityClass.java, collectionName)
else updateFirst(query, update, entityClass.java)
@@ -338,7 +316,6 @@ inline fun <reified T : Any> ReactiveMongoOperations.updateFirst(query: Query, u
* @author Sebastien Deleuze
* @since 2.0
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("updateMulti<T>(query, update, collectionName)"))
fun <T : Any> ReactiveMongoOperations.updateMulti(query: Query, update: Update, entityClass: KClass<T>, collectionName: String? = null): Mono<UpdateResult> =
if (collectionName != null) updateMulti(query, update, entityClass.java, collectionName)
else updateMulti(query, update, entityClass.java)
@@ -360,7 +337,6 @@ inline fun <reified T : Any> ReactiveMongoOperations.updateMulti(query: Query, u
* @author Sebastien Deleuze
* @since 2.0
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("remove<T>(query, collectionName)"))
fun <T : Any> ReactiveMongoOperations.remove(query: Query, entityClass: KClass<T>, collectionName: String? = null): Mono<DeleteResult> =
if (collectionName != null) remove(query, entityClass.java, collectionName)
else remove(query, entityClass.java)

View File

@@ -15,8 +15,6 @@
*/
package org.springframework.data.mongodb.core
import com.mongodb.client.result.DeleteResult
import kotlinx.coroutines.reactive.awaitSingle
import kotlin.reflect.KClass
/**
@@ -25,7 +23,6 @@ import kotlin.reflect.KClass
* @author Mark Paluch
* @since 2.0
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("remove<T>()"))
fun <T : Any> ReactiveRemoveOperation.remove(entityClass: KClass<T>): ReactiveRemoveOperation.ReactiveRemove<T> =
remove(entityClass.java)
@@ -37,12 +34,3 @@ fun <T : Any> ReactiveRemoveOperation.remove(entityClass: KClass<T>): ReactiveRe
*/
inline fun <reified T : Any> ReactiveRemoveOperation.remove(): ReactiveRemoveOperation.ReactiveRemove<T> =
remove(T::class.java)
/**
* Coroutines variant of [ReactiveRemoveOperation.TerminatingRemove.all].
*
* @author Sebastien Deleuze
* @since 2.2
*/
suspend fun <T : Any> ReactiveRemoveOperation.TerminatingRemove<T>.allAndAwait(): DeleteResult =
all().awaitSingle()

Some files were not shown because too many files have changed in this diff Show More