Compare commits

...

10 Commits

Author SHA1 Message Date
Christoph Strobl
a7b36ba2c0 Hacking - Explore Entity Metadata DSL 2023-06-06 15:01:46 +02:00
Christoph Strobl
54dc3dd7f8 Prepare issue branch 2023-06-01 13:59:09 +02:00
Christoph Strobl
98795cb33e Convert BsonUndefined to null value.
Register a reading converter that returns null when attempting to read a value of type BsonUndefined.
Prior to this change users faced a ConverterNotFoundException when source documents contained BsonUndefined.

Resolves: #2350
2023-06-01 09:24:29 +02:00
Christoph Strobl
fa63efcb24 Add tests using $slice on dbref field.
Closes: #2191
2023-05-31 16:11:40 +02:00
Christoph Strobl
5ffaa79f4e Fix code snippet in change streams reference documentation.
Closes: #4376
2023-05-30 13:01:57 +02:00
Christoph Strobl
f775d485c6 Update docker image build instructions.
Closes: #4372
2023-05-30 13:01:57 +02:00
Mark Paluch
370b4145d2 Polishing.
Add assertions and missing Override annotations. Avoid recursive self-call on getClassLoader. Extend documentation.

See #1627
Original pull request: #4389
2023-05-26 14:48:08 +02:00
Christoph Strobl
4b78ef6523 Extend GridFsTemplate and its reactive variant to accept a provided GridFSBucket instance.
Allow to pass in a GridFSBucket from outside to avoid recreating instances on every method call.

Closes #1627
Original pull request:# 4389
2023-05-26 14:48:08 +02:00
Christoph Strobl
5163e544ae After release cleanups.
See #4369
2023-05-12 14:18:53 +02:00
Christoph Strobl
431512a66c Prepare next development iteration.
See #4369
2023-05-12 14:18:51 +02:00
24 changed files with 662 additions and 88 deletions

View File

@@ -16,7 +16,7 @@ All of these use cases are great reasons to essentially run what the CI server d
IMPORTANT: To do this you must have Docker installed on your machine.
1. `docker run -it --mount type=bind,source="$(pwd)",target=/spring-data-mongodb-github springci/spring-data-openjdk8-with-mongodb-4.0:latest /bin/bash`
1. `docker run -it --mount type=bind,source="$(pwd)",target=/spring-data-mongodb-github springci/spring-data-openjdk17-with-mongodb-5.0.3:latest /bin/bash`
+
This will launch the Docker image and mount your source code at `spring-data-mongodb-github`.
+

View File

@@ -10,7 +10,7 @@ All of these use cases are great reasons to essentially run what Concourse does
IMPORTANT: To do this you must have Docker installed on your machine.
1. `docker run -it --mount type=bind,source="$(pwd)",target=/spring-data-mongodb-github springci/spring-data-8-jdk-with-mongodb /bin/bash`
1. `docker run -it --mount type=bind,source="$(pwd)",target=/spring-data-mongodb-github springci/spring-data-openjdk17-with-mongodb-5.0.3 /bin/bash`
+
This will launch the Docker image and mount your source code at `spring-data-mongodb-github`.
+
@@ -23,7 +23,7 @@ Since the container is binding to your source, you can make edits from your IDE
If you need to test the `build.sh` script, do this:
1. `mkdir /tmp/spring-data-mongodb-artifactory`
2. `docker run -it --mount type=bind,source="$(pwd)",target=/spring-data-mongodb-github --mount type=bind,source="/tmp/spring-data-mongodb-artifactory",target=/spring-data-mongodb-artifactory springci/spring-data-8-jdk-with-mongodb /bin/bash`
2. `docker run -it --mount type=bind,source="$(pwd)",target=/spring-data-mongodb-github --mount type=bind,source="/tmp/spring-data-mongodb-artifactory",target=/spring-data-mongodb-artifactory springci/spring-data-openjdk17-with-mongodb-5.0.3 /bin/bash`
+
This will launch the Docker image and mount your source code at `spring-data-mongodb-github` and the temporary
artifactory output directory at `spring-data-mongodb-artifactory`.
@@ -36,4 +36,4 @@ IMPORTANT: `build.sh` doesn't actually push to Artifactory so don't worry about
It just deploys to a local folder. That way, the `artifactory-resource` later in the pipeline can pick up these artifacts
and deliver them to artifactory.
NOTE: Docker containers can eat up disk space fast! From time to time, run `docker system prune` to clean out old images.
NOTE: Docker containers can eat up disk space fast! From time to time, run `docker system prune` to clean out old images.

10
pom.xml
View File

@@ -5,7 +5,7 @@
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>4.1.0</version>
<version>4.2.x-3380-SNAPSHOT</version>
<packaging>pom</packaging>
<name>Spring Data MongoDB</name>
@@ -15,7 +15,7 @@
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>3.1.0</version>
<version>3.2.0-SNAPSHOT</version>
</parent>
<modules>
@@ -26,7 +26,7 @@
<properties>
<project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>3.1.0</springdata.commons>
<springdata.commons>3.2.0-SNAPSHOT</springdata.commons>
<mongo>4.9.1</mongo>
<mongo.reactivestreams>${mongo}</mongo.reactivestreams>
<jmh.version>1.19</jmh.version>
@@ -145,8 +145,8 @@
<repositories>
<repository>
<id>spring-libs-release</id>
<url>https://repo.spring.io/libs-release</url>
<id>spring-libs-snapshot</id>
<url>https://repo.spring.io/libs-snapshot</url>
<snapshots>
<enabled>true</enabled>
</snapshots>

View File

@@ -7,7 +7,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>4.1.0</version>
<version>4.2.x-3380-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -15,7 +15,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>4.1.0</version>
<version>4.2.x-3380-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -13,7 +13,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>4.1.0</version>
<version>4.2.x-3380-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -19,6 +19,7 @@ import java.util.Collection;
import java.util.Collections;
import java.util.HashSet;
import java.util.Set;
import java.util.function.Consumer;
import org.bson.UuidRepresentation;
import org.springframework.beans.factory.config.BeanDefinition;
@@ -34,7 +35,10 @@ import org.springframework.data.mongodb.MongoManagedTypes;
import org.springframework.data.mongodb.core.convert.MongoCustomConversions;
import org.springframework.data.mongodb.core.convert.MongoCustomConversions.MongoConverterConfigurationAdapter;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MappingConfig;
import org.springframework.data.mongodb.core.mapping.MappingConfig.MappingRuleCustomizer;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.lang.Nullable;
import org.springframework.util.ClassUtils;
import org.springframework.util.StringUtils;
@@ -87,10 +91,16 @@ public abstract class MongoConfigurationSupport {
mappingContext.setSimpleTypeHolder(customConversions.getSimpleTypeHolder());
mappingContext.setFieldNamingStrategy(fieldNamingStrategy());
mappingContext.setAutoIndexCreation(autoIndexCreation());
mappingContext.setMappingConfig(mappingConfig());
return mappingContext;
}
@Nullable
public MappingConfig mappingConfig() {
return null;
}
/**
* @return new instance of {@link MongoManagedTypes}.
* @throws ClassNotFoundException

View File

@@ -487,7 +487,22 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
&& instanceCreatorMetadata.hasParameters() ? getParameterProvider(context, entity, documentAccessor, evaluator)
: NoOpParameterValueProvider.INSTANCE;
EntityInstantiator instantiator = instantiators.getInstantiatorFor(entity);
EntityInstantiator instantiator = entity.getInstanceCreator();
if(instantiator != null) {
provider = new ParameterValueProvider() {
@Nullable
public Object getParameterValue(Parameter parameter) {
String name = parameter.getName();
if (name == null) {
throw new IllegalArgumentException(String.format("Parameter %s does not have a name", parameter));
} else {
return documentAccessor.get(entity.getRequiredPersistentProperty(name));
}
}
};
} else {
instantiator = instantiators.getInstantiatorFor(entity);
}
S instance = instantiator.createInstance(entity, provider);
if (entity.requiresPropertyPopulation()) {

View File

@@ -33,6 +33,7 @@ import java.util.concurrent.atomic.AtomicLong;
import org.bson.BsonReader;
import org.bson.BsonTimestamp;
import org.bson.BsonUndefined;
import org.bson.BsonWriter;
import org.bson.Document;
import org.bson.codecs.Codec;
@@ -104,6 +105,7 @@ abstract class MongoConverters {
converters.add(BinaryToByteArrayConverter.INSTANCE);
converters.add(BsonTimestampToInstantConverter.INSTANCE);
converters.add(reading(BsonUndefined.class, Object.class, it -> null));
converters.add(reading(String.class, URI.class, URI::create).andWriting(URI::toString));
return converters;

View File

@@ -31,7 +31,9 @@ import org.springframework.data.mapping.AssociationHandler;
import org.springframework.data.mapping.MappingException;
import org.springframework.data.mapping.PropertyHandler;
import org.springframework.data.mapping.model.BasicPersistentEntity;
import org.springframework.data.mapping.model.EntityInstantiator;
import org.springframework.data.mongodb.MongoCollectionUtils;
import org.springframework.data.mongodb.core.mapping.MappingConfig.EntityConfig;
import org.springframework.data.mongodb.util.encryption.EncryptionUtils;
import org.springframework.data.spel.ExpressionDependencies;
import org.springframework.data.util.Lazy;
@@ -72,6 +74,11 @@ public class BasicMongoPersistentEntity<T> extends BasicPersistentEntity<T, Mong
private final @Nullable Expression collationExpression;
private final ShardKey shardKey;
private EntityConfig entityConfig;
public BasicMongoPersistentEntity(TypeInformation<T> typeInformation) {
this(typeInformation, null);
}
/**
* Creates a new {@link BasicMongoPersistentEntity} with the given {@link TypeInformation}. Will default the
@@ -79,12 +86,18 @@ public class BasicMongoPersistentEntity<T> extends BasicPersistentEntity<T, Mong
*
* @param typeInformation must not be {@literal null}.
*/
public BasicMongoPersistentEntity(TypeInformation<T> typeInformation) {
public BasicMongoPersistentEntity(TypeInformation<T> typeInformation, EntityConfig<T> config) {
super(typeInformation, MongoPersistentPropertyComparator.INSTANCE);
this.entityConfig = config;
Class<?> rawType = typeInformation.getType();
String fallback = MongoCollectionUtils.getPreferredCollectionName(rawType);
if (config != null) {
fallback = config.collectionNameOrDefault(() -> MongoCollectionUtils.getPreferredCollectionName(rawType));
}
if (this.isAnnotationPresent(Document.class)) {
Document document = this.getRequiredAnnotation(Document.class);
@@ -249,6 +262,12 @@ public class BasicMongoPersistentEntity<T> extends BasicPersistentEntity<T, Mong
Assert.notNull(property, "MongoPersistentProperty must not be null");
if (entityConfig != null) {
if (entityConfig.isIdProperty(property)) {
return property;
}
}
if (!property.isIdProperty()) {
return null;
}
@@ -340,6 +359,11 @@ public class BasicMongoPersistentEntity<T> extends BasicPersistentEntity<T, Mong
}
}
@Override
public EntityInstantiator getInstanceCreator() {
return this.entityConfig != null ? this.entityConfig.getInstantiator() : null;
}
@Override
public Collection<Object> getEncryptionKeyIds() {
@@ -398,9 +422,9 @@ public class BasicMongoPersistentEntity<T> extends BasicPersistentEntity<T, Mong
if (persistentProperty.isDbReference() && persistentProperty.getDBRef().lazy()) {
if (persistentProperty.isArray() || Modifier.isFinal(persistentProperty.getActualType().getModifiers())) {
throw new MappingException(String.format(
"Invalid lazy DBRef property for %s; Found %s which must not be an array nor a final class",
persistentProperty.getField(), persistentProperty.getActualType()));
throw new MappingException(
String.format("Invalid lazy DBRef property for %s; Found %s which must not be an array nor a final class",
persistentProperty.getField(), persistentProperty.getActualType()));
}
}
}

View File

@@ -34,6 +34,7 @@ import org.springframework.data.mapping.model.FieldNamingStrategy;
import org.springframework.data.mapping.model.Property;
import org.springframework.data.mapping.model.PropertyNameFieldNamingStrategy;
import org.springframework.data.mapping.model.SimpleTypeHolder;
import org.springframework.data.mongodb.core.mapping.MappingConfig.PropertyConfig;
import org.springframework.data.mongodb.util.encryption.EncryptionUtils;
import org.springframework.data.util.Lazy;
import org.springframework.expression.EvaluationContext;
@@ -73,6 +74,12 @@ public class BasicMongoPersistentProperty extends AnnotationBasedPersistentPrope
}
private final FieldNamingStrategy fieldNamingStrategy;
PropertyConfig<?, ?> propertyConfig;
public BasicMongoPersistentProperty(Property property, MongoPersistentEntity<?> owner,
SimpleTypeHolder simpleTypeHolder, @Nullable FieldNamingStrategy fieldNamingStrategy) {
this(property, owner, simpleTypeHolder, fieldNamingStrategy, null);
}
/**
* Creates a new {@link BasicMongoPersistentProperty}.
@@ -83,11 +90,12 @@ public class BasicMongoPersistentProperty extends AnnotationBasedPersistentPrope
* @param fieldNamingStrategy can be {@literal null}.
*/
public BasicMongoPersistentProperty(Property property, MongoPersistentEntity<?> owner,
SimpleTypeHolder simpleTypeHolder, @Nullable FieldNamingStrategy fieldNamingStrategy) {
SimpleTypeHolder simpleTypeHolder, @Nullable FieldNamingStrategy fieldNamingStrategy, @Nullable PropertyConfig<?,?> propertyConfig) {
super(property, owner, simpleTypeHolder);
this.fieldNamingStrategy = fieldNamingStrategy == null ? PropertyNameFieldNamingStrategy.INSTANCE
: fieldNamingStrategy;
this.propertyConfig = propertyConfig;
if (isIdProperty() && hasExplicitFieldName()) {
@@ -115,6 +123,10 @@ public class BasicMongoPersistentProperty extends AnnotationBasedPersistentPrope
return true;
}
if(propertyConfig != null && propertyConfig.isId()) {
return true;
}
// We need to support a wider range of ID types than just the ones that can be converted to an ObjectId
// but still we need to check if there happens to be an explicit name set
return SUPPORTED_ID_PROPERTY_NAMES.contains(getName()) && !hasExplicitFieldName();
@@ -132,6 +144,10 @@ public class BasicMongoPersistentProperty extends AnnotationBasedPersistentPrope
*/
public String getFieldName() {
if(propertyConfig != null && StringUtils.hasText(propertyConfig.getTargetName())) {
return propertyConfig.getTargetName();
}
if (isIdProperty()) {
if (getOwner().getIdProperty() == null) {

View File

@@ -18,6 +18,7 @@ package org.springframework.data.mongodb.core.mapping;
import org.springframework.data.mapping.model.FieldNamingStrategy;
import org.springframework.data.mapping.model.Property;
import org.springframework.data.mapping.model.SimpleTypeHolder;
import org.springframework.data.mongodb.core.mapping.MappingConfig.PropertyConfig;
import org.springframework.lang.Nullable;
/**
@@ -47,8 +48,8 @@ public class CachingMongoPersistentProperty extends BasicMongoPersistentProperty
* @param fieldNamingStrategy can be {@literal null}.
*/
public CachingMongoPersistentProperty(Property property, MongoPersistentEntity<?> owner,
SimpleTypeHolder simpleTypeHolder, @Nullable FieldNamingStrategy fieldNamingStrategy) {
super(property, owner, simpleTypeHolder, fieldNamingStrategy);
SimpleTypeHolder simpleTypeHolder, @Nullable FieldNamingStrategy fieldNamingStrategy, PropertyConfig config) {
super(property, owner, simpleTypeHolder, fieldNamingStrategy, config);
}
@Override

View File

@@ -0,0 +1,238 @@
/*
* Copyright 2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.mapping;
import java.lang.annotation.Annotation;
import java.util.Collections;
import java.util.HashMap;
import java.util.Map;
import java.util.function.Consumer;
import java.util.function.Function;
import java.util.function.Supplier;
import org.springframework.data.mapping.Parameter;
import org.springframework.data.mapping.PersistentEntity;
import org.springframework.data.mapping.PersistentProperty;
import org.springframework.data.mapping.SimplePropertyHandler;
import org.springframework.data.mapping.model.EntityInstantiator;
import org.springframework.data.mapping.model.ParameterValueProvider;
import org.springframework.data.mapping.model.PropertyValueProvider;
import org.springframework.data.util.Lazy;
import org.springframework.data.util.MethodInvocationRecorder;
import org.springframework.data.util.TypeInformation;
import org.springframework.lang.Nullable;
/**
* @author Christoph Strobl
* @since 2023/06
*/
public class MappingConfig {
private final Map<Class, EntityConfig<?>> entityConfigMap;
MappingConfig(Map<Class, EntityConfig<?>> entityConfigMap) {
this.entityConfigMap = entityConfigMap;
}
public static MappingConfig none() {
return new MappingConfig(Collections.emptyMap());
}
public static MappingConfig mappingRules(Consumer<MappingRuleCustomizer> customizer) {
MappingConfig mappingConfig = new MappingConfig(new HashMap<>());
customizer.accept(new MappingRuleCustomizer() {
@Override
public <T> MappingRuleCustomizer add(Class<T> type, Consumer<EntityConfig<T>> cfg) {
EntityConfig<T> entityConfig = (EntityConfig<T>) mappingConfig.entityConfigMap.computeIfAbsent(type,
(it) -> EntityConfig.configure(it));
cfg.accept(entityConfig);
return this;
}
});
return mappingConfig;
}
public interface MappingRuleCustomizer {
<T> MappingRuleCustomizer add(Class<T> type, Consumer<EntityConfig<T>> cfg);
}
@Nullable
public <T> EntityConfig<T> getEntityConfig(Class<T> type) {
return (EntityConfig<T>) entityConfigMap.get(type);
}
public static class EntityConfig<T> {
private final Class<T> type;
@Nullable private Supplier<String> collectionName;
Map<String, PropertyConfig<T, ?>> propertyConfigMap = new HashMap<>();
EntityInstantiator instantiator;
public EntityConfig(Class<T> type) {
this.type = type;
}
public static <T, P> EntityConfig<T> configure(Class<T> type) {
return new EntityConfig<>(type);
}
public <P> EntityConfig<T> define(String name, Consumer<PropertyConfig<T, P>> cfg) {
PropertyConfig<T, P> config = (PropertyConfig<T, P>) propertyConfigMap.computeIfAbsent(name,
(key) -> new PropertyConfig<>(this.type, key));
cfg.accept(config);
return this;
}
public <P> EntityConfig<T> define(Function<T, P> property, Consumer<PropertyConfig<T, P>> cfg) {
String propertyName = MethodInvocationRecorder.forProxyOf(type).record(property).getPropertyPath()
.orElseThrow(() -> new IllegalArgumentException("Cannot obtain property name"));
return define(propertyName, cfg);
}
public EntityConfig<T> namespace(String name) {
return namespace(() -> name);
}
public EntityConfig<T> namespace(Supplier<String> name) {
this.collectionName = name;
return this;
}
boolean isIdProperty(PersistentProperty<?> property) {
PropertyConfig<T, ?> propertyConfig = propertyConfigMap.get(property.getName());
if (propertyConfig == null) {
return false;
}
return propertyConfig.isId();
}
String collectionNameOrDefault(Supplier<String> fallback) {
return collectionName != null ? collectionName.get() : fallback.get();
}
public EntityInstantiator getInstantiator() {
return instantiator;
}
public EntityConfig<T> entityCreator(Function<Arguments<T>, T> createFunction) {
instantiator = new EntityInstantiator() {
@Override
public <T, E extends PersistentEntity<? extends T, P>, P extends PersistentProperty<P>> T createInstance(
E entity, ParameterValueProvider<P> provider) {
Map<String, Object> targetMap = new HashMap<>();
PropertyValueProvider pvv = provider instanceof PropertyValueProvider pvp ? pvp : new PropertyValueProvider<P>() {
@Nullable
@Override
public <T> T getPropertyValue(P property) {
Parameter parameter = new Parameter<>(property.getName(), (TypeInformation) property.getTypeInformation(),
new Annotation[] {}, null);
return (T) provider.getParameterValue(parameter);
}
};
entity.doWithProperties((SimplePropertyHandler) property -> {
targetMap.put(property.getName(), pvv.getPropertyValue(property));
});
return (T) createFunction.apply(new Arguments() {
private Map<Function, String> resolvedName = new HashMap<>();
@Override
public Object get(String arg) {
return targetMap.get(arg);
}
@Override
public Class getType() {
return entity.getType();
}
@Override
public Object get(Function property) {
String name = resolvedName.computeIfAbsent(property, key -> (String) MethodInvocationRecorder.forProxyOf(getType()).record(property).getPropertyPath().orElse(""));
return get(name);
}
});
}
};
return this;
}
public interface Arguments<T> {
<V> V get(String arg);
default <V> V get(Function<T, V> property) {
String propertyName = MethodInvocationRecorder.forProxyOf(getType()).record(property).getPropertyPath()
.orElseThrow(() -> new IllegalArgumentException("Cannot obtain property name"));
return get(propertyName);
}
Class<T> getType();
}
}
public static class PropertyConfig<T, P> {
private final Class<T> owingType;
private final String propertyName;
private String fieldName;
private boolean isId;
private boolean isTransient;
public PropertyConfig(Class<T> owingType, String propertyName) {
this.owingType = owingType;
this.propertyName = propertyName;
}
public PropertyConfig<T, P> useAsId() {
this.isId = true;
return this;
}
public boolean isId() {
return isId;
}
public PropertyConfig<T, P> setTransient() {
this.isTransient = true;
return this;
}
public PropertyConfig<T, P> mappedName(String fieldName) {
this.fieldName = fieldName;
return this;
}
public String getTargetName() {
return this.fieldName;
}
}
}

View File

@@ -16,6 +16,7 @@
package org.springframework.data.mongodb.core.mapping;
import java.util.AbstractMap;
import java.util.function.Consumer;
import org.springframework.beans.BeansException;
import org.springframework.context.ApplicationContext;
@@ -26,6 +27,7 @@ import org.springframework.data.mapping.model.FieldNamingStrategy;
import org.springframework.data.mapping.model.Property;
import org.springframework.data.mapping.model.PropertyNameFieldNamingStrategy;
import org.springframework.data.mapping.model.SimpleTypeHolder;
import org.springframework.data.mongodb.core.mapping.MappingConfig.MappingRuleCustomizer;
import org.springframework.data.util.NullableWrapperConverters;
import org.springframework.data.util.TypeInformation;
import org.springframework.lang.Nullable;
@@ -45,6 +47,7 @@ public class MongoMappingContext extends AbstractMappingContext<MongoPersistentE
private FieldNamingStrategy fieldNamingStrategy = DEFAULT_NAMING_STRATEGY;
private boolean autoIndexCreation = false;
private MappingConfig mappingConfig;
@Nullable
private ApplicationContext applicationContext;
@@ -67,6 +70,14 @@ public class MongoMappingContext extends AbstractMappingContext<MongoPersistentE
this.fieldNamingStrategy = fieldNamingStrategy == null ? DEFAULT_NAMING_STRATEGY : fieldNamingStrategy;
}
public void setMappingConfig(MappingConfig mappingConfig) {
this.mappingConfig = mappingConfig;
}
public void mappingRules(Consumer<MappingRuleCustomizer> customizer) {
setMappingConfig(MappingConfig.mappingRules(customizer));
}
@Override
protected boolean shouldCreatePersistentEntityFor(TypeInformation<?> type) {
@@ -80,12 +91,12 @@ public class MongoMappingContext extends AbstractMappingContext<MongoPersistentE
@Override
public MongoPersistentProperty createPersistentProperty(Property property, MongoPersistentEntity<?> owner,
SimpleTypeHolder simpleTypeHolder) {
return new CachingMongoPersistentProperty(property, owner, simpleTypeHolder, fieldNamingStrategy);
return new CachingMongoPersistentProperty(property, owner, simpleTypeHolder, fieldNamingStrategy, mappingConfig != null ? mappingConfig.getEntityConfig(owner.getType()).propertyConfigMap.get(property.getName()) : null);
}
@Override
protected <T> BasicMongoPersistentEntity<T> createPersistentEntity(TypeInformation<T> typeInformation) {
return new BasicMongoPersistentEntity<>(typeInformation);
return new BasicMongoPersistentEntity<>(typeInformation, mappingConfig != null ? mappingConfig.getEntityConfig(typeInformation.getType()) : null);
}
@Override

View File

@@ -18,6 +18,7 @@ package org.springframework.data.mongodb.core.mapping;
import java.util.Collection;
import org.springframework.data.mapping.PersistentEntity;
import org.springframework.data.mapping.model.EntityInstantiator;
import org.springframework.data.mapping.model.MutablePersistentEntity;
import org.springframework.lang.Nullable;
@@ -111,4 +112,8 @@ public interface MongoPersistentEntity<T> extends MutablePersistentEntity<T, Mon
*/
@Nullable
Collection<Object> getEncryptionKeyIds();
default EntityInstantiator getInstanceCreator() {
return null;
}
}

View File

@@ -22,6 +22,7 @@ import java.io.InputStream;
import java.util.ArrayList;
import java.util.List;
import java.util.Optional;
import java.util.function.Supplier;
import org.bson.Document;
import org.bson.types.ObjectId;
@@ -30,6 +31,7 @@ import org.springframework.data.mongodb.MongoDatabaseFactory;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.util.BsonUtils;
import org.springframework.data.util.Lazy;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
@@ -56,12 +58,14 @@ import com.mongodb.client.gridfs.model.GridFSUploadOptions;
*/
public class GridFsTemplate extends GridFsOperationsSupport implements GridFsOperations, ResourcePatternResolver {
private final MongoDatabaseFactory dbFactory;
private final @Nullable String bucket;
private final Supplier<GridFSBucket> bucketSupplier;
/**
* Creates a new {@link GridFsTemplate} using the given {@link MongoDatabaseFactory} and {@link MongoConverter}.
* <p>
* Note that the {@link GridFSBucket} is obtained only once from {@link MongoDatabaseFactory#getMongoDatabase()
* MongoDatabase}. Use {@link #GridFsTemplate(MongoConverter, Supplier)} if you want to use different buckets from the
* same Template instance.
*
* @param dbFactory must not be {@literal null}.
* @param converter must not be {@literal null}.
@@ -72,26 +76,44 @@ public class GridFsTemplate extends GridFsOperationsSupport implements GridFsOpe
/**
* Creates a new {@link GridFsTemplate} using the given {@link MongoDatabaseFactory} and {@link MongoConverter}.
* <p>
* Note that the {@link GridFSBucket} is obtained only once from {@link MongoDatabaseFactory#getMongoDatabase()
* MongoDatabase}. Use {@link #GridFsTemplate(MongoConverter, Supplier)} if you want to use different buckets from the
* same Template instance.
*
* @param dbFactory must not be {@literal null}.
* @param converter must not be {@literal null}.
* @param bucket can be {@literal null}.
*/
public GridFsTemplate(MongoDatabaseFactory dbFactory, MongoConverter converter, @Nullable String bucket) {
this(converter, Lazy.of(() -> getGridFs(dbFactory, bucket)));
}
/**
* Creates a new {@link GridFsTemplate} using the given {@link MongoConverter} and {@link Supplier} providing the
* required {@link GridFSBucket}.
*
* @param converter must not be {@literal null}.
* @param gridFSBucket must not be {@literal null}.
* @since 4.2
*/
public GridFsTemplate(MongoConverter converter, Supplier<GridFSBucket> gridFSBucket) {
super(converter);
Assert.notNull(dbFactory, "MongoDbFactory must not be null");
Assert.notNull(gridFSBucket, "GridFSBucket supplier must not be null");
this.dbFactory = dbFactory;
this.bucket = bucket;
this.bucketSupplier = gridFSBucket;
}
@Override
public ObjectId store(InputStream content, @Nullable String filename, @Nullable String contentType,
@Nullable Object metadata) {
return store(content, filename, contentType, toDocument(metadata));
}
@Override
@SuppressWarnings("unchecked")
public <T> T store(GridFsObject<T, InputStream> upload) {
GridFSUploadOptions uploadOptions = computeUploadOptionsFor(upload.getOptions().getContentType(),
@@ -110,6 +132,7 @@ public class GridFsTemplate extends GridFsOperationsSupport implements GridFsOpe
return upload.getFileId();
}
@Override
public GridFSFindIterable find(Query query) {
Assert.notNull(query, "Query must not be null");
@@ -130,10 +153,12 @@ public class GridFsTemplate extends GridFsOperationsSupport implements GridFsOpe
return iterable;
}
@Override
public GridFSFile findOne(Query query) {
return find(query).first();
}
@Override
public void delete(Query query) {
for (GridFSFile gridFSFile : find(query)) {
@@ -141,10 +166,12 @@ public class GridFsTemplate extends GridFsOperationsSupport implements GridFsOpe
}
}
@Override
public ClassLoader getClassLoader() {
return dbFactory.getClass().getClassLoader();
return null;
}
@Override
public GridFsResource getResource(String location) {
return Optional.ofNullable(findOne(query(whereFilename().is(location)))) //
@@ -152,6 +179,7 @@ public class GridFsTemplate extends GridFsOperationsSupport implements GridFsOpe
.orElseGet(() -> GridFsResource.absent(location));
}
@Override
public GridFsResource getResource(GridFSFile file) {
Assert.notNull(file, "GridFSFile must not be null");
@@ -159,6 +187,7 @@ public class GridFsTemplate extends GridFsOperationsSupport implements GridFsOpe
return new GridFsResource(file, getGridFs().openDownloadStream(file.getId()));
}
@Override
public GridFsResource[] getResources(String locationPattern) {
if (!StringUtils.hasText(locationPattern)) {
@@ -183,6 +212,12 @@ public class GridFsTemplate extends GridFsOperationsSupport implements GridFsOpe
}
private GridFSBucket getGridFs() {
return this.bucketSupplier.get();
}
private static GridFSBucket getGridFs(MongoDatabaseFactory dbFactory, @Nullable String bucket) {
Assert.notNull(dbFactory, "MongoDatabaseFactory must not be null");
MongoDatabase db = dbFactory.getMongoDatabase();
return bucket == null ? GridFSBuckets.create(db) : GridFSBuckets.create(db, bucket);

View File

@@ -27,7 +27,6 @@ import org.bson.BsonValue;
import org.bson.Document;
import org.bson.types.ObjectId;
import org.reactivestreams.Publisher;
import org.springframework.core.io.buffer.DataBuffer;
import org.springframework.core.io.buffer.DataBufferFactory;
import org.springframework.core.io.buffer.DefaultDataBufferFactory;
@@ -37,6 +36,7 @@ import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.SerializationUtils;
import org.springframework.data.mongodb.util.BsonUtils;
import org.springframework.data.util.Lazy;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
@@ -61,13 +61,17 @@ import com.mongodb.reactivestreams.client.gridfs.GridFSUploadPublisher;
*/
public class ReactiveGridFsTemplate extends GridFsOperationsSupport implements ReactiveGridFsOperations {
private final ReactiveMongoDatabaseFactory dbFactory;
private final DataBufferFactory dataBufferFactory;
private final @Nullable String bucket;
private final Mono<GridFSBucket> bucketSupplier;
/**
* Creates a new {@link ReactiveGridFsTemplate} using the given {@link ReactiveMongoDatabaseFactory} and
* {@link MongoConverter}.
* <p>
* Note that the {@link GridFSBucket} is obtained only once from
* {@link ReactiveMongoDatabaseFactory#getMongoDatabase() MongoDatabase}. Use
* {@link #ReactiveGridFsTemplate(MongoConverter, Mono, DataBufferFactory)} if you want to use different buckets from
* the same Template instance.
*
* @param dbFactory must not be {@literal null}.
* @param converter must not be {@literal null}.
@@ -79,10 +83,15 @@ public class ReactiveGridFsTemplate extends GridFsOperationsSupport implements R
/**
* Creates a new {@link ReactiveGridFsTemplate} using the given {@link ReactiveMongoDatabaseFactory} and
* {@link MongoConverter}.
* <p>
* Note that the {@link GridFSBucket} is obtained only once from
* {@link ReactiveMongoDatabaseFactory#getMongoDatabase() MongoDatabase}. Use
* {@link #ReactiveGridFsTemplate(MongoConverter, Mono, DataBufferFactory)} if you want to use different buckets from
* the same Template instance.
*
* @param dbFactory must not be {@literal null}.
* @param converter must not be {@literal null}.
* @param bucket
* @param bucket can be {@literal null}.
*/
public ReactiveGridFsTemplate(ReactiveMongoDatabaseFactory dbFactory, MongoConverter converter,
@Nullable String bucket) {
@@ -92,23 +101,41 @@ public class ReactiveGridFsTemplate extends GridFsOperationsSupport implements R
/**
* Creates a new {@link ReactiveGridFsTemplate} using the given {@link DataBufferFactory},
* {@link ReactiveMongoDatabaseFactory} and {@link MongoConverter}.
* <p>
* Note that the {@link GridFSBucket} is obtained only once from
* {@link ReactiveMongoDatabaseFactory#getMongoDatabase() MongoDatabase}. Use
* {@link #ReactiveGridFsTemplate(MongoConverter, Mono, DataBufferFactory)} if you want to use different buckets from
* the same Template instance.
*
* @param dataBufferFactory must not be {@literal null}.
* @param dbFactory must not be {@literal null}.
* @param converter must not be {@literal null}.
* @param bucket
* @param bucket can be {@literal null}.
*/
public ReactiveGridFsTemplate(DataBufferFactory dataBufferFactory, ReactiveMongoDatabaseFactory dbFactory,
MongoConverter converter, @Nullable String bucket) {
this(converter, Mono.defer(Lazy.of(() -> doGetBucket(dbFactory, bucket))), dataBufferFactory);
}
/**
* Creates a new {@link ReactiveGridFsTemplate} using the given {@link MongoConverter}, {@link Mono} emitting a
* {@link ReactiveMongoDatabaseFactory} and {@link DataBufferFactory}.
*
* @param converter must not be {@literal null}.
* @param gridFSBucket must not be {@literal null}.
* @param dataBufferFactory must not be {@literal null}.
* @since 4.2
*/
public ReactiveGridFsTemplate(MongoConverter converter, Mono<GridFSBucket> gridFSBucket,
DataBufferFactory dataBufferFactory) {
super(converter);
Assert.notNull(gridFSBucket, "GridFSBucket Mono must not be null");
Assert.notNull(dataBufferFactory, "DataBufferFactory must not be null");
Assert.notNull(dbFactory, "ReactiveMongoDatabaseFactory must not be null");
this.bucketSupplier = gridFSBucket;
this.dataBufferFactory = dataBufferFactory;
this.dbFactory = dbFactory;
this.bucket = bucket;
}
@Override
@@ -117,6 +144,8 @@ public class ReactiveGridFsTemplate extends GridFsOperationsSupport implements R
return store(content, filename, contentType, toDocument(metadata));
}
@Override
@SuppressWarnings("unchecked")
public <T> Mono<T> store(GridFsObject<T, Publisher<DataBuffer>> upload) {
GridFSUploadOptions uploadOptions = computeUploadOptionsFor(upload.getOptions().getContentType(),
@@ -248,6 +277,13 @@ public class ReactiveGridFsTemplate extends GridFsOperationsSupport implements R
}
protected Mono<GridFSBucket> doGetBucket() {
return bucketSupplier;
}
private static Mono<GridFSBucket> doGetBucket(ReactiveMongoDatabaseFactory dbFactory, @Nullable String bucket) {
Assert.notNull(dbFactory, "ReactiveMongoDatabaseFactory must not be null");
return dbFactory.getMongoDatabase()
.map(db -> bucket == null ? GridFSBuckets.create(db) : GridFSBuckets.create(db, bucket));
}
@@ -274,6 +310,7 @@ public class ReactiveGridFsTemplate extends GridFsOperationsSupport implements R
this.sortObject = sortObject;
}
@Override
public GridFSFindPublisher doInBucket(GridFSBucket bucket) {
GridFSFindPublisher findPublisher = bucket.find(queryObject).sort(sortObject);
@@ -311,21 +348,8 @@ public class ReactiveGridFsTemplate extends GridFsOperationsSupport implements R
}
}
private static class UploadCallback implements ReactiveBucketCallback<Void> {
private final BsonValue fileId;
private final String filename;
private final Publisher<ByteBuffer> source;
private final GridFSUploadOptions uploadOptions;
public UploadCallback(BsonValue fileId, String filename, Publisher<ByteBuffer> source,
GridFSUploadOptions uploadOptions) {
this.fileId = fileId;
this.filename = filename;
this.source = source;
this.uploadOptions = uploadOptions;
}
private record UploadCallback(BsonValue fileId, String filename, Publisher<ByteBuffer> source,
GridFSUploadOptions uploadOptions) implements ReactiveBucketCallback<Void> {
@Override
public GridFSUploadPublisher<Void> doInBucket(GridFSBucket bucket) {
@@ -333,19 +357,8 @@ public class ReactiveGridFsTemplate extends GridFsOperationsSupport implements R
}
}
private static class AutoIdCreatingUploadCallback implements ReactiveBucketCallback<ObjectId> {
private final String filename;
private final Publisher<ByteBuffer> source;
private final GridFSUploadOptions uploadOptions;
public AutoIdCreatingUploadCallback(String filename, Publisher<ByteBuffer> source,
GridFSUploadOptions uploadOptions) {
this.filename = filename;
this.source = source;
this.uploadOptions = uploadOptions;
}
private record AutoIdCreatingUploadCallback(String filename, Publisher<ByteBuffer> source,
GridFSUploadOptions uploadOptions) implements ReactiveBucketCallback<ObjectId> {
@Override
public GridFSUploadPublisher<ObjectId> doInBucket(GridFSBucket bucket) {
@@ -353,13 +366,7 @@ public class ReactiveGridFsTemplate extends GridFsOperationsSupport implements R
}
}
private static class DeleteCallback implements ReactiveBucketCallback<Void> {
private final BsonValue id;
public DeleteCallback(BsonValue id) {
this.id = id;
}
private record DeleteCallback(BsonValue id) implements ReactiveBucketCallback<Void> {
@Override
public Publisher<Void> doInBucket(GridFSBucket bucket) {

View File

@@ -20,6 +20,11 @@ import static org.springframework.test.util.ReflectionTestUtils.*;
import javax.net.ssl.SSLSocketFactory;
import java.util.function.Supplier;
import com.mongodb.client.MongoCollection;
import com.mongodb.client.gridfs.GridFSBucket;
import com.mongodb.client.gridfs.model.GridFSFile;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
@@ -168,8 +173,12 @@ public class MongoNamespaceTests {
assertThat(ctx.containsBean("gridFsTemplate")).isTrue();
GridFsOperations operations = (GridFsOperations) ctx.getBean("gridFsTemplate");
MongoDatabaseFactory dbf = (MongoDatabaseFactory) getField(operations, "dbFactory");
assertThat(getField(dbf, "databaseName")).isEqualTo("database");
Supplier<GridFSBucket> gridFSBucketSupplier = (Supplier<GridFSBucket>) getField(operations, "bucketSupplier");
GridFSBucket gfsBucket = gridFSBucketSupplier.get();
assertThat(gfsBucket.getBucketName()).isEqualTo("fs"); // fs is the default
MongoCollection<GridFSFile> filesCollection = (MongoCollection<GridFSFile>) getField(gfsBucket, "filesCollection");
assertThat(filesCollection.getNamespace().getDatabaseName()).isEqualTo("database");
MongoConverter converter = (MongoConverter) getField(operations, "converter");
assertThat(converter).isNotNull();
@@ -181,9 +190,12 @@ public class MongoNamespaceTests {
assertThat(ctx.containsBean("secondGridFsTemplate")).isTrue();
GridFsOperations operations = (GridFsOperations) ctx.getBean("secondGridFsTemplate");
MongoDatabaseFactory dbf = (MongoDatabaseFactory) getField(operations, "dbFactory");
assertThat(getField(dbf, "databaseName")).isEqualTo("database");
assertThat(getField(operations, "bucket")).isEqualTo(null);
Supplier<GridFSBucket> gridFSBucketSupplier = (Supplier<GridFSBucket>) getField(operations, "bucketSupplier");
GridFSBucket gfsBucket = gridFSBucketSupplier.get();
assertThat(gfsBucket.getBucketName()).isEqualTo("fs"); // fs is the default
MongoCollection<GridFSFile> filesCollection = (MongoCollection<GridFSFile>) getField(gfsBucket, "filesCollection");
assertThat(filesCollection.getNamespace().getDatabaseName()).isEqualTo("database");
MongoConverter converter = (MongoConverter) getField(operations, "converter");
assertThat(converter).isNotNull();
@@ -195,9 +207,12 @@ public class MongoNamespaceTests {
assertThat(ctx.containsBean("thirdGridFsTemplate")).isTrue();
GridFsOperations operations = (GridFsOperations) ctx.getBean("thirdGridFsTemplate");
MongoDatabaseFactory dbf = (MongoDatabaseFactory) getField(operations, "dbFactory");
assertThat(getField(dbf, "databaseName")).isEqualTo("database");
assertThat(getField(operations, "bucket")).isEqualTo("bucketString");
Supplier<GridFSBucket> gridFSBucketSupplier = (Supplier<GridFSBucket>) getField(operations, "bucketSupplier");
GridFSBucket gfsBucket = gridFSBucketSupplier.get();
assertThat(gfsBucket.getBucketName()).isEqualTo("bucketString"); // fs is the default
MongoCollection<GridFSFile> filesCollection = (MongoCollection<GridFSFile>) getField(gfsBucket, "filesCollection");
assertThat(filesCollection.getNamespace().getDatabaseName()).isEqualTo("database");
MongoConverter converter = (MongoConverter) getField(operations, "converter");
assertThat(converter).isNotNull();

View File

@@ -35,6 +35,7 @@ import org.springframework.data.mongodb.core.convert.LazyLoadingTestUtils;
import org.springframework.data.mongodb.core.mapping.DBRef;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoId;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.test.util.MongoTemplateExtension;
import org.springframework.data.mongodb.test.util.MongoTestTemplate;
import org.springframework.data.mongodb.test.util.Template;
@@ -232,6 +233,53 @@ public class MongoTemplateDbRefTests {
assertThat(target.getValue()).containsExactlyInAnyOrder(one, two);
}
@Test // GH-2191
void shouldAllowToSliceCollectionOfDbRefs() {
JustSomeType one = new JustSomeType();
one.value = "one";
JustSomeType two = new JustSomeType();
two.value = "two";
template.insertAll(Arrays.asList(one, two));
WithCollectionDbRef source = new WithCollectionDbRef();
source.refs = Arrays.asList(one, two);
template.save(source);
Query theQuery = query(where("id").is(source.id));
theQuery.fields().slice("refs", 1, 1);
WithCollectionDbRef target = template.findOne(theQuery, WithCollectionDbRef.class);
assertThat(target.getRefs()).containsExactly(two);
}
@Test // GH-2191
void shouldAllowToSliceCollectionOfLazyDbRefs() {
JustSomeType one = new JustSomeType();
one.value = "one";
JustSomeType two = new JustSomeType();
two.value = "two";
template.insertAll(Arrays.asList(one, two));
WithCollectionDbRef source = new WithCollectionDbRef();
source.lazyrefs = Arrays.asList(one, two);
template.save(source);
Query theQuery = query(where("id").is(source.id));
theQuery.fields().slice("lazyrefs", 1, 1);
WithCollectionDbRef target = template.findOne(theQuery, WithCollectionDbRef.class);
LazyLoadingTestUtils.assertProxyIsResolved(target.lazyrefs, false);
assertThat(target.getLazyrefs()).containsExactly(two);
}
@Data
@Document("cycle-with-different-type-root")
static class RefCycleLoadingIntoDifferentTypeRoot {
@@ -264,6 +312,16 @@ public class MongoTemplateDbRefTests {
String value;
}
@Data
static class WithCollectionDbRef {
@Id String id;
@DBRef List<JustSomeType> refs;
@DBRef(lazy = true) List<JustSomeType> lazyrefs;
}
@Data
static class WithDBRefOnRawStringId {

View File

@@ -0,0 +1,97 @@
/*
* Copyright 2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.assertj.core.api.Assertions.*;
import static org.springframework.data.mongodb.core.mapping.MappingConfig.*;
import lombok.Data;
import java.util.List;
import org.bson.Document;
import org.junit.jupiter.api.Test;
import org.springframework.data.annotation.Id;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.mapping.Field;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.Query;
import com.mongodb.client.MongoClients;
/**
* @author Christoph Strobl
* @since 2023/06
*/
public class MongoTemplateMappingConfigTests {
@Test
void testProgrammaticMetadata() {
SimpleMongoClientDatabaseFactory dbFactory = new SimpleMongoClientDatabaseFactory(MongoClients.create(),
"test-manual-config");
MongoMappingContext mappingContext = new MongoMappingContext();
mappingContext.mappingRules(rules -> {
rules.add(Sample.class, cfg -> {
cfg.namespace("my-sample");
cfg.entityCreator(args -> {
return new Sample(args.get(Sample::getName));
});
cfg.define(Sample::getName, PropertyConfig::useAsId);
cfg.define(Sample::getValue, property -> property.mappedName("va-l-ue"));
});
});
mappingContext.afterPropertiesSet();
MappingMongoConverter mappingMongoConverter = new MappingMongoConverter(dbFactory, mappingContext);
mappingMongoConverter.afterPropertiesSet();
MongoTemplate template = new MongoTemplate(dbFactory, mappingMongoConverter);
template.dropCollection(Sample.class);
Sample sample = new Sample("s1");
sample.value = "val";
template.save(sample);
Document dbValue = template.execute("my-sample", collection -> {
return collection.find(new Document()).first();
});
System.out.println("dbValue: " + dbValue);
assertThat(dbValue).containsEntry("_id", sample.name).containsEntry("va-l-ue", sample.value);
List<Sample> entries = template.find(Query.query(Criteria.where("name").is(sample.name)), Sample.class);
entries.forEach(System.out::println);
assertThat(entries).containsExactly(sample);
}
@Data
@org.springframework.data.mongodb.core.mapping.Document(collection = "my-sample")
static class Sample {
Sample(String name) {
this.name = name;
}
@Id final String name;
@Field(name = "va-l-ue") String value;
}
}

View File

@@ -34,6 +34,7 @@ import java.time.ZoneOffset;
import java.time.temporal.ChronoUnit;
import java.util.*;
import org.bson.BsonUndefined;
import org.bson.types.Binary;
import org.bson.types.Code;
import org.bson.types.Decimal128;
@@ -2843,6 +2844,13 @@ class MappingMongoConverterUnitTests {
assertThat(converter.read(Address.class, source).city).isEqualTo("Gotham,Metropolis");
}
@Test // GH-2350
void shouldConvertBsonUndefinedToNull() {
org.bson.Document source = new org.bson.Document("s", "hallway drive").append("city", new BsonUndefined());
assertThat(converter.read(Address.class, source).city).isNull();
}
static class GenericType<T> {
T content;
}

View File

@@ -17,6 +17,7 @@ package org.springframework.data.mongodb.core.mapping;
import static org.assertj.core.api.Assertions.*;
import static org.mockito.Mockito.*;
import static org.springframework.data.mongodb.core.mapping.MappingConfig.EntityConfig.*;
import java.lang.annotation.ElementType;
import java.lang.annotation.Retention;
@@ -26,18 +27,21 @@ import java.util.Collections;
import java.util.LinkedHashMap;
import java.util.Map;
import lombok.Data;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.ExtendWith;
import org.mockito.Mock;
import org.mockito.junit.jupiter.MockitoExtension;
import org.springframework.context.ApplicationContext;
import org.springframework.core.annotation.AliasFor;
import org.springframework.data.mapping.MappingException;
import org.springframework.data.mongodb.core.mapping.MappingConfig.EntityConfig;
import org.springframework.data.mongodb.core.mapping.MappingConfig.PropertyConfig;
import org.springframework.data.mongodb.core.query.Collation;
import org.springframework.data.spel.ExtensionAwareEvaluationContextProvider;
import org.springframework.data.spel.spi.EvaluationContextExtension;
import org.springframework.data.util.ClassTypeInformation;
import org.springframework.data.util.TypeInformation;
/**
* Unit tests for {@link BasicMongoPersistentEntity}.
@@ -301,6 +305,31 @@ public class BasicMongoPersistentEntityUnitTests {
return new BasicMongoPersistentEntity<>(ClassTypeInformation.from(type));
}
@Data
class Sample {
String name;
String value;
}
@Test
void testProgrammaticMetadata() {
doReturn("value").when(propertyMock).getName();
EntityConfig<Sample> entityConfig = configure(Sample.class) //
.namespace("my-collection") //
.define(Sample::getValue, PropertyConfig::useAsId)
.define(Sample::getName, property -> property.mappedName("n-a-m-e"));
BasicMongoPersistentEntity<Sample> entity = new BasicMongoPersistentEntity<>(TypeInformation.of(Sample.class), entityConfig);
entity.addPersistentProperty(propertyMock);
MongoPersistentProperty idProperty = entity.getIdProperty();
assertThat(idProperty).isSameAs(propertyMock);
assertThat(entity.getCollection()).isEqualTo("my-collection");
}
@Document("contacts")
class Contact {}

View File

@@ -25,16 +25,16 @@ The following example shows how to use Change Streams with `MessageListener` ins
[source,java]
----
MessageListenerContainer container = new DefaultMessageListenerContainer(template);
container.start(); <1>
container.start(); <1>
MessageListener<ChangeStreamDocument<Document>, User> listener = System.out::println; <2>
ChangeStreamRequestOptions options = new ChangeStreamRequestOptions("user", ChangeStreamOptions.empty()); <3>
MessageListener<ChangeStreamDocument<Document>, User> listener = System.out::println; <2>
ChangeStreamRequestOptions options = new ChangeStreamRequestOptions("db", "user", ChangeStreamOptions.empty()); <3>
Subscription subscription = container.register(new ChangeStreamRequest<>(listener, options), User.class); <4>
Subscription subscription = container.register(new ChangeStreamRequest<>(listener, options), User.class); <4>
// ...
container.stop(); <5>
container.stop(); <5>
----
<1> Starting the container initializes the resources and starts `Task` instances for already registered `SubscriptionRequest` instances. Requests added after startup are ran immediately.
<2> Define the listener called when a `Message` is received. The `Message#getBody()` is converted to the requested domain type. Use `Document` to receive raw results without conversion.

View File

@@ -3,7 +3,6 @@
MongoDB supports storing binary files inside its filesystem, GridFS. Spring Data MongoDB provides a `GridFsOperations` interface as well as the corresponding implementation, `GridFsTemplate`, to let you interact with the filesystem. You can set up a `GridFsTemplate` instance by handing it a `MongoDatabaseFactory` as well as a `MongoConverter`, as the following example shows:
====
.Java
[source,java,role="primary"]
@@ -82,7 +81,7 @@ class GridFsClient {
@Test
public void findFilesInGridFs() {
GridFSFindIterable result = operations.find(query(whereFilename().is("filename.txt")))
GridFSFindIterable result = operations.find(query(whereFilename().is("filename.txt")));
}
}
----
@@ -110,3 +109,7 @@ class GridFsClient {
====
`GridFsOperations` extends `ResourcePatternResolver` and lets the `GridFsTemplate` (for example) to be plugged into an `ApplicationContext` to read Spring Config files from MongoDB database.
NOTE: By default, `GridFsTemplate` obtains `GridFSBucket` once upon the first GridFS interaction.
After that, the Template instance reuses the cached bucket.
To use different buckets, from the same Template instance use the constructor accepting `Supplier<GridFSBucket>`.