3

I am trying to write some integration tests relative to some methods that needs to extract data from MongoDB. In detail, I am using the Embedded Mongo given by Spring Data project. The embedded mongo is clearly provided by Flapdoodle.

I need to import some json file into the Embedded Mongo. I have looked at the tests provided with flapdoodle, but I am not able to understand how they integrates with the magic given by Spring Data + Spring Boot.

Can anyone post some clarifying snippets?

3 Answers 3

1

You can create a junit rule (ExternalResource) which runs before and after each test. Check the MongoEmbeddedRule class to get some idea on the implementation details.

Integration test:

@RunWith(SpringRunner.class)
@SpringBootTest(webEnvironment = RANDOM_PORT)
public abstract class TestRunner {

    @Autowired
    protected MongoTemplate mongoTemplate;

    @Rule
    public MongoEmbeddedRule mongoEmbeddedRule = new MongoEmbeddedRule(this);

ExternalResource Rule:

public class MongoEmbeddedRule extends ExternalResource {

    private final Object testClassInstance;
    private final Map<String, Path> mongoCollectionDataPaths;
    private final String fieldName;
    private final String getterName;

    public MongoEmbeddedRule(final Object testClassInstance) {
        this(testClassInstance, "mongoTemplate", "getMongoTemplate");
    }

    protected MongoEmbeddedRule(final Object testClassInstance, final String fieldName, final String getterName) {
        this.fieldName = fieldName;
        this.getterName = getterName;
        this.testClassInstance = testClassInstance;
        this.mongoCollectionDataPaths = mongoExtendedJsonFilesLookup();
    }

    @Override
    protected void before() {
        dropCollections();
        createAndPopulateCollections();
    }

    @Override
    protected void after() {
    }

    protected Set<String> getMongoCollectionNames() {
        return mongoCollectionDataPaths.keySet();
    }

    public void dropCollections() {
        getMongoCollectionNames().forEach(collectionName -> getMongoTemplate().dropCollection(collectionName));
    }

    protected void createAndPopulateCollections() {
        mongoCollectionDataPaths.forEach((key, value) -> insertDocumentsFromMongoExtendedJsonFile(value, key));
    }

    protected MongoTemplate getMongoTemplate() {
        try {
            Object value = ReflectionTestUtils.getField(testClassInstance, fieldName);
            if (value instanceof MongoTemplate) {
                return (MongoTemplate) value;
            }
            value = ReflectionTestUtils.invokeGetterMethod(testClassInstance, getterName);
            if (value instanceof MongoTemplate) {
                return (MongoTemplate) value;
            }
        } catch (final IllegalArgumentException e) {
            // throw exception with dedicated message at the end
        }
        throw new IllegalArgumentException(
                String.format(
                        "%s expects either field '%s' or method '%s' in order to access the required MongoTemmplate",
                        this.getClass().getSimpleName(), fieldName, getterName));
    }

    private Map<String, Path> mongoExtendedJsonFilesLookup() {
        Map<String, Path> collections = new HashMap<>();
        try {
            Files.walk(Paths.get("src","test","resources","mongo"))
                    .filter(Files::isRegularFile)
                    .forEach(filePath -> collections.put(
                            filePath.getFileName().toString().replace(".json", ""),
                            filePath));
        } catch (IOException e) {
            e.printStackTrace();
        }
        return collections;
    }

    private void insertDocumentsFromMongoExtendedJsonFile(Path path, String collectionName) {
        try {
            List<Document> documents = new ArrayList<>();
            Files.readAllLines(path).forEach(l -> documents.add(Document.parse(l)));
            getMongoTemplate().getCollection(collectionName).insertMany(documents);
            System.out.println(documents.size() + " documents loaded for " + collectionName + " collection.");
        } catch (IOException e) {
            e.printStackTrace();
        }
    }
}

json file (names.json) with MongoDB Extended JSON, where every document is in one line and the collection name is the filename without extension.

{ "_id" : ObjectId("594d324d5b49b78da8ce2f28"), "someId" : NumberLong(1), "name" : "Some Name 1", "lastModified" : ISODate("1970-01-01T00:00:00Z")}
{ "_id" : ObjectId("594d324d5b49b78da8ce2f29"), "someId" : NumberLong(2), "name" : "Some Name 2", "lastModified" : ISODate("1970-01-01T00:00:00Z")}
Sign up to request clarification or add additional context in comments.

Comments

1

You can have a look at this following Test class, provided by "flapdoodle". The test shows how to import a JSON file containing the collection dataset: MongoImportExecutableTest.java

You could theoretically also import a whole dump of a database. (using MongoDB restore): MongoRestoreExecutableTest.java

2 Comments

Thanks. However, I asked an example that also uses Spring Boot :)
Hi. I think you could run the import of data on or after spring boot has started. See hier an example : stackoverflow.com/questions/27405713/…
0

You can create an abstract class and have setup logic to start mongod and mongoimport process.

AbstractMongoDBTest.java

public abstract class AbstractMongoDBTest {

private MongodProcess mongodProcess;
private MongoImportProcess mongoImportProcess;
private MongoTemplate mongoTemplate;

void setup(String dbName, String collection, String jsonFile) throws Exception {
    String ip = "localhost";
    int port = 12345;

    IMongodConfig mongodConfig = new MongodConfigBuilder().version(Version.Main.PRODUCTION)
            .net(new Net(ip, port, Network.localhostIsIPv6()))
            .build();

    MongodStarter starter = MongodStarter.getDefaultInstance();
    MongodExecutable mongodExecutable = starter.prepare(mongodConfig);

    File dataFile = new File(Thread.currentThread().getContextClassLoader().getResource(jsonFile).getFile());
    MongoImportExecutable mongoImportExecutable = mongoImportExecutable(port, dbName,
            collection, dataFile.getAbsolutePath()
            , true, true, true);

    mongodProcess = mongodExecutable.start();
    mongoImportProcess = mongoImportExecutable.start();

    mongoTemplate = new MongoTemplate(new MongoClient(ip, port), dbName);
}

private MongoImportExecutable mongoImportExecutable(int port, String dbName, String collection, String jsonFile,
                                                    Boolean jsonArray, Boolean upsert, Boolean drop) throws
        IOException {
    IMongoImportConfig mongoImportConfig = new MongoImportConfigBuilder()
            .version(Version.Main.PRODUCTION)
            .net(new Net(port, Network.localhostIsIPv6()))
            .db(dbName)
            .collection(collection)
            .upsert(upsert)
            .dropCollection(drop)
            .jsonArray(jsonArray)
            .importFile(jsonFile)
            .build();

    return MongoImportStarter.getDefaultInstance().prepare(mongoImportConfig);
}

@AfterEach
void clean() {
    mongoImportProcess.stop();
    mongodProcess.stop();
}

public MongoTemplate getMongoTemplate(){
    return mongoTemplate;
}

}

YourTestClass.java

public class YourTestClass extends AbstractMongoDBTest{


@BeforeEach
void setup() throws Exception {
    super.setup("db", "collection", "jsonfile");
}

@Test
void test() throws Exception {

}

}

1 Comment

I really like your example. But I don't understand how I can autowire (@Autowired) a repository and load the data from the database. Could you add this to your solution?

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.