2

When I perform AQL queries, I run into this error:

Query: AQL: query would use more memory than allowed (while executing)

My dataset: 9 billion vertices, 19 billion edges

My cluster has three nodes, running Centos 7 with 32 GB RAM

I only hope that the query can be executed and will not be reported wrong. What configuration files can be modified to achieve the purpose? I am new to ArangoDB.

1
  • 1
    Welcome to SO! Please read how to ask. Commented Dec 6, 2017 at 2:53

1 Answer 1

2

Some back-of-the-envelope calculations suggest you will need to use the RocksDB engine:

(100 * 28 * 10^9) / (3 * (32 * 10^9)) => 29.17

See https://www.arangodb.com/why-arangodb/comparing-rocksdb-mmfiles-storage-engines/

Sign up to request clarification or add additional context in comments.

4 Comments

I have already used rocksdb,You can tell me how you calculated this,thank you!
I just used your numbers and guessed an average for the number of bytes per (V+E). If you are already using RocksDB and have enough disk space, then perhaps the cause is that ArangoDB has some memory requirements regarding the indexes it maintains. What indexes have you defined? It sounds like maybe you will have to reach out to ArangoDB more directly , e.g. at github.com/arangodb/arangodb/issues
I don't have a custom index,It's the default index of arangodb,Yes. My disk space is enough,I've already raised questions on GitHub, but the development team didn't solve me
Maybe the default indices require a lot of RAM? Have you looked at github.com/arangodb/arangodb/issues/3806 ? What is your issue #?

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.