500 error in querybuilder API request. | Community
Skip to main content
New Participant
December 2, 2022
Solved

500 error in querybuilder API request.

  • December 2, 2022
  • 2 replies
  • 925 views

Hi

We are getting below error in querybuilder Api request.

<!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">

<html>

<head>

<title>500 The query read or traversed more than 100000 nodes. To avoid affecting other tasks, processing was

stopped.</title>

</head>

<body>

<h1>The query read or traversed more than 100000 nodes. To avoid affecting other tasks, processing was stopped.</h1>

<p>Cannot serve request to /bin/querybuilder.json on this server</p>

<hr>

<address>ApacheSling/2.6 (jetty/9.4.15.v20190215, OpenJDK 64-Bit Server VM 11.0.16, Linux

5.10.135-122.509.amzn2.x86_64 amd64)</address>

</body>

</html>

We have done some analysis and found that this issue is of “In Memory Limit” that can be configured through configMgr.

Please let us know how this limit works and what is the maximum limit we can set.

This post is no longer active and is closed to new replies. Need help? Start a new post to ask your question.
Best answer by VeenaK

Hi @akshaybhujbale

Problem is with the query and the index for the query that you are running, before proceeding further as mentioned by @lokesh_vajrala try to use the explain query tool and check which part of the query is taking time, try to index those predicate filters and make sure proper types are also used in the query not the generic types like "nt:unstructured"

Hope this helps !!

2 replies

VeenaKAccepted solution
New Participant
December 2, 2022

Hi @akshaybhujbale

Problem is with the query and the index for the query that you are running, before proceeding further as mentioned by @lokesh_vajrala try to use the explain query tool and check which part of the query is taking time, try to index those predicate filters and make sure proper types are also used in the query not the generic types like "nt:unstructured"

Hope this helps !!

New Participant
December 5, 2022

Hi @lokesh_vajrala @veenak

Can we setup a call to discuss more so that I will get clear idea.

Let me know if it is possible?

Lokesh_Vajrala
New Participant
December 2, 2022

@akshaybhujbale It will help if you create a new index to match the query or optimize the current query to use existing indexes. Increasing the limits may impact the system's performance negatively.

You can use the Explain Query tool to understand the query performance and Oak Index Definition Generator to generate index definition.