Is there a limit on how much data is returned by the API?

I’m issuing the following:

curl -k -s -u masteradm:nanortalinnguaq ''

{"results":[{"attrs":{"__name":" ...[VERY LONG STRING]..."check_comman

And it returns a very long string, that is truncated, as illustrated. I see the same in python:

>>> r=requests.get(url,headers=hdr,auth=a)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/Library/Python/2.7/site-packages/requests/", line 75, in get
    return request('get', url, params=params, **kwargs)
  File "/Library/Python/2.7/site-packages/requests/", line 60, in request
    return session.request(method=method, url=url, **kwargs)
  File "/Library/Python/2.7/site-packages/requests/", line 533, in request
    resp = self.send(prep, **send_kwargs)
  File "/Library/Python/2.7/site-packages/requests/", line 686, in send
  File "/Library/Python/2.7/site-packages/requests/", line 828, in content
    self._content = b''.join(self.iter_content(CONTENT_CHUNK_SIZE)) or b''
  File "/Library/Python/2.7/site-packages/requests/", line 753, in generate
    raise ChunkedEncodingError(e)
requests.exceptions.ChunkedEncodingError: ('Connection broken: IncompleteRead(8095 bytes read, 2145 more expected)', IncompleteRead(8095 bytes read, 2145 more expected))

But I am reasonably certain that I could do this in the past (like a few days ago). I do, admittedly, have a large number of service objects: ~23600 - is that enough to cause a problem?


current installations have a bug which terminates the connection after an idle time. There’s work underway with 2.11 to completely eliminate that problem - see


1 Like

OK - thanks for letting me know.