Data no longer visualised

This support forum board is for support questions relating to Nagios Log Server, our solution for managing and monitoring critical log data.
User avatar
WillemDH
Posts: 2320
Joined: Wed Mar 20, 2013 5:49 am
Location: Ghent

Data no longer visualised

Post by WillemDH »

Hello,

The graphs on our NLS nodes are no longer showing. The trend tables seem to be still working, but all charts are down, even the one on the homepage 'Events over Time- - last 1H'
Check screenshot with inspect.

Please advise how to solve this asap. I already tried rebooting a node, but that didn't help..

Tail of the httpd error log:

Code: Select all

 tail -n100 /var/log/httpd/error_log
[Sun Oct 09 03:11:02 2016] [notice] Digest: generating secret for digest authentication ...
[Sun Oct 09 03:11:02 2016] [notice] Digest: done
[Sun Oct 09 03:11:02 2016] [notice] Apache/2.2.15 (Unix) DAV/2 PHP/5.3.3 mod_ssl/2.2.15 OpenSSL/1.0.1e-fips configured -- resuming normal operations
[Tue Oct 11 15:13:20 2016] [notice] caught SIGTERM, shutting down
[Tue Oct 11 15:14:13 2016] [notice] suEXEC mechanism enabled (wrapper: /usr/sbin/suexec)
[Tue Oct 11 15:14:13 2016] [notice] Digest: generating secret for digest authentication ...
[Tue Oct 11 15:14:13 2016] [notice] Digest: done
[Tue Oct 11 15:14:13 2016] [notice] Apache/2.2.15 (Unix) DAV/2 PHP/5.3.3 mod_ssl/2.2.15 OpenSSL/1.0.1e-fips configured -- resuming normal operations
Tail of the elasticsearch logs while refreshing the homepage:

Code: Select all

[2016-10-11 15:41:51,839][DEBUG][action.search.type       ] [c4d16075-9bc2-4095-9f00-e7de7f96930c] [logstash-2016.10.11][0], node[-zyhvoh3Tpi22pjDojF1xw], [P], s[STARTED]: Failed to execute [org.elasticsearch.action.search.SearchRequest@73549a7c]
org.elasticsearch.transport.RemoteTransportException: [95f9ab14-da22-4144-bb0b-6bbc5662115c][inet[/10.54.24.141:9300]][indices:data/read/search[phase/query]]
Caused by: org.elasticsearch.search.query.QueryPhaseExecutionException: [logstash-2016.10.11][0]: query[ConstantScore(*:*)],from[0],size[0]: Query Failed [Failed to execute global facets]
        at org.elasticsearch.search.facet.FacetPhase.execute(FacetPhase.java:193)
        at org.elasticsearch.search.query.QueryPhase.execute(QueryPhase.java:171)
        at org.elasticsearch.search.SearchService.loadOrExecuteQueryPhase(SearchService.java:289)
        at org.elasticsearch.search.SearchService.executeQueryPhase(SearchService.java:300)
        at org.elasticsearch.search.action.SearchServiceTransportAction$SearchQueryTransportHandler.messageReceived(SearchServiceTransportAction.java:776)
        at org.elasticsearch.search.action.SearchServiceTransportAction$SearchQueryTransportHandler.messageReceived(SearchServiceTransportAction.java:767)
        at org.elasticsearch.transport.netty.MessageChannelHandler$RequestHandler.doRun(MessageChannelHandler.java:279)
        at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:36)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
Caused by: org.elasticsearch.ElasticsearchException: org.elasticsearch.common.breaker.CircuitBreakingException: [FIELDDATA] Data too large, data for [@timestamp] would be larger than limit of [6411072307/5.9gb]
        at org.elasticsearch.index.fielddata.plain.AbstractIndexFieldData.load(AbstractIndexFieldData.java:80)
        at org.elasticsearch.search.facet.datehistogram.CountDateHistogramFacetExecutor$Collector.setNextReader(CountDateHistogramFacetExecutor.java:88)
        at org.elasticsearch.common.lucene.search.FilteredCollector.setNextReader(FilteredCollector.java:67)
        at org.apache.lucene.search.IndexSearcher.search(IndexSearcher.java:612)
        at org.elasticsearch.search.internal.ContextIndexSearcher.search(ContextIndexSearcher.java:191)
        at org.apache.lucene.search.IndexSearcher.search(IndexSearcher.java:309)
        at org.elasticsearch.search.facet.FacetPhase.execute(FacetPhase.java:186)
        ... 10 more
Caused by: org.elasticsearch.common.util.concurrent.UncheckedExecutionException: org.elasticsearch.common.breaker.CircuitBreakingException: [FIELDDATA] Data too large, data for [@timestamp] would be larger than limit of [6411072307/5.9gb]
        at org.elasticsearch.common.cache.LocalCache$Segment.get(LocalCache.java:2203)
        at org.elasticsearch.common.cache.LocalCache.get(LocalCache.java:3937)
        at org.elasticsearch.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4739)
        at org.elasticsearch.indices.fielddata.cache.IndicesFieldDataCache$IndexFieldCache.load(IndicesFieldDataCache.java:167)
        at org.elasticsearch.index.fielddata.plain.AbstractIndexFieldData.load(AbstractIndexFieldData.java:74)
        ... 16 more
Caused by: org.elasticsearch.common.breaker.CircuitBreakingException: [FIELDDATA] Data too large, data for [@timestamp] would be larger than limit of [6411072307/5.9gb]
        at org.elasticsearch.common.breaker.ChildMemoryCircuitBreaker.circuitBreak(ChildMemoryCircuitBreaker.java:97)
        at org.elasticsearch.common.breaker.ChildMemoryCircuitBreaker.addEstimateBytesAndMaybeBreak(ChildMemoryCircuitBreaker.java:148)
        at org.elasticsearch.index.fielddata.RamAccountingTermsEnum.flush(RamAccountingTermsEnum.java:71)
        at org.elasticsearch.index.fielddata.RamAccountingTermsEnum.next(RamAccountingTermsEnum.java:89)
        at org.elasticsearch.index.fielddata.ordinals.OrdinalsBuilder$3.next(OrdinalsBuilder.java:472)
        at org.elasticsearch.index.fielddata.plain.PackedArrayIndexFieldData.loadDirect(PackedArrayIndexFieldData.java:109)
        at org.elasticsearch.index.fielddata.plain.PackedArrayIndexFieldData.loadDirect(PackedArrayIndexFieldData.java:49)
        at org.elasticsearch.indices.fielddata.cache.IndicesFieldDataCache$IndexFieldCache$1.call(IndicesFieldDataCache.java:180)
        at org.elasticsearch.indices.fielddata.cache.IndicesFieldDataCache$IndexFieldCache$1.call(IndicesFieldDataCache.java:167)
        at org.elasticsearch.common.cache.LocalCache$LocalManualCache$1.load(LocalCache.java:4742)
        at org.elasticsearch.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3527)
        at org.elasticsearch.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2319)
        at org.elasticsearch.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2282)
        at org.elasticsearch.common.cache.LocalCache$Segment.get(LocalCache.java:2197)
        ... 20 more
[2016-10-11 15:41:51,839][DEBUG][action.search.type       ] [c4d16075-9bc2-4095-9f00-e7de7f96930c] All shards failed for phase: [query]
org.elasticsearch.transport.RemoteTransportException: [95f9ab14-da22-4144-bb0b-6bbc5662115c][inet[/10.54.24.141:9300]][indices:data/read/search[phase/query]]
Caused by: org.elasticsearch.search.query.QueryPhaseExecutionException: [logstash-2016.10.11][0]: query[ConstantScore(*:*)],from[0],size[0]: Query Failed [Failed to execute global facets]
        at org.elasticsearch.search.facet.FacetPhase.execute(FacetPhase.java:193)
        at org.elasticsearch.search.query.QueryPhase.execute(QueryPhase.java:171)
        at org.elasticsearch.search.SearchService.loadOrExecuteQueryPhase(SearchService.java:289)
        at org.elasticsearch.search.SearchService.executeQueryPhase(SearchService.java:300)
        at org.elasticsearch.search.action.SearchServiceTransportAction$SearchQueryTransportHandler.messageReceived(SearchServiceTransportAction.java:776)
        at org.elasticsearch.search.action.SearchServiceTransportAction$SearchQueryTransportHandler.messageReceived(SearchServiceTransportAction.java:767)
        at org.elasticsearch.transport.netty.MessageChannelHandler$RequestHandler.doRun(MessageChannelHandler.java:279)
        at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:36)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
Caused by: org.elasticsearch.ElasticsearchException: org.elasticsearch.common.breaker.CircuitBreakingException: [FIELDDATA] Data too large, data for [@timestamp] would be larger than limit of [6411072307/5.9gb]
        at org.elasticsearch.index.fielddata.plain.AbstractIndexFieldData.load(AbstractIndexFieldData.java:80)
        at org.elasticsearch.search.facet.datehistogram.CountDateHistogramFacetExecutor$Collector.setNextReader(CountDateHistogramFacetExecutor.java:88)
        at org.elasticsearch.common.lucene.search.FilteredCollector.setNextReader(FilteredCollector.java:67)
        at org.apache.lucene.search.IndexSearcher.search(IndexSearcher.java:612)
        at org.elasticsearch.search.internal.ContextIndexSearcher.search(ContextIndexSearcher.java:191)
        at org.apache.lucene.search.IndexSearcher.search(IndexSearcher.java:309)
        at org.elasticsearch.search.facet.FacetPhase.execute(FacetPhase.java:186)
        ... 10 more
Caused by: org.elasticsearch.common.util.concurrent.UncheckedExecutionException: org.elasticsearch.common.breaker.CircuitBreakingException: [FIELDDATA] Data too large, data for [@timestamp] would be larger than limit of [6411072307/5.9gb]
        at org.elasticsearch.common.cache.LocalCache$Segment.get(LocalCache.java:2203)
        at org.elasticsearch.common.cache.LocalCache.get(LocalCache.java:3937)
        at org.elasticsearch.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4739)
        at org.elasticsearch.indices.fielddata.cache.IndicesFieldDataCache$IndexFieldCache.load(IndicesFieldDataCache.java:167)
        at org.elasticsearch.index.fielddata.plain.AbstractIndexFieldData.load(AbstractIndexFieldData.java:74)
        ... 16 more
Caused by: org.elasticsearch.common.breaker.CircuitBreakingException: [FIELDDATA] Data too large, data for [@timestamp] would be larger than limit of [6411072307/5.9gb]
        at org.elasticsearch.common.breaker.ChildMemoryCircuitBreaker.circuitBreak(ChildMemoryCircuitBreaker.java:97)
        at org.elasticsearch.common.breaker.ChildMemoryCircuitBreaker.addEstimateBytesAndMaybeBreak(ChildMemoryCircuitBreaker.java:148)
        at org.elasticsearch.index.fielddata.RamAccountingTermsEnum.flush(RamAccountingTermsEnum.java:71)
        at org.elasticsearch.index.fielddata.RamAccountingTermsEnum.next(RamAccountingTermsEnum.java:89)
        at org.elasticsearch.index.fielddata.ordinals.OrdinalsBuilder$3.next(OrdinalsBuilder.java:472)
        at org.elasticsearch.index.fielddata.plain.PackedArrayIndexFieldData.loadDirect(PackedArrayIndexFieldData.java:109)
        at org.elasticsearch.index.fielddata.plain.PackedArrayIndexFieldData.loadDirect(PackedArrayIndexFieldData.java:49)
        at org.elasticsearch.indices.fielddata.cache.IndicesFieldDataCache$IndexFieldCache$1.call(IndicesFieldDataCache.java:180)
        at org.elasticsearch.indices.fielddata.cache.IndicesFieldDataCache$IndexFieldCache$1.call(IndicesFieldDataCache.java:167)
        at org.elasticsearch.common.cache.LocalCache$LocalManualCache$1.load(LocalCache.java:4742)
        at org.elasticsearch.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3527)
        at org.elasticsearch.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2319)
        at org.elasticsearch.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2282)
        at org.elasticsearch.common.cache.LocalCache$Segment.get(LocalCache.java:2197)
        ... 20 more
Willem
You do not have the required permissions to view the files attached to this post.
Nagios XI 5.8.1
https://outsideit.net
User avatar
WillemDH
Posts: 2320
Joined: Wed Mar 20, 2013 5:49 am
Location: Ghent

Re: Data no longer visualised

Post by WillemDH »

Managed to solve the problem by clearing the field cache with this command:

Code: Select all

curl -XPOST 'http://localhost:9200/_cache/clear' -d '{ "fielddata": "true" }'
The fact is that the issue points to a RAM shortage. Let's hope it doesn't return soon. Please leave this open for a few weeks. grtz
Last edited by WillemDH on Tue Oct 11, 2016 1:15 pm, edited 1 time in total.
Nagios XI 5.8.1
https://outsideit.net
tmcdonald
Posts: 9117
Joined: Mon Sep 23, 2013 8:40 am

Re: Data no longer visualised

Post by tmcdonald »

Can do, will do.
Former Nagios employee