Occasionally my Apache Solr will just go down and this is spammed in the log files:
有时候我的Apache Solr会宕机,这是在日志文件中发送的垃圾邮件:
Sep 24, 2013 1:00:21 AM org.apache.solr.common.SolrException log
SEVERE: java.lang.OutOfMemoryError: Java heap space
The dedicated server my Solr install is on has plenty of free RAM (16gb), what can be causing this? I've tried some Google Fu but with no solid answer as to why, or how to fix beyond using -Xmx512mb to raise the allowed heap size which I don't think is the most ideal method? I could be wrong.
我安装Solr的专用服务器有足够的空闲RAM(16 GB),是什么原因导致的?我试过一些Google Fu,但没有确切的答案,为什么,或者如何修复,除了使用-Xmx 512 mb来提高允许的堆大小,我不认为这是最理想的方法?我可能错了。
I'm using Jetty with my Solr install, is there a way to tell Jetty to use more memory? or to tell it how to resolve this error should it happen again.
我在Solr安装中使用Jetty,有没有办法告诉Jetty使用更多内存?或者告诉它如何解决这个错误,如果它再次发生。
Extra Info: The thing is if I set any large Xms or Xmx values, even if it's within my free memory left on the server I get There is insufficient memory for the Java Runtime Environment to continue.
额外信息:问题是,如果我设置了任何大的Xms或Xmx值,即使它在我在服务器上留下的空闲内存内,我也会得到没有足够的内存供Java Runtime环境继续运行。
# Out of Memory Error (allocation.inline.hpp:58), pid=10851, tid=47145744939328
#
# JRE version: 7.0_11-b21
# Java VM: Java HotSpot(TM) 64-Bit Server VM (23.6-b04 mixed mode linux-amd64 compressed oops)
# Failed to write core dump. Core dumps have been disabled. To enable core dumping, try "ulimit -c unlimited" before starting Java again
#
VM Arguments:
jvm_args: -Dsolr.solr.home=/opt/solr/solr -Xmx64m -Djetty.logs=/var/log/solr -Djetty.home=/root/wmv_solr -Djava.io.tmpdir=/tmp -Xmx256m
java_command: /root/wmv_solr/start.jar /root/wmv_solr/etc/jetty-logging.xml /root/wmv_solr/etc/jetty.xml
Launcher Type: SUN_STANDARD
Environment Variables:
JAVA_HOME=/usr/java/jdk1.7.0_11
CLASSPATH=.:/usr/java/jdk1.7.0_11/lib/classes.zip
PATH=/usr/kerberos/sbin:/usr/kerberos/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/usr/java/jdk1.7.0_11/bin:/opt/ant/bin:/usr/local/bin:/usr/X11R6/bin:/root/bin
SHELL=/bin/bash
/proc/meminfo:
MemTotal: 4033216 kB
MemFree: 755528 kB
Buffers: 274004 kB
Cached: 1939244 kB
SwapCached: 168388 kB
Active: 1923800 kB
Inactive: 1052876 kB
HighTotal: 0 kB
HighFree: 0 kB
LowTotal: 4033216 kB
LowFree: 755528 kB
SwapTotal: 4980024 kB
SwapFree: 4784124 kB
Dirty: 2092 kB
Writeback: 0 kB
AnonPages: 762616 kB
Mapped: 39044 kB
Slab: 256200 kB
PageTables: 17496 kB
NFS_Unstable: 0 kB
Bounce: 0 kB
CommitLimit: 6996632 kB
Committed_AS: 1964148 kB
VmallocTotal: 34359738367 kB
VmallocUsed: 265936 kB
VmallocChunk: 34359471775 kB
HugePages_Total: 0
HugePages_Free: 0
HugePages_Rsvd: 0
Hugepagesize: 2048 kB
This is still an ongoing issue and we've been seeing a new kind of error in logs:
这仍然是一个持续的问题,我们已经看到了一种新的错误日志:
Oct 02, 2013 7:14:10 AM org.apache.solr.common.SolrException log
SEVERE: java.lang.OutOfMemoryError: GC overhead limit exceeded
We're looking at upgrading our servers RAM in hopes that this will resolve our issue, but it doesn't explain why SOLR isn't using the otherwise free RAM.
我们正在考虑升级我们的服务器RAM,希望这将解决我们的问题,但它不能解释为什么SOLR不使用其他免费的RAM。