Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I've deployed it a few times. In some cases it randomly spikes to 100% cpu usage until you restart it.

Is there a way to know how much RAM you are going to need for your dataset? I think I was using it for looking up restaurant names from a database of 100,000 and I wanted to factor in misspellings and partial matches.



Not really, depends on the size of the dataset and the complexity of the operations - when did you use it? Sounds like you were on a dodgy build or set up

I think Elastisearches policy on sizing is pay us or a partner to have a look and give you a guesstimate, which is pretty standard




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: