tmp cache limitation

Hi,

Is there a way to limit directory size of /pawtucket/app/tmp ?

Is there a parameter to change somewhere in a config file?

Thanks

Comments

  • You can periodically clear the directory, or you can run redis as your cache backend to avoid filling app/tmp

  • Hi Seth,

    We tried Redis but with some large collections, system doesn't answer anymore. For example : https://virtualcol.africamuseum.be/providence/pawtucket/index.php/Detail/collections/48, while https://virtualcol.africamuseum.be/providence/pawtucket/index.php/Detail/collections/47 answers correctly.

    Is there a maximum of records that CA can manage in a collection? Here we have more than 10000 photos for lepidoptera.

    Are there special parameters to be applied for Redis? We installed it on CA 1.7.8

    Thanks in advance for your help!

  • If one page returns and the other doesn't it's probably not a cache issue. It's probably the page trying to dump too much information in a single response. Changing the theme to do paging is likely the answer.

  • What do you mean by changing the theme to do paging? There is already paging for specimens: it loads around 20 specimens only and when we go to the end of the page, it loads the next 20, and so on.

    It doesn't answer also in the collection hierarchy : when we click on Insects, it only has to show the subcollections, no specimens, but it stops after a time or crashes (page is https://virtualcol.africamuseum.be/providence/pawtucket/index.php/Detail/collections/3 and it crashes when you click on Insects)

  • Something is probably pulling too much data. Just a guess though. I'd have to look at the theme and data to be sure.

  • If you need something more, tell me.

    I didn't change a lot the design of the pages for collections hierarchy and details

  • wat kan I do to solve that problem?

    Because of these performance problems, my manager is ready to go on another system :-(

  • Is there a way to see the site? The links you posted return a 404 error. Also, would it be possible to update to a current version of CA?

  • Hi,

    I could avoid the crash of the page with an upgrade of ubuntu os but it's still slow.

    Should elastic search be a solution to avoid this? As a lot of data are in tables about attributes, it's perhaps a problem if there are too many metadata?

    If it's better to go with elastic search, is it still possible with an existing database on MySQL?

    Thanks

  • How does an Ubuntu upgrade avoid the crash?

    Using ElasticSearch might help, but tuning MySQL is where I'd start. The links you posted didn't work for me the other day, but they do now. I just tried them and all work for me. The Lepidoptera page is noticeably slow, but not horribly so, to the point the I suspect tuning will resolve it.

    If you can share your theme + data with me (or let me see the actual system) I can help you with this. We are already a vendor at the Africa Museum, so hopefully this is possible. Contact me at support@collectiveaccess.org if you wish to work directly on this.

    Seth

  • Hi Jim,

    2 years ago, I faced a similar issue, with an increasing delay for the reactivity of interface, until crash. The content of cache directory were growing dramatically.

    I solved this by deleting the content of cache directory of Pawtucket, every hour, by using a crontab. (And the same for Providence, once a week.)

Sign In or Register to comment.