The original file contains highly sensitive information (personal, identifiable data from a longitudinal study of at-risk adolescents), so I've used the sed command to replace the the original contents with nonsense. But, I've replicated it with that new, attached file (and found it with other large files, too). I was trying to replace the cells containing NA with empty cells (and vice-versa).
This is may be nothing more than the patient trying to tell the doctor how to do his job, but it seems related to how RAM is used, and perhaps includes elements from the document history: Larger files on which many, complex operations are done (e.g., copying in formulae that operate on several cells into _all_ cells in a column) will crash it, and this seems both to get a bit worse as the file has been worked on for a while and to get a bit better after LO is restarted.
Unrelatedly, I am interested in helping out with LO, but don't have many skills that are useful. So, if you know of things an average Joe end user can do (beta testing, language clean-up in online guides, etc.), please let me know. I've looked around a bit on the site to find this, but probably more through lack of sufficient effort haven't found things I personally can do to help.