bumping this as I really have no way forward at th...
# general
s
bumping this as I really have no way forward at the moment, starting to feel like Pants may not be in the stars for me - https://pantsbuild.slack.com/archives/C046T6T9U/p1682444973353959 I tried uninstalling and reinstalling pants using homebrew and now pants can't even load a help message, if I just run
pants
I get the same error:
Exception message: Could not initialize store for cache: "Error making env for store at \"/Users/zach/.cache/pants/lmdb_store/cache/8\": No space left on device"
I tried deleting everything in
/Users/zach/.cache/pants
, deleted everything in my docker cache (don't think this would matter), restarted my computer. tried checking out a new copy of the repo I was working on. I also tried creating a new project with a minimal pants.toml setup, same error. short of reformatting my computer I'm really not sure what else to try.
c
there’s also a cache for the bootstrapped pants at
~/Library/Caches/nce/
that you may want to try to nuke first..
but no space left on device.. what does
df -h
say?
s
thanks, just tried nuking
~/Library/Caches/nce/
but no change
not super sure how to parse the output, but here's `df -h`:
Copy code
Filesystem       Size   Used  Avail Capacity iused      ifree %iused  Mounted on
/dev/disk3s3s1  926Gi   12Gi  761Gi     2%  349475 4293746329    0%   /
devfs           210Ki  210Ki    0Bi   100%     726          0  100%   /dev
/dev/disk3s6    926Gi  4.0Gi  761Gi     1%       4 7981504800    0%   /System/Volumes/VM
/dev/disk3s4    926Gi  9.1Gi  761Gi     2%    1196 7981504800    0%   /System/Volumes/Preboot
/dev/disk3s2    926Gi  663Mi  761Gi     1%     275 7981504800    0%   /System/Volumes/Update
/dev/disk1s2    500Mi  6.0Mi  481Mi     2%       1    4921360    0%   /System/Volumes/xarts
/dev/disk1s1    500Mi  6.1Mi  481Mi     2%      36    4921360    0%   /System/Volumes/iSCPreboot
/dev/disk1s3    500Mi  2.5Mi  481Mi     1%      49    4921360    0%   /System/Volumes/Hardware
/dev/disk3s1    926Gi  138Gi  761Gi    16% 1883792 7981504800    0%   /System/Volumes/Data
map auto_home     0Bi    0Bi    0Bi   100%       0          0  100%   /System/Volumes/Data/home
/dev/disk2s1    5.0Gi  1.5Gi  3.4Gi    31%      60   36151200    0%   /System/Volumes/Update/SFR/mnt1
/dev/disk4s2    2.2Gi  2.2Gi   11Mi   100%    7369 4294959910    0%   /Volumes/PyCharm
/dev/disk5s1    783Mi  722Mi   61Mi    93%   17304 4294949975    0%   /Volumes/pgAdmin 4
/dev/disk3s3    926Gi   12Gi  761Gi     2%  356095 4293621667    0%   /System/Volumes/Update/mnt1
c
yea that looks ok to me.
just to get a lot more logging going, how far do you get with:
Copy code
RUST_LOG=trace pants -ldebug -V
s
quite a large output so I attached it here
c
this is really weird / not I’m familiar with. I see you tried a new repo with a minimal pants.toml. does it give the same error also for a newer version of pants? say
2.16.0rc1
?
otherwise perhaps @witty-crayon-22786 recognizes the error message?
s
I just tried 2.16.0rc1 in both my normal repo and new reop with minimal pants.toml and the error message is the same
c
all out of ideas 🤷
w
i’ve seen this once before when crash dumps are enabled… might you have enabled crash dumps on your machine?
do you have anything in
/cores/
?
s
I'm not faimilar with crash dumps, certainly not something I enabled and this kind of just came out of the blue at the end of last week for me with this machine
the
/cores
folder is there but there isn't anything in it
w
slightly ridiculous, but: have you rebooted recently?
s
I think I rebooted last week when this first came up but let me give it another shot to make sure
w
k… if you’re still seeing it after a reboot, i think that you might want to try
Applications > Utilities > Disk Utility > First Aid
🙏 1
s
I really appreciate y'all trying to help out, I'll keep you updated
👍 1
w
because while our usage of the disk is slightly unusual (we MMAP a lot of stuff), i’ve never seen it fail this way without an actual disk space issue
s
wooo! a restart got it working again 🤷 I thought I had tried that, thanks for bearing with me on this one!
👍 1
h
Thanks for sticking with it! "Have you tried turning it off and on again" will remain good advice for centuries...
🤣 1