Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

helia-examples/examples /helia-101/201-storage hangs then crashes #294

Open
dentropy opened this issue Feb 16, 2024 · 4 comments
Open

helia-examples/examples /helia-101/201-storage hangs then crashes #294

dentropy opened this issue Feb 16, 2024 · 4 comments

Comments

@dentropy
Copy link

(base) ➜ helia-101 git:(main) ✗ node 201-storage.js
Added file: bafkreih7eug2oqxx7ft427q4fcce6lnl73q6kmtzaxeldllx266mxr6os4
Added file contents: Hello World 201
(node:2648575) MaxListenersExceededWarning: Possible EventTarget memory leak detected. 11 abort listeners added to [AbortSignal]. Use events.setMaxListeners() to increase limit
(Use node --trace-warnings ... to show where the warning was created)
(node:2648575) MaxListenersExceededWarning: Possible EventTarget memory leak detected. 11 abort listeners added to [AbortSignal]. Use events.setMaxListeners() to increase limit
[1] 2648575 segmentation fault (core dumped) node 201-storage.js
(base) ➜ helia-101 git:(main) ✗ node -v
v21.6.1
(base) ➜ helia-101 git:(main) ✗ neofetch
/////////////
/////////////////////
///////767////////////////
//////7676767676
////////////// dentropy@pop-os
/////76767//7676767////////////// ---------------
/////767676///76767/////////////// OS: Pop!_OS 22.04 LTS x86_64
///////767676///76767.///7676
/////// Kernel: 6.5.6-76060506-generic
/////////767676//76767///767676//////// Uptime: 19 days, 5 hours, 18 mins
//////////76767676767////76767///////// Packages: 2875 (dpkg), 48 (nix-default), 22 (flatpak), 8 (snap)
///////////76767676//////7676////////// Shell: zsh 5.8.1
////////////,7676,///////767/////////// Resolution: 1920x1080, 3840x2160, 1920x1080
/////////////*7676///////76//////////// DE: Unity
///////////////7676//////////////////// WM: Mutter
///////////////7676///767//////////// WM Theme: Adwaita
//////////////////////'//////////// Theme: Adwaita [GTK2/3]
//////.7676767676767676767,////// Icons: Adwaita [GTK2/3]
/////767676767676767676767///// Terminal: vscode
/////////////////////////// CPU: Intel i7-9700 (8) @ 4.700GHz
///////////////////// GPU: AMD ATI Radeon RX 5600 OEM/5600 XT / 5700/5700 XT
///////////// Memory: 8462MiB / 15787MiB

(base) ➜ helia-101 git:(main) ✗ npm -v
10.2.4
(base) ➜ helia-101 git:(main) ✗

@dentropy
Copy link
Author

(base) ➜  helia-101 git:(main) ✗ node --trace-warnings 201-storage.js 
Added file: bafkreih7eug2oqxx7ft427q4fcce6lnl73q6kmtzaxeldllx266mxr6os4
Added file contents: Hello World 201
(node:2650145) MaxListenersExceededWarning: Possible EventTarget memory leak detected. 11 abort listeners added to [AbortSignal]. Use events.setMaxListeners() to increase limit
    at [kNewListener] (node:internal/event_target:568:17)
    at [kNewListener] (node:internal/abort_controller:241:24)
    at EventTarget.addEventListener (node:internal/event_target:681:23)
    at anySignal (file:///home/dentropy/Projects/IPFS-MFS-CAR-TEST/helia-101/node_modules/any-signal/dist/src/index.js:21:20)
    at DialQueue.createDialAbortController (file:///home/dentropy/Projects/IPFS-MFS-CAR-TEST/helia-101/node_modules/libp2p/dist/src/connection-manager/dial-queue.js:200:24)
    at queue.add.peerId.peerId [as fn] (file:///home/dentropy/Projects/IPFS-MFS-CAR-TEST/helia-101/node_modules/libp2p/dist/src/connection-manager/dial-queue.js:126:33)
    at Job.run (file:///home/dentropy/Projects/IPFS-MFS-CAR-TEST/helia-101/node_modules/@libp2p/utils/dist/src/queue/job.js:56:50)
    at Queue.tryToStartAnother (file:///home/dentropy/Projects/IPFS-MFS-CAR-TEST/helia-101/node_modules/@libp2p/utils/dist/src/queue/index.js:79:17)
    at Queue.add (file:///home/dentropy/Projects/IPFS-MFS-CAR-TEST/helia-101/node_modules/@libp2p/utils/dist/src/queue/index.js:121:14)
    at DialQueue.dial (file:///home/dentropy/Projects/IPFS-MFS-CAR-TEST/helia-101/node_modules/libp2p/dist/src/connection-manager/dial-queue.js:123:27)
(node:2650145) MaxListenersExceededWarning: Possible EventTarget memory leak detected. 11 abort listeners added to [AbortSignal]. Use events.setMaxListeners() to increase limit
    at [kNewListener] (node:internal/event_target:568:17)
    at [kNewListener] (node:internal/abort_controller:241:24)
    at EventTarget.addEventListener (node:internal/event_target:681:23)
    at anySignal (file:///home/dentropy/Projects/IPFS-MFS-CAR-TEST/helia-101/node_modules/any-signal/dist/src/index.js:21:20)
    at DialQueue.createDialAbortController (file:///home/dentropy/Projects/IPFS-MFS-CAR-TEST/helia-101/node_modules/libp2p/dist/src/connection-manager/dial-queue.js:200:24)
    at queue.add.peerId.peerId [as fn] (file:///home/dentropy/Projects/IPFS-MFS-CAR-TEST/helia-101/node_modules/libp2p/dist/src/connection-manager/dial-queue.js:126:33)
    at Job.run (file:///home/dentropy/Projects/IPFS-MFS-CAR-TEST/helia-101/node_modules/@libp2p/utils/dist/src/queue/job.js:56:50)
    at Queue.tryToStartAnother (file:///home/dentropy/Projects/IPFS-MFS-CAR-TEST/helia-101/node_modules/@libp2p/utils/dist/src/queue/index.js:79:17)
    at Queue.add (file:///home/dentropy/Projects/IPFS-MFS-CAR-TEST/helia-101/node_modules/@libp2p/utils/dist/src/queue/index.js:121:14)
    at DialQueue.dial (file:///home/dentropy/Projects/IPFS-MFS-CAR-TEST/helia-101/node_modules/libp2p/dist/src/connection-manager/dial-queue.js:123:27)
[1]    2650145 segmentation fault (core dumped)  node --trace-warnings 201-storage.js
(base) ➜  helia-101 git:(main) ✗ 

@dentropy
Copy link
Author

{
  "name": "helia-101",
  "version": "1.0.0",
  "private": true,
  "type": "module",
  "description": "Getting started with Helia",
  "license": "MIT",
  "main": "index.js",
  "scripts": {
    "start": "node 101-basics.js",
    "serve": "npm run start",
    "test": "test-node-example test/*"
  },
  "dependencies": {
    "@chainsafe/libp2p-noise": "^15.0.0",
    "@chainsafe/libp2p-yamux": "^6.0.1",
    "@helia/unixfs": "^3.0.0",
    "@libp2p/bootstrap": "^10.0.12",
    "@libp2p/identify": "^1.0.14",
    "@libp2p/tcp": "^9.0.12",
    "blockstore-core": "^4.1.0",
    "datastore-core": "^9.1.1",
    "helia": "^4.0.1",
    "libp2p": "^1.2.0"
  },
  "devDependencies": {
    "test-ipfs-example": "^1.0.0"
  }
}

@dentropy
Copy link
Author

(base) ➜  helia-101 git:(main) ✗ npm run test

> [email protected] test
> test-node-example test/*

add event unixfs:importer:progress:file:read { bytesRead: 15n, chunkSize: 15n, path: undefined }
add event blocks:put:providers:notify CID(bafkreife2klsil6kaxqhvmhgldpsvk5yutzm4i5bgjoq6fydefwtihnesa)
add event bitswap:network:provide CID(bafkreife2klsil6kaxqhvmhgldpsvk5yutzm4i5bgjoq6fydefwtihnesa)
add event blocks:put:blockstore:put CID(bafkreife2klsil6kaxqhvmhgldpsvk5yutzm4i5bgjoq6fydefwtihnesa)
add event unixfs:importer:progress:file:write {
  bytesWritten: 15n,
  cid: CID(bafkreife2klsil6kaxqhvmhgldpsvk5yutzm4i5bgjoq6fydefwtihnesa),
  path: undefined
}
add event unixfs:importer:progress:file:layout {
  cid: CID(bafkreife2klsil6kaxqhvmhgldpsvk5yutzm4i5bgjoq6fydefwtihnesa),
  path: undefined
}
Added file: bafkreife2klsil6kaxqhvmhgldpsvk5yutzm4i5bgjoq6fydefwtihnesa
cat event blocks:get:blockstore:get CID(bafkreife2klsil6kaxqhvmhgldpsvk5yutzm4i5bgjoq6fydefwtihnesa)
cat event unixfs:exporter:progress:raw { bytesRead: 15n, totalBytes: 15n, fileSize: 15n }
Added file contents: Hello World 101
Added file: bafkreih7eug2oqxx7ft427q4fcce6lnl73q6kmtzaxeldllx266mxr6os4
Added file contents: Hello World 201
Added file: bafkreia7g3sdmf5f3s4uihaqyzzqb7mugg32yuqnm2q5b5jwtklh36ggkm
Fetched file contents: Hello World 301
(base) ➜  helia-101 git:(main) ✗ 

@achingbrain
Copy link
Contributor

The MaxListenersExceededWarning warning is annoying but it's just a standard node warning. It'll be silenced by libp2p/js-libp2p#2417

I can't replicate this locally, though I'm using Mac OS X.

Segfaults are normally caused by native modules, there aren't many of these in the stack, node-datachannel is one (used by the WebRTC transport) - are you able to isolate where the segfault is coming from?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants