Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

bug: uploading to the newly reset network many chunks have been carried over from the previous network #2111

Open
happybeing opened this issue Sep 15, 2024 · 6 comments

Comments

@happybeing
Copy link
Contributor

I'm uploading lots of files that I previously uploaded, and after the recent reset most/all chunks are still there from the previous network.

@maqi
Copy link
Member

maqi commented Sep 16, 2024

chunks are still there from the previous network.

the nodes of previous network that owned by us will continue run for couple of days more to allow some statistic data to be collected.
or some other node owner's node forgot to reset and still stay with previous network.
In that case, if your client try to connect to previous network, then there is chance the connection will be established, and fetch some chunks back.
This itself shall not be considered as an issue.

However, if you mean your client is connected to new network, but can fetch some content that blongs to previous network, then that could be an issue.

Could you clarify the detail of this reported issue? thx

@happybeing
Copy link
Contributor Author

happybeing commented Sep 16, 2024

However, if you mean your client is connected to new network, but can fetch some content that blongs to previous network, then that could be an issue.

Yes, this is what I mean. I've seen this effect both with my client built for the new network, and with safe files upload using safe -V -> 0.95.0. @aatonnomicc also reported this when he was uploading BegBlag to the new network.

@maqi
Copy link
Member

maqi commented Sep 16, 2024

looks like there are some node owner did upgrade instead of reset for this time.
this will upgrade the node and allows it connected to new network meanwhile bring old records into new network

@rid-dim
Copy link

rid-dim commented Sep 16, 2024

Logging to directory: "/home/riddim/.local/share/safe/client/logs/log_2024-09-16_16-01-48"
safe client built with git version: 08b0a49 / stable / 08b0a49 / 2024-09-09
Instantiating a SAFE client...
Connecting to the network with 25 peers
🔗 Connected to the Network                                                                                                                   Chunking 1 files...
"BigBuckBunny_320x180.mp4" will be made public and linkable
Splitting and uploading "BigBuckBunny_320x180.mp4" into 125 chunks
**************************************
*          Uploaded Files            *
**************************************
Uploaded "BigBuckBunny_320x180.mp4" to address b0c551356ff21b023ddb01add3fa74a8eb7db3a6a940722989e98c611507ae4c
Among 125 chunks, found 61 already existed in network, uploaded the leftover 64 chunks in 8 minutes 2 seconds
**************************************
*          Payment Details           *
**************************************
Made payment of NanoTokens(74) for 64 chunks
Made payment of NanoTokens(74) for royalties fees
New wallet balance: 0.000001060
Completed with Ok(()) of execute "Files(Upload { file_path: \"BigBuckBunny_320x180.mp4\", batch_size: 16, make_data_public: true, retry_strategy: Quick })"

half the file already existed on the current network - hard to believe someone uploaded half of the chunks for me in advance

oh - stupid me ... @maqi already did diagnose it ... but somehow this still feels strange doesn't it? should this really be possible? ... that data was not paid for on this network ...

@rid-dim
Copy link

rid-dim commented Sep 17, 2024

but @maqi you say this like it would be expected behavior ... isn't this highly problematic ...?

I could self encrypt my data and just start a specialized uploader node to upload it instead of paying for uploads… (or a uploader process that mimics a node for just enough seconds to do the uploads)
...or I could generate data, upload it to the network, shut my node down again - and repeat ...

Bringing data into the network for free opens the gates to gaming the system and for attacks ....?

@maqi
Copy link
Member

maqi commented Sep 17, 2024

it would be expected behavior

I don't mean it's an expected behaviour, just try to explain what happened :)

just start a specialized uploader node to upload it instead of paying for uploads

That node must to be close to the data generated.

Meanwhile, I have raised this as a concern and we are discussing the options to prevent that.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants