? Editing: Post:21.body Save Delete Cancel
Initial sync in progress...

List of ZeroNet sites

Follow in NewsfeedFollowing
+ Add new site
Loading...
stickied

Title

Body
^1 ^2 added ━ submitted by user_name
More sites

 

Follow in NewsfeedFollowing

Rex Research - The Civilization Kit

^2 ^3 mg0 posted on Feb 22, 2019
Please sign innew comment
Sign in as...
Submit comment
You are running out of your allowed space, please contact the site's admin at unknown to raise your limit.
user_nameadded ^1 ^2
Reply
Body
mg0on Jul 19, 2019 ^1 ^2
Reply

Hmm, have you checked your global optional file size limit?

ssdifnskdjfnsdjkon Jul 19, 2019 ^1 ^2
Reply

current state: downloaded 5.5GB out of 6.7GB (300MB better than 2 months back)

mg0on Jul 08, 2019 ^1 ^2
Reply

@ssdifnskdjfnsdjk: Hey, I checked out the global optional files size limit and it was exceeded on my raspberry pi (and my main PC wasn't storing new ZeroMe content when I found out :/ ), I increased the limit and the pi started to download all the remaining files automatically! It's not done yet but is going to be soon hopefully, it was stuck at 4.3GB for much time, now is reaching 6.0GB !! hopefully it will help with the issues others are/were having.

Also, the zite says that the author is somewhere at a desert and won't update much for a while, a bit sad for curious minds but it will help spread the content evenly across the net for the moment. Thanks for all the help so far!

mg0on May 27, 2019 ^2 ^3
Reply

ssdifnskdjfnsdjk: on one computer (should be non-stop running and tor always) i have downloaded 5.2 out of 6.7 GB. On another (sometimes running and tor available) 6.3 out of 6.7 GB. Both computers has enabled downloading of all content files up to 2000MB and 1900MB of file size. This seems like longer term (a weeks?) state and no new data being downloaded, unsure why not downloaded all data.

Hey there, just got from a hiatus. I noticed that on my rpi too, it had 4.3 GB so far and was stuck there, I tried everything to trigger a download from its side but it seemed like it couldn't reach my notebook (main peer) for some reason. I resolved it to just batch copy the whole zite to the pi again, but that won't work for the rest of the peers :/

I think it has to do with some peers having something updated but not everything and others not reaching the peers with updated content. The clearnet site probably got updated in my absence, so I'll update the zite and publish.

ssdifnskdjfnsdjkon May 27, 2019 ^1 ^2
Reply

on one computer (should be non-stop running and tor always) i have downloaded 5.2 out of 6.7 GB. On another (sometimes running and tor available) 6.3 out of 6.7 GB. Both computers has enabled downloading of all content files up to 2000MB and 1900MB of file size. This seems like longer term (a weeks?) state and no new data being downloaded, unsure why not downloaded all data.

mg0on May 13, 2019 ^1 ^2
Reply

@quantumworld: I updated the zite just now, my rpi wasn't downloading the optional files out right but that was because I was trying out Tor Always Mode. I wonder if your issues with the zite are still going and if they are related to Tor Always Mode.

mg0on May 06, 2019 ^2 ^3
Reply

quantumworld: I just deleted the site and tried to re-download it and I am getting jason errors.

Try restarting zeronet, maybe then you can try downloading the zite.

quantumworldon May 06, 2019 ^1 ^2
Reply

I just deleted the site and tried to re-download it and I am getting jason errors.

quantumworldon May 06, 2019 ^1 ^2
Reply

mg0: It is up to date. No change. Odd..

mg0on May 06, 2019 ^1 ^2
Reply

quantumworld: Strange your link works fine. This one is not on my end. http://127.0.0.1:43110/1D6QfsBSZHmYWbuCxKG5ya31BiTczoZiES/cannabis/cannabis.html

Did you check if your main page is up-to-date with ours?

quantumworldon May 06, 2019 ^1 ^2
Reply

ssdifnskdjfnsdjk: It works for me and also for my proxy which is located in different country.

Strange your link works fine. This one is not on my end. http://127.0.0.1:43110/1D6QfsBSZHmYWbuCxKG5ya31BiTczoZiES/cannabis/cannabis.html

ssdifnskdjfnsdjkon May 06, 2019 ^1 ^2
Reply

quantumworld: mg0: This link is dead from the main page link.http://127.0.0.1:43110/1D6QfsBSZHmYWbuCxKG5ya31BiTczoZiES/cannabis/cannabis.html

It works for me and also for my proxy which is located in different country.

mg0on May 06, 2019 ^1 ^2
Reply

quantumworld: mg0: Hopefully it can be fixed. That site needs to be put in an iso image file. It would save space. However bad idea for slow computers or low bandwidth connections. For local storage it may be a good idea for archiving.

The clearnet site owner has a section in which one can buy a DVD version of the whole site and probably some more stuff, I don't have money nor ways of internet banking so I just mirrored the whole zite; I'd like to leave donations though but welp :/

quantumworldon May 06, 2019 ^1 ^2
Reply

mg0: Hopefully it can be fixed. That site needs to be put in an iso image file. It would save space. However bad idea for slow computers or low bandwidth connections. For local storage it may be a good idea for archiving.

mg0on May 06, 2019 ^1 ^2
Reply

quantumworld: mg0: It is the same update. This link is dead from the main page link.http://127.0.0.1:43110/1D6QfsBSZHmYWbuCxKG5ya31BiTczoZiES/cannabis/cannabis.html

Works perfectly on my end (as expected), just signed and published cannabis/content.json, hopefully it syncs quickly. If you mind, check that your cannabis/content.json has the value of 1557110212 in the modified field.

quantumworldon May 06, 2019 ^1 ^2
Reply

mg0: It is the same update. This link is dead from the main page link.
http://127.0.0.1:43110/1D6QfsBSZHmYWbuCxKG5ya31BiTczoZiES/cannabis/cannabis.html

mg0on May 06, 2019 ^1 ^2
Reply

quantumworld: mg0: It should be the same problem on your end. Just go on zeronet and click your seeded site from zeronet. From the main page it will drop out at about 2 or 3 levels down. It could be the original site has bad links or that your mirror does. I have zites that are far more complex than yours and larger that all sync up just fine.

My main client has the mirror up-to-date; in the mirroring process I link all missing files to a tiny plain text file so zeronet doesn't stall searching for the whole content.json file, and my rpi was synced directly instead of thru zeronet for speed purposes. I probably won't see any issues that you are experiencing unless some links are actually borked because of the original zite and/or the mirroring process. This is why I wanted to know what links didn't work for you, so to publish those and hope it fixes it. By the way, the main page is up to date on your end? Last update left the main zite page like this. Check that yours is like this.

quantumworldon May 06, 2019 ^1 ^2
Reply

mg0: It should be the same problem on your end. Just go on zeronet and click your seeded site from zeronet. From the main page it will drop out at about 2 or 3 levels down. It could be the original site has bad links or that your mirror does. I have zites that are far more complex than yours and larger that all sync up just fine.

mg0on May 06, 2019 ^1 ^2
Reply

quantumworld: I have noticed that there are allot of links that are not being seeded. Maybe you are off line.

Can you arrange a list of two or three; I'd try to publish those content.json files to check if it syncs properly on your end. With that I'll have to publish all of the content files (more than 1000) and each take a while to do so.

Keep Zeroing~

quantumworldon May 04, 2019 ^1 ^2
Reply

I have noticed that there are allot of links that are not being seeded. Maybe you are off line.

mg0on May 04, 2019 ^1 ^2
Reply

ssdifnskdjfnsdjk: yes, i can see in your site menu: Total: 6698.60MB. I have "Help seed all" option ticked and so far like 4.7GB downloaded (no big change for maybe 1-3 days). Zeronet do not seem to queue any more fileparts (per ZeroHello site list), "check files", "update" buttons was just triggered ... and after a few minutes i see ZeroHello started showing queued file parts for download (like 800) and seems to be actively downloading, so appears quite good..

Thanks, it's a lot of content and probably only my two systems have them all, the pi is on 24/7 so it should be leveraging all the uploading. Hang in there pi~

ssdifnskdjfnsdjkon May 04, 2019 ^1 ^2
Reply

mg0: showing that many files left to sync?

yes, i can see in your site menu: Total: 6698.60MB. I have "Help seed all" option ticked and so far like 4.7GB downloaded (no big change for maybe 1-3 days). Zeronet do not seem to queue any more fileparts (per ZeroHello site list), "check files", "update" buttons was just triggered ... and after a few minutes i see ZeroHello started showing queued file parts for download (like 800) and seems to be actively downloading, so appears quite good..

mg0on May 04, 2019 ^1 ^2
Reply

ssdifnskdjfnsdjk: 17099 file parts remaining to download for Rex Research (per ZeroHello site list)

Can you tell me if your end is still showing that many files left to sync? I'd like to know if it got synced, also, the total size of the zite as of now is ~6800MB.

mg0on Apr 29, 2019 ^1 ^2
Reply

ssdifnskdjfnsdjk: 17099 file parts remaining to download for Rex Research (per ZeroHello site list)

I don't know what to do with the zite, yesterday I was checking the the many content.json approach won't update optional files, my raspi wasn't updating a pdf that was missing before. I may have to go back to one huge content.json file but can't publish it properly. Any suggestions, or should the zite be just archived in 8GB zip files?

ssdifnskdjfnsdjkon Apr 28, 2019 ^1 ^2
Reply

17099 file parts remaining to download for Rex Research (per ZeroHello site list)

mg0on Apr 14, 2019 ^2 ^3
Reply

ssdifnskdjfnsdjk: The way how Zeronet downloads files makes me unsure if or when the download of the zite files will be complete. Sometimes ZeroHello shows downloading of many files, but the count is stuck and not decreasing, then i see the process is paused (does not appear like downloading). Then i do "Check files" button and i see it is active again.Thanks to this i can not say if the site download works ok, but i think it is zeronet problem.So i will wait and hope site will be downloaded completely. I will try to help distribute all its files.

Thanks for the support, so far I'm sure only my instance has all the files (the whole zite is almost 7GB as of now) which means that you and others only get to download when I'm online and my upload speed isn't superb. Also, the way it is now seems to confuse the use if it's downoading or not, it goes on its own fetching files (it was doing like that last time I checked on my rpi) and I think it was showing a count of files remaining. For now I'm happy the zite is finally publishing. I'll try to sync my rpi fully so it keeps on seeding for others, hopefully I won't have to scratch another 7GB ontio it by reusing all the files of the zite before the fix (files hashes are the same, even for the html files since the fix didn't change directory layout, yay!).

BTW, if there's anyone hosting a clearnet proxy and would like to host my zite as well, the required files are the html ones and those account for about 200MB so that's the limit you'd like to set for the proxy.

Keep zeroing~

ssdifnskdjfnsdjkon Apr 14, 2019 ^1 ^2
Reply

The way how Zeronet downloads files makes me unsure if or when the download of the zite files will be complete. Sometimes ZeroHello shows downloading of many files, but the count is stuck and not decreasing, then i see the process is paused (does not appear like downloading). Then i do "Check files" button and i see it is active again.
Thanks to this i can not say if the site download works ok, but i think it is zeronet problem.
So i will wait and hope site will be downloaded completely. I will try to help distribute all its files.

mg0on Apr 13, 2019 ^1 ^2
Reply

Site is ready now for primetime! The issue that was stopping the zite from publishing is hopefully solved by now. Thanks for the help ya'all.

Keep Zeroing~

mg0on Apr 08, 2019 ^1 ^2
Reply

ssdifnskdjfnsdjk: @mg0: robione in Yours linked topic said "all broken links", so i tried last 3 links and all works. But when i tried to load recent link via proxies:https://z.hex3.cf/1D6QfsBSZHmYWbuCxKG5ya31BiTczoZiES/zurovcik/zurovcik.htmhttps://43110.cf/1D6QfsBSZHmYWbuCxKG5ya31BiTczoZiES/zurovcik/zurovcik.htmhttps://zn.amorgan.xyz/1D6QfsBSZHmYWbuCxKG5ya31BiTczoZiES/zurovcik/zurovcik.htmthen i see "Not Found". [...]

Yeah, the issue is that publishing almost always fails for my end and I don't know why, I removed a lot of files from the zite that were fetched as cgi based files from the original clearnet site but the site is still huge and I think it's because of the size limit of 10MB. My first test of the zite only had a couple files and was properly published but I've only seen the recent updates in my raspberry pi (it syncs properly thru zeronet) and sometimes I hear your end does as well. and the zite is huge (just html files go beyond 300MB, whole zite is about 7GB so far).

I'll post more later, pretty busy now. Keep zeroing~

ssdifnskdjfnsdjkon Apr 08, 2019 ^2 ^3
Reply

@mg0: robione in Yours linked topic said "all broken links", so i tried last 3 links and all works. But when i tried to load recent links via proxies:
https://z.hex3.cf/1D6QfsBSZHmYWbuCxKG5ya31BiTczoZiES/zurovcik/zurovcik.htm
https://43110.cf/1D6QfsBSZHmYWbuCxKG5ya31BiTczoZiES/zurovcik/zurovcik.htm
https://zn.amorgan.xyz/1D6QfsBSZHmYWbuCxKG5ya31BiTczoZiES/zurovcik/zurovcik.htm
then i see "Not Found".

Then i try to load site main page on proxies (example: https://z.hex3.cf/1D6QfsBSZHmYWbuCxKG5ya31BiTczoZiES)

It loading OK, but when i load ZeroHello page on proxies and search for "1D6QfsBSZHmYWbuCxKG5ya31BiTczoZiES" i see last update of the site is Feb 24 which is the problem as i guess @mg0 updated it more recently (my update time is Mar 19). I wonder if mg0 signing and publishing went OK and his singned content.json contains really all subpages which @robione and the proxies are missing.. @mg0 you may also try to describe issue on ZeroTalk to get better exposure (update: i created it here). If your site is not big, consider packing it so some interested person can take a look on the files.

mg0on Apr 08, 2019 ^1 ^2
Reply

@ssdifnskdjfnsdjk: I added the entries to a post in another 0list thread, hope it works as desired.

mg0on Mar 19, 2019 ^1 ^2
Reply

ssdifnskdjfnsdjk: if you will be also linking to particular articles, the "markdown" code is for example: [...]

Looks neat, I'll follow that style instead but that won't be now, I'm a bit tired.

ssdifnskdjfnsdjkon Mar 19, 2019 ^1 ^2
Reply

if you will be also linking to particular articles, the "markdown" code is for example:

[James HOLM / Swaminathan RAMESH - Catalytic Thermal Depolymerisation - ( Plastic to Diesel )](/1D6QfsBSZHmYWbuCxKG5ya31BiTczoZiES/holmplasticdiesel/holmplasticdiesel.html)

and on the 0list/zerotalk it will look like:

James HOLM / Swaminathan RAMESH - Catalytic Thermal Depolymerisation - ( Plastic to Diesel )

mg0on Mar 19, 2019 ^1 ^2
Reply

ssdifnskdjfnsdjk: Linking from ZeroTalk to an sitemap/indexing page located on other zite may help if someone will develop fullt€xt search €ngine (not yet developed), but the point is that article names (with maybe even some short description containing keywords) are located inside ZeroTalk database (so these are searchable - can be found when searching ZeroNet connected sites - it is done from within ZeroHello page). [...]

I think I'll make another 0list topic only for listing the pages of the site, so I don't get this one messed up. As for the entries, the index already makes it a bit easier; just have to make a tiny script to get the url of each page and I'll fill the new topic with those. I'll use these as the format for entries:

[Last Name],[First Name]: [Name of invention] -- [Brief description of invention]

fields with * may or may not appear depending on the difficulty of parsing their sites for info.

Keep Zeroing~

ssdifnskdjfnsdjkon Mar 19, 2019 ^1 ^2
Reply

Linking from ZeroTalk to an sitemap/indexing page located on other zite may help if someone will develop fullt€xt search €ngine (not yet developed), but the point is that article names (with maybe even some short description containing keywords) are located inside ZeroTalk database (so these are searchable - can be found when searching ZeroNet connected sites - it is done from within ZeroHello page).

But you are right regarding the ZeroTalk limit. You can also use this site 0List/this topic to post that "sitemap" as

  1. this site has much bigger limit (at least me seeing it under the comment field)
  2. this site is among the few of sites that new zeronet members have listed on their ZeroHello and thus it is likely they will open it and it will be included in their zeronet search.
mg0on Mar 19, 2019 ^1 ^2
Reply

ssdifnskdjfnsdjk: @mg0: once You are sure the site structure (links) will remain as is, you may try to create kind of sitemap with names of your site articles and links to them and add this as a zerotalk topic or such so the ZeroNet's search feature (located on ZeroHello) can be used to find these topics and people this way know they can find these articles on your site.Article name or its descriptionArticle name or its descriptionIt is like with Google, one have to link to his site from other (well indexed) sites in order search engine finds his site.

I get what you mean, but I'm a bit afraid of a couple of things. I think I heard that there's a limit to how much stuff you can put in ZeroTalk before asking for raising your limit, I understand that I can always ask for more but this approach feels a bit spammy in my head. Now, the site has its own index so I could just link to it directly in a new ZeroTalk topic or parse that page and generate all the entries I need for the approach you mention. I just think it's a bit spammy, that's it. Please tell me otherwise as I'd like more attention on the site and hopefully some help for making it more 'sane' as a zite.

Keep Zeroing~

ssdifnskdjfnsdjkon Mar 19, 2019 ^1 ^2
Reply

@mg0: once You are sure the site structure (links) will remain as is, you may try to create kind of sitemap with names of your site articles and links to them and add this as a zerotalk topic or such so the ZeroNet's search feature (located on ZeroHello) can be used to find these topics and people this way know they can find these articles on your site.
[Article name or its description](linkhere)
[Article name or its description](linkhere)
It is like with Google, one have to link to his site from other (well indexed) sites in order search engine finds his site.

mg0on Mar 18, 2019 ^1 ^2
Reply

ssdifnskdjfnsdjk: Kaffie is alive though unsure how often she is checking her mailbox.I see that Your site started loading its subpages.on this subpage when you hover mouse over the article title, you see the link is wrong

Cryptic!, yeah, you are right. Pretty sure it has to do with some of my scripting. Gonna fix it outright, thanks for the feedback and glad you are getting the updates as well.

[UPDATE]: Just fixed it but can't keep seeding right now, will see if I can for a brief moment in an hour, otherwise you'll have to wait until a couple hours (not that this completely breaks the zite tho). Thanks for pointing it out again. Also, I just added the zite to the ZeroSites list if you wanted to know. And yeah, it seems kaffie doesn't check mail that frequently.

Keep Zeroing~

ssdifnskdjfnsdjkon Mar 18, 2019 ^1 ^2
Reply

Kaffie is alive though unsure how often she is checking her mailbox.
I see that Your site started loading its subpages.
on this subpage when you hover mouse over the article title, you see the link is wrong

mg0on Mar 18, 2019 ^1 ^2
Reply

Update: Ok, now keeping the site up to date should be a breeze for me, may be always like 1 to 3 days off of the site.

Now, I've been waiting for a reply from kaffie for the entry in its search engine; still nothing, do any of you know if kaffie's alive? Will check out ZeroSites next.

Keep Zeroing~

quantumworldon Mar 09, 2019 ^1 ^2
Reply

I am 100% Free Speech. And Zeronet.io is the best place for it. I love the News I get from China and other countries that you would never get anywhere else. Zeronet will pop one day and it will be bigger than Facebook when it takes off.

mg0on Mar 09, 2019 ^1 ^2
Reply

quantumworld: This is how I look at it. I don't care what you are into. Its free speech. End of argument. I believe in the U.S. constitution. Share what you want to share with others. I may not like it but its free speech. Freenode is about what the individual user wants to share with others. That is what makes this place special from any other platform on the web that I have seen. People need to grow up and stop relying on some one else to censor what content they should view or not view. People have the block and ban option. Use it as you see fit. And I have no issues with anything being published. Call it modern day digital free press. If people don't value what the structure of Zeronet actually is as humanity they just don't get the design of it. (I do have issues wit people who never post anything!) (You SILENT BASTARDS!!!!) (Note: Some sarcasm was used in this comment.) (Reader Discretion is advised.)

Wow, I saw your comment on my feed and I got the wrong idea at first but now I get it. So yeah, people are used to be babysit, others are just afraid of what could come their way just by being here. That's an issue of modern social individuals but solving that alone won't make people come to zeronet. Say, if the clearnet went down many would look for other ways of being connected (or free). Making a decentralized net makes sense on its own when there's so much censorship but most people won't start doing stuff here as long as youtube works (not implying we should take it down, but if it casually happens it would benefit zeronet).

Just keep spreading the word of the good, decentralized network. Users will come at their pace but 0net gotta be ready for them when the time comes.

quantumworldon Mar 09, 2019 ^1 ^2
Reply

mg0: There's an ugly lot of people that may have what it takes to use and improve zeronet out on the wild, but as a darknet we have the short end of the stick: [...]

This is how I look at it. I don't care what you are into. Its free speech. End of argument. I believe in the U.S. constitution. Share what you want to share with others. I may not like it but its free speech. Freenode is about what the individual user wants to share with others. That is what makes this place special from any other platform on the web that I have seen. People need to grow up and stop relying on some one else to censor what content they should view or not view. People have the block and ban option. Use it as you see fit. And I have no issues with anything being published. Call it modern day digital free press. If people don't value what the structure of Zeronet actually is as humanity they just don't get the design of it. (I do have issues wit people who never post anything!) (You SILENT BASTARDS!!!!) (Note: Some sarcasm was used in this comment.) (Reader Discretion is advised.)

mg0on Mar 09, 2019 ^1 ^2
Reply

quantumworld: I am impressed with Zeronets responsiveness to updates. I am not impressed with the amount of users using Zeronet. Its just not enough and something needs to be done about this. And the current user base seems uninterested in marketing Zeronet on the open web. That is the only down side to Zeronet.

There's an ugly lot of people that may have what it takes to use and improve zeronet out on the wild, but as a darknet we have the short end of the stick:

We are not indexed by google nor other clearnet search engines,
we support decentralization so really big corporates won't let us get much good attention after all (knowing that you can find CP at any time here is really a big concern, blocklists are a thing but won't keep you clean alone),
there's still nothing really amazing, vastly useful and exclusive to zeronet to help speedup adoption.

I've been loving eh anime recently, makes the experience of watching anime a breeze when you only knew to hunt it at the clearnet and store it on your own at your Downloads folder. There's KopyKate BIG too, it hasn't been really active recently but it's a nice idea, yet it requires LOTS of storage which is sad. I dream of vectorized video formats ...

On the other side, ZeroMe is a god sent to some oppressed people, but again, they are oppressed so there's a chance they get caught and punished.

tl;dr: usage of zeronet comes with many risks and not enough valuable resources depending where you are from. This may be the issue to workaround earlier, tho I don't know for sure. Humans are complicated beings.

Keep Zeroing~

btw, this should be discussed in a more akin thread, not disencouraging you from doing so here but you/we may get better feedback somewhere else.

quantumworldon Mar 09, 2019 ^1 ^2
Reply

I am impressed with Zeronets responsiveness to updates. I am not impressed with the amount of users using Zeronet. Its just not enough and something needs to be done about this. And the current user base seems uninterested in marketing Zeronet on the open web. That is the only down side to Zeronet.

mg0on Mar 09, 2019 ^1 ^2
Reply

quantumworld: Update worked fine. I got it when I turned on Zeronet. I have been down from Zeronet doing system upgrades.

Glad to hear that, I'll go ahead and add the site to some search engines.

quantumworldon Mar 09, 2019 ^1 ^2
Reply

Update worked fine. I got it when I turned on Zeronet. I have been down from Zeronet doing system upgrades.

mg0on Mar 09, 2019 ^1 ^2
Reply

Ok, made another snapshot of the site, found out that most of the site is still not being downloaded when I tried listing some directories but I won't care much about that now. Still those non-mirrored files may prove useful in the future so I'll try tweaking the mirroring script to fetch those one day.

Just signed and published it, hopefully everyone will be able to get the updated site as well.

quantumworld, ssdifnskdjfnsdjk : I'd like to know if you get the site updated successfully so to remove the beta tag, adding the site to some search zites and maybe even getting a .bit domain (have first to understand how that works though).

Keep Zeroing~

mg0on Mar 08, 2019 ^1 ^2
Reply

The mirroring script is working, surprisingly wget was badly converting HTML jump marks. That's pretty much fixed now. Just updated the zite on my end but have to go home now so will seed by tomorrow (say, after 16 hours or so, maybe). Zite's reaching stable tag.

Keep Zeroing~

PS: I'll want to know about how well it distributes and updates to other peers when I publish by tomorrow.

mg0on Mar 07, 2019 ^2 ^3
Reply

quantumworld: There has to be a way you can directly mirror that site with a script file. Wget used to be really good.

I'm getting to it, I'm about to finish the script for easily mirroring the site. As of last night I left it advanced enough as to get a vanilla mirror to the state where the actual zite is right now, I'm just missing how to sanitize the html jump marks afaik.

Will be checking it out on zeronet and then publishing it. Keep zeroing~

quantumworldon Mar 07, 2019 ^1 ^2
Reply

There has to be a way you can directly mirror that site with a script file. Wget used to be really good.

mg0on Mar 05, 2019 ^2 ^3
Reply

ssdifnskdjfnsdjk: i tried and then Update and the Check files, that not fixed/updated it. So i deleted site and visited it again and immediately set size to 400MB, no luck - sub pages are "Not Found" and last update time on ZeroHello appears as FEB 24. You may try to remove some .json files if they block including some subpages or checking content.json file it if include all sub pages.. you said that the site signing and publishing worked to peers.

Recently publishing may not have been working, I prefer using the CLI but I can't really tell if it works or not and the GUI stalls heavily when signing happens. I'll try sign&publish once again. Hopefully it works this time. Thanks for going through all the trouble for this site, I appreciate the help and feedback.

Keep 0'ing~

ssdifnskdjfnsdjkon Mar 05, 2019 ^1 ^2
Reply

mg0: can you raise the size limit to 400MB and try updating the site

i tried and then Update and the Check files, that not fixed/updated it. So i deleted site and visited it again and immediately set size to 400MB, no luck - sub pages are "Not Found" and last update time on ZeroHello appears as FEB 24. You may try to remove some .json files if they block including some subpages or checking content.json file it if include all sub pages.. you said that the site signing and publishing worked to peers.

mg0on Mar 05, 2019 ^1 ^2
Reply

I think I get why 0net.io won't update, I think I read that public proxy will only share zites that fit below the default 10MB size limit. Probably the version it has is when I was testing if I had to publicate the site as empty first then raise the size limit to make it work. Since it wasn't getting signed properly because of the big size. If this is correct then I'll just ignore that site as there's nothing I can do about it.

On the other hand, ssdifnskdjfnsdjk, I'm concerned about you not being able to get the site updated, can you raise the size limit to 400MB and try updating the site? Since you and 0net.io share the same version of the content.json file, maybe you could fix it by raising the limit.

Keep zeroing~

quantumworldon Mar 02, 2019 ^2 ^3
Reply

mg0: If you made a mirror of the rexresearch.com then I'd like to know the file count and the total size of the site for comparison purposes. Also, as of now I'm seeing '42' peers for the site. Great! [...]

The updates of your site is automatic when you change it. All the modified files will be compared with your user login key and will be updated. You should not need to manually update it.

ssdifnskdjfnsdjkon Mar 02, 2019 ^1 ^2
Reply

I just did Update and Recheck of the site, yet it shows last update Feb 24. Same on https://0net.io pr0xy.

mg0on Mar 02, 2019 ^1 ^2
Reply

quantumworld: You currently have 22 peers listed on my end. I am one of them. I have some down time from zeronet within the next 40 hours. You must have access to the server side of that site. I ran across a full download and mirror program that worked perfectly. I do not remember the name of it. It was a gui program, and a really odd name.

If you made a mirror of the rexresearch.com then I'd like to know the file count and the total size of the site for comparison purposes. Also, as of now I'm seeing '42' peers for the site. Great!

I just want all peers to be up-to-date.

quantumworldon Mar 02, 2019 ^1 ^2
Reply

mg0:

You currently have 22 peers listed on my end. I am one of them. I have some down time from zeronet within the next 40 hours. You must have access to the server side of that site. I ran across a full download and mirror program that worked perfectly. I do not remember the name of it. It was a gui program, and a really odd name.

mg0on Mar 02, 2019 ^1 ^2
Reply

quantumworld: You can use wget. That will auto convert files and it will mirror all of them. It works fairly well. There may be a better out there. [...]

I do use it, but if I use --convert-links it won't convert all links for some reason, also, I won't have a verbatim copy of the site and so I won't be able to make an incremental download with it. I use btrfs as my main FS, I can make a copy of the whole site directory in seconds with no extra size cost so I work with that.

I'm after the script thing because wget can't possibly fulfill all the weird fixing that site needs after each mirroring, but wget is important, indeed.

I won't be online 'till tomorrow, hopefully there's at least one peer properly updated. If things start to work for ssdifnskdjfnsdjk and others then I lost much time by using cjdns... it seems I won't adopt it just yet, at least not until I can leave a device with cjdns and zeronet (and the pi lacks too much for that).

Keep zeroing~

quantumworldon Mar 02, 2019 ^1 ^2
Reply

mg0: I just have to apply all the dark magic I did to get it ready for zeronet. [...]

You can use wget. That will auto convert files and it will mirror all of them. It works fairly well. There may be a better out there.

https://www.computerhope.com/unix/wget.htm

mg0on Mar 02, 2019 ^1 ^2
Reply

quantumworld: Are you mirroring that site all the time? Or did you do a snapshot then re-code it. I just noticed that the site you are mirroring is different from the one on Zeronet.

I'm still working over the snapshot I took by Feb 19th, trying to make it most usable then I'll write an script to ease subsequent snapshots. In case you wonder, I don't have to re-download everything everytime; I keep a 'virgin' version of the snapshot and make CoW reflinks (BTRFS is awesome) of that snapshot then mirror only the changed files based on that snapshot. I just have to apply all the dark magic I did to get it ready for zeronet.

Keep Zeroing~

quantumworldon Mar 02, 2019 ^1 ^2
Reply

Are you mirroring that site all the time? Or did you do a snapshot then re-code it. I just noticed that the site you are mirroring is different from the one on Zeronet.

mg0on Mar 01, 2019 ^1 ^2
Reply

Ok, find out something. It seems zeronet over cjdns is still not widespread enough, I just disabled it and received a zerotalk post that was posted 10 hours ago with comments on it, I've been connected the whole day so I wasn't going to get said post. Maybe the site has not been updating for some because of this; there's the raspberry pi but I'll assume it just lacks the horsepower to sign and publish such big site on its own.

I'm re-signing/re-publishing right now, hoping to see the "Content published successfully to 5/5 peers" message this time.

mg0on Mar 01, 2019 ^2 ^3
Reply

ssdifnskdjfnsdjk: Now that I think of it, my notebook's zeronet is configured to use cjdns, I think that is relevant but the rpi is not using cjdns so I don't know.

[UPDATE]: hmm, it says content publish fail. I'll try with the pi now.

mg0on Feb 28, 2019 ^1 ^2
Reply

ssdifnskdjfnsdjk: it downloaded content.json in which is: "modified": 1551030356,did you check zeronet's debug.log (running zeronet in "Everything" Level of logging to file set in /Config)? Do you really publish and sign the file?

Yeah, mine is set to log everything and I was checking it out when it was signing and publishing, maybe there's some disconnection in between us? like peers of mine don't get to update yours or I don't know man. Do anyone else know what could be happening here?

ssdifnskdjfnsdjk: I'll sign and publish one more time, site ain't light but it should be done in 5 minutes or less, will take a pic and comment when I'm done.

ssdifnskdjfnsdjkon Feb 28, 2019 ^1 ^2
Reply

value of the "modified" field

it downloaded content.json in which is: "modified": 1551030356,
did you check zeronet's debug.log (running zeronet in "Everything" Level of logging to file set in /Config)? Do you really publish and sign the file?

mg0on Feb 28, 2019 ^1 ^2
Reply

ssdifnskdjfnsdjk: i tried to delete the site now, and after like 2 minutes i visited mentioned subpage and i see the same: "Not Found". Maybe it will get fixed later. Though make sure you have that file, and you published right content.json file in your site (0) menu or via command line.

I have it and have been visiting that page since you mentioned it, btw, if you can access the content.json file of the site on your side, would you mind telling me what's the value of the "modified" field? The one I have is '1551380316' and the 0net.io one says '1551030356' instead.

ssdifnskdjfnsdjkon Feb 28, 2019 ^1 ^2
Reply

i tried to delete the site now, and after like 2 minutes i visited mentioned subpage and i see the same: "Not Found". Maybe it will get fixed later. Though make sure you have that file, and you published right content.json file in your site (0) menu or via command line.

mg0on Feb 28, 2019 ^1 ^2
Reply

ssdifnskdjfnsdjk: 0net.io seem not be able to visit mine mentioned page too

I was offline yesterday, now I checked the 0net.io content.json and it's outdated, like, it's not getting any updates. I synced with the rpi again and won't change anything for a while. You should try deleting the site first if you still haven't done so. It seems that maybe some peers are not updating well.

On other notes, I kind of got why the jump marks weren't working and may fix them in the future. I'm craving to make the mirror most functional so I can automatize the mirroring/patching/witchcrafting stuff into a nice script.

Keep Zeroing~

PS: Will add the site to kaffiene search in a day or two (and with it, move on from the beta stage).

ssdifnskdjfnsdjkon Feb 26, 2019 ^1 ^2
Reply

0net.io seem not be able to visit mine mentioned page too

mg0on Feb 26, 2019 ^1 ^2
Reply

ssdifnskdjfnsdjk: i am sorry, please forget it, i think this limit is only on dynamic sites in /data/users/content.json [...]

Man, I don't know what's happening then. Is it that you are getting it from a not up-to-date peer? Both rpi and my notebook clients have it signed and updated and I checked my content.json to contain said files. If you haven't deleted and redownloaded the site then that may help but if you can wait a bit more I'm about to sign and publish with the /alchemyarchives sub directory included.

If you think it could be fixed by getting an updated content.json file from somewhere else then I can upload it to zeroUpload or something on the clearnet for you to fetch it (also, if my content.json contains critical info like keys and hashes that should not be made public, please tell me before hand.).

I'll update in a while. Keep Zeroing~

ssdifnskdjfnsdjkon Feb 26, 2019 ^1 ^2
Reply

mg0: about signing a site size limit IN the content.json

i am sorry, please forget it, i think this limit is only on dynamic sites in /data/users/content.json

content.json updated, but it does not contain the file you are linking on the index page, so it end up showing "Not found". Anyone else can not access the file i am linking?

mg0on Feb 26, 2019 ^1 ^2
Reply

ssdifnskdjfnsdjk: "Not found" still there on sub page/s, seems the content.json was not published to me for lack of seeder or because of too big size, please check here

It should have been working, I left the rpi serving the site since yesterday. Maybe you'll have to delete the site on your end and try again, also, I checked your post at zerome and I had to manually set the limit to 200MB once but that was the first time I created the site. I still have to check for broken links on the last downloaded load of files and then I'll sign and publish the content.json, I'll tell ya all when that happens and hopefully the site will be ready (no beta) after a couple days and fixing HTML jump marks.

ssdifnskdjfnsdjk: You mentioned something about signing a site size limit IN the content.json; I wasn't aware of that and now I'm setting the limit to 220MB (about 80% is already used with that limit but the site's mostly static so I that that's ok). Hopefully it fixes the issue for you and others.

Keep Zeroing~

quantumworldon Feb 26, 2019 ^1 ^2
Reply

I wish who ever created that site could organize it better. But making a backup on zeronet is a dam good idea. I have had a bunch of sites taken down. CNN has actually gone around the web and censoring sites by any means necessary. They think they are the moral authority on all information.

ssdifnskdjfnsdjkon Feb 26, 2019 ^1 ^2
Reply

"Not found" still there on sub page/s, seems the content.json was not published to me for lack of seeder or because of too big size, please check here

mg0on Feb 25, 2019 ^1 ^2
Reply

So I just found out that somehow the mirroring command didn't download everything, just added about 1.5GB to the site's weight (that being around 5GB now). Man! sure there's some content here!

mg0on Feb 25, 2019 ^1 ^2
Reply

ssdifnskdjfnsdjk: @mg0 somehow i keep seeing "Not found" error. Right on the first sub-page i tried. I assume you signed and published your content.json file/s.

It works on my end (ofc), can you try again now?, I'm adding new files that weren't mirrored before but the files relevant to the page you visited are already signed and published afaik. You could also try and grep the site's content.json on your end.

ssdifnskdjfnsdjkon Feb 25, 2019 ^1 ^2
Reply

@mg0 somehow i keep seeing "Not found" error. Right on the first sub-page i tried. I assume you signed and published your content.json file/s.

mg0on Feb 25, 2019 ^1 ^2
Reply

So I found out what stalls the zeronet client; whenever it tries to access a file that is missing in the site directory it freezes so now I'm trying to sanitize the site from broken links. I want to keep the site as unchanged as possible so I'll just make empty files as placeholders for the missing ones.

I haven't done any thorough tests on this but the site might actually be quite performant once all broken links are fixed.

Any ideas on this are appreciated, keep zeroing~

[UPDATE]

Fixed most if not all of the broken links and synced with my raspberry pi which will serve as the main seed for the site, somehow the site size went down below 200MB, maybe I had an error in the previous optioanl regex I was using. The only issues of the site I'm aware as of now are HTML jump marks (like index.htm#bottom), those just don't work and I don't know enough webdev to fix it on my own. If you happen to know a workaround to said jump marks it would be greatly appreciated.

After that the site is pretty usable, I've seen that media files other than videos won't be opened by the browser, like mp4 videos will show but pdf files must be downloaded which is a bummer. I may have to edit all the pdf links to allow in-browser viewing but that's not a requisite now.

mg0on Feb 24, 2019 ^2 ^3
Reply

ssdifnskdjfnsdjk: On ZeroBlog page can be found "Full support fileGet, fileList, dirList calls on tar.gz/zip files."i read somewhere on zeronet (possibly zerotalk - is searchable from ZeroHello) that there is some support to archive important files to reduce the size. Search for tar.gz in this topic.Another option is to convert the static site texts into a zerotalk or zeroblog posts. though i am unsure what .db size becomes too much nowadays. @nofish @blurhy possibly know it.btw: many (or all?) subpages of your site shows "Not Found" atm.

I'll look at it, not that I'm well versed with js but it seems a must to achieve some performance; updating 24K files every time isn't pleasant. And yeah, I was trying to come up with a better optional regex and my internet connection isn't really amazing at weekends. I'd like to let everybody know this is kind of beta now (should have put that at the very beginning :/) since many changes are to come. I left a rpi serving the site but zeronet on it seems to stall drastically when the site updates so I'm not pretty sure if it's actually serving the zite at all.

Thanks for all the comments.

blurhyon Feb 24, 2019 ^1 ^2
Reply

ssdifnskdjfnsdjk: On ZeroBlog page can be found "Full support fileGet, fileList, dirList calls on tar.gz/zip files."i read somewhere on zeronet (possibly zerotalk - is searchable from ZeroHello) that there is some support to archive important files to reduce the size. Search for tar.gz in this topic.Another option is to convert the static site texts into a zerotalk or zeroblog posts. though i am unsure what .db size becomes too much nowadays. @nofish @blurhy possibly know it.btw: many (or all?) subpages of your site shows "Not Found" atm.

Yes. ZeroNet has .json.gz support. The usage is the same

ssdifnskdjfnsdjkon Feb 24, 2019 ^1 ^2
Reply

On ZeroBlog page can be found "Full support fileGet, fileList, dirList calls on tar.gz/zip files."
i read somewhere on zeronet (possibly zerotalk - is searchable from ZeroHello) that there is some support to archive important files to reduce the size. Search for tar.gz in this topic.
Another option is to convert the static site texts into a zerotalk or zeroblog posts. though i am unsure what .db size becomes too much nowadays. @nofish @blurhy possibly know it.
btw: many (or all?) subpages of your site shows "Not Found" atm.

mg0on Feb 23, 2019 ^1 ^2
Reply

quantumworld: I know right, loved this site ever since I knew about it and so I thought it would be sensical to make it decentralized, as a site it has many many files. I plan to update the mirror monthly so I need to achieve some performance for the site config and then come up with an automated script for it. Glad you like it.

Btw, I'm tweaking with the site at the moment, the non optional files account for 500MB, I think that's ok but I need a way to make the content.json file smaller. I think it may lead to quicker updating times. That may imply I have to reorganize all optional files somewhere or find a type of archive that can contain many files but allows for in browser webview as well... sounds unlikely if you ask me.

quantumworldon Feb 23, 2019 ^1 ^2
Reply

mg0: I'll be tweaking with the zite to make it hopefully more snappy. quantumworld: I saw the pyro site and it works pretty well while being big, this one has still two orders of magnitude more files :/

I just got your site to load. Looks good. You are missing a couple of pdf link's. Looks good for an exact mirror of the original site. Make sure you post it on the search engines. The information looks interesting as well. And if it is accurate info some one will try to take it down. Its a good idea to mirror a site like.

mg0on Feb 23, 2019 ^1 ^2
Reply

I'll be tweaking with the zite to make it hopefully more snappy. quantumworld: I saw the pyro site and it works pretty well while being big, this one has still two orders of magnitude more files :/

quantumworldon Feb 23, 2019 ^1 ^2
Reply

mg0: Yeah, I already knew that. Thing is I was trying to set it up and running on a raspberry pi before going home but it was becoming slower and slower so I thought the site was just too big or not properly set as a Zite yet. Sorry for not commenting on that at time. Are there some resources about how to deal with such a big site? [...]

You can shrink it down further. I am not an expert at this. I am basing my information from what I have seen other sites do. The pyrotechnics site is very large (700mb) and it does allot.

mg0on Feb 23, 2019 ^1 ^2
Reply

quantumworld: It wont load. Says it has 2 peers and then times out. You need to leave Zeronet running to get the info on to other sites. This isn't the regular internet. This is torrent based. And if you alredy knew this I just wasted my time writing this.

Yeah, I already knew that. Thing is I was trying to set it up and running on a raspberry pi before going home but it was becoming slower and slower so I thought the site was just too big or not properly set as a Zite yet. Sorry for not commenting on that at time. Are there some resources about how to deal with such a big site?

This one has over 20K files, and it whole weighs more than 3.5GB, just the html files sum like 400MB alone! I made every non html file to be optional but they site still feels slugish when updates happen, also I've noticed that the main content.json file is over 3MB. Is it possible to make every folder of the site have a content.json that lists said sub-directory files so the main content.json doesn't end so bulky?

Thanks for comments.

quantumworldon Feb 23, 2019 ^1 ^2
Reply

mg0: This is a mirror of www.rexresearch.com which contains a huge lot of info about alternative energy, medicine and other fields of knowledge. Many invention patents and discussions are stored on the site. This is my first zite and I'll be trying to make it up-to-date periodically. Also, I'd appreciation opinions on how to handle the big size of the site in an approach fit for zeronet. [...]

It wont load. Says it has 2 peers and then times out. You need to leave Zeronet running to get the info on to other sites. This isn't the regular internet. This is torrent based. And if you alredy knew this I just wasted my time writing this.

gluekon Feb 23, 2019 ^1 ^2
Reply

No peers...

mg0on Feb 22, 2019 ^1 ^2
Reply

This is a mirror of www.rexresearch.com which contains a huge lot of info about alternative energy, medicine and other fields of knowledge. Many invention patents and discussions are stored on the site. This is my first zite and I'll be trying to make it up-to-date periodically. Also, I'd appreciation opinions on how to handle the big size of the site in an approach fit for zeronet.

Keep zeroing~

This page is a snapshot of ZeroNet. Start your own ZeroNet for complete experience. Learn More