-
@ 8fb140b4:f948000c
2023-07-30 00:35:01Test Bounty Note
-
@ 8fb140b4:f948000c
2023-07-22 09:39:48Intro
This short tutorial will help you set up your own Nostr Wallet Connect (NWC) on your own LND Node that is not using Umbrel. If you are a user of Umbrel, you should use their version of NWC.
Requirements
You need to have a working installation of LND with established channels and connectivity to the internet. NWC in itself is fairly light and will not consume a lot of resources. You will also want to ensure that you have a working installation of Docker, since we will use a docker image to run NWC.
- Working installation of LND (and all of its required components)
- Docker (with Docker compose)
Installation
For the purpose of this tutorial, we will assume that you have your lnd/bitcoind running under user bitcoin with home directory /home/bitcoin. We will also assume that you already have a running installation of Docker (or docker.io).
Prepare and verify
git version - we will need git to get the latest version of NWC. docker version - should execute successfully and show the currently installed version of Docker. docker compose version - same as before, but the version will be different. ss -tupln | grep 10009- should produce the following output: tcp LISTEN 0 4096 0.0.0.0:10009 0.0.0.0: tcp LISTEN 0 4096 [::]:10009 [::]:**
For things to work correctly, your Docker should be version 20.10.0 or later. If you have an older version, consider installing a new one using instructions here: https://docs.docker.com/engine/install/
Create folders & download NWC
In the home directory of your LND/bitcoind user, create a new folder, e.g., "nwc" mkdir /home/bitcoin/nwc. Change to that directory cd /home/bitcoin/nwc and clone the NWC repository: git clone https://github.com/getAlby/nostr-wallet-connect.git
Creating the Docker image
In this step, we will create a Docker image that you will use to run NWC.
- Change directory to
nostr-wallet-connect
:cd nostr-wallet-connect
- Run command to build Docker image:
docker build -t nwc:$(date +'%Y%m%d%H%M') -t nwc:latest .
(there is a dot at the end) - The last line of the output (after a few minutes) should look like
=> => naming to docker.io/library/nwc:latest
nwc:latest
is the name of the Docker image with a tag which you should note for use later.
Creating docker-compose.yml and necessary data directories
- Let's create a directory that will hold your non-volatile data (DB):
mkdir data
- In
docker-compose.yml
file, there are fields that you want to replace (<> comments) and port “4321” that you want to make sure is open (check withss -tupln | grep 4321
which should return nothing). - Create
docker-compose.yml
file with the following content, and make sure to update fields that have <> comment:
version: "3.8" services: nwc: image: nwc:latest volumes: - ./data:/data - ~/.lnd:/lnd:ro ports: - "4321:8080" extra_hosts: - "localhost:host-gateway" environment: NOSTR_PRIVKEY: <use "openssl rand -hex 32" to generate a fresh key and place it inside ""> LN_BACKEND_TYPE: "LND" LND_ADDRESS: localhost:10009 LND_CERT_FILE: "/lnd/tls.cert" LND_MACAROON_FILE: "/lnd/data/chain/bitcoin/mainnet/admin.macaroon" DATABASE_URI: "/data/nostr-wallet-connect.db" COOKIE_SECRET: <use "openssl rand -hex 32" to generate fresh secret and place it inside ""> PORT: 8080 restart: always stop_grace_period: 1m
Starting and testing
Now that you have everything ready, it is time to start the container and test.
- While you are in the
nwc
directory (important), execute the following command and check the log output,docker compose up
- You should see container logs while it is starting, and it should not exit if everything went well.
- At this point, you should be able to go to
http://<ip of the host where nwc is running>:4321
and get to the interface of NWC - To stop the test run of NWC, simply press
Ctrl-C
, and it will shut the container down. - To start NWC permanently, you should execute
docker compose up -d
, “-d” tells Docker to detach from the session. - To check currently running NWC logs, execute
docker compose logs
to run it in tail mode add-f
to the end. - To stop the container, execute
docker compose down
That's all, just follow the instructions in the web interface to get started.
Updating
As with any software, you should expect fixes and updates that you would need to perform periodically. You could automate this, but it falls outside of the scope of this tutorial. Since we already have all of the necessary configuration in place, the update execution is fairly simple.
- Change directory to the clone of the git repository,
cd /home/bitcoin/nwc/nostr-wallet-connect
- Run command to build Docker image:
docker build -t nwc:$(date +'%Y%m%d%H%M') -t nwc:latest .
(there is a dot at the end) - Change directory back one level
cd ..
- Restart (stop and start) the docker compose config
docker compose down && docker compose up -d
- Done! Optionally you may want to check the logs:
docker compose logs
-
@ d2e97f73:ea9a4d1b
2023-04-11 19:36:53There’s a lot of conversation around the #TwitterFiles. Here’s my take, and thoughts on how to fix the issues identified.
I’ll start with the principles I’ve come to believe…based on everything I’ve learned and experienced through my past actions as a Twitter co-founder and lead:
- Social media must be resilient to corporate and government control.
- Only the original author may remove content they produce.
- Moderation is best implemented by algorithmic choice.
The Twitter when I led it and the Twitter of today do not meet any of these principles. This is my fault alone, as I completely gave up pushing for them when an activist entered our stock in 2020. I no longer had hope of achieving any of it as a public company with no defense mechanisms (lack of dual-class shares being a key one). I planned my exit at that moment knowing I was no longer right for the company.
The biggest mistake I made was continuing to invest in building tools for us to manage the public conversation, versus building tools for the people using Twitter to easily manage it for themselves. This burdened the company with too much power, and opened us to significant outside pressure (such as advertising budgets). I generally think companies have become far too powerful, and that became completely clear to me with our suspension of Trump’s account. As I’ve said before, we did the right thing for the public company business at the time, but the wrong thing for the internet and society. Much more about this here: https://twitter.com/jack/status/1349510769268850690
I continue to believe there was no ill intent or hidden agendas, and everyone acted according to the best information we had at the time. Of course mistakes were made. But if we had focused more on tools for the people using the service rather than tools for us, and moved much faster towards absolute transparency, we probably wouldn’t be in this situation of needing a fresh reset (which I am supportive of). Again, I own all of this and our actions, and all I can do is work to make it right.
Back to the principles. Of course governments want to shape and control the public conversation, and will use every method at their disposal to do so, including the media. And the power a corporation wields to do the same is only growing. It’s critical that the people have tools to resist this, and that those tools are ultimately owned by the people. Allowing a government or a few corporations to own the public conversation is a path towards centralized control.
I’m a strong believer that any content produced by someone for the internet should be permanent until the original author chooses to delete it. It should be always available and addressable. Content takedowns and suspensions should not be possible. Doing so complicates important context, learning, and enforcement of illegal activity. There are significant issues with this stance of course, but starting with this principle will allow for far better solutions than we have today. The internet is trending towards a world were storage is “free” and infinite, which places all the actual value on how to discover and see content.
Which brings me to the last principle: moderation. I don’t believe a centralized system can do content moderation globally. It can only be done through ranking and relevance algorithms, the more localized the better. But instead of a company or government building and controlling these solely, people should be able to build and choose from algorithms that best match their criteria, or not have to use any at all. A “follow” action should always deliver every bit of content from the corresponding account, and the algorithms should be able to comb through everything else through a relevance lens that an individual determines. There’s a default “G-rated” algorithm, and then there’s everything else one can imagine.
The only way I know of to truly live up to these 3 principles is a free and open protocol for social media, that is not owned by a single company or group of companies, and is resilient to corporate and government influence. The problem today is that we have companies who own both the protocol and discovery of content. Which ultimately puts one person in charge of what’s available and seen, or not. This is by definition a single point of failure, no matter how great the person, and over time will fracture the public conversation, and may lead to more control by governments and corporations around the world.
I believe many companies can build a phenomenal business off an open protocol. For proof, look at both the web and email. The biggest problem with these models however is that the discovery mechanisms are far too proprietary and fixed instead of open or extendable. Companies can build many profitable services that complement rather than lock down how we access this massive collection of conversation. There is no need to own or host it themselves.
Many of you won’t trust this solution just because it’s me stating it. I get it, but that’s exactly the point. Trusting any one individual with this comes with compromises, not to mention being way too heavy a burden for the individual. It has to be something akin to what bitcoin has shown to be possible. If you want proof of this, get out of the US and European bubble of the bitcoin price fluctuations and learn how real people are using it for censorship resistance in Africa and Central/South America.
I do still wish for Twitter, and every company, to become uncomfortably transparent in all their actions, and I wish I forced more of that years ago. I do believe absolute transparency builds trust. As for the files, I wish they were released Wikileaks-style, with many more eyes and interpretations to consider. And along with that, commitments of transparency for present and future actions. I’m hopeful all of this will happen. There’s nothing to hide…only a lot to learn from. The current attacks on my former colleagues could be dangerous and doesn’t solve anything. If you want to blame, direct it at me and my actions, or lack thereof.
As far as the free and open social media protocol goes, there are many competing projects: @bluesky is one with the AT Protocol, nostr another, Mastodon yet another, Matrix yet another…and there will be many more. One will have a chance at becoming a standard like HTTP or SMTP. This isn’t about a “decentralized Twitter.” This is a focused and urgent push for a foundational core technology standard to make social media a native part of the internet. I believe this is critical both to Twitter’s future, and the public conversation’s ability to truly serve the people, which helps hold governments and corporations accountable. And hopefully makes it all a lot more fun and informative again.
💸🛠️🌐 To accelerate open internet and protocol work, I’m going to open a new category of #startsmall grants: “open internet development.” It will start with a focus of giving cash and equity grants to engineering teams working on social media and private communication protocols, bitcoin, and a web-only mobile OS. I’ll make some grants next week, starting with $1mm/yr to Signal. Please let me know other great candidates for this money.
-
@ 82341f88:fbfbe6a2
2023-04-11 19:36:53There’s a lot of conversation around the #TwitterFiles. Here’s my take, and thoughts on how to fix the issues identified.
I’ll start with the principles I’ve come to believe…based on everything I’ve learned and experienced through my past actions as a Twitter co-founder and lead:
- Social media must be resilient to corporate and government control.
- Only the original author may remove content they produce.
- Moderation is best implemented by algorithmic choice.
The Twitter when I led it and the Twitter of today do not meet any of these principles. This is my fault alone, as I completely gave up pushing for them when an activist entered our stock in 2020. I no longer had hope of achieving any of it as a public company with no defense mechanisms (lack of dual-class shares being a key one). I planned my exit at that moment knowing I was no longer right for the company.
The biggest mistake I made was continuing to invest in building tools for us to manage the public conversation, versus building tools for the people using Twitter to easily manage it for themselves. This burdened the company with too much power, and opened us to significant outside pressure (such as advertising budgets). I generally think companies have become far too powerful, and that became completely clear to me with our suspension of Trump’s account. As I’ve said before, we did the right thing for the public company business at the time, but the wrong thing for the internet and society. Much more about this here: https://twitter.com/jack/status/1349510769268850690
I continue to believe there was no ill intent or hidden agendas, and everyone acted according to the best information we had at the time. Of course mistakes were made. But if we had focused more on tools for the people using the service rather than tools for us, and moved much faster towards absolute transparency, we probably wouldn’t be in this situation of needing a fresh reset (which I am supportive of). Again, I own all of this and our actions, and all I can do is work to make it right.
Back to the principles. Of course governments want to shape and control the public conversation, and will use every method at their disposal to do so, including the media. And the power a corporation wields to do the same is only growing. It’s critical that the people have tools to resist this, and that those tools are ultimately owned by the people. Allowing a government or a few corporations to own the public conversation is a path towards centralized control.
I’m a strong believer that any content produced by someone for the internet should be permanent until the original author chooses to delete it. It should be always available and addressable. Content takedowns and suspensions should not be possible. Doing so complicates important context, learning, and enforcement of illegal activity. There are significant issues with this stance of course, but starting with this principle will allow for far better solutions than we have today. The internet is trending towards a world were storage is “free” and infinite, which places all the actual value on how to discover and see content.
Which brings me to the last principle: moderation. I don’t believe a centralized system can do content moderation globally. It can only be done through ranking and relevance algorithms, the more localized the better. But instead of a company or government building and controlling these solely, people should be able to build and choose from algorithms that best match their criteria, or not have to use any at all. A “follow” action should always deliver every bit of content from the corresponding account, and the algorithms should be able to comb through everything else through a relevance lens that an individual determines. There’s a default “G-rated” algorithm, and then there’s everything else one can imagine.
The only way I know of to truly live up to these 3 principles is a free and open protocol for social media, that is not owned by a single company or group of companies, and is resilient to corporate and government influence. The problem today is that we have companies who own both the protocol and discovery of content. Which ultimately puts one person in charge of what’s available and seen, or not. This is by definition a single point of failure, no matter how great the person, and over time will fracture the public conversation, and may lead to more control by governments and corporations around the world.
I believe many companies can build a phenomenal business off an open protocol. For proof, look at both the web and email. The biggest problem with these models however is that the discovery mechanisms are far too proprietary and fixed instead of open or extendable. Companies can build many profitable services that complement rather than lock down how we access this massive collection of conversation. There is no need to own or host it themselves.
Many of you won’t trust this solution just because it’s me stating it. I get it, but that’s exactly the point. Trusting any one individual with this comes with compromises, not to mention being way too heavy a burden for the individual. It has to be something akin to what bitcoin has shown to be possible. If you want proof of this, get out of the US and European bubble of the bitcoin price fluctuations and learn how real people are using it for censorship resistance in Africa and Central/South America.
I do still wish for Twitter, and every company, to become uncomfortably transparent in all their actions, and I wish I forced more of that years ago. I do believe absolute transparency builds trust. As for the files, I wish they were released Wikileaks-style, with many more eyes and interpretations to consider. And along with that, commitments of transparency for present and future actions. I’m hopeful all of this will happen. There’s nothing to hide…only a lot to learn from. The current attacks on my former colleagues could be dangerous and doesn’t solve anything. If you want to blame, direct it at me and my actions, or lack thereof.
As far as the free and open social media protocol goes, there are many competing projects: @bluesky is one with the AT Protocol, nostr another, Mastodon yet another, Matrix yet another…and there will be many more. One will have a chance at becoming a standard like HTTP or SMTP. This isn’t about a “decentralized Twitter.” This is a focused and urgent push for a foundational core technology standard to make social media a native part of the internet. I believe this is critical both to Twitter’s future, and the public conversation’s ability to truly serve the people, which helps hold governments and corporations accountable. And hopefully makes it all a lot more fun and informative again.
💸🛠️🌐 To accelerate open internet and protocol work, I’m going to open a new category of #startsmall grants: “open internet development.” It will start with a focus of giving cash and equity grants to engineering teams working on social media and private communication protocols, bitcoin, and a web-only mobile OS. I’ll make some grants next week, starting with $1mm/yr to Signal. Please let me know other great candidates for this money.
-
@ 32e18276:5c68e245
2023-07-30 21:19:40Company Overview:
Damus is the pioneering iOS nostr client. Through its platform, Damus empowers billions on iOS with the tools for free speech and free speech money. If you're driven to bring freedom technology to the masses and ignite change, we invite you to join our mission.
Job Description
- Collaborate on iOS Damus in tandem with our core developer, Will Casarin, and the broader Damus team.
- Implement our vision as laid out in our product roadmap: https://github.com/orgs/damus-io/projects/3/views/1.
- Embrace the fun and critical mission of undermining totalitarian regimes across the globe.
Job Requirements
- A genuine passion for freedom technology.
- At least one year of collaborative development experience.
- Experience building SwiftUI iOS apps.
- Passionate about design and user experience.
- Eager to work in close coordination with Damus lead developer, Will, and a dedicated team spanning development, design, and product.
- Commitment to a full-time role, although we remain open to discussing alternative arrangements.
Bonus Qualifications
- Experience with Nostr development.
- Experience with C.
- Previous work in free and open source projects.
- A publicly shareable portfolio.
Job Structure
- A one-month paid probationary period to ensure a mutual fit.
- Upon successful completion of the trial, the opportunity for a six (6) month contractual engagement.
- The potential for contract renewal, contingent on funding.
Application Process:
Interested candidates should forward a motivational statement alongside their CV/portfolio to vanessa@damus.io. -
@ 1bc70a01:24f6a411
2023-07-30 07:43:26Originally published on October 15, 2022. Moving from my personal blog to Nostr 🙌
Don't trust, verify
Some people find this very annoying. But I've realized that most people are winging it in life. Human psychology and power dynamics influences how people think and what they say. People want to seem right so they copy what others say, and expect you to believe it.
One way people consume information is through media - news and articles. They'll read something and assume it's true.
Cited by a "study"? Must be true.
"Experts say"?
Gotta be true.
Except, if you check the sources and do some critical thinking, you might uncover that the study is flawed or that the experts are biased.
Simple things like statistical significance are thrown out the window in some studies.
People overlook personal biases.
This all leads to poor data and misrepresentation. This is how you get click-bait headlines.
The only way to know what's true is to try and verify it for yourself.
Even if you are not able to verify the facts, you can come away with a range of certainty about what you learned. Is it probably true? May be true. Certainly true? Or maybe there is a grain of truth to it? The level of confidence can vary. You don't need to be 100% confident in your newly acquired knowledge.
Treat new knowledge as data points
"It depends" is the most appropriate answer to all questions in life. The actual answer is very nuanced and depends on all sorts of things.
But, the way media presents information and the way our minds try to categorize it is by replacing past beliefs.
Once you learn something new, it's tempting to disregard your past knowledge.
I don't see information this way.
To me, all new information is a data point in my life-long accumulated collection of other data points.
Think of it like a toolbox. You have many tools in your toolbox and they are all useful for some specific task. You don't throw away a tool you bought because you acquired another.
Information is much the same. You source information from your mind to understand some subject. Consider existing beliefs, while looking at the new information. The goal is to connect the dots.
Sooner or later the data points (dots) start connecting, and you get a better picture of the subject.
Seek out incentives
Incentives drive human behavior.
Incentive to earn.
Incentive to love.
Incentive to feel good about yourself (generosity).
If you can figure out a person's incentives, you will start to understand their behavior.
People suck at understanding this. They blame the wrong things. They assign responsibility to people who are not responsible. They create made up reasons for why someone did something. All the whole ignoring incentives.
Incentives drive everything. Ask the following questions:
- What is the ultimate goal of this person by saying or doing this?
- What do they seek to gain?
- How does it make them feel?
- What will they do at the prospect of failure?
- Who are the people they care for?
- What do they stand to lose?
- What motivates their actions?
- In almost all cases the answer to these questions involves a personal gain / loss.
Give people the benefit of the doubt
When trying to decide whether something was said or done with good intent, assume that's the case.
With exceptions such as career politicians or criminals, it is best to assume that people mean well and people are default=nice.
Otherwise it becomes difficult to have faith in systems, processes and humanity.
I know the world can be an ugly place, but it can also be a beautiful place. I'd rather assume people are seeking to build a better future for themselves.
If they act against everyone's best interest, it is best to assume they do so out of personal gain rather than some sinister plot. Look at the incentives that drive their behavior and figure out how to modify them to best serve everyone.
Acknowledge that material wealth does not bring happiness and strive for what matters to you.
Money does not bring happiness no matter what anyone says. It does relieve you of despair and help live a better, more peaceful and enjoyable life, but it will not make you happy.
Very few people in the world, if any, know true happiness because they confuse happiness (the process) with happiness (the end state). There is no end state for happiness.
When people ask you if you are happy, it's hard to answer, right?
Your instinct is to say "yes, of course". After all, your day is going fine, you have every comfort you desire and you had a pleasant lunch with a friend. You feel "happy".
But, this happiness fades. Tomorrow you get in a car accident, your car damaged, insurance won't pay out and you miss 2 weeks of work. You are "unhappy".
Some people confuse happiness with being content. You can be content with your place in life, but that doesn't mean you're happy.
The whole debate over happiness is a mute point. Since happiness is a process which is more akin to enjoyment, ultimately there is no such thing as happiness. The word itself should disappear from dictionaries.
Ok, I'm joking. But, you get the point.
When it comes to my life, I strive for satisfaction and contentment.
If I'm satisfied that day, that week, that month, that year, or the last decade, then I've met my "happiness" goals. If I am content with my life, then I'm "happy".
While material wealth can provide satisfaction, the contentment piece is often missing. With wealth, the goal post moves from one spot to another. Once you've achieved a satisfying moment, you have to move to the next, and the next. Each time you have to introduce higher states of satisfaction that require more effort and money.
Eventually, you plateau and satisfying moments from the physical world no longer matter. You seek out emotional attachment and relationships.
This is why so many wealthy people are depressed. No matter how much they spend, they can never reach that emotional attachment provided by love and relationships.
In fact, wealth only complicates those things. Instead of people seeing you for who you are as a person, a new variable is introduced and they see you for your status. With misaligned incentives, it becomes more difficult to form genuine relationships. Thus, contentment is hard to achieve.
I'm going to let you in on the biggest secret everyone is asking about - the secret to happiness.
The secret to happiness is charitable work. Doing things for others, without expecting anything in return.
Charitable acts are secretly selfish acts, whether that person acknowledges it or not.
All humans act in self-preservation. The need to feel good about ourselves. The need for approval. The need for comfort and safety. The need for recognition. All of these needs are driven by self-preservation.
Charity is no different. While we may feel that we do good things for others out of the goodness of our hearts, the evolutionary incentive is to feel satisfied by our deeds. There's nothing wrong with doing good things to feel good, as long as you understand why you're doing it, instead of performing mental gymnastics.
We are not designed for our digital world and should not strive to adapt to it
Evolution took hundreds of thousands of years to shape humans into what we are today. But technology accelerated the process by many magnitudes all within about 100 years. Except, it hasn't. The evolutionary clock is still catching up, out of breath and exhausted with your daily activity.
While we adapt rather well, humans are not capable of processing the amount of data and connections offered by the internet. This ultimately messes with our minds.
Think about your current physical world social circle. Small, yes?
Now think about your digital social circle. Much larger?
How many people do you think about out of your larger digital social circle? My guess is not that many.
While you may recognize the individuals you interact with, the meaning connections formed in the digital world are somewhat closely mirrored to the physical world. I hypothesize this is because we have innate evolutionary boundaries that limit our ability to expand beyond a certain natural limit.
Going beyond the invisible evolutionary boundaries places a burden on your mind. It feels unhealthy.
Remember the last time you thought to yourself "I need a break. I need to unplug"? That's your evolutionary boundary telling you it is not capable of processing any more data. You need a purge to get back to equilibrium.
We are not designed to be always-on, always-connected, always-responsive. At least not yet. It's possible the evolutionary process catches up and prepares us to handle all of it, but I am not confident it will be any time soon.
I try to remind myself that I am not meant to have 50,000 followers and expect to keep everyone happy. I am not meant to follow everyone's updates and be expected to respond. To stay healthy, I need to unplug.
Life is what you make of it, do what you wish to do and ignore the rest
There's no "right way" to do anything. Everything is a figment of human imagination. To do things "the right way" means doing it by someone else's definition of right.
I found it incredibly freeing to realize that the only thing that is pure in this world is the natural world and your mind.
The world is a canvas.
You are the paint brush.
Everyone is capable of everything and nothing at the same time.
Everything else is noise.
Yes, we have society. Yes, we have rules. These artificial, human-invented boundaries act to preserve harmony. But, ultimately, the only pure forms of existence are the human mind and the natural world. What you do is the only thing within your boundary of control (to an extent).
Since we experience the world with our minds, we are all that is true. Everything else is a projection of some entity or experience that seeks to change our behavior (intentionally or unintentionally).
With this in mind, I look at people's experiences as unique to them and do not assume that they will translate in the same way to my life if I apply them.
Only worry about things that are in my control
I can't control what people do. I can try to influence them, but ultimately they will do what they decide to do. Since I can only control myself, I don't worry about what others think or do.
Never regret
It's easy to say "I don't regret things" but in reality emotions take us there and we think about what we regret.
I would be lying if I said I don't have feelings of regret. But, I forget about them quickly so I can focus on the things I can control. What passed is now behind me and there's no way to change it back, so there's no point in regretting.
Reflection is not the same as regret. One can reflect by looking at the past, and make changes for the future. Regrets are bitter feelings an no action.
Truth is relative
Even as I write this, my own truths are relative. What is true for me, or is true now, may not be true for you or true when you read it. It may be wrong.
People used to believe earth was the center of the universe. That the sun went around the earth. That to ward off bad spirits you had to offer sacrifices.
These were all truths to the people who held those beliefs. Now we know different.
Given what we know now, it is only logical to assume that our current truths may be false in the future. Even physics, the foundation upon we base so many truths may be proven to be wrong or incompatible with other parts of the universe.
Most things are not the best version of themselves therefore there is always room for improvement.
Since things get better all the time, it is only logical to assume that even the best thing we can think of is not the best version of that thing. The "best" is relative to time, technology and imagination.
Whenever someone says "but we already have X that works great", that doesn't mean that Y cannot be better than X, nobody just thought of it yet.
Stay curious, stay hungry and know that you can create something better.
Take experts with a grain of salt
Even the top experts in their fields are often incorrect.
For example - I know a lot about conversion optimization, but if you were to take every advice I dish out, some of it will inevitably flop.
When you get to an expert level, when someone asks you something you don't know, it is tempting to give an answer. When many people ask you for something, it's even more tempting to respond.
Saying "I don't know" is difficult even if you feel that you should say it. Before you can utter the words, your mind says "I should know this..."
The second factor that makes experts not as credible as they may have been in the past, is the speed at which technology moves.
Think about a specialized doctor. This doctor may be the top expert in their field, but it is impossible for them to read the thousands of medical journals published every day. So many breakthroughs could happen in a year without this doctor ever knowing about them.
By the time an expert weighs in, the "truth" of the matter may have already shifted.
Knowing this, you can assume that not everything an expert says is correct, especially if they step outside of their lane.
Elon Musk is a great example of this. While Elon is a successful entrepreneur and perhaps knows a thing or two about rocket engines, when he steps outside of his lane and talks about crypto, he is no more coherent than someone who spent a lot of time in the space. He uses his influence to prop up things like dogecoin for fun and games, while people lose their life savings following his doge tweets.
I'm not saying that he is responsible for people's personal decisions, but he does influence them.
-
@ 957966b6:2d4fe6b7
2023-07-29 22:45:52Yahweh’s word which came to Zephaniah, the son of Cushi, the son of Gedaliah, the son of Amariah, the son of Hezekiah, in the days of Josiah, the son of Amon, king of Judah.
I will utterly sweep away everything from the surface of the earth, says Yahweh. I will sweep away man and animal. I will sweep away the birds of the sky, the fish of the sea, and the heaps of rubble with the wicked. I will cut off man from the surface of the earth, says Yahweh. I will stretch out my hand against Judah and against all the inhabitants of Jerusalem. I will cut off the remnant of Baal from this place—the name of the idolatrous and pagan priests, those who worship the army of the sky on the housetops, those who worship and swear by Yahweh and also swear by Malcam, those who have turned back from following Yahweh, and those who haven’t sought Yahweh nor inquired after him.
Be silent at the presence of the Lord Yahweh, for the day of Yahweh is at hand. For Yahweh has prepared a sacrifice. He has consecrated his guests. It will happen in the day of Yahweh’s sacrifice that I will punish the princes, the king’s sons, and all those who are clothed with foreign clothing. In that day, I will punish all those who leap over the threshold, who fill their master’s house with violence and deceit.
In that day, says Yahweh, there will be the noise of a cry from the fish gate, a wailing from the second quarter, and a great crashing from the hills. Wail, you inhabitants of Maktesh, for all the people of Canaan are undone! All those who were loaded with silver are cut off. It will happen at that time, that I will search Jerusalem with lamps, and I will punish the men who are settled on their dregs, who say in their heart, “Yahweh will not do good, neither will he do evil.” Their wealth will become a plunder, and their houses a desolation. Yes, they will build houses, but won’t inhabit them. They will plant vineyards, but won’t drink their wine.
The great day of Yahweh is near. It is near and hurries greatly, the voice of the day of Yahweh. The mighty man cries there bitterly. That day is a day of wrath, a day of distress and anguish, a day of trouble and ruin, a day of darkness and gloom, a day of clouds and blackness, a day of the trumpet and alarm against the fortified cities and against the high battlements. I will bring such distress on men that they will walk like blind men because they have sinned against Yahweh. Their blood will be poured out like dust and their flesh like dung. Neither their silver nor their gold will be able to deliver them in the day of Yahweh’s wrath, but the whole land will be devoured by the fire of his jealousy; for he will make an end, yes, a terrible end, of all those who dwell in the land.
Gather yourselves together, yes, gather together, you nation that has no shame, before the appointed time when the day passes as the chaff, before the fierce anger of Yahweh comes on you, before the day of Yahweh’s anger comes on you. Seek Yahweh, all you humble of the land, who have kept his ordinances. Seek righteousness. Seek humility. It may be that you will be hidden in the day of Yahweh’s anger. For Gaza will be forsaken, and Ashkelon a desolation. They will drive out Ashdod at noonday, and Ekron will be rooted up. Woe to the inhabitants of the sea coast, the nation of the Cherethites! Yahweh’s word is against you, Canaan, the land of the Philistines. I will destroy you until there is no inhabitant. The sea coast will be pastures, with cottages for shepherds and folds for flocks. The coast will be for the remnant of the house of Judah. They will find pasture. In the houses of Ashkelon, they will lie down in the evening, for Yahweh, their God, will visit them and restore them. I have heard the reproach of Moab and the insults of the children of Ammon, with which they have reproached my people and magnified themselves against their border. Therefore, as I live, says Yahweh of Armies, the God of Israel, surely Moab will be as Sodom, and the children of Ammon as Gomorrah, a possession of nettles and salt pits, and a perpetual desolation. The remnant of my people will plunder them, and the survivors of my nation will inherit them. This they will have for their pride, because they have reproached and magnified themselves against the people of Yahweh of Armies. Yahweh will be awesome to them, for he will famish all the gods of the land. Men will worship him, everyone from his place, even all the shores of the nations.
You Cushites also, you will be killed by my sword.
He will stretch out his hand against the north, destroy Assyria, and will make Nineveh a desolation, as dry as the wilderness. Herds will lie down in the middle of her, all kinds of animals. Both the pelican and the porcupine will lodge in its capitals. Their calls will echo through the windows. Desolation will be in the thresholds, for he has laid bare the cedar beams. This is the joyous city that lived carelessly, that said in her heart, “I am, and there is no one besides me.” How she has become a desolation, a place for animals to lie down in! Everyone who passes by her will hiss and shake their fists.
Woe to her who is rebellious and polluted, the oppressing city! She didn’t obey the voice. She didn’t receive correction. She didn’t trust in Yahweh. She didn’t draw near to her God.
Her princes within her are roaring lions. Her judges are evening wolves. They leave nothing until the next day. Her prophets are arrogant and treacherous people. Her priests have profaned the sanctuary. They have done violence to the law. Yahweh, within her, is righteous. He will do no wrong. Every morning he brings his justice to light. He doesn’t fail, but the unjust know no shame.
I have cut off nations. Their battlements are desolate. I have made their streets waste, so that no one passes by. Their cities are destroyed, so that there is no man, so that there is no inhabitant. I said, “Just fear me. Receive correction,” so that her dwelling won’t be cut off, according to all that I have appointed concerning her. But they rose early and corrupted all their doings.
“Therefore wait for me”, says Yahweh, “until the day that I rise up to the prey, for my determination is to gather the nations, that I may assemble the kingdoms to pour on them my indignation, even all my fierce anger, for all the earth will be devoured with the fire of my jealousy.
For then I will purify the lips of the peoples, that they may all call on Yahweh’s name, to serve him shoulder to shoulder. From beyond the rivers of Cush, my worshipers, even the daughter of my dispersed people, will bring my offering. In that day you will not be disappointed for all your doings in which you have transgressed against me; for then I will take away out from among you your proudly exulting ones, and you will no more be arrogant in my holy mountain. But I will leave among you an afflicted and poor people, and they will take refuge in Yahweh’s name. The remnant of Israel will not do iniquity nor speak lies, neither will a deceitful tongue be found in their mouth, for they will feed and lie down, and no one will make them afraid.”
Sing, daughter of Zion! Shout, Israel! Be glad and rejoice with all your heart, daughter of Jerusalem. Yahweh has taken away your judgments. He has thrown out your enemy. The King of Israel, Yahweh, is among you. You will not be afraid of evil any more. In that day, it will be said to Jerusalem, “Don’t be afraid, Zion. Don’t let your hands be weak.” Yahweh, your God, is among you, a mighty one who will save. He will rejoice over you with joy. He will calm you in his love. He will rejoice over you with singing. I will remove those who grieve about the appointed feasts from you. They are a burden and a reproach to you. Behold, at that time I will deal with all those who afflict you; and I will save those who are lame and gather those who were driven away. I will give them praise and honor, whose shame has been in all the earth. At that time I will bring you in, and at that time I will gather you; for I will give you honor and praise among all the peoples of the earth when I restore your fortunes before your eyes, says Yahweh.
The World English Bible is in the Public Domain. That means that it is not copyrighted. However, "World English Bible" is a Trademark of eBible.org.
-
@ 00000001:a21169a7
2023-07-29 19:50:18August Sander (1876-1964), a prominent figure in documentary photography, played a pivotal role in challenging and resisting Nazi ideology through his work during the difficult years preceding and following Adolf Hitler's rise to power in Germany.
Sander's most famous project, "People of the 20th Century," stands as a testament to this silent resistance. This monumental body of work, portraying a broad cross-section of German society, presented a bold challenge to the Nazi notion of a "superior race".
Sander's strategy was to humanize all the people he photographed, regardless of their social status or race. His approach was neither to glorify nor denigrate, but simply to portray people as they were, a vision that was in stark contrast with the Nazis' racial ideology.
Furthermore, Sander sought to dismantle the Nazis' rigid racial and social categories. In "People of the 20th Century," the range of subjects is impressively broad: aristocrats, peasants, artists, manual workers, women, men, Jews, non-Jews. By juxtaposing these different categories, Sander underscored the diversity and complexity of German society, thus challenging the Nazi notion of a "pure race".
Despite the resistance he faced, Sander's courage never wavered. His photographs were confiscated and destroyed by the Nazis, and his book "Face of Our Time" was banned in 1936. Yet, Sander persisted, and while he was forced to switch his focus to landscape photography to avoid further confrontations, he continued to document his time's Germany until his death in 1964.
Sander's resistance to Nazi ideology was not frontal or violent, but subversive and silent. His camera became his weapon, and his photographs, his resistance. By capturing his subjects' humanity, he demonstrated that diversity, far from being a threat, is a source of richness and vitality in a society.
August Sander's work remains a powerful tool for understanding the social and political context of Germany in the first half of the 20th century. At the same time, his work continues to be a reminder of the importance of preserving and respecting human dignity, regardless of race, religion, or social status. In this sense, Sander's photography not only documents the reality of his era but also offers a powerful critique of Nazi ideology and a more inclusive and humanitarian vision of society.
-
@ 00000001:a21169a7
2023-07-29 19:42:42If for Henri Cartier-Bresson the essence of photography was the decisive moment, for William Klein the defining factor is commitment. This assertion, made by Raphaëlle Stopin, curator of the 'William Klein. Manifesto' exhibition, couldn't be more accurate. At 91 years old and with multiple facets to his name - photographer, painter, filmmaker, publicist, writer, and activist - Klein has left an indelible mark on the landscape of art and photography with his irreverent commitment and radical approach.
Born in New York in 1928, Klein transformed mid-20th-century photography, creating an aesthetic language that evokes the rawness and emotions of a post-war society still to be rebuilt. Through his lens, we see a vibrant, dirty, mestizo, and ever-evolving New York. His unconventional approach, which mixed with the subjects he portrayed, broke with the standards of photography of his time, resulting in images that the white, Anglo-Saxon elite of the era found distasteful. However, his honest and committed vision ended up paving a new path in photography.
Klein began his creative journey in painting, where he already displayed a unique, free, and transgressive language. His entry into the world of photography happened fortuitously, by photographing his own paintings, and it was in 1956 when his vision exploded onto the scene, with the publication of 'Life is Good & Good for You in New York: Trance Witness Revels'. This work, the fruit of his immersion in the wild side of New York life, became a revolutionary manifesto that subverted the traditional principles of photography.
Klein's photographs were never just images. They were bold statements of freedom and experimentation, going against the conventions of his time. It was with his legendary book about New York that he became a disruptive figure in photography. Despite being published in France, Italy, and Japan, this book took 40 years to be published in the United States, a fact that evidences the rejection his raw and uninhibited gaze of the city caused.
But Klein was never content with just being a photographer. His enormous cultural baggage, inspired by figures such as Masaccio, Piero della Francesca, and the Bauhaus, propelled him to explore other forms of expression. In his work, we see the fusion of photography with painting, the 'painted contacts', where he applies paint to contact sheets with large brushes, creating a fusion of disciplines that is surprisingly innovative and evocative.
Today, at 91 years old, Klein continues to work, reinventing himself and reinterpreting his archive. Although his pace has changed, his commitment to his art and his language has not diminished. His legacy endures, and his influence continues to be visible in contemporary photography.
Is Klein a Leonardo of photography? Perhaps it's not an exact comparison. But, just like Da Vinci, Klein has shown a diversity of interests and skills that rivals any Renaissance polymath's. And, unlike Da Vinci, Klein is indisputably wilder and more committed.
In summary, William Klein is a visionary, a provocateur, and an iconoclast, whose work remains a source of inspiration and a challenge to today's artists and photographers. With his committed, free, and innovative gaze, Klein has demonstrated that photography, like any form of art, should always be an act of commitment and revolution.
-
@ 1bc70a01:24f6a411
2023-07-29 13:15:49I’m going to think out lout about the UAP hearing a bit, feel free to tune me out.
Scenario 1
The one most of us are probably thinking: This entire thing is a distraction from one or many known or unknown issues currently underway. It must be so bad that it warrants national attention to the all-entertaining “UAP” subject.
For this to be true, we have to assume one of the following: 1. The people testifying are lying. Maybe paid actors, maybe not, who knows, but they are not telling the truth. 2. They are telling their “truth”, however they obtained it, perhaps even their reality. We do know UFOs are commonly reported worldwide, so perhaps they are just telling what they saw.
In terms of congressional involvement, scenario one would mean that some or all of the congress members are full of shit. This is an unlikely scenario in my mind (hard to get everyone to go along with a lie). The more likely scenario is that they are in it for the wild ride, totally unaware they are being used as pawns (or at least some aren’t).
What makes this scenario possible in my opinion is that Grush is allowed to speak about the events publicly but not discuss details (unless privately). This makes for good public theatre. Why would unsanctioned projects allow Grush to even go so far as to speak about the events when a slip and fall accident could occur at any moment? The answer would have to be either: 1. Because it’s all bullshit, or mostly bullshit. 2. Because the agencies involved are really THAT incompetent and hadn’t figured out what is happening. 3. Because they have no problem with him talking about it because they would like to slowly disclose alien technology and the fact that we’re not alone. 4. To serve the purpose of distraction.
Assuming scenario 1 is correct, the most likely scenario is that the people involved are telling their own version of truth and that some of the congressional members are unaware of the real reason they’re there.
This is an easy fallback for any time you want to divert attention from some other pressing matters. Just talk about UFOs!
The earlier story in NY Times would also make sense in this case - as the NYT has been known to act as the propaganda arm of the government. The NY Times story brought a lot of attention to the matter, but we have to also wonder why this was under the rug for so many years? Perhaps because it’s pure bullshit.
The “It’s real!” Scenario….
Let us assume that this is not a distraction. The people testifying are telling the truth. Everything is as they say it is. Ooof!
This would mean that everything Grush says is true - there are unsanctioned programs and or people with extreme privilege and clearance not granted to anyone else. People that are not even part of the government in any way.
It would also mean we have recovered alien craft and bodies and have attempted to reverse engineer their tech. Perhaps we already succeeded and what we see flying around today is this reverse engineered tech, or actual ETs.
What would make this scenario real?
Well, there is no shortage of UFO sightings. People swearing their life on it. For us to discount all of them would mean calling everyone a liar, or just confused about some other phenomena. I wouldn’t bet on all liars and delusional, but confused, maybe. Still, some accounts are just too out there to dismiss without some serious thought. The fact that people’s stories seem to match could mean that there’s truth to those reports.
Perhaps the aliens use pods (classic saucer shapes, or “tic tacs” or other shapes craft. Maybe the “cubes” in orbs are their versions of drones? Or not? Maybe the large cubes are spacecraft.
Yes, they cover vast amounts of space quickly and STILL crash on our planet. Rookies that failed the landing maneuver in flight school… heh.
The governments of the world are all cooperating in hiding the truth and or the US government has unprecedented access to other parts of the world to recover and cover up various incidents. WOW… ok. That means we have something like a one world alien-communication government that has unprecedented access beyond anyone’s control. How??
If this is the real deal, we should all be losing our minds right now. The prospect of another species with vastly superior technology with interest in our nuclear dealings should scare everyone. Forget everything else, this is priority #1.
THE PROBLEM with this scenario?
- Some foreign govt. would have already revealed what they know. Would they not? Are we to believe they are all collaborating on this with some inter-government agency that acts with ultimate authority? Why? How? Maybe they are already speaking out and I’m unaware?
- We STILL don’t have any definitive evidence. He said, she said, they said. “First account” from a “trusted source”. OK.
- Aliens possess mind-blown tech, but still crash once in a while. I guess you could say that there will always be a small chance of crash no matter who is flying what. I’ll give people that.
- “It’s classified”. No amount of talking seems to reveal the actual “classified” information. Even during the congressional meeting it’s always “I can tell you privately”, and “I’m not permitted to talk about this by law”. Which is it? Are you, or are you not allowed to speak on the matter? Why can you disclose it behind closed doors but not in public? Seems rather strange to me.
In the “It’s all real!” Scenario, we are all screwed. Either we are being prepped for eventual disclosure of alien life or aliens can do crap we can’t protect against. If we’re being prepared for disclosure, it may mean that we have no say in how things unfold on our planet and the best we can do is ease people into it without everyone losing their minds. In that case, a series of hearings on the matter might make sense. Keep talking about it louder and louder until people are no longer shocked to hear the truth. Buckle up everyone, things are about to get interesting!
Scenario 3: “Trust Us, We’re here to serve you”
From the hearing, we hear multiple times congress mention “distrust in the government” and “rebuild trust”. In line with Scenario 1, this would mean everything is just a smoke screen to “build trust”. “Oh look, congress cares about me😍!!”
It would make sense why nothing has been revealed concretely and no foreign government came forward yet. Didn’t you know, UFOs only crash in the states! Haha.
Congress is just “building back trust”, either fully aware of the lies, or using mentally unstable people for their gain. Not saying the witnesses are mentally unstable, but you never really know, do you? Tell yourself a story enough times, it becomes a reality.
My Personal Take?
I wish to remain open minded about this subject. There are too many questions unanswered for me and I don’t want to dismiss anything entirely. Could there be aliens visiting this planet? I don’t see why not. We don’t know what we don’t know. For me to dismiss the possibility of non-human life traveling to our planet would mean to believe we have learned everything there is to learn about physics and the universe, when in fact it is the opposite, we know little to nothing.
Could it be bullshit? Yes, of course. I also like to consider things from a practical perspective and think in terms of probabilities rather than right and wrong, yes or no. If I had to guess the odds of aliens traveling to our planet, I’d give it at 5% chance. So yeah, not much. I’m much more inclined to believe this is all just human manipulation to achieve some agenda. I won’t speculate what that agenda might be as it’s anyone’s guess really. There’s no way to prove anything.
Should all of this be real, things are going to get very interesting…
-
@ 00000001:a21169a7
2023-07-29 08:32:13Robert Frank, born on November 9, 1924, in Zurich, Switzerland, was the child of Rosa (Zucker) and Hermann Frank, of Jewish origin. Though his family managed to remain safe during the Second World War in Switzerland, the looming threat of Nazism informed his understanding of oppression, invariably seeping into his subsequent work. Frank was drawn to photography partly as a means to escape from his business-oriented family and home. He was mentored by various photographers and graphic designers before he crafted his first handmade book of photographs, "40 Photos," in 1946. At the age of 23, Frank migrated to the United States, seeking to broaden his photographic horizon. He found employment in New York as a fashion photographer for Harper's Bazaar. However, the rigidity and artifice of fashion photography failed to satiate his creative yearning. He soon quit his job and embarked on journeys through South America and Europe, creating a second handmade book of photographs taken in Peru.
Upon returning to the United States in 1950, Frank began to experience a transformation in his photographic perspective. Influenced by his disillusionment with the frantic pace of American life and its emphasis on money, he began to perceive America as a place often desolate and lonely. This vision is reflected in his subsequent photography.
The aesthetic context of the subjective photography of the 1950s, which Frank adopted, was marked by a paradigm shift in how photography was understood and practiced. It involved a move away from the apparent objectivity and neutrality that characterized documentary and journalistic photography, to adopt a more personal and subjective approach. Images ceased to be seen simply as faithful representations of reality to become the photographer's personal interpretations.
In this context, Robert Frank stood out for his direct, unadorned style. His photographs did not aim to beautify or idealize reality, but to capture it as it is, with all its imperfections and contradictions. For Frank, the camera was a tool to explore and question society, rather than merely documenting it. His most emblematic work, "The Americans," first published in Paris in 1958, is a chronicle of his journey through 48 states of the United States. The book, composed of 83 images selected from 28,000 shots, offers an unconventional and critical view of American life. Through his images, Frank portrays a society marked by racial and class inequality, loneliness, and helplessness. The images in "The Americans" are often dark, blurry, and off-kilter, contributing to a sense of unease and instability. Robert Frank continued to work intensely throughout his career. Here is a list of some of his most representative works, ordered by year:
- 1946: Published his first handmade book of photographs, "40 Photos."
- 1947: Moved to the United States and began working for Harper's Bazaar.
- 1950: Participated in the group exhibition "51 American Photographers" at the MoMA and published his second book of photographs from his journey through Peru.
- 1955: Received a Guggenheim Fellowship to travel and photograph the United States, a project that would result in "The Americans."
- 1958: Published "The Americans" in Paris.
- 1959: "The Americans" was published in the United States.
- 1961: Had his first solo exhibition at the Art Institute of Chicago.
- 1962: Exhibited at the Museum of Modern Art in New York.
- 1983: The French magazine "Les Cahiers de la photographie" devoted two special issues to his work.
- 2008: A new edition of "The Americans" was published to commemorate the 50th anniversary of its first publication.
- 2009: The exhibition "Looking In: Robert Frank's The Americans" was shown at the National Gallery of Art in Washington D.C., the San Francisco Museum of Modern Art, and the Metropolitan Museum of Art in New York.
Robert Frank's influence on contemporary photography is undeniable. His subjective and critical vision opened new pathways for understanding and practicing photography, shifting it away from conventionalism and closer to a more artistic and personal conception.
-
@ 20986fb8:cdac21b3
2023-07-29 06:45:23YakiHonne.com is continuously improving to offer a top-notch user experience. With weekly updates being rolled out, you are invited to test these updates and post your feedback and opinions as an article via YakiHonne.com.
As an incentive, participants can earn up to 100,000 SATs.
Round 2 will be from 27th to 30th July
How to participate:
- Pick one or multiple Updates below, test it (them)
- Write your feedback and opinion (pros and cons are all welcomed)
- Post your article on Nostr via YakiHonne.com
- Share your article to social media like Nostr and Twitter, don't forget to @YakiHonne
- Share the link to our group: http://t.me/YakiHonne_Daily_Featured
- Be Zapped!
Requirements:
- No malicious speech such as discrimination, attack, incitement, etc.
- No Spam/Scam, not fully AI-generated article
- No directly copy & paste from other posts on Relays
- Experience our updates in action, NO limit on the length of your post, share your REAL feedback and opinion
- The top 10 posts will be zapped during each round.
- The evaluation will based on the article's depth, completeness, and constructiveness.
- Contact us for additional zaps if bugs are found.
Updates to be tested in Round 2:
-
Comments: re-implemented and comments can be deleted
-
NIP-25 supporting: users now can upvote and downvote
-
Zap stats: Zaps sent and Zaps received can be seen from users profiles
-
“login with an extension” button: now it is grayed out rather than invisible
-
Search: search list showing optimization, adjust users searching results to the NIP-21 URI scheme
-
Tags: click on the tags in the article to view the content under the tag
-
Share: sharing posts with one click
-
NIP-05: verify your profile
If you missed Round 1, the updates below could be tested as additions:
-
Comment function: more user-friendly
-
Stats area: clearly displays the interactive status of each article
-
Following function: generated-key users can also be followed
-
Curation function: easily add or remove articles from an existing curation
-
Tags: search and browse articles with the tags
-
Home feed scrolling pagination: optimized data fetching and faster loading time
-
Article editing preview: preview the final version of the article while writing in MarkDown
Don't miss this opportunity to participate in Round 2, test the updates, and provide valuable feedback. Head over to YakiHonne.com to share your thoughts and earn SATs for your valuable input. Act fast!
About YakiHonne:
YakiHonne is a Nostr-based decentralized content media protocol, which supports free curation, creation, publishing, and reporting by various media. Try YakiHonne.com Now!
Follow us
- Telegram: http://t.me/YakiHonne_Daily_Featured
- Twitter: @YakiHonne
- Nostr pubkey: npub1yzvxlwp7wawed5vgefwfmugvumtp8c8t0etk3g8sky4n0ndvyxesnxrf8q
-
@ 20986fb8:cdac21b3
2023-07-29 06:44:43A long-term Nostr Creation Grant, with a 17,500,000 SATs funding pool
Round 3 starts on 22 July till 5 Aug!
Creating for You and Your Fans through Nostr and ZAP.
Nostr is a simple, open and censorship-resistant protocol, the number of users has been growing, and more and more users use zap to tip content. Nostr's growth over the past six months is amazing, which is a great encouragement for all nostrians. This is also a great encouragement for content creators. Earn SATs by posting your creations on nostr, allowing your readers to encourage better content creation while tipping your creations.
Zaps, provide a global solution for tipping content. Some posts on Nostr even got 89K+ SATs within one day, like Roya, Brianna.
On the other hand, while Apple's decision to take a 30% cut from fundraisers and humanitarian aid posts is criticized, Bitcoin emerges as a vital alternative for those suffering globally. Organizations like Oslo Freedom Forum and Qala Africa shed light on how Africans heavily rely on Bitcoin due to unreliable banking systems.
To this end, YakiHonne.com officially released the creation grant project, Creating for You and Your Fans through Nostr and ZAP. Join us on YakiHonne.com to share your long-form articles and curate content, experiencing the power of Nostr's censorship-resistance and ZAP features. Earn Sats rewards for publishing on Relay and Yakihonne clients. Don't forget to include your ZAP address and let's build Nostr's long content together!
What You Will Get From Your First 10 Posts in each round:
- 500 SATs, if you post on Relays through other clients
- 1000 SATs, if you post articles from other platforms to Relays as the first one on Relays and are curated or tweeted by YakiHonne
- 2000 SATs, for posting your own past articles on Relays through YakiHonne.com
- 3000 SATs, for posting your new original on Relays through YakiHonne.com
Zap Rules:
- No malicious speech such as discrimination, attack, incitement, etc.
- No Spam/Scam, not fully AI-generated article
- No directly copy & paste from other posts on Relays
- Spread positive content like your knowledge/experience/insight/ideas, etc.
How to Get Zap:
- Join YakiHonne TG group: https://t.me/YakiHonne_Daily_Featured
- Share your post in the group
- Make sure your LN address is in your profile
- Based on the rules above, we will ZAP your post directly within 2 days
Join our group for more queries: https://t.me/YakiHonne_Daily_Featured
About YakiHonne:
YakiHonne is a Nostr-based decentralized content media protocol, which supports free curation, creation, publishing, and reporting by various media. Try YakiHonne.com Now!
Follow us
- Telegram: http://t.me/YakiHonne_Daily_Featured
- Twitter: @YakiHonne
- Nostr pubkey: npub1yzvxlwp7wawed5vgefwfmugvumtp8c8t0etk3g8sky4n0ndvyxesnxrf8q
-
@ ee11a5df:b76c4e49
2023-07-29 03:27:23Gossip: The HTTP Fetcher
Gossip is a desktop nostr client. This post is about the code that fetches HTTP resources.
Gossip fetches HTTP resources. This includes images, videos, nip05 json files, etc. The part of gossip that does this is called the fetcher.
We have had a fetcher for some time, but it was poorly designed and had problems. For example, it was never expiring items in the cache.
We've made a lot of improvements to the fetcher recently. It's pretty good now, but there is still room for improvement.
Caching
Our fetcher caches data. Each URL that is fetched is hashed, and the content is stored under a file in the cache named by that hash.
If a request is in the cache, we don't do an HTTP request, we serve it directly from the cache.
But cached data gets stale. Sometimes resources at a URL change. We generally check resources again after three days.
We save the server's ETag value for content, and when we check the content again we supply an If-None-Match header with the ETag so the server could respond with 304 Not Modified in which case we don't need to download the resource again, we just bump the filetime to now.
In the event that our cache data is stale, but the server gives us an error, we serve up the stale data (stale is better than nothing).
Queueing
We used to fire off HTTP GET requests as soon as we knew that we needed a resource. This was not looked on too kindly by servers and CDNs who were giving us either 403 Forbidden or 429 Too Many Requests.
So we moved into a queue system. The host is extracted from each URL, and each host is only given up to 3 requests at a time. If we want 29 images from the same host, we only ask for three, and the remaining 26 remain in the queue for next time. When one of those requests completes, we decrement the host load so we know that we can send it another request later.
We process the queue in an infinite loop where we wait 1200 milliseconds between passes. Passes take time themselves and sometimes must wait for a timeout. Each pass fetches potentially multiple HTTP resources in parallel, asynchronously. If we have 300 resources at 100 different hosts, three per host, we could get them all in a single pass. More likely a bunch of resources are at the same host, and we make multiple passes at it.
Timeouts
When we fetch URLs in parallel asynchronously, we wait until all of the fetches complete before waiting another 1200 ms and doing another loop. Sometimes one of the fetches times out. In order to keep things moving, we use short timeouts of 10 seconds for a connect, and 15 seconds for a response.
Handling Errors
Some kinds of errors are more serious than others. When we encounter these, we sin bin the server for a period of time where we don't try fetching from it until a specified period elapses.
-
@ ee11a5df:b76c4e49
2023-07-29 03:13:59Gossip: Switching to LMDB
Unlike a number of other nostr clients, Gossip has always cached events and related data in a local data store. Up until recently, SQLite3 has served this purpose.
SQLite3 offers a full ACID SQL relational database service.
Unfortunately however it has presented a number of downsides:
- It is not as parallel as you might think.
- It is not as fast as you might hope.
- If you want to preserve the benefit of using SQL and doing joins, then you must break your objects into columns, and map columns back into objects. The code that does this object-relational mapping (ORM) is not trivial and can be error prone. It is especially tricky when working with different types (Rust language types and SQLite3 types are not a 1:1 match).
- Because of the potential slowness, our UI has been forbidden from direct database access as that would make the UI unresponsive if a query took too long.
- Because of (4) we have been firing off separate threads to do the database actions, and storing the results into global variables that can be accessed by the interested code at a later time.
- Because of (4) we have been caching database data in memory, essentially coding for yet another storage layer that can (and often did) get out of sync with the database.
LMDB offers solutions:
- It is highly parallel.
- It is ridiculously fast when used appropriately.
- Because you cannot run arbitrary SQL, there is no need to represent the fields within your objects separately. You can serialize/deserialize entire objects into the database and the database doesn't care what is inside of the blob (yes, you can do that into an SQLite field, but if you did, you would lose the power of SQL).
- Because of the speed, the UI can look stuff up directly.
- We no longer need to fork separate threads for database actions.
- We no longer need in-memory caches of data. The LMDB data is already in-memory (it is memory mapped) so we just access it directly.
The one obvious downside is that we lose SQL. We lose the query planner. We cannot ask arbitrary question and get answers. Instead, we have to pre-conceive of all the kinds of questions we want to ask, and we have to write code that answers them efficiently. Often this involves building and maintaining indices.
Indices
Let's say I want to look at fiatjaf's posts. How do I efficiently pull out just his recent feed-related events in reverse chronological order? It is easy if we first construct the following index
key: EventKind + PublicKey + ReverseTime value: Event Id
In the above, '+' is just a concatenate operator, and ReverseTime is just some distant time minus the time so that it sorts backwards.
Now I just ask LMDB to start from (EventKind=1 + PublicKey=fiatjaf + now) and scan until either one of the first two fields change, or more like the time field gets too old (e.g. one month ago). Then I do it again for the next event kind, etc.
For a generalized feed, I have to scan a region for each person I follow.
Smarter indexes can be imagined. Since we often want only feed-related event kinds, that can be implicit in an index that only indexes those kinds of events.
You get the idea.
A Special Event Map
At first I had stored events into a K-V database under the Id of the event. Then I had indexes on events that output a set of Ids (as in the example above).
But when it comes to storing and retrieving events, we can go even faster than LMDB.
We can build an append-only memory map that is just a sequence of all the events we have, serialized, and in no particular order. Readers do not need a lock and multiple readers can read simultaneously. Writers will need to acquire a lock to append to the map and there may only be one writer at a time. However, readers can continue reading even while a writer is writing.
We can then have a K-V database that maps Id -> Offset. To get the event you just do a direct lookup in the event memory map at that offset.
The real benefit comes when we have other indexes that yield events, they can yield offsets instead of ids. Then we don't need to do a second lookup from the Id to the Event, we can just look directly at the offset.
Avoiding deserialization
Deserialization has a price. Sometimes it requires memory allocation (if the object is not already linear, e.g. variable lengthed data like strings and vectors are allocated on the heap) which can be very expensive if you are trying to scan 150,000 or so events.
We serialize events (and other objects where we can) with a serialization library called speedy. It does its best to preserve the data much like it is represented in memory, but linearized. Because events start with fixed-length fields, we know the offset into the serialized event where these first fields occur and we can directly extract the value of those fields without deserializing the data before it.
This comes in useful whenever we need to scan a large number of events. Search is the one situation where I know that we must do this. We can search by matching against the content of every feed-related event without fully deserialing any of them.
-
@ ee11a5df:b76c4e49
2023-07-29 02:52:13Gossip: Zaps
Gossip is a desktop nostr client. This post is about the code that lets users send lightning zaps to each other (NIP-57).
Gossip implemented Zaps initially on 20th of June, 2023.
Gossip maintains a state of where zapping is at, one of: None, CheckingLnurl, SeekingAmount, LoadingInvoice, and ReadyToPay.
When you click the zap lightning bolt icon, Gossip moves to the CheckingLnurl state while it looks up the LN URL of the user.
If this is successful, it moves to the SeekingAmount state and presents amount options to the user.
Once a user chooses an amount, it moves to the LoadingInvoice state where it interacts with the lightning node and receives and checks an invoice.
Once that is complete, it moves to the ReadyToPay state, where it presents the invoice as a QR code for the user to scan with their phone. There is also a copy button so they can pay it from their desktop computer too.
Gossip also loads zap receipt events and associates them with the event that was zapped, tallying a zap total on that event. Gossip is unfortunately not validating these receipts very well currently, so fake zap receipts can cause an incorrect total to show. This remains an open issue.
Another open issue is the implementation of NIP-46 Nostr Connect and NIP-47 Wallet Connect.
-
@ 3f770d65:7a745b24
2023-07-31 12:53:38The following is a collection of Tweets posted on Twitter that documented my entire heart surgery process, from finding out I had an issue, through the surgery, and finally throughout my recovery process. If Elon decides to remove old and unpaid content, I do not want this part of my life to vanish from the Internet. At the time, it was extremely important for my mental health to talk about this whole process and it was therapeutic in my recovery process, reading all of the responses as all of Bitcoin Twitter was behind my success. Thank you all of your kind words, love, and support during this whole process. May my Tweets live on through nostr.
...
Nov 28, 2021 I had open heart surgery 4 days before my 3rd b-day. I've led a healthy & active life since then, zero issues. I had an echocardiogram last week. The results were not good. Heart valve replacement may be in my near future. I am freaking the fuck out. I'll know more on Dec 6th. 😫
Dec 6, 2021 Update: I had my cardiologist appointment today. He said to not worry for now and continue to exercise and live life. In 3 months get another echo done. He doesn't believe the previous other results since I have zero symptoms and wants to do his own interpretation.
Dec 6, 2021 He said if the other results were correct, I may need valve replacement in 6 months to 3 years. However, he doesn't believe the local hospital's results as I said above. He can't form an opinion just yet. He said not to worry over and over again. For now, I'm staying positive!
Mar 7, 2022 Well, it's been three months. I had my follow up this morning. I have severe pulmonic regurgitation. I now need to speak with a specialist and see what my options are for surgery. Neat. 😭
Apr 12, 2022 Well, it looks like I'll need full open heart surgery again to replace my pulmonic valve. Bonus: I'll be part pig. So, I have that going for me. I was assuming that if I had to have this done it would be much less invasive, so I'm not overly impressed at the moment. 🫤
May 11, 2022 Today I had to get a CT scan of my heart in preparation for the May 27th surgery. All went well. I have no other heart issues. It's looking like I'll be in the hospital for 4-7 days, depending on how fast I recover. I should be fully 100% recovered by the end of August. ❤️
May 20, 2022 My grandfather with me 39 years ago, days after my open heart surgery. He taught me to hunt, fish, golf, and I'm sure taught me a thing or two about drinking beer and partying. 😂 He was a great man. He won't physically be with me next Friday, but I'm sure he'll be watching over.
May 25, 2022 Two more sleeps. My mind is racing with an incredible amount of thoughts and emotions now. It's overwhelming. I love you all. Thanks for all of your replies and DMs over the last couple days, weeks, and months. I appreciate it immensely. ❤️❤️❤️
May 25, 2022 Two more sleeps. My mind is racing with an incredible amount of thoughts and emotions now. It's overwhelming. I love you all. Thanks for all of your replies and DMs over the last couple days, weeks, and months. I appreciate it immensely. ❤️❤️❤️
May 27, 2022 LET'S GO! I am alive and doing well. I was on a ventilator until 8pm. That was horrible. I will read all of the comments that you all posted on Katie's updates. Now I need to rest. They want to get me up and walk at 11pm. 🤯 I love you all and your support had helped so much 🧡🧡
May 28, 2022 The amount of love, compassion, caring, and appreciation from everyone blows my mind. Thanks for all of your comments and DMs. The positivity though all is this has helped me get through dark times and now it's helping me get through pain. You are helping me immensely. 🤯❤️🧡💪
May 28, 2022 Today has been a rough day. Lots of chest pain when breathing. But, I apparently am doing something right, because I have been upgraded to a regular room. No more ICU for this guy! My ICU nurse told my new nurse that I'm strong. ❤️💪🔥
May 29, 2022 Using this to document my journey. Last night was not good at all. I had tachycardia and AFib for hours. It was scary as fuck having my heart beat the way it was at 160bpm. They gave me new medication to bring it down and stop the AFib irregular heartbeat. It's now at 101.
May 29, 2022 I was very scared. Katie was able to come and stay the night with me and be my personal care nurse. That made me feel much better having her here with me. Hopefully the meds continue to do what they're supposed to. Fuck. Anyways, I may not Tweet much today. Love you all. ❤️
May 30, 2022 Today has mostly been a great day progress wise. I ate a lot. I've walked more today than I have previously. My doctor told me I might be going home tomorrow, it all depends on what happens with my last drainage tube. Fingers crossed that it's draining properly now. 💪❤️
May 31, 2022 Morning walk crushed. Breakfast crushed. Feeling stronger. My drainage tube is still draining so we'll see what the surgeon says, but I probably won't be coming home today according to my nurse. It may be another day. Better to be safe. I'm feeling good though. Let's go! 💪💪❤️❤️
Jun 1, 2022 Today's plans: Crush morning walk, crush breakfast, CRUSH MY LAST X-RAY AND HEAD THE HELL HOME! Fingers crossed. 🤞🤞❤️❤️💪💪
Jun 1, 2022 On my morning walk I went into a slight AFib. The nurse and PA said since I hadn't had my morning meds to control that yet, that that could be the cause. They're going to increase meds and monitor me for another 24 hours. That sucks, but again, I'd rather be safe. Ugh.
Jun 2, 2022 GM! I miss my kids. I miss wearing normal clothes. I miss my house. I miss my doggy. I better go home today or I guess I'll just keep working to get well enough to go home. 😂 I'm still progressing forward. I have a chest x-ray scheduled later this morning. Fingers crossed.
Jun 2, 2022 I JUST GOT CLEARED TO HEAD HOME AFTER LUNCH. FUCK YEAH. LET'S GOOOOOO💪💪💪
Jun 2, 2022 I am home! I have some family that needs some loving. Enjoy the rest of your day!
Jun 4, 2022 Last night I slept in bed thanks to a reclining pillow, the first night our living room chair. I was so happy to sleep in my own bed. I walked around our yard about 9 times yesterday. My goal is to do that plus a little more every day. I'm still in a lot of pain, but meds help.
Jun 4, 2022 I still have a long way to go recovery wise, but having Katie and the kids here helping me along the way makes it easier and gives me a reason to keep pushing forward through this. Thanks again for all of your past and future support. You all are fantastic.
Jun 6, 2022 My wife went back to work today. My son and daughter are in charge of taking care of me. My kids are fantastic. They made me breakfast already and helped me check all of my vitals. My daughter really shines here. She's such a little nurse and caretaker. ❤️❤️
Jun 6, 2022 I'm still in pain, but I'm not in as much pain as I was a couple days ago. I'm walking around a little better and a little more every day. I really hate just sitting around and not doing anything, but it's hard to do much else besides watch TV. I am enjoying my patio though. 💪
Jun 16, 2022 It's been a while. It's time to update this thread! I am doing great, IMO. I have lots of energy. I feel great. I can do a lot more than I previously could. I still have a limited range of motion due to my sternum being broken and still healing. i.e. I can't wash my back or legs.
Jun 16, 2022 I started back to work yesterday. I was cleared to do 20 hours this week by my doctor and I'm hoping to be cleared full time next week. I work from home, so if I can sit in front of a TV, I can sit in front of a computer, right?
Jun 16, 2022 My home nurses have been absolutely fantastic.😂 My kids make me breakfast every morning. And I could not have done any of this without my rock star wife. Words can't describe how much she's done for me throughout all of this. I am looking forward to continuing to improve. ♥️
Jun 16, 2022 You all have been absolutely wonderful through all of this too. I appreciate all of your love, support, and check-ins. Seriously. It means more than you know.
I have a check-up at the end of the month to make sure all is well. I'll update again in a couple weeks after that appt.
Jun 28, 2022 My doctor said everything looks great. He said it doesn't look like I had open heart surgery a month ago.💪 He said I have zero restrictions and that I can resume normal life. I can drive and I can go out on my boat! ❤️❤️❤️ I'm so happy right now! 🔥🚀
Jul 15, 2022 I started cardiac rehab this week. After 2 sessions the nurse said I'm on "Week 4" already. She doesn't believe with my initial intake stress test if I'll really be able to improve on it that much. 🤣 TL;DR I am a rock star and kicking ass. Feeling great. Life's great. 😍
Aug 28, 2022 Yesterday was 3 months since my open heart surgery. I'm going great! I'd guess essentially back to normal. My sternum is still not fully healed, that'll take more time, but energy and capability are basically back to what I was like last summer. I'm very happy with the results.
Aug 28, 2022 I'm very happy to be able to exercise daily, go boating on the weekends AND swim and paddleboard. I was annoyed that I couldn't do these things a month ago. I had a cardiologist appt. two weeks ago. He said he hopes the valve lasts me the rest of my life and to see him in a year.
Aug 28, 2022 As a final post to this thread, I want to thank each and every one of you again that commented, liked, shared and DMed me throughout all of this. The love and support from Twitter and the #Bitcoin community was unfathomable. You all made a difference in my life and my recovery.❤️
-
@ e6ce6154:275e3444
2023-07-27 14:12:49Este artigo foi censurado pelo estado e fomos obrigados a deletá-lo após ameaça de homens armados virem nos visitar e agredir nossa vida e propriedade.
Isto é mais uma prova que os autoproclamados antirracistas são piores que os racistas.
https://rothbardbrasil.com/pelo-direito-de-ser-racista-fascista-machista-e-homofobico
Segue artigo na íntegra. 👇
Sem dúvida, a escalada autoritária do totalitarismo cultural progressista nos últimos anos tem sido sumariamente deletéria e prejudicial para a liberdade de expressão. Como seria de se esperar, a cada dia que passa o autoritarismo progressista continua a se expandir de maneira irrefreável, prejudicando a liberdade dos indivíduos de formas cada vez mais deploráveis e contundentes.
Com a ascensão da tirania politicamente correta e sua invasão a todos os terrenos culturais, o autoritarismo progressista foi se alastrando e consolidando sua hegemonia em determinados segmentos. Com a eventual eclosão e a expansão da opressiva e despótica cultura do cancelamento — uma progênie inevitável do totalitarismo progressista —, todas as pessoas que manifestam opiniões, crenças ou posicionamentos que não estão alinhados com as pautas universitárias da moda tornam-se um alvo.
Há algumas semanas, vimos a enorme repercussão causada pelo caso envolvendo o jogador profissional de vôlei Maurício Sousa, que foi cancelado pelo simples fato de ter emitido sua opinião pessoal sobre um personagem de história em quadrinhos, Jon Kent, o novo Superman, que é bissexual. Maurício Sousa reprovou a conduta sexual do personagem, o que é um direito pessoal inalienável que ele tem. Ele não é obrigado a gostar ou aprovar a bissexualidade. Como qualquer pessoa, ele tem o direito pleno de criticar tudo aquilo que ele não gosta. No entanto, pelo simples fato de emitir a sua opinião pessoal, Maurício Sousa foi acusado de homofobia e teve seu contrato rescindido, sendo desligado do Minas Tênis Clube.
Lamentavelmente, Maurício Sousa não foi o primeiro e nem será o último indivíduo a sofrer com a opressiva e autoritária cultura do cancelamento. Como uma tirania cultural que está em plena ascensão e usufrui de um amplo apoio do establishment, essa nova forma de totalitarismo cultural colorido e festivo está se impondo de formas e maneiras bastante contundentes em praticamente todas as esferas da sociedade contemporânea. Sua intenção é relegar ao ostracismo todos aqueles que não se curvam ao totalitarismo progressista, criminalizando opiniões e crenças que divergem do culto à libertinagem hedonista pós-moderna. Oculto por trás de todo esse ativismo autoritário, o que temos de fato é uma profunda hostilidade por padrões morais tradicionalistas, cristãos e conservadores.
No entanto, é fundamental entendermos uma questão imperativa, que explica em partes o conflito aqui criado — todos os progressistas contemporâneos são crias oriundas do direito positivo. Por essa razão, eles jamais entenderão de forma pragmática e objetiva conceitos como criminalidade, direitos de propriedade, agressão e liberdade de expressão pela perspectiva do jusnaturalismo, que é manifestamente o direito em seu estado mais puro, correto, ético e equilibrado.
Pela ótica jusnaturalista, uma opinião é uma opinião. Ponto final. E absolutamente ninguém deve ser preso, cancelado, sabotado ou boicotado por expressar uma opinião particular sobre qualquer assunto. Palavras não agridem ninguém, portanto jamais poderiam ser consideradas um crime em si. Apenas deveriam ser tipificados como crimes agressões de caráter objetivo, como roubo, sequestro, fraude, extorsão, estupro e infrações similares, que representam uma ameaça direta à integridade física da vítima, ou que busquem subtrair alguma posse empregando a violência.
Infelizmente, a geração floquinho de neve — terrivelmente histérica, egocêntrica e sensível — fica profundamente ofendida e consternada sempre que alguém defende posicionamentos contrários à religião progressista. Por essa razão, os guerreiros da justiça social sinceramente acreditam que o papai-estado deve censurar todas as opiniões que eles não gostam de ouvir, assim como deve também criar leis para encarcerar todos aqueles que falam ou escrevem coisas que desagradam a militância.
Como a geração floquinho de neve foi criada para acreditar que todas as suas vontades pessoais e disposições ideológicas devem ser sumariamente atendidas pelo papai-estado, eles embarcaram em uma cruzada moral que pretende erradicar todas as coisas que são ofensivas à ideologia progressista; só assim eles poderão deflagrar na Terra o seu tão sonhado paraíso hedonista e igualitário, de inimaginável esplendor e felicidade.
Em virtude do seu comportamento intrinsecamente despótico, autoritário e egocêntrico, acaba sendo inevitável que militantes progressistas problematizem tudo aquilo que os desagrada.
Como são criaturas inúteis destituídas de ocupação real e verdadeiro sentido na vida, sendo oprimidas unicamente na sua própria imaginação, militantes progressistas precisam constantemente inventar novos vilões para serem combatidos.
Partindo dessa perspectiva, é natural para a militância que absolutamente tudo que exista no mundo e que não se enquadra com as regras autoritárias e restritivas da religião progressista seja encarado como um problema. Para a geração floquinho de neve, o capitalismo é um problema. O fascismo é um problema. A iniciativa privada é um problema. O homem branco, tradicionalista, conservador e heterossexual é um problema. A desigualdade é um problema. A liberdade é um problema. Monteiro Lobato é um problema (sim, até mesmo o renomado ícone da literatura brasileira, autor — entre outros títulos — de Urupês, foi vítima da cultura do cancelamento, acusado de ser racista e eugenista).
Para a esquerda, praticamente tudo é um problema. Na mentalidade da militância progressista, tudo é motivo para reclamação. Foi em função desse comportamento histérico, histriônico e infantil que o famoso pensador conservador-libertário americano P. J. O’Rourke afirmou que “o esquerdismo é uma filosofia de pirralhos chorões”. O que é uma verdade absoluta e irrefutável em todos os sentidos.
De fato, todas as filosofias de esquerda de forma geral são idealizações utópicas e infantis de um mundo perfeito. Enquanto o mundo não se transformar naquela colorida e vibrante utopia que é apresentada pela cartilha socialista padrão, militantes continuarão a reclamar contra tudo o que existe no mundo de forma agressiva, visceral e beligerante. Evidentemente, eles não vão fazer absolutamente nada de positivo ou construtivo para que o mundo se transforme no gracioso paraíso que eles tanto desejam ver consolidado, mas eles continuarão a berrar e vociferar muito em sua busca incessante pela utopia, marcando presença em passeatas inúteis ou combatendo o fascismo imaginário nas redes sociais.
Sem dúvida, estamos muito perto de ver leis absurdas e estúpidas sendo implementadas, para agradar a militância da terra colorida do assistencialismo eterno onde nada é escasso e tudo cai do céu. Em breve, você não poderá usar calças pretas, pois elas serão consideradas peças de vestuário excessivamente heterossexuais. Apenas calças amarelas ou coloridas serão permitidas. Você também terá que tingir de cor-de-rosa uma mecha do seu cabelo; pois preservar o seu cabelo na sua cor natural é heteronormativo demais da sua parte, sendo portanto um componente demasiadamente opressor da sociedade.
Você também não poderá ver filmes de guerra ou de ação, apenas comédias românticas, pois certos gêneros de filmes exaltam a violência do patriarcado e isso impede o mundo de se tornar uma graciosa festa colorida de fraternidades universitárias ungidas por pôneis resplandecentes, hedonismo infinito, vadiagem universitária e autogratificação psicodélica, que certamente são elementos indispensáveis para se produzir o paraíso na Terra.
Sabemos perfeitamente, no entanto, que dentre as atitudes “opressivas” que a militância progressista mais se empenha em combater, estão o racismo, o fascismo, o machismo e a homofobia. No entanto, é fundamental entender que ser racista, fascista, machista ou homofóbico não são crimes em si. Na prática, todos esses elementos são apenas traços de personalidade; e eles não podem ser pura e simplesmente criminalizados porque ideólogos e militantes progressistas iluminados não gostam deles.
Tanto pela ética quanto pela ótica jusnaturalista, é facilmente compreensível entender que esses traços de personalidade não podem ser criminalizados ou proibidos simplesmente porque integrantes de uma ideologia não tem nenhuma apreciação ou simpatia por eles. Da mesma forma, nenhum desses traços de personalidade representa em si um perigo para a sociedade, pelo simples fato de existir. Por incrível que pareça, até mesmo o machismo, o racismo, o fascismo e a homofobia merecem a devida apologia.
Mas vamos analisar cada um desses tópicos separadamente para entender isso melhor.
Racismo
Quando falamos no Japão, normalmente não fazemos nenhuma associação da sociedade japonesa com o racismo. No entanto, é incontestável o fato de que a sociedade japonesa pode ser considerada uma das sociedades mais racistas do mundo. E a verdade é que não há absolutamente nada de errado com isso.
Aproximadamente 97% da população do Japão é nativa; apenas 3% do componente populacional é constituído por estrangeiros (a população do Japão é estimada em aproximadamente 126 milhões de habitantes). Isso faz a sociedade japonesa ser uma das mais homogêneas do mundo. As autoridades japonesas reconhecidamente dificultam processos de seleção e aplicação a estrangeiros que desejam se tornar residentes. E a maioria dos japoneses aprova essa decisão.
Diversos estabelecimentos comerciais como hotéis, bares e restaurantes por todo o país tem placas na entrada que dizem “somente para japoneses” e a maioria destes estabelecimentos se recusa ostensivamente a atender ou aceitar clientes estrangeiros, não importa quão ricos ou abastados sejam.
Na Terra do Sol Nascente, a hostilidade e a desconfiança natural para com estrangeiros é tão grande que até mesmo indivíduos que nascem em algum outro país, mas são filhos de pais japoneses, não são considerados cidadãos plenamente japoneses.
Se estes indivíduos decidem sair do seu país de origem para se estabelecer no Japão — mesmo tendo descendência nipônica legítima e inquestionável —, eles enfrentarão uma discriminação social considerável, especialmente se não dominarem o idioma japonês de forma impecável. Esse fato mostra que a discriminação é uma parte tão indissociável quanto elementar da sociedade japonesa, e ela está tão profundamente arraigada à cultura nipônica que é praticamente impossível alterá-la ou atenuá-la por qualquer motivo.
A verdade é que — quando falamos de um país como o Japão — nem todos os discursos politicamente corretos do mundo, nem a histeria progressista ocidental mais inflamada poderão algum dia modificar, extirpar ou sequer atenuar o componente racista da cultura nipônica. E isso é consequência de uma questão tão simples quanto primordial: discriminar faz parte da natureza humana, sendo tanto um direito individual quanto um elemento cultural inerente à muitas nações do mundo. Os japoneses não tem problema algum em admitir ou institucionalizar o seu preconceito, justamente pelo fato de que a ideologia politicamente correta não tem no oriente a força e a presença que tem no ocidente.
E é fundamental enfatizar que, sendo de natureza pacífica — ou seja, não violando nem agredindo terceiros —, a discriminação é um recurso natural dos seres humanos, que está diretamente associada a questões como familiaridade e segurança.
Absolutamente ninguém deve ser forçado a apreciar ou integrar-se a raças, etnias, pessoas ou tribos que não lhe transmitem sentimentos de segurança ou familiaridade. Integração forçada é o verdadeiro crime, e isso diversos países europeus — principalmente os escandinavos (países que lideram o ranking de submissão à ideologia politicamente correta) — aprenderam da pior forma possível.
A integração forçada com imigrantes islâmicos resultou em ondas de assassinato, estupro e violência inimagináveis para diversos países europeus, até então civilizados, que a imprensa ocidental politicamente correta e a militância progressista estão permanentemente tentando esconder, porque não desejam que o ocidente descubra como a agenda “humanitária” de integração forçada dos povos muçulmanos em países do Velho Mundo resultou em algumas das piores chacinas e tragédias na história recente da Europa.
Ou seja, ao discriminarem estrangeiros, os japoneses estão apenas se protegendo e lutando para preservar sua nação como um ambiente cultural, étnico e social que lhe é seguro e familiar, assim se opondo a mudanças bruscas, indesejadas e antinaturais, que poderiam comprometer a estabilidade social do país.
A discriminação — sendo de natureza pacífica —, é benévola, salutar e indubitavelmente ajuda a manter a estabilidade social da comunidade. Toda e qualquer forma de integração forçada deve ser repudiada com veemência, pois, mais cedo ou mais tarde, ela irá subverter a ordem social vigente, e sempre será acompanhada de deploráveis e dramáticos resultados.
Para citar novamente os países escandinavos, a Suécia é um excelente exemplo do que não fazer. Tendo seguido o caminho contrário ao da discriminação racional praticada pela sociedade japonesa, atualmente a sociedade sueca — além de afundar de forma consistente na lama da libertinagem, da decadência e da deterioração progressista — sofre em demasia com os imigrantes muçulmanos, que foram deixados praticamente livres para matar, saquear, esquartejar e estuprar quem eles quiserem. Hoje, eles são praticamente intocáveis, visto que denunciá-los, desmoralizá-los ou acusá-los de qualquer crime é uma atitude politicamente incorreta e altamente reprovada pelo establishment progressista. A elite socialista sueca jamais se atreve a acusá-los de qualquer crime, pois temem ser classificados como xenófobos e intolerantes. Ou seja, a desgraça da Europa, sobretudo dos países escandinavos, foi não ter oferecido nenhuma resistência à ideologia progressista politicamente correta. Hoje, eles são totalmente submissos a ela.
O exemplo do Japão mostra, portanto — para além de qualquer dúvida —, a importância ética e prática da discriminação, que é perfeitamente aceitável e natural, sendo uma tendência inerente aos seres humanos, e portanto intrínseca a determinados comportamentos, sociedades e culturas.
Indo ainda mais longe nessa questão, devemos entender que na verdade todos nós discriminamos, e não existe absolutamente nada de errado nisso. Discriminar pessoas faz parte da natureza humana e quem se recusa a admitir esse fato é um hipócrita. Mulheres discriminam homens na hora de selecionar um parceiro; elas avaliam diversos quesitos, como altura, aparência, status social, condição financeira e carisma. E dentre suas opções, elas sempre escolherão o homem mais atraente, másculo e viril, em detrimento de todos os baixinhos, calvos, carentes, frágeis e inibidos que possam estar disponíveis. Da mesma forma, homens sempre terão preferência por mulheres jovens, atraentes e delicadas, em detrimento de todas as feministas de meia-idade, acima do peso, de cabelo pintado, que são mães solteiras e militantes socialistas. A própria militância progressista discrimina pessoas de forma virulenta e intransigente, como fica evidente no tratamento que dispensam a mulheres bolsonaristas e a negros de direita.
A verdade é que — não importa o nível de histeria da militância progressista — a discriminação é inerente à condição humana e um direito natural inalienável de todos. É parte indissociável da natureza humana e qualquer pessoa pode e deve exercer esse direito sempre que desejar. Não existe absolutamente nada de errado em discriminar pessoas. O problema real é a ideologia progressista e o autoritarismo politicamente correto, movimentos tirânicos que não respeitam o direito das pessoas de discriminar.
Fascismo
Quando falamos de fascismo, precisamos entender que, para a esquerda política, o fascismo é compreendido como um conceito completamente divorciado do seu significado original. Para um militante de esquerda, fascista é todo aquele que defende posicionamentos contrários ao progressismo, não se referindo necessariamente a um fascista clássico.
Mas, seja como for, é necessário entender que — como qualquer ideologia política — até mesmo o fascismo clássico tem o direito de existir e ocupar o seu devido lugar; portanto, fascistas não devem ser arbitrariamente censurados, apesar de defenderem conceitos que representam uma completa antítese de tudo aquilo que é valioso para os entusiastas da liberdade.
Em um país como o Brasil, onde socialistas e comunistas tem total liberdade para se expressar, defender suas ideologias e até mesmo formar partidos políticos, não faz absolutamente o menor sentido que fascistas — e até mesmo nazistas assumidos — sofram qualquer tipo de discriminação. Embora socialistas e comunistas se sintam moralmente superiores aos fascistas (ou a qualquer outra filosofia política ou escola de pensamento), sabemos perfeitamente que o seu senso de superioridade é fruto de uma pueril romantização universitária da sua própria ideologia. A história mostra efetivamente que o socialismo clássico e o comunismo causaram muito mais destruição do que o fascismo.
Portanto, se socialistas e comunistas tem total liberdade para se expressar, não existe a menor razão para que fascistas não usufruam dessa mesma liberdade.
É claro, nesse ponto, seremos invariavelmente confrontados por um oportuno dilema — o famoso paradoxo da intolerância, de Karl Popper. Até que ponto uma sociedade livre e tolerante deve tolerar a intolerância (inerente a ideologias totalitárias)?
As leis de propriedade privada resolveriam isso em uma sociedade livre. O mais importante a levarmos em consideração no atual contexto, no entanto — ao defender ou criticar uma determinada ideologia, filosofia ou escola de pensamento —, é entender que, seja ela qual for, ela tem o direito de existir. E todas as pessoas que a defendem tem o direito de defendê-la, da mesma maneira que todos os seus detratores tem o direito de criticá-la.
Essa é uma forte razão para jamais apoiarmos a censura. Muito pelo contrário, devemos repudiar com veemência e intransigência toda e qualquer forma de censura, especialmente a estatal.
Existem duas fortes razões para isso:
A primeira delas é a volatilidade da censura (especialmente a estatal). A censura oficial do governo, depois que é implementada, torna-se absolutamente incontrolável. Hoje, ela pode estar apontada para um grupo de pessoas cujas ideias divergem das suas. Mas amanhã, ela pode estar apontada justamente para as ideias que você defende. É fundamental, portanto, compreendermos que a censura estatal é incontrolável. Sob qualquer ponto de vista, é muito mais vantajoso que exista uma vasta pluralidade de ideias conflitantes na sociedade competindo entre si, do que o estado decidir que ideias podem ser difundidas ou não.
Além do mais, libertários e anarcocapitalistas não podem nunca esperar qualquer tipo de simpatia por parte das autoridades governamentais. Para o estado, seria infinitamente mais prático e vantajoso criminalizar o libertarianismo e o anarcocapitalismo — sob a alegação de que são filosofias perigosas difundidas por extremistas radicais que ameaçam o estado democrático de direito — do que o fascismo ou qualquer outra ideologia centralizada em governos burocráticos e onipotentes. Portanto, defender a censura, especialmente a estatal, representa sempre um perigo para o próprio indivíduo, que mais cedo ou mais tarde poderá ver a censura oficial do sistema se voltar contra ele.
Outra razão pela qual libertários jamais devem defender a censura, é porque — ao contrário dos estatistas — não é coerente que defensores da liberdade se comportem como se o estado fosse o seu papai e o governo fosse a sua mamãe. Não devemos terceirizar nossas próprias responsabilidades, tampouco devemos nos comportar como adultos infantilizados. Assumimos a responsabilidade de combater todas as ideologias e filosofias que agridem a liberdade e os seres humanos. Não procuramos políticos ou burocratas para executar essa tarefa por nós.
Portanto, se você ver um fascista sendo censurado nas redes sociais ou em qualquer outro lugar, assuma suas dores. Sinta-se compelido a defendê-lo, mostre aos seus detratores que ele tem todo direito de se expressar, como qualquer pessoa. Você não tem obrigação de concordar com ele ou apreciar as ideias que ele defende. Mas silenciar arbitrariamente qualquer pessoa não é uma pauta que honra a liberdade.
Se você não gosta de estado, planejamento central, burocracia, impostos, tarifas, políticas coletivistas, nacionalistas e desenvolvimentistas, mostre com argumentos coesos e convincentes porque a liberdade e o livre mercado são superiores a todos esses conceitos. Mas repudie a censura com intransigência e mordacidade.
Em primeiro lugar, porque você aprecia e defende a liberdade de expressão para todas as pessoas. E em segundo lugar, por entender perfeitamente que — se a censura eventualmente se tornar uma política de estado vigente entre a sociedade — é mais provável que ela atinja primeiro os defensores da liberdade do que os defensores do estado.
Machismo
Muitos elementos do comportamento masculino que hoje são atacados com virulência e considerados machistas pelo movimento progressista são na verdade manifestações naturais intrínsecas ao homem, que nossos avôs cultivaram ao longo de suas vidas sem serem recriminados por isso. Com a ascensão do feminismo, do progressismo e a eventual problematização do sexo masculino, o antagonismo militante dos principais líderes da revolução sexual da contracultura passou a naturalmente condenar todos os atributos genuinamente masculinos, por considerá-los símbolos de opressão e dominação social.
Apesar do Brasil ser uma sociedade liberal ultra-progressista, onde o estado protege mais as mulheres do que as crianças — afinal, a cada semana novas leis são implementadas concedendo inúmeros privilégios e benefícios às mulheres, aos quais elas jamais teriam direito em uma sociedade genuinamente machista e patriarcal —, a esquerda política persiste em tentar difundir a fantasia da opressão masculina e o mito de que vivemos em uma sociedade machista e patriarcal.
Como sempre, a realidade mostra um cenário muito diferente daquilo que é pregado pela militância da terra da fantasia. O Brasil atual não tem absolutamente nada de machista ou patriarcal. No Brasil, mulheres podem votar, podem ocupar posições de poder e autoridade tanto na esfera pública quanto em companhias privadas, podem se candidatar a cargos políticos, podem ser vereadoras, deputadas, governadoras, podem ser proprietárias do próprio negócio, podem se divorciar, podem dirigir, podem comprar armas, podem andar de biquíni nas praias, podem usar saias extremamente curtas, podem ver programas de televisão sobre sexo voltados única e exclusivamente para o público feminino, podem se casar com outras mulheres, podem ser promíscuas, podem consumir bebidas alcoólicas ao ponto da embriaguez, e podem fazer praticamente tudo aquilo que elas desejarem. No Brasil do século XXI, as mulheres são genuinamente livres para fazer as próprias escolhas em praticamente todos os aspectos de suas vidas. O que mostra efetivamente que a tal opressão do patriarcado não existe.
O liberalismo social extremo do qual as mulheres usufruem no Brasil atual — e que poderíamos estender a toda a sociedade contemporânea ocidental — é suficiente para desmantelar completamente a fábula feminista da sociedade patriarcal machista e opressora, que existe única e exclusivamente no mundinho de fantasias ideológicas da esquerda progressista.
Tão importante quanto, é fundamental compreender que nenhum homem é obrigado a levar o feminismo a sério ou considerá-lo um movimento social e político legítimo. Para um homem, ser considerado machista ou até mesmo assumir-se como um não deveria ser um problema. O progressismo e o feminismo — com o seu nefasto hábito de demonizar os homens, bem como todos os elementos inerentes ao comportamento e a cultura masculina — é que são o verdadeiro problema, conforme tentam modificar o homem para transformá-lo em algo que ele não é nem deveria ser: uma criatura dócil, passiva e submissa, que é comandada por ideologias hostis e antinaturais, que não respeitam a hierarquia de uma ordem social milenar e condições inerentes à própria natureza humana. Com o seu hábito de tentar modificar tudo através de leis e decretos, o feminismo e o progressismo mostram efetivamente que o seu real objetivo é criminalizar a masculinidade.
A verdade é que — usufruindo de um nível elevado de liberdades — não existe praticamente nada que a mulher brasileira do século XXI não possa fazer. Adicionalmente, o governo dá as mulheres uma quantidade tão avassaladora de vantagens, privilégios e benefícios, que está ficando cada vez mais difícil para elas encontrarem razões válidas para reclamarem da vida. Se o projeto de lei que pretende fornecer um auxílio mensal de mil e duzentos reais para mães solteiras for aprovado pelo senado, muitas mulheres que tem filhos não precisarão nem mesmo trabalhar para ter sustento. E tantas outras procurarão engravidar, para ter direito a receber uma mesada mensal do governo até o seu filho completar a maioridade.
O que a militância colorida da terra da fantasia convenientemente ignora — pois a realidade nunca corresponde ao seu conto de fadas ideológico — é que o mundo de uma forma geral continua sendo muito mais implacável com os homens do que é com as mulheres. No Brasil, a esmagadora maioria dos suicídios é praticada por homens, a maioria das vítimas de homicídio são homens e de cada quatro moradores de rua, três são homens. Mas é evidente que uma sociedade liberal ultra-progressista não se importa com os homens, pois ela não é influenciada por fatos concretos ou pela realidade. Seu objetivo é simplesmente atender as disposições de uma agenda ideológica, não importa quão divorciadas da realidade elas são.
O nível exacerbado de liberdades sociais e privilégios governamentais dos quais as mulheres brasileiras usufruem é suficiente para destruir a fantasiosa fábula da sociedade machista, opressora e patriarcal. Se as mulheres brasileiras não estão felizes, a culpa definitivamente não é dos homens. Se a vasta profusão de liberdades, privilégios e benefícios da sociedade ocidental não as deixa plenamente saciadas e satisfeitas, elas podem sempre mudar de ares e tentar uma vida mais abnegada e espartana em países como Irã, Paquistão ou Afeganistão. Quem sabe assim elas não se sentirão melhores e mais realizadas?
Homofobia
Quando falamos em homofobia, entramos em uma categoria muito parecida com a do racismo: o direito de discriminação é totalmente válido. Absolutamente ninguém deve ser obrigado a aceitar homossexuais ou considerar o homossexualismo como algo normal. Sendo cristão, não existe nem sequer a mais vaga possibilidade de que algum dia eu venha a aceitar o homossexualismo como algo natural. O homossexualismo se qualifica como um grave desvio de conduta e um pecado contra o Criador.
A Bíblia proíbe terminantemente conduta sexual imoral, o que — além do homossexualismo — inclui adultério, fornicação, incesto e bestialidade, entre outras formas igualmente pérfidas de degradação.
Segue abaixo três passagens bíblicas que proíbem terminantemente a conduta homossexual:
“Não te deitarás com um homem como se deita com uma mulher. Isso é abominável!” (Levítico 18:22 — King James Atualizada)
“Se um homem se deitar com outro homem, como se deita com mulher, ambos terão praticado abominação; certamente serão mortos; o seu sangue estará sobre eles.” (Levítico 20:13 — João Ferreira de Almeida Atualizada)
“O quê! Não sabeis que os injustos não herdarão o reino de Deus? Não sejais desencaminhados. Nem fornicadores, nem idólatras, nem adúlteros, nem homens mantidos para propósitos desnaturais, nem homens que se deitam com homens, nem ladrões, nem gananciosos, nem beberrões, nem injuriadores, nem extorsores herdarão o reino de Deus.” (1 Coríntios 6:9,10 —Tradução do Novo Mundo das Escrituras Sagradas com Referências)
Se você não é religioso, pode simplesmente levar em consideração o argumento do respeito pela ordem natural. A ordem natural é incondicional e incisiva com relação a uma questão: o complemento de tudo o que existe é o seu oposto, não o seu igual. O complemento do dia é a noite, o complemento da luz é a escuridão, o complemento da água, que é líquida, é a terra, que é sólida. E como sabemos o complemento do macho — de sua respectiva espécie — é a fêmea.
Portanto, o complemento do homem, o macho da espécie humana, é naturalmente a mulher, a fêmea da espécie humana. Um homem e uma mulher podem naturalmente se reproduzir, porque são um complemento biológico natural. Por outro lado, um homem e outro homem são incapazes de se reproduzir, assim como uma mulher e outra mulher.
Infelizmente, o mundo atual está longe de aceitar como plenamente estabelecida a ordem natural pelo simples fato dela existir, visto que tentam subvertê-la a qualquer custo, não importa o malabarismo intelectual que tenham que fazer para justificar os seus pontos de vista distorcidos e antinaturais. A libertinagem irrefreável e a imoralidade bestial do mundo contemporâneo pós-moderno não reconhecem nenhum tipo de limite. Quem tenta restabelecer princípios morais salutares é imediatamente considerado um vilão retrógrado e repressivo, sendo ativamente demonizado pela militância do hedonismo, da luxúria e da licenciosidade desenfreada e sem limites.
Definitivamente, fazer a apologia da moralidade, do autocontrole e do autodomínio não faz nenhum sucesso na Sodoma e Gomorra global dos dias atuais. O que faz sucesso é lacração, devassidão, promiscuidade e prazeres carnais vazios. O famoso escritor e filósofo francês Albert Camus expressou uma verdade contundente quando disse: “Uma só frase lhe bastará para definir o homem moderno — fornicava e lia jornais”.
Qualquer indivíduo tem o direito inalienável de discriminar ativamente homossexuais, pelo direito que ele julgar mais pertinente no seu caso. A objeção de consciência para qualquer situação é um direito natural dos indivíduos. Há alguns anos, um caso que aconteceu nos Estados Unidos ganhou enorme repercussão internacional, quando o confeiteiro Jack Phillips se recusou a fazer um bolo de casamento para o “casal” homossexual Dave Mullins e Charlie Craig.
Uma representação dos direitos civis do estado do Colorado abriu um inquérito contra o confeiteiro, alegando que ele deveria ser obrigado a atender todos os clientes, independente da orientação sexual, raça ou crença. Preste atenção nas palavras usadas — ele deveria ser obrigado a atender.
Como se recusou bravamente a ceder, o caso foi parar invariavelmente na Suprema Corte, que decidiu por sete a dois em favor de Jack Phillips, sob a alegação de que obrigar o confeiteiro a atender o “casal” homossexual era uma violação nefasta dos seus princípios religiosos. Felizmente, esse foi um caso em que a liberdade prevaleceu sobre a tirania progressista.
Evidentemente, homossexuais não devem ser agredidos, ofendidos, internados em clínicas contra a sua vontade, nem devem ser constrangidos em suas liberdades pelo fato de serem homossexuais. O que eles precisam entender é que a liberdade é uma via de mão dupla. Eles podem ter liberdade para adotar a conduta que desejarem e fazer o que quiserem (contanto que não agridam ninguém), mas da mesma forma, é fundamental respeitar e preservar a liberdade de terceiros que desejam rejeitá-los pacificamente, pelo motivo que for.
Afinal, ninguém tem a menor obrigação de aceitá-los, atendê-los ou sequer pensar que uma união estável entre duas pessoas do mesmo sexo — incapaz de gerar descendentes, e, portanto, antinatural — deva ser considerado um matrimônio de verdade. Absolutamente nenhuma pessoa, ideia, movimento, crença ou ideologia usufrui de plena unanimidade no mundo. Por que o homossexualismo deveria ter tal privilégio?
Homossexuais não são portadores de uma verdade definitiva, absoluta e indiscutível, que está acima da humanidade. São seres humanos comuns que — na melhor das hipóteses —, levam um estilo de vida que pode ser considerado “alternativo”, e absolutamente ninguém tem a obrigação de considerar esse estilo de vida normal ou aceitável. A única obrigação das pessoas é não interferir, e isso não implica uma obrigação em aceitar.
Discriminar homossexuais (assim como pessoas de qualquer outro grupo, raça, religião, nacionalidade ou etnia) é um direito natural por parte de todos aqueles que desejam exercer esse direito. E isso nem o direito positivo nem a militância progressista poderão algum dia alterar ou subverter. O direito natural e a inclinação inerente dos seres humanos em atender às suas próprias disposições é simplesmente imutável e faz parte do seu conjunto de necessidades.
Conclusão
A militância progressista é absurdamente autoritária, e todas as suas estratégias e disposições ideológicas mostram que ela está em uma guerra permanente contra a ordem natural, contra a liberdade e principalmente contra o homem branco, cristão, conservador e tradicionalista — possivelmente, aquilo que ela mais odeia e despreza.
Nós não podemos, no entanto, ceder ou dar espaço para a agenda progressista, tampouco pensar em considerar como sendo normais todas as pautas abusivas e tirânicas que a militância pretende estabelecer como sendo perfeitamente razoáveis e aceitáveis, quer a sociedade aceite isso ou não. Afinal, conforme formos cedendo, o progressismo tirânico e totalitário tende a ganhar cada vez mais espaço.
Quanto mais espaço o progressismo conquistar, mais corroída será a liberdade e mais impulso ganhará o totalitarismo. Com isso, a cultura do cancelamento vai acabar com carreiras, profissões e com o sustento de muitas pessoas, pelo simples fato de que elas discordam das pautas universitárias da moda.
A história mostra perfeitamente que quanto mais liberdade uma sociedade tem, mais progresso ela atinge. Por outro lado, quanto mais autoritária ela for, mais retrocessos ela sofrerá. O autoritarismo se combate com liberdade, desafiando as pautas de todos aqueles que persistem em implementar a tirania na sociedade. O politicamente correto é o nazismo dos costumes, que pretende subverter a moral através de uma cultura de vigilância policial despótica e autoritária, para que toda a sociedade seja subjugada pela agenda totalitária progressista.
Pois quanto a nós, precisamos continuar travando o bom combate em nome da liberdade. E isso inclui reconhecer que ideologias, hábitos e costumes de que não gostamos tem o direito de existir e até mesmo de serem defendidos.
-
@ fa0165a0:03397073
2023-07-24 10:19:27Below is an easy-to-read list of keyboard shortcuts and commands to navigate your Linux computer efficiently: (Note that some variations between systems may apply)
General Shortcuts: Open Terminal: Ctrl + Alt + T Close current application: Alt + F4 Switch between open applications: Alt + Tab Logout from current session: Ctrl + Alt + Del Navigating the File System: Open File Manager (Nautilus): Super (Windows key) + E Move back in directory: Alt + Left Arrow Move forward in directory: Alt + Right Arrow Go to Home directory: Ctrl + H Go to Desktop: Ctrl + D Open a folder or file: Enter Rename a file or folder: F2 Copy selected item: Ctrl + C Cut selected item: Ctrl + X Paste copied/cut item: Ctrl + V Delete selected item: Delete Create a new folder: Ctrl + Shift + N Navigating Applications: Switch between open windows of the same application: Alt + ` Close the current window: Ctrl + W Minimize the current window: Ctrl + M Maximize/Restore the current window: Ctrl + Super + Up Arrow / Down Arrow Navigating Web Browsers (e.g., Firefox, Chrome): Open a new tab: Ctrl + T Close the current tab: Ctrl + W Switch to the next tab: Ctrl + Tab Switch to the previous tab: Ctrl + Shift + Tab Open a link in a new tab: Ctrl + Left Click Go back in the browser history: Alt + Left Arrow Go forward in the browser history: Alt + Right Arrow System Controls: Lock the screen: Ctrl + Alt + L Open the system menu (context menu): Menu key (or Right-click key) or Shift + F10 Open the Run Command prompt: Alt + F2
These shortcuts may vary slightly depending on the Linux distribution and desktop environment you are using. Experiment with these shortcuts to navigate your Linux system faster and more efficiently without relying on the mouse.
Since websites are such an important interface for the information of today, I have here appended the list with some navigational hotkeys for web browsers (e.g., Firefox, Chrome) on Linux:
General Navigation: Scroll down: Spacebar Scroll up: Shift + Spacebar Scroll horizontally: Hold Shift and scroll with the mouse wheel or arrow keys Go to the top of the page: Home Go to the bottom of the page: End Refresh the page: F5 or Ctrl + R Stop loading the page: Esc Link and Page Navigation: Move focus to the next link or interactive element: Tab Move focus to the previous link or interactive element: Shift + Tab Activate/follow a link or button: Enter Open link in a new tab: Ctrl + Enter (Cmd + Enter on macOS) Open link in a new background tab: Ctrl + Shift + Enter (Cmd + Shift + Enter on macOS) Open link in a new window: Shift + Enter Go back to the previous page: Backspace or Alt + Left Arrow Go forward to the next page: Alt + Right Arrow Searching: Find text on the page: Ctrl + F Find next occurrence: Ctrl + G Find previous occurrence: Ctrl + Shift + G Tab Management: Open a new tab: Ctrl + T Close the current tab: Ctrl + W Reopen the last closed tab: Ctrl + Shift + T Switch to the next tab: Ctrl + Tab Switch to the previous tab: Ctrl + Shift + Tab Switch to a specific tab (numbered from left to right): Ctrl + [1-8] Switch to the last tab: Ctrl + 9 Form Interaction: Move to the next form field: Tab Move to the previous form field: Shift + Tab Check/uncheck checkboxes and radio buttons: Spacebar Select an option from a dropdown menu: Enter, then arrow keys to navigate options Miscellaneous: Open the browser's menu: Alt (sometimes F10) Open the address bar (omnibox): Ctrl + L or Alt + D
Remember, the accessibility of websites can vary, and some sites might have different keyboard navigation implementations. In some cases, you may need to enable keyboard navigation in the browser's settings or extensions. Additionally, browser updates might introduce changes to keyboard shortcuts, so it's always good to check the latest documentation or help resources for your specific browser version.
But I hope this helps as an tldr and getting started with navigating your laptop the ways pro role.
Version controlled over at github gist.
-
@ aa55a479:f7598935
2023-07-19 17:54:44Test
-
@ 32dc4f25:f95ddcce
2023-07-27 11:56:04In Nostr, "smart clients and dumb servers" refers to a communication model or architecture where the clients (smart clients) have higher logic capability and autonomy, while the servers (dumb servers) have lower logic capability and passive execution.
Welcome read more of my articles form decentralized long content platform-- yakihonne.com
Smart clients refer to client applications or devices that possess higher intelligence and processing power. They can make autonomous decisions and perform specific tasks, including handling business logic, processing data, communicating, and coordinating interactions with other services. In Nostr, smart clients play an active role, responsible for controlling the communication flow, processing data and protocols, and executing necessary business logic.
Dumb servers, on the other hand, refer to servers with lower intelligence and passive execution capability. They are designed to perform basic data storage and transmission functions without handling complex business logic or decision-making. In Nostr, dumb servers act as passive recipients, receiving and transmitting data based on instructions from smart clients but not participating in active decision-making or coordination.
The concept of smart clients and dumb servers in Nostr aims to achieve a distributed and decentralized communication model. Smart clients take charge of managing the communication flow and business logic, reducing reliance on centralized servers and the risk of single points of failure. At the same time, the passive execution capability of dumb servers makes the system more scalable and flexible, allowing more services and nodes to join the communication network.
In summary, the smart clients and dumb servers model in Nostr provides a flexible, scalable, and decentralized communication architecture by decentralizing decision-making and logic to the clients and restricting servers to basic data storage and transmission functions.
Smart clients and dumb servers are the core idea in microservices
The tech development space is dominated by mantras — sayings that make sense in context but are repeated away from their original source. These expressions may carry meaning for the initiated but are often seen as complicated jargon to others. One such concept is “smart endpoints and dumb pipes.” Accredited to Martin Fowler, the idea is helpful to consider when designing microservices architectures.
The basic premise of “smart endpoints and dumb pipes” is that microservices should carry their own communication logic (the “endpoints”), and the carrier superstructure that transmits these messages (the “pipes”) should have as light a structure as possible. Why is this so? What’s the justification, and what’s the background upon which this conversation on pipes and endpoints exists?
Below, we’ll attempt to clarify what the saying means. We’ll look at where it came from, how it applies to microservice development, and see why it’s a helpful expression for modern software architects to understand.
The Problem of the Monolith
We’ve talked at length about both monoliths and microservices, but it bears significant importance to this discussion, so we will again define them.
Understanding Monolith
The traditional approach to development is often referred to by a singular term — the monolith. Monolithic development adopts a centralized data storage and response. A holdover from the era of mainframes and clients, the basic idea is to create a vertical stack wherein everything is located and guided by a single, all-powerful system. This non-distributed approach meant that everything existed in one body. Software applications were self-contained and non-modular, with requests being handled by a singular entity.
Often, these solutions were purpose-built, with many organizations utilizing a monolith due to their startup phases being singularly focused. This resulted in APIs that were non-modular, specific in intent and purpose to the point of rejecting extensibility and scalability in favor of stability and a functional response.
There are some strengths to this kind of approach. First, it’s often cheaper to engage with at the start. Building something to do one thing — and only one thing — is often more affordable than building a system that allows for many things. In such a monolithic approach, the service is a one-to-one relationship — the server owns the data, how it’s transmitted, and how it’s serviced, making a clear communication pathway.
Unfortunately, this means that monoliths are heavily siloed, lacking a malleable underlying codebase. Alterations or replacements carry an enormous cost in both time and resources. This can result in a sort of code paralysis just due to the sheer weight of the system. It can also result in a high cost to develop, maintain, and run. As a monolith grows larger, it remains constrained to its original design, incurring ever-growing expenses to add new components. This results in late-stage development that carries massive deployment burdens and slower release schedules.
Understanding Microservices
Microservices were designed to solve this problem. In essence, the microservice paradigm represents a sea change from isolated singular monoliths to a collection of smaller, single-purpose services that work in concert. With each service doing a specific thing, the idea is that those services would then work together to facilitate the core function.
This shift from non-distributed to distributed, from centralized to decentralized, delivers significant benefits. First, the system becomes much more scalable. An application can slot in new services or change existing ones without adverse end-user experience changes. Iteration does not freeze new development, as you’re changing a small part of the greater subset. You can compare this to changing clothes — if your entire outfit was all sewn together, as in a monolith, you would be unable to change your jacket or put on a hat without a huge effort. A microservice allows you to, metaphorically speaking, change hats or jackets at will. This is possible because the individual parts are only connected by necessity, not enforcement.
This also means microservices do not carry with them the problem of siloing. With components architected as microservices, the overall system can change, mold itself, and adapt to circumstances without affecting any other part. Business logic can be represented on the node with the closest affinity to its function, rather than being centralized into a singular node of control.
This system does introduce additional complexity, as we’ll cover below, yet the benefits are so substantial that many software architects have adopted it whole-heartedly.
The Solution: Smart or Dumb Pipes?
While microservices fix many of the issues of the monolith, some unique problems arise. Microservices don’t isolate everything into a single vertical silo, which is great — systems can be spread across multiple nodes and instances, creating a lot of variability in resource location. This also, unfortunately, creates some communication complexity. Think about it this way — imagine you are trying to do a group project, but instead of everyone sitting at the same conference table, you are all several miles apart. This would introduce significant communication barriers, even while introducing more extensibility and scalability to your group.
Microservices face this exact problem when it comes to facilitating inter-microservice communication. How do you solve that communication problem? There are really only two options: either you create smart pipes, or you create smart endpoints.
You could solve the microservices communication problem by creating smart pipes, and many implementations have employed this solution. An excellent example of this type, the Enterprise Service Bus, allows for the logic and processing governing communication to be placed in a singular system, a facilitation center, to make communication stable and efficient. However, the problem with doing this is that you’ve created a vertical system of centralization — while the entities doing the work are decentralized, all communication is centralized to this ESB, resulting in a sort of quasi-monolithic approach.
While an ESB is not itself bad — and to put a finer point, not necessarily a monolithic approach — taking it too far can certainly lead to a lot of the same issues incurred with a monolithic strategy.
Smart Endpoints and Dumb Pipes
A great solution is to go the other way. Instead of making your pipes smart, make your endpoints smart! In this system, you’re shifting the paradigm away from the idea of a communications bus and closer towards self-sufficient nodes. In essence, you are taking that business logic, communication logic, and general governance away from the communication node and are instead placing it firmly in the entities doing the talking.
The core idea of “smart endpoints and dumb pipes” is that the microservices, when designed correctly, don’t need a bus to govern this communication. The services themselves can govern the logical breakout of communication without going through an intermediary. If the User Service needs to talk to the Authentication Service, for instance, all the logic for how that works can be held by those entities — including a third entity may increase governance and control, but it may be antithesis to the microservice solution.
Martin Fowler’s coined this concept in his commentary on microservices. In it, he stated as thus:
“The microservice community favours an alternative approach: smart endpoints and dumb pipes. Applications built from microservices aim to be as decoupled and as cohesive as possible – they own their own domain logic and act more as filters in the classical Unix sense – receiving a request, applying logic as appropriate and producing a response. These are choreographed using simple RESTish protocols rather than complex protocols such as WS-Choreography or BPEL or orchestration by a central tool.”
To be clear, this doesn’t mean that no logic is located within the communication systems. The point is where the bulk of that logic resides. It’s one thing to have the “pipes” holding onto basic logic for managing asynchronous communication. Still, it’s a whole secondary thing to have them hold the complete logic for all communication and governance internally. Not only is it highly inefficient, but it may also be ineffective in actually facilitating communication. Lightweight logic systems can, in fact, be useful. Again, from the same commentary from Martin Fowler:
“The second approach in common use is messaging over a lightweight message bus. The infrastructure chosen is typically dumb (dumb as in acts as a message router only) – simple implementations such as RabbitMQ or ZeroMQ don’t do much more than provide a reliable asynchronous fabric – the smarts still live in the endpoints that are producing and consuming messages; in the services.”
This is really a solution that is more concerned with removing blockers than demanding a particular approach. The beauty of microservices lies in the freedom to adapt and change for different circumstances — adopting the concept of “smart endpoints and dumb pipes” simply allows that change and adaptation to be more fluid and independent of the communication controls and protocols more typical of centralized systems.
Conclusion
The best way to think about “smart endpoints and dumb pipes” is to think about where the logic exists. When you create dumb endpoints and smart pipes, you depend on the pipes for everything. You depend on the pipes to know where the data is coming from, where it’s going, and how it should be transmitted. You have centralized all communication power into a system of organization and governance. That’s not necessarily a bad thing in all cases — in secure data situations, you may very well need such a system — but it’s only truly appropriate in a handful of situations.
“Smart endpoints and dumb pipes” is an inversion of this principle. The microservices know what they’re doing, and if that’s true, why should you need to control their communication? If you’ve built out a robust microservice ecosystem, then you should be able to let them communicate over a system that is minimally controlling, with minimal logic built-in for core functions.
What do you think about this concept? Let us know below!
-
@ aac07d95:c5819a2f
2023-07-09 14:37:58More info about charge-lnd: https://github.com/accumulator/charge-lnd
Install with these commands line-by line (the code comments are starting with #): ```
change to the bitcoin user
sudo su - bitcoin
download charge-lnd
git clone https://github.com/accumulator/charge-lnd.git
create a dedicated macaroon
lncli bakemacaroon offchain:read offchain:write onchain:read info:read --save_to=~/.lnd/data/chain/bitcoin/mainnet/charge-lnd.macaroon
change directory
cd charge-lnd
install charge-lnd
pip install -U setuptools && pip install -r requirements.txt .
leave the bitcoin user
exit ```
Paste this whole code block to create the example config at
/home/bitcoin/charge-lnd/charge.config
: ``` echo " [default] strategy = static base_fee_msat = 1000 fee_ppm = 1000 time_lock_delta = 144[exchanges-drain-sats] node.id = 033d8656219478701227199cbd6f670335c8d408a92ae88b962c49d4dc0e83e025, 03cde60a6323f7122d5178255766e38114b4722ede08f7c9e0c5df9b912cc201d6,037f990e61acee8a7697966afd29dd88f3b1f8a7b14d625c4f8742bd952003a590,03cde60a6323f7122d5178255766e38114b4722ede08f7c9e0c5df9b912cc201d6,033d8656219478701227199cbd6f670335c8d408a92ae88b962c49d4dc0e83e025, 021c97a90a411ff2b10dc2a8e32de2f29d2fa49d41bfbb52bd416e460db0747d0d strategy = static base_fee_msat = 50000 fee_ppm = 2500 time_lock_delta = 144
[discourage-routing] chan.max_ratio = 0.10 chan.min_capacity = 250000 strategy = static base_fee_msat = 1000 fee_ppm = 2000 time_lock_delta = 144
[encourage-routing] chan.min_ratio = 0.90 chan.min_capacity = 250000 strategy = static base_fee_msat = 1000 fee_ppm = 10 time_lock_delta = 144
" | sudo -u bitcoin tee /home/bitcoin/charge-lnd/charge.config ```
Set up a cronjob with:
crontab -e
Paste this to the crontab to run it every 5 minutes (https://crontab.guru/#*/5_*_*_*_*):*/5 * * * * sudo -u bitcoin /home/bitcoin/.local/bin/charge-lnd -c /home/bitcoin/charge-lnd/charge.config
gist with more example configurations:
https://gist.github.com/openoms/9d0c554f620f4584c17bec268d4519e8
Posted also on: https://www.lightningnode.info/hardware-deployment/raspiblitz/charge-lnd
-
@ 7e9c924a:188c865e
2023-07-31 12:44:20Introduction
Users of Nostr are most likely already familiar with the fact that there are multiple kinds of events on Nostr. It's not just for Twitter like experiences. In this series of posts we'll take a look at how to build a fully functional blog with the following features:
- Setting up and getting started with SvelteKit and Tailwind
- How to fetch our posts from Nostr
- Dynamic SSR or prerendering using SvelteKit
- Building a simple cache adapter for Cachified that uses SQLite as a storage
- Adding a contact us form that sends a DM to use using Nostr.
We will be adding features as we go. At some point in the future I want to add some kind of authentication (using NIP-07) as well as comments, reactions and analytics.
If you're just interested in checking out the code, you can find it on GitHub. Each part and section has a separate branch.
This post and the upcoming parts will also be available on my blog where you can see the end result.
Getting started with SvelteKit
Getting the project set up
SvelteKit is one of the simplest solutions out there if you want to build your own blog. It's more work than just using something like Ghost but you get the advantage of customizability.
I will assume that you have some experience using the terminal here. To get going run the following commands somewhere on your compouter:
bash npm create svelte@latest my-blog
Select the following options:
- Which template: Skeleton project
- Type Checking: Yes, using Typescript syntax
- Addition options: ESLint, Prettier
When the installation process is done, run the following commands:
bash cd my-blog npm install npm run dev -- --open
You'll also need an editor. I prefer VSCode (or in the spirit of Nostr, go for VSCodium). If you're new to Svelte, make sure to download the Svelte extension on the marketplace.
To make our website look somewhat decent, let's also add Tailwind to handle the styling. Run
npx svelte-add@latest tailwindcss
in the terminal.If you are completely new to Svelte and SvelteKit I would recommend going through the learning material to get a basic understanding of what is going on.
Fetching posts from Nostr
Before we actually set things up to show only our own posts, let's start with just fetching the latest 5 events from a couple of different relays. We'll be using @
pablof7z
's excellent library@nostr-dev-kit/ndk
. We'll also need to polyfill the websocket functionality since node doesn't support this out of the box:bash npm install @nostr-dev-kit/ndk websocket-polyfill
Once it's installed, go ahead and create a file called
nostr.ts
in the following path:lib/server/
. This is where we will be interacting with Nostr.First off, we'll need a list of relays that support NIP-04 (long form content). I used
wss://purplepag.es
. Find more on nostr.watch. Let's write some code.```ts // lib/server/nostr.ts import NDK from "@nostr-dev-kit/ndk"; import 'websocket-polyfill'
const relays = ['wss://relay.damus.io', 'wss://purplepag.es']
class Nostr { private ndk: NDK
constructor(relays: string[]) { this.ndk = new NDK({ explicitRelayUrls: relays }) } public async init() { try { await this.ndk.connect() } catch (error) { console.error('Error connecting to NDK:', error) } }
}
export const ndk = new Nostr(relays); ```
We'recreating a simple class and instantiating NDK inside of it. We're then passing in our list of relays and exporting an instance of this class that we can use in the rest of our application. We've also added an
init()
method that we'll run when we start our application that connects to the relays.Next we'll create a method to fetch our posts but first, let's add an interface so we get some type safety. In
src/app.d.ts
go ahead and add the following interface:ts ... interface Article { slug?: string, d?: string, title: string, summary: string, tags: string[] published_at: string, }
Add a method to our Nostr class in
nostr.ts
. This will be the method that fetches the latest 5 events from our relays.```ts ... import type { NDKEvent } from "@nostr-dev-kit/ndk";
class Nostr { ... public async getAllArticles(): Promise
{ let events: NDKEvent[]; try { events = [...await this.ndk.fetchEvents({ kinds: [ 30023 ], limit: 5 })] } catch (error) { console.error('Error fetching events:', error) events = [] } const articles = events.map(({tags}) => { const { summary, image, published_at, title, t, d } = mapTags(tags) return { slug: generateSlug(title as string), d, summary, image, published_at, title, tags: t || [] } }) if (articles.length === 0) { return [] } return articles as Article[]
} } ```
Let's try to understand what's going on here. First we're importing the type
NDKEvent
. This is what we'll get back from NDK when we fetch our events.We're fetching out event here:
events = [...await this.ndk.fetchEvents({ kinds: [ 30023 ], limit: 5 })]
. NDK returns a set of NDKEvents. I like to work with arrays so I convert them to just that by spreading them into an empty array. The argument that we pass into the fetchEvents method is a filter object. It follows the NIP-01 specification, so you can put all sorts of things in here. You can read more about in the spec. In our case we're using the30023
kind which is the long-form content kind and we're limiting the number of results to5
.Nostr events have a few fields that we need to pay attention to. Often you'll find the actual body of a message in the
content
field, but other things that are specific to thekind
will be found in thet
field as an array of tuples so in order to more easily use this information we'll need to go through and clean it all up.The keen-eyed among you have probably already spotted the
mapTags()
andgenerateSlug()
functions here. Create a new filelib/server/utils.ts
and inside add the following:```ts import type { NDKTag } from "@nostr-dev-kit/ndk";
export function generateSlug(headline: string): string { const lowerCaseHeadline = headline.toLowerCase(); const slug = lowerCaseHeadline.replace(/[^\w\s]/g, '').replace(/\s+/g, '-'); return slug; }
export function mapTags(tuple_tags: NDKTag[]): Record
{ let tags = tuple_tags.reduce((acc, [key, value]) => { if (!acc[key]) acc[key] = value else if (typeof acc[key] === 'string') { acc[key] = [acc[key], value] as string[] } else { (acc[key] as string[]).push(value) } return acc }, {} as Record ) return tags } ```
You don't need to understand what's going on here other than the fact that the functions will generate a slug and map the
t
field into usable fields.Let's go back to our
lib/server/nostr.ts
file and import these.```ts import { generateSlug, mapTags } from './utils'
class Nostr { ... } ```
Ok. with that done we should be able to fetch events. Let's move on to the SvelteKit bits.
Showing posts on the front-end
SvelteKit apps are server-rendered by default for first load performance and SEO. This is important in order to get your posts ranked on search engines like Google or Bing.
First off, we need to make sure our
Nostr
client is initiated. Create a file called hooks:/hooks.server.ts
and add the following:```ts import type { Handle } from '@sveltejs/kit'; import { ndk } from '$lib/server/nostr'
await ndk.init()
export const handle = (async ({ event, resolve }) => { const response = await resolve(event); return response; }) satisfies Handle; ```
This file runs when the server starts. The handle hook runds on each request, we're not using it here but we need to add it in order for the rest of the file to be run.
Loading our events
Let's start with loading our events. Create a file
routes/+page.server.ts
. In here we'll call the get articles method on ourNostr
client.```ts import { ndk } from '$lib/server/api';
export const load = async () => { const articles = await ndk.getAllArticles()
return { articles };
} ```
A load function in SvelteKit is used to return data to the front-end. It runs whenever a request is sent to the corresponding route. In this case the root:
/
.Since we have already initiated our
Nostr
client inhooks.server.ts
we don't need to do it here. Simply callndk.getAllArticles()
and return the result.Building the front-end
Let's make something appear on the page. Open up
routes/+page.svelte
. Remove the contents of the file and add the following:```html
My blog
-
{#each data.articles as article}
- {article.title} {/each}
```
The things we returned in our
load
function will be available in ourdata
prop.To get a list of our posts we use an
{#each}
block and loop through our articles.Yay! Titles of posts! Amazing. Let's make it look a bit better. Create a new component:
routes/Summary.svelte
and add the following:```html
{article.title}
{@html article.summary}```
And finally import it and use it in
/routes/+page.svelte
:```html
My blog
-
{#each data.articles as article}
-
```
Last but not least we'll want to only fetch our own posts. To do that modify the filter we used in the
Nostr
class (lib/server/nostr.ts
):ts ... events = [...await this.ndk.fetchEvents({ kinds: [ 30023 ], authors: [YOUR_PUBKEY_GOES_HERE, SOME_OTHER_PUBKEY])] ...
Make sure that the pubkey you're using is in a hex format. If you only have your
npub1.....
style key you can use something like Nostr Check to convert it.You will probably notice that it takes quite a while to load your page at this point. In the next part we'll take a look at how we can fix this by using caching.
Until next time! 🤘
-
@ 7f5c2b4e:a818d75d
2023-07-05 13:49:49This translation of the nostr:naddr1qqxnzd3cxserxdpsxverzwp4qgs87hptfey2p607ef36g6cnekuzfz05qgpe34s2ypc2j6x24qvdwhgrqsqqqa28zcj37a was prepared by nostr:npub138s5hey76qrnm2pmv7p8nnffhfddsm8sqzm285dyc0wy4f8a6qkqtzx624
Habla es una plataforma basada en Nostr que te permite crear y gestionar notas de nostr de formato largo (long-form posts). Se podría comparar con Medium, pero Habla es mucho más que eso. Habla es superior a las plataformas tradicionales de blogs porque está construida sobre Nostr. Es interoperable con una plétora de otras aplicaciones Nostr, lo que hace que la experiencia del usuario sea fluida y más atractiva. Además, gracias a la Lightning Network, tu aportación -si los lectores la consideran valiosa- puede y será recompensada al instante con el mejor dinero que la humanidad ha visto jamás: bitcoin.
¿Qué es Nostr?
Nostr es una nueva forma de comunicarse online que ofrece un montón de ventajas a sus usuarios. Es gratis para todo el mundo; no necesitas un documento de identidad ni ningún otro tipo de verificación por parte de terceros para empezar a conectarte, relacionarte con personas afines y hacer crecer la comunidad que te rodea. Nostr suele confundirse con una plataforma de redes sociales, pero es mucho más que eso. Te animamos a que consultes los recursos de Nostr reunidos aquí para darte cuenta de la magnitud potencial de esta herramienta.
¿Cómo inicio sesión en Habla?
Para empezar a escribir en Habla, simplemente crea una cuenta Habla/Nostr e inicia sesión. Sigue estos unos sencillos pasos para registrarte, empezar a compartir valor y recibir valor de vuelta.
¿Cómo gano sats con Habla?
Habla te permite recibir valor directamente de tus lectores. No se requiere cuenta bancaria o identificación. Simplemente conecta tu dirección Lightning a tu cuenta Habla / Nostr y recibe fondos directamente en tu monedero - sin terceras partes, sin esperar retiros, sin agobios. Sigue estos sencillos pasos para hacerlo.
¿Por qué publicar en Habla es diferente?
El protocolo Nostr es superligero, lo que introduce algunas peculiaridades en cómo deben comportarse las aplicaciones basadas en Nostr. No entraremos en detalles técnicos, pero la diferencia más obvia que notarás como creador de contenido es que tendrás que usar un formato de texto diferente y, posiblemente, inusual mientras redactas tus posts. Pero no temas; Habla proporciona herramientas que hacen este proceso fácil e intuitivo. Aquí hay un video rápido de nostr:npub1wkljx5c6a8uccc5etws8ry0y3r4dgavh2dcav0tal4rtmcdl4z2sfu5u0t que explica lo básico de publicar con Habla (la guía fue hecha antes del rediseño, pero sigue siendo útil):
https://nostr.build/p/nb9474.mp4
Habla (y muchas otras aplicaciones Nostr) utiliza un formato bien establecido, que se llama Markdown. Existe desde hace casi una década y es compatible con la mayoría de las aplicaciones que utilizas a diario. La razón por la que puede que no hayas oído hablar de Markdown es porque las aplicaciones tradicionales suelen ocultarlo al usuario, y nosotros estamos trabajando para hacerlo también. Puedes encontrar más información sobre Markdown aquí.
¿Dónde se almacena mi contenido?
Las plataformas de blog tradicionales almacenan tus contenidos en sus propios servidores. Es un enfoque cómodo y (solía ser) sólido, pero conlleva riesgos críticos. Dejar los frutos de tu trabajo en manos de una sola parte significa que tienen el control total sobre tu contenido. Nostr lo soluciona. Cada vez que publicas algo, tu contenido se transmite a numerosos relés para su posterior almacenamiento y redistribución. Si algún operador de relé bloquea tu publicación o se niega a redistribuirla, tus lectores pueden recurrir a otros relés y acceder a tu contenido (no te preocupes si esto suena complicado; todo sucede bajo cuerda). Esto garantiza que nunca te silencien. Dicho esto, Habla no gestiona su propio relé; hemos decidido concentrarnos en lo que mejor sabemos hacer -construir una plataforma de blogs intuitiva, eficiente y fácil de usar que recompensa- y dejar el almacenamiento y distribución de contenidos a los profesionales en ese campo.
¿Cómo publico?
Habla te proporciona todas las herramientas necesarias para producir posts ricos y que destaquen. Prepara tu artículo, formatea tu texto con la ayuda de las herramientas designadas, añade medios y previsualiza los resultados antes de publicar. Todo lo que necesitas está al alcance de tu mano, y la plataforma es cada día mejor y más amigable.
¿Quién puede leer mis mensajes en Habla?
Cualquier persona en Internet puede leer tus posts. Sin embargo, si a tus lectores les gustaría interactuar con tu trabajo - ya sea siguiéndote / comentando / devolviéndote valor - deben crear una cuenta en Nostr. Te animamos a que ayudes a tus seguidores a introducirse en Nostr para hacer crecer una comunidad próspera y alcanzar nuevas cotas. Esta guía rápida te ayudará a ti y a tus compañeros a empezar.
Este FAQ es un trabajo en curso, y evolucionará a medida que Habla y Nostr se conviertan en herramientas aún más potentes. Por favor, dame tu opinión para que pueda mejorarla.
-
@ 3bf0c63f:aefa459d
2007-05-16 00:00:01Danilo acordou cedo
Danilo acordou cedo e saiu para pegar o metrô, trajava aquelas vestes que seus amigos chamavam de "roupa de comunista", uma calça velha de brim, bege, uma blusa branca com uma logomarca vermelha - que não tinha nada a ver com comunismo - velha sob um paletó azul surrado e chinelo de dedo. Suas roupas eram todas parecidas entre si e, combinadas com sua barba malfeita castanha e seu olhar fundo típico de pessoas alcoolizadas, davam-lhe, realmente, um aspecto notório de comunista.
Quando o metrô parou na estação, Danilo entrou com sua mochila. Não havia assentos livres, mas ele já estava acostumado, aliás, até gostava de ficar em pé, para sentir melhor no rosto o vento que só vinha das janelas superiores do veículo. Colocou a mochila no chão e se segurou em uma das barras de ferro do veículo. Seus cabelos, apesar de curtos, balançavam em intrépidos estandartizados movimentos, como se dançassem o som de "One of These Days".
-
@ a4a6b584:1e05b95b
2023-07-27 01:23:03"For it is written, As I live, saith the Lord, every knee shall bow to me, and every tongue shall confess to God. So then every one of us shall give account of himself to God." Romans 14:11-12
Though God has always had a people, the New Testament Church started with Christ and His disciples and was empowered at Pentecost. Historically, there has always been a remnant of people who adhered to the truth of God's Word for their faith and practice. I am convinced that the very same body of truth which was once delivered must be contended for in every generation.
In every generation from Christ and His disciples to this present moment, there have been holy adherents to the truth of God's Word. This witness of history is a witness in their blood. I am a Baptist by conviction. I was not "Baptist-born." I became and remain a Baptist because of what I have discovered in the Word of God.
The Lord Jesus Christ left His church doctrine and ordinances. The ordinances are baptism and the Lord's Supper. These are the things He ordered us to do. They picture His death and remind us of Him and His return. He also left us a body of doctrine. This involves our belief and teaching. Our doctrine distinguishes us. It is all from the Bible.
No one can be forced to become a Baptist. Our records show that baptism was always administered to professing believers. Hence, we refer to baptism as believers' baptism.
Baptists are gospel-preaching people. They preached the gospel to me. The Lord Jesus said to the first-century church, "But ye shall receive power, after that the Holy Ghost is come upon you: and ye shall be witnesses unto me both in Jerusalem, and in all Judea, and in Samaria, and unto the uttermost part of the earth ,, (Acts I :8). I am grateful to God to be a Baptist. I consider this to be New Testament Christianity. It is my joy to serve as the pastor of an independent Baptist church, preaching and teaching God's Word.
A Baptist recognizes the autonomy of the local church. There is no such thing as "The Baptist Church." The only Baptist headquarters that exists is in the local assembly of baptized believers. There are only local Baptist churches. \
When we say that the Bible is our sole authority, we are speaking of all the scriptures, the whole and its parts. We should preach the whole counsel of God. In the Bible we find the gospel-the death, burial, and resurrection of Jesus Christ. We should proclaim the gospel because Jesus Christ said that we are to take the gospel message to every creature. When we open the sixty-six books of the Bible, we find more than the gospel. Of course, that scarlet thread of redemption runs through all the Bible, but the whole counsel of God must be proclaimed.
The Bible says in Romans 14:11-12, ''For it is written, As I live, saith the Lord, every knee shall bow to me, and every tongue shall confess to God. So then every one of us shall give account of himself to God. " Each human being must answer personally to God.
In our nation we hear people talk about religious tolerance. Religious tolerance is something created by government. It is a "gift" from government. Religious tolerance is something man has made. Soul liberty is something God established when He created us. We find a clear teaching of this in His Word. Soul liberty is a gift from God! God's Word says in Galatians 5:1, "Stand fast therefore in the liberty wherewith Christ hath made us free, and be not entangled again with the yoke of bondage."
Soul liberty does not rest upon the legal documents of our nation-it is rooted in the Word of God. This individual freedom of the soul is inherent in man's nature as God created him. Man is responsible for his choices, but he is free to choose. This conviction is at the core of Baptist beliefs.
This powerful declaration about our Baptist position was made by J.D. Freeman in 1905:
Our demand has been not simply for religious toleration, but religious liberty; not sufferance merely, but freedom; and that not for ourselves alone, but for all men. We did not stumble upon this doctrine. It inheres in the very essence of our belief. Christ is Lord of all.. .. The conscience is the servant only of God, and is not subject to the will of man. This truth has indestructible life. Crucify it and the third day it will rise again. Bury it in the sepulcher and the stone will be rolled away, while the keepers become as dead men .... Steadfastly refusing to bend our necks under the yoke of bondage, we have scrupulously withheld our hands from imposing that yoke upon others .... Of martyr blood our hands are clean. We have never invoked the sword of temporal power to aid the sword of the Spirit. We have never passed an ordinance inflicting a civic disability on any man because of his religious views, be he Protestant or Papist, Jew, or Turk, or infidel. In this regard there is no blot on our escutcheon (family crest).
Remember that, when we are talking about individual soul liberty and the relationship of the church and the state, in America the Constitution does not place the church over the state or the state over the church. Most importantly, Scripture places them side by side, each operating independently of the other. This means there is freedom in the church and freedom in the state. Each is sovereign within the sphere of the authority God has given to each of them (Matthew 22:21).
Read carefully this statement made by Charles Spurgeon, the famous English preacher, concerning Baptist people:
We believe that the Baptists are the original Christians. We did not commence our existence at the Reformation, we were reformers before Luther or Calvin were born; we never came from the Church of Rome, for we were never in it, but we have an unbroken line up to the apostles themselves. We have always existed from the very days of Christ, and our principles, sometimes veiled and forgotten, like a river which may travel underground for a little season, have always had honest and holy adherents. Persecuted alike by Romanists and Protestants of almost every sect, yet there has never existed a government holding Baptist principles which persecuted others; nor, I believe, any body of Baptists ever held it to be right to put the consciences of others under the control of man. We have ever been ready to suffer, as our martyrologies will prove, but we are not ready to accept any help from the State, to prostitute the purity of the Bride of Christ to any alliance with overnment, and we wil never make the Church, although the Queen, the despot over the consciences of men.
The New Park Street Pulpit, Volume VII · Page 225
This a marvelous statement about Baptist beliefs. I am rather troubled when I see so many people who claim to be Baptists who do not understand why they are Baptists. We should be able to defend our position and do it biblically. lf we are people who know and love the Lord and His Word and if the Bible is our sole authority for faith and practice, then we have no reason to be ashamed of the position we take. May God not only help us to take this position, but to take it with holy boldness and cotnpassion. May He help us to be able to take His Word in hand and heart and defend what we believe to a lost and dying world.
So much of what we have to enjoy in our country can be credited to Baptist people. Any honest student of American history will agree that the Virginia Baptists were almost solely responsible for the First Amendment being added to our Constitution providing the freedom to worship God as our conscience dictates.
We have a country that has been so influenced that we do not believe it is right to exercise any control or coercion of any kind over the souls of men. Where did this conviction come from? We find it in the Bible, but someone imparted it to the Founding Fathers. It became the law of the land, and it should remain the law of the land. We need to understand it. It comes out of the clear teaching of God's Word concerning the subject of soul liberty.
There are many historic places there where people were martyred for their faith, giving their lives for what they believed. The religious persecution came as a result of the laws of the land. Although many Baptists have been martyred, you will never find Baptist people persecuting anyone anywhere for his faith, no matter what his faith may be.
There are many denominations that teach the Scriptures are the infallible Word of God, God is the creator of heaven and earth, man is a fallen creature and must be redeemed by the blood of Christ, salvation is the free offer of eternal Iife to those who receive it and eternal helI awaits those who reject it, the Lord's Day is a day of worship, and the only time for man to be made right with God is in this lifetime. There are certainly other commonly held teachings, but there are certain Baptist distinctive. Often an acrostic using the word BAPTISTS is used to represent these Baptist distinctive.
-
B is for biblical authority. The Bible is the sole authority for our faith and practice.
-
A for the autonomy of the local church. Every church we find in the New Testament was a self-governing church with only Christ as the head.
-
P represents the priesthood of believers and the access we have to God through Jesus Christ.
-
T for the two church officers-pastors and deacons. We find these officers in the New Testament.
-
I for individual soul liberty. Most people, when asked, say that the sole authority of the Scripture in our faith and practice is the single, most important distinctive of our faith. However, if we did not have individual soul liberty, we could not come to the convictions we have on all other matters.
-
S for a saved church membership.
Personal Accountability to God
Renowned Baptist leader, Dr. E.Y. Mullins, summarized our Baptist position in these words. He wrote:
The biblical significance of the Baptist is the right of private interpretation of and obedience to the Scriptures. The significance of the Baptist in relation to the individual is soul freedom. The ecclesiastical significance of the Baptist is a regenerated church membership and the equality and priesthood of believers. The political significance of the Baptist is the separation of church and state. But as comprehending all the above particulars, as a great and aggressive force in Christian history, as distinguished from all others and standing entirely alone, the doctrine of the soul's competency in religion under God is the distinctive significance of the Baptists.
We find this accountability in the opening verses of God's Word. When God created man, He created man capable of giving a personal account of himself to God. God did not create puppets; He created people. He gave man the right to choose. That is why we find the man Adam choosing to sin and to disobey God in Genesis chapter three. Of his own volition he chose to sin and disobey God. Genesis I :27 says, "So God created man in his own image, in the image of God created he him; male and female created he them. " We were made in God's image, and when God made us in His image, He made us with the ability to choose.
It is not right to try to force one's religion or belief upon another individual. This does not mean, however, that he can be a Christian by believing anything he wishes to believe, because Jesus Christ said that there is only one way to heaven. He said in John 14:6, "I am the way, the truth, and the life: no man cometh unto the Father, but by me." He is the only way to God. The only way of salvation is the Lord Jesus Christ.
In this age of tolerance, people say that nothing is really wrong. The same people who say that any way of believing is right will not accept the truth that one belief can be the only way that is right. You may believe anything you choose, but God has declared that there is only one way to Him and that is through His Son, Jesus Christ. He is the only way of salvation- that is why He suffered and died for our sins. The only way to know God is through His Son, the Lord Jesus Christ.
Someone is certain to ask, "Who are you to declare that everyone else's religion is wrong?" We are saying that everyone ~as ~ right to choose his own way, but God has clearly taught us in His Word that there is only one way to Him. The Lord Jesus says in John 10:9, "I am the door: by me if any man enter in, he shall be saved, and shall go in and out, and find pasture."
No human being is going to live on this earth without being sinned against by others. Many children are sinned against greatly by their own parents. However, we cannot go through life blaming others for the person we are, because God has made us in such a way that we have an individual accountability to God. This comes out of our soul liberty and our right to choose and respond to things in a way that God would have us espond to them. God has made us in His image. Again, He did not make us puppets or robots; He made us people, created in His image with the ability to choose our own way.
Remember, "For it is written, As I live, saith the Lord, every knee shall bow to me, and every tongue shall confess to God. So then every one of us shall give account of himself to God" (Romans 14:11- 12). We are responsible because we have direct access to God. God has given us the Word of God, the Holy Spirit, and access to the Throne by the merit of Jesus Christ. We, therefore? must answer personally to God at the judgment seat because God communicates to us directly.
People do not like to be held personally accountable for their actions. The truth of the Word of God is that every individual is personally accountable to God. In other words, you are going to meet God in judgment some day. I am going to meet God in judgment some day. All of us are going to stand before the Lord some day and answer to Him. We are individually accountable to God. Since the state cannot answer for us to God, it has no right to dictate our conscience.
We live in a country where there are many false religions. As Baptist people, we defend the right of anyone in our land to worship as he sees fit to worship. This is unheard of in most of the world. If a man is a Moslem, I do not agree with his Islamic religion, but I defend his right to worship as he sees fit to worship. The teaching of the Catholic Church declares that salvation comes through Mary, but this is not the teaching of the Bible. We zealously proclaim the truth, but we must also defend the right of people to worship as they choose to worship. Why? Because individual soul liberty is a gift from God to every human being.
Since the Bible teaches individual soul liberty and personal accountability to God, then it is a truth that will endure to all generations. To be sure, Baptists believe the Bible is the sole authority for faith and practice (II Timothy 3:16-17; Matthew 15:9; I John 2:20, 21 , 27) and the Bible clearly teaches that no church or government or power on earth has the right to bind a man's conscience. The individual is personally accountable to God. Hence, we reject the teaching of infant baptism and all doctrine that recognizes people as members of a church before they give evidence of personal repentance toward God and faith in the Lord Jesus Christ.
The famous Baptist, John Bunyan is the man who gave us Pilgrim's Progress. This wonderful book was planned during Bunyan's prison experience and written when he was released. The trial of John Bunyan took place on October 3, 1660. John Bunyan spent twelve years in jail for his convictions about individual soul liberty, failure to attend the Church of England, and for preaching the Word of God. During his trial, Bunyan stood before Judge Wingate who was interested in hearing John Bunyan state his case. Judge Wingate said, "In that case, then, this court would be profoundly interested in your response to them."
Part of John Bunyan's response follows:
Thank you, My 'lord. And may I say that I am grateful for the opportunity to respond. Firstly, the depositions speak the truth. I have never attended services in the Church of England, nor do I intend ever to do so. Secondly, it is no secret that I preach the Word of God whenever, wherever, and to whomever He pleases to grant me opportunity to do so. Having said that, My 'lord, there is a weightier issue that I am constrained to address. I have no choice but to acknowledge my awareness of the law which I am accused of transgressing. Likewise, I have no choice but to confess my auilt in my transgression of it. As true as these things are, I must affirm that I neiher regret breaking the law, nor repent of having broken it. Further, I must warn you that I have no intention in future of conforming to it. It is, on its face, an unjust law, a law against which honorable men cannot shrink from protesting. In truth, My 'lord, it violates an infinitely higher law- the right of every man to seek God in his own way, unhindered by any temporal power. That, My 'lord, is my response.
Remember that Bunyan was responding as to why he would not do all that he was doing for God within the confines of the Church of England. The transcription goes on to say:
Judge Wingate: This court would remind you, sir, that we are not here to debate the merits of the law. We are here to determine if you are, in fact, guilty of violating it. John Bunyan: Perhaps, My 'lord, that is why you are here, but it is most certainly not why I am here. I am here because you compel me to be here. All I ask is to be left alone to preach and to teach as God directs me. As, however, I must be here, I cannot fail to use these circumstances to speak against what I know to be an unjust and odious edict. Judge Wingate: Let me understand you. You are arguing that every man has a right, given him by Almighty God, to seek the Deity in his own way, even if he chooses without the benefit of the English Church? John Bunyan: That is precisely what I am arguing, My 'lord. Or without benefit of any church. Judge Wingate: Do you know what you are saying? What of Papist and Quakers? What of pagan Mohammedans? Have these the right to seek God in their own misguided way? John Bunyan: Even these, My 'lord. Judge Wingate: May I ask if you are articularly sympathetic to the views of these or other such deviant religious societies? John Bunyan: I am not, My 'lord. Judge Wingate: Yet, you affirm a God-given right to hold any alien religious doctrine that appeals to the warped minds of men? John Bunyan: I do, My'lord. Judge Wingate: I find your views impossible of belief And what of those who, if left to their own devices, would have no interest in things heavenly? Have they the right to be allowed to continue unmolested in their error? John Bunyan: It is my fervent belief that they do, My'lord. Judge Wingate: And on what basis, might r ask. can you make such rash affirmations? John Bunyan: On the basis, My 'lord, that a man's religious views- or lack of them- are matters between his conscience and his God, and are not the business of the Crown, the Parliament, or even, with all due respect, My 'lord, of this court. However much I may be in disagreement with another man's sincerely held religious beliefs, neither I nor any other may disallo his right to hold those beliefs. No man's right in these affairs are . secure if every other man's rights are not equally secure.
I do not know of anyone who could have expressed the whole idea of soul liberty in the words of man any more clearly than Bunyan stated in 1660. Every man can seek God as he pleases. This means that we cannot force our religious faith or teaching on anyone. It means clearly that no one can be coerced into being a Baptist and believing what we believe. It means that we can do no arm-twisting, or anything of that sort, to make anyone believe what we believe. Every man has been created by God with the ability to choose to follow God or to follow some other god.
Personal accountability to God is a distinctive of our faith. It is something we believe, and out of this distinctive come other distinctives that we identify with as Baptist people.
The Priesthood of Every Believer
The priesthood of the believer means that every believer can go to God through the merit of Jesus Christ. Christ and Christ alone is the only way to God. All of us who have trusted Christ as Saviour enjoy the glorious privilege of the priesthood of the believer and can access God through the merits of our Lord and Saviour Jesus Christ.
The Bible says in I Timothy 2: 1-6,
I exhort therefore, that, first of all, supplications, prayers, intercessions, and giving of thanks, be made for all men,· for kings, and for all that are in authority; that we may lead a quiet and peaceable life in all godliness and honesty. For this is good and acceptable in the sight of God our Saviour,· who will have all men to be saved, and to come unto the knowledge of the truth. For there is one God, and one mediator between God and men, the man Christ Jesus; who gave himself a ransom for all, to be testified in due time.
Take special note of verse five, "For there is one God, and one mediator between God and men, the man Christ Jesus."
Any man, anywhere in this world can go to God through the Lord Jesus Christ.
1 Peter 2:9 says, "But ye are a chosen generation, a royal pn~ • thood , an . holy nation, a peculiar people; that ye should. shew forth the praises of hzm who hath called you out of darkness into his marvellous light. "
Christians have access to God. You can personally talk to God. You can take your needs to the Lord. Whatever your needs are, you can take those needs to the Lord. You, as an individual Christian, can go to God through the Lord Jesus Christ, your High Priest who "ever liveth to make intercession" for you (Hebrews 7:25).
We have no merit of our own. We do not accumulate merit. People may make reference to a time of meritorious service someone has rendered, but we cannot build up "good works" that get us through to God. Each day, we must come before God as needy sinners approaching Him through the finished work of Christ and Christ alone.
The Bible teaches the personal accountability of every human being to God. We cannot force our religion on anyone or make anyone a believer. We cannot force someone to be a Christian. Think of how wrong it is to take babies and allow them later in life to think they have become Christians by an act of infant baptism. Yes, they have a right to practice infant baptism, but we do not believe this is biblical because faith cannot be forced or coerced.
There are places in the world where the state is under a religion. There are places in the world where religion is under the state-the state exercises control over the faith of people. This is not taught in the Bible. Then, there are countries like ours where the church and the state operate side by side.
Throughout history, people who have identified with Baptist distinctives have stood as guardians of religious liberty. At the heart of this liberty is what we refer to as ndividual soul liberty. I am grateful to be a Baptist.
The Power of Influence Where does this teaching of the priesthood of every believer and our personal accountability to God lead us? It leads us to realize the importance of the power of influence. This is the tool God has given us. I want to give you an Old Testament example to illustrate the matter of influence.
Judges 21 :25 tells us, "In those days there was no king in Israel: every man did that which was right in his own eyes."
In the days of the judges, every man did what was right in his own eyes with no fixed point of reference.
God's Word continues to describe this time of judges in Ruth 1:1, ''Now it came to pass in the days when the judges ruled, that there was a famine in the land. " God begins to tell us about a man named Elimelech, his wife Naomi, and his sons. He brings us to the beautiful love story of Ruth and Boaz. God tells us that at the same time in which the judges ruled, when there was anarchy in the land, this beautiful love stor of Boaz and Ruth took place.
This story gives us interesting insight on the responsibility of the Christian and the church. In the midst of everything that is going on, we are to share the beautiful love story of the Lord Jesus Christ and His bride. We need to tell people about the Saviour.
The same truth is found throughout the Word of God. Philippians 2:15 states, "That ye may be blameless and harmless, the sons of God, without rebuke, in the midst of a crooked and perverse nation, among whom ye shine as lights in the world. "
We are "in the midst ofa crooked and p erverse nation." This is why the Lord Jesus said in Matthew 5:16, "Let your light so shine before men, that they may see your good works, and glorify your Father which is in heaven. " Let your light shine!
The more a church becomes like the world, the less influence it is going to have in the world. Preaching ceases, and churches only have diolauges. Singing that is sacred is taken out and the worlds music comes in. All revrence is gone. What so many are attempting to do in order to build up their ministry is actualy what will cause the demise of their ministry. We will never make a difference without being willing to be differnt, Being different is not the goal. Christ is the goal, and He makes us different.
Remember, we cannot force people to become Christians or force our faith on people. It is not right to attempt to violate another man's will; he must choose of his own volition to trust Christ or reject Christ. When we understand this, then we understand the powerful tool of influence. We must live Holy Spirit-filled, godly lives and be what God wants us to be. We must be lights in a dark world as we live in the midst of a crooked generation. The only tool we have to use is influence, not force. As we separate ourselves to God and live godly lives, only then do we have a testimony.
Separation to God and from the world is not the enemy of evangelism; it cais the essential of evangelism. There can be no evangelism without separation to God and from the world because we have no other tool to use. We cannot force people to believe what we believe to be the truth. They must choose of their own will. We must so identify with the Lord Jesus in His beauty, glory, and holiness that He will be lifted up, and people will come to Him
The more a church becomes like the world, the less influence it is going to have in the world. Preaching ceases, and churches only have dialogue. Singing that is sacred is taken out, and the world's music comes in. All reverence is gone. What so many are attempting to do in order to build up their ministry is actually what will cause the demise of their ministry. We will never make a difference without being willing to be different. It is Christ who makes us different. Being different is not the goal. Christ is the goal, and He makes us different.
Remember, we cannot force people to become Christians or force our faith on people. It is not right to attempt to violate another man's will; he must choose of his own volition to trust Christ or reject Christ. When we understand this, then we understand the powerful tool of influence. We must live Holy Spirit-filled, godly lives and be what God wants us to be. We must be lights in a dark world as we live in the midst of a crooked generation. The only tool we have to use is influence, not force. As we separate ourselves to God and live godly lives, only then do we have a testimony.
Separation to God and from the world is not the enemy of evangelism; it is the essential of evangelism. There can be no evangelism without separation to God and from the world because we have no other tool to use. We cannot force people to believe what we believe to be the truth. They must choose of their own will. We must so identify with the Lord Jesus in His beauty, glory, and holiness that He will be lifted up, and people will come to Him.
As this world becomes increasingly worse, the more off-thewall and ridiculous we will appear to an unbelieving world. The temptation will come again and again for us to simply cave in.
When one finds people with sincerely held Baptist distinctives, he finds those people have a passion for going in the power of the Holy Spirit, obeying Christ, preaching the gospel to every creature. I am grateful to God to say, "I am a Baptist."
Baptists know it is because of what we find in the Bible about soul freedom, personal accountability, and the priesthood of every believer that we must use the power of Spirit-filled influence to win the lost to Christ. If we disobey Christ by conforming to the world, we lose our influence.
May the Lord help us to be unashamed to bear His reproach and be identified with our Lord Jesus Christ.
The Way of Salvation
Do you know for sure that if you died today you would go to Heaven?
1. Realize that God loves you
God loves you and has a plan for your life. "For God so loved the world, that he gave His only begotten Son, that whosoever believeth in him, should not perish but have everlasting life" (John 3:16 ).
2. The Bible says that all men are sinners
Our sins have separated us from God. "For all have sinned, and come short of the glory of God" (Romans 3:23). God made man in His own image. He gave man the ability to choose right from wrong. We choose to sin. Our sins separate us from God.
3. God's word also says that sin must be paid for
"For the wages of sin is death" (Romans 6:23 ). Wages means payment. The payment of our sin is death and hell, separation from God forever. If we continue in our sin, we shall die without Christ and be without God forever.
4. The good news is that Christ paid for our sins
All our sins were laid on Christ on the cross. He paid our sin debt for us. The Lord Jesus Christ died on the cross, and He arose from the dead. He is alive forevermore. "But God commendeth his love toward us, in that, while we were yet sinners, Christ died for us" (Romans 5:8).
5. We must personally pray and recieve Christ by faith as our saviour
The Bible says, "For whosoever shall call upon the name of the Lord shall be saved" (Romans 10:13 ).
Lets review for a moment
- Realize That God Loves You - John 3: 1
- All are sinners - Romans 3:21
- Sin Must Be Paid For - Romans 6:23
- Christ paid for our sins - Romans 5:8
- We Must Personally Pray and Receive Christ as Our Saviour - Romans 10:13
- Pray and recieve Christ as your saviour
Lord, I know that I am a sinner. If I died today I would not go to heaven. Forgive my sins and be my Saviour. Help me live for You from this day forward. In Jesus' name, Amen.
The Bible says, "For whosoever shall call upon the name of the Lord shall be saved" (Romans 10:13 ).
~navluc-latmes
-
-
@ a4a6b584:1e05b95b
2023-07-25 23:44:42Introducing Box Chain - a decentralized logistics network enabling private, secure package delivery through Nostr identities, zero-knowledge proofs, and a dynamic driver marketplace.
Identity Verification
A cornerstone of Box Chain's functionality and security is its identity verification system. Drawing on the principles of decentralization and privacy, it eschews traditional identifiers for a unique, cryptographic solution built on the Nostr protocol.
When a new user wishes to join the Box Chain network, they begin by generating a unique cryptographic identity. This identity is derived through complex algorithms, creating a unique key pair consisting of a public and a private key. The public key serves as the user's identity within the network, available to all participants. The private key remains with the user, a secret piece of information used to sign transactions and interactions, proving their identity.
Unlike many centralized systems, Box Chain does not require any form of real-world identification for this process. This is crucial for a few reasons. First, it ensures the privacy of all participants by not requiring them to disclose sensitive personal information. Second, it allows Box Chain to operate independently of any jurisdiction, enhancing its universal applicability.
Once their identity is established, participants engage in the network, accepting and fulfilling delivery tasks. Each successful delivery, confirmed by the receiver, contributes to the participant's reputation score. This reputation score is publicly linked to their identity and serves as a measure of their reliability and performance.
A critical aspect of mitigating potential identity fraud is the stake requirement. Each participant, before they can accept a delivery task, is required to stake a certain amount of Bitcoin. This acts as a form of collateral, held in escrow until the successful completion of the delivery. The staked amount is forfeited in case of fraudulent activities or non-delivery, creating a strong financial disincentive against dishonest behavior.
Overall, the identity verification system in Box Chain, built on unique cryptographic identities, a reputation score system, and the staking mechanism, provides a robust framework to ensure reliability and trust in a decentralized, borderless context. It significantly mitigates the risks of identity fraud, further strengthening Box Chain's promise of secure, efficient, and private package delivery.
Zero-Knowledge Proof Shipping Information
Box Chain's unique approach to preserving privacy while ensuring accurate delivery relies heavily on the concept of Zero-Knowledge Proofs (ZKPs). This cryptographic principle allows for the verification of information without revealing the information itself.
When a package is set for delivery, the sender provides the recipient's address. This address is immediately subjected to cryptographic transformation - the details are processed and turned into a form that is impossible to understand without a specific decryption key. This transformation process is designed to ensure maximum privacy - the original address remains known only to the sender and, eventually, the recipient.
In this transformed state, the address can still be verified for its validity without revealing its actual content. This is where ZKPs come into play. The sender, by leveraging ZKPs, can prove to the Box Chain system that they possess a valid address without needing to reveal the specifics of that address. This protects the privacy of the recipient by ensuring their address isn't openly visible on the system.
For the delivery process, the route is broken down into relay points, each assigned a particular driver. This relay-based system plays a significant role in preserving privacy and ensuring package safety. Instead of providing the entire route to each driver, they only receive the location of the next relay point. This is accomplished by giving them a cryptographic key which, when applied to the encrypted route data, reveals only the necessary information - the location of the next relay driver.
This way, the sender's address and recipient's final address remain as private as possible. Moreover, it enables the package to be securely handed off from one relay driver to the next, each having access to only the information necessary for their leg of the journey.
It's important to note that this process, while extremely secure, can take a longer time due to its relay nature and the cryptographic processes involved. There may also be a need to introduce a timeout mechanism for each leg of the journey to ensure drivers complete their part in a reasonable timeframe.
By incorporating ZKPs and a relay-based delivery system, Box Chain is able to provide a level of privacy and security that is rare in the logistics world, striking a balance between efficiency and privacy.
Delivery Confirmation
One of the key aspects of ensuring the success and trustworthiness of Box Chain's decentralized package delivery service is the process of delivery confirmation. To make this process as secure, transparent, and reliable as possible, Box Chain implements a multi-signature approach that leverages the power of cryptography.
When a delivery is made, it is not enough to simply leave the package at the prescribed location. Confirmation of delivery is critical to ensure the transaction is closed and the delivery person can receive their payment. This is where the multi-signature approach comes into play.
In essence, a multi-signature approach means that more than one party needs to provide a digital signature to confirm the transaction. For Box Chain, this involves both the receiving party and the delivery person. After the delivery person leaves the package at the prescribed location, they would use their private cryptographic key to sign a confirmation of delivery. The recipient, upon receiving the package, would do the same.
These digital signatures are securely generated using each party's private key and are incredibly difficult, if not impossible, to forge. This means that when both signatures are provided, the system can be confident that the package was delivered and received successfully.
As a further layer of security and accuracy, Box Chain incorporates geolocation confirmation into its delivery confirmation process. This means that when the delivery person signs the confirmation of delivery, their geographic location is checked against the intended delivery location. If the two locations match within an acceptable margin of error, this acts as another layer of proof that the delivery was made successfully.
This process ensures that delivery confirmations are not only secure but are also indisputable. It provides a significant level of trust and reliability, both for the sender and receiver, but also for the delivery people who are looking to receive their payments.
Furthermore, it aligns with the principles of decentralization and privacy, leveraging the power of cryptography and blockchain technology to provide a secure, transparent, and effective service. This approach positions Box Chain as a pioneer in combining the gig economy with the blockchain technology, providing a unique and secure package delivery solution.
Package Security
Guaranteeing the security of packages throughout transit is fundamental to Box Chain's service offering. To accomplish this, Box Chain utilizes a unique stake system that incentivizes delivery drivers to handle packages with the utmost care.
When a sender initiates a delivery, they specify the value of the package being shipped. This value is used to determine the stake amount that the delivery person must deposit to accept the delivery task. The stake functions as a form of shipping insurance, held in escrow by the Box Chain system until successful completion of the delivery.
The stake amount set by the sender is commensurate with the value of the package, allowing the sender to decide how much insurance they deem appropriate for their shipment. Higher-value packages may necessitate a larger stake, serving to reassure the sender that the delivery person has a significant financial incentive to ensure the safe delivery of the package.
In the event that a package is lost or damaged, the stake acts as a safety net for the sender. The staked amount is forfeited by the delivery driver and is transferred to the sender as compensation for their loss.
This system not only provides reassurance to the sender, but also gives a powerful incentive for delivery drivers to handle the packages with care. By having their own funds at risk, delivery drivers are likely to go the extra mile to ensure packages are safely delivered to their destination.
Further enhancing the security of the package, Box Chain uses a multi-signature approach during the relay handoff process. The current delivery driver and the next relay driver must both provide a digital signature to confirm the handoff. This ensures that responsibility for the package is officially transferred from one party to the next in a secure and verifiable manner. This "chain" of signatures creates an auditable trail that adds another layer of security and trust to the process.
Through the combination of a stake system and multi-signature handoff confirmation, Box Chain provides a comprehensive security solution that ensures packages are not only delivered efficiently but are also safeguarded throughout their transit journey.
Routing Algorithm and Pricing Mechanism
The efficiency and economic fairness of Box Chain's service hinge significantly on its innovative routing algorithm and pricing mechanism. This system is designed to ensure a smooth journey for each package, while also ensuring that delivery drivers are fairly compensated for their work.
The journey of a package begins with the sender specifying the destination and offering a starting payment for delivery. This payment information is propagated to the network, along with the package's destination, where potential delivery drivers can view it. However, unlike traditional package delivery services, the journey and price aren't fixed from the onset. Instead, Box Chain employs a dynamic, decentralized, and market-driven approach to optimize routing and pricing.
The total journey is divided into smaller legs, each of which can be undertaken by different delivery drivers. The starting payment offered by the sender is not a flat rate for the whole journey but is instead used as an initial bid for each leg of the journey. This bid is then doubled, and the network waits for a delivery driver to accept the offer. If no one accepts, the bid continues to increase in increments, creating an auction-like environment.
This system allows the real-time market dynamics to determine the cost of delivery. Factors such as distance, package weight, or even current traffic conditions can influence how much a delivery driver is willing to accept for a leg of the journey. If a leg of the journey is particularly difficult or inconvenient, the price will naturally rise until it reaches a level that a delivery driver deems worthwhile.
By allowing the drivers themselves to choose which jobs to accept based on their own assessment of the work's value, Box Chain creates a fair, flexible, and dynamic market where compensation is closely tied to the effort and resources required to complete a delivery. This is akin to how the Tor network or the Lightning Network operates, but instead of data packets being routed and priced, Box Chain is doing this with physical packages in the real world.
Such a system not only incentivizes delivery drivers to participate but also ensures the service remains adaptable and resilient to changing conditions and demands. This represents a novel and intelligent use of blockchain technology to disrupt the traditional gig economy model, placing the power in the hands of the individual drivers and fostering a more equitable and efficient market.
Incentive for Participation
Box Chain's unique approach to package delivery provides a plethora of compelling incentives for individuals to participate in the network as delivery drivers. By offering the potential for higher earnings through a competitive and dynamic bidding system, Box Chain encourages active participation and healthy competition within its network.
Upon initiating a package delivery, the sender offers a starting bid for each leg of the journey. This bid is then escalated through the network, doubling with each round until a delivery driver accepts the task. This mechanism presents delivery drivers with the opportunity to earn more for their services, especially in cases where the journey is long or complex.
However, the competitive aspect of the bidding system also helps regulate the pricing. Although prices might be high initially, as more delivery drivers join the network and competition increases, the bid required to win a delivery job is likely to decrease. This dynamic balance ensures a fair market price for delivery services while also enabling delivery drivers to optimize their earnings.
Furthermore, Box Chain's reputation system, based on the Nostr protocol, provides an additional layer of incentive for delivery drivers. The reputation of each driver is tracked and publicly displayed, allowing senders to gauge the reliability and efficiency of their potential couriers. As drivers successfully complete deliveries and earn positive feedback, their reputation score increases.
This reputation score can play a pivotal role in the Box Chain economy. Drivers with higher reputation scores may demand higher bids for their services, reflecting their proven track record of successful deliveries. Alternatively, the system could grant priority in the bidding process to drivers with higher reputation scores. This not only incentivizes good performance and reliability but also helps to foster trust within the network.
Overall, Box Chain's combination of a competitive bidding system and a reputation-based incentive structure encourages active participation and ensures a high standard of service. By aligning economic incentives with high-quality service delivery, Box Chain empowers its participants while also ensuring a satisfying experience for its users.
Here is an additional section covering dispute resolution and failed delivery handling that could be added:
Dispute Resolution and Failed Deliveries
For a decentralized network like Box Chain, dispute resolution and handling of failed deliveries present unique challenges. To address these in alignment with its principles, Box Chain implements an arbitration-based system and a dual-stake mechanism.
In case deliveries fail due to recipients being unavailable or refusing, both the sender and recipient are required to stake funds as collateral. If the delivery cannot be completed, the stakes are forfeited and awarded to the delivery driver as compensation for their time and effort. This creates a disincentive for recipients failing to receive packages and ensures drivers are paid for their work.
For disputes between senders, recipients and drivers, Box Chain leverages a decentralized arbitration system built on the Nostr protocol. Independent arbitrators stake their own funds and adjudicate disputes based on review of evidence from both parties. Their incentive is a small percentage of the transaction amount.
The arbitration process involves submission of dispute details, review of evidence, ruling based on policies, and appeals handled by additional arbitrators if required. A majority ruling wins the appeal. This system, relying on staked incentives and the wisdom of the crowd, enables fair dispute resolution aligned with Box Chain's ethos.
By combining reciprocal stakes and decentralized arbitration, Box Chain is able to provide robust recourse around failed deliveries and disputes while retaining its principles of decentralization, privacy, and aligned incentives. These mechanisms strengthen the system and instill further user trust and satisfaction.
Bitcoin + Nostr = <3
The combination of Bitcoin and Nostr represents a powerful and synergistic integration of decentralized technologies. Bitcoin provides a secure, transparent and decentralized means of value transfer, while Nostr offers a decentralized protocol for creating and managing digital identities and data. Together, they can enable truly decentralized and privacy-preserving applications, like Box Chain, that have the potential to disrupt traditional business models and empower individuals around the world. The future looks promising with such advanced and transformative technologies working hand in hand.
Read more from Adam Malin at habitus.blog.
Adam Malin
You can find me on Twitter or on Nostr at
npub15jnttpymeytm80hatjqcvhhqhzrhx6gxp8pq0wn93rhnu8s9h9dsha32lx
value4value Did you find any value from this article? Click here to send me a tip!
-
@ 78733875:4eb851f2
2023-07-14 22:25:21"The computer can be used as a tool to liberate and protect people, rather than to control them," as Hal Finney wrote so presciently 30 years ago.[^fn-hal]
The goal of OpenSats is to help build the tools that Hal alluded to. Tools that liberate and protect, rather than systems that control and oppress. Many tools still have to be built. Many tools still need to be improved. However, "the universe smiles on encryption," as Assange so aptly put it.[^fn-assange]
We believe that freedom tech is what carries this smile forward, which is why we are delighted to announce grants for over a dozen projects in the bitcoin & lightning ecosystem.
[^fn-hal]: Hal Finney: Why remailers... (November 1992)
[^fn-assange]: Julian Assange: A Call to Cryptographic Arms (October 2012)
The following open-source projects were selected by the OpenSats board for funding:
- Payjoin Dev Kit
- Bolt12 for LND
- Splicing
- Raspiblitz
- Labelbase
- BTCPay Server
- ZeroSync
- Mutiny Wallet
- next-auth Lightning Provider
- Cashu
- lnproxy
- Blixt Wallet
Let's take a closer look at each to understand their goal and how it aligns with the OpenSats mission.
Payjoin Dev Kit
Payjoin brings privacy to bitcoin without changing the way you're used to using it. Payjoin transactions look no different from normal activity on-chain, so they boost everyone's privacy, even those who don't payjoin, and foil chain surveillance.
Payjoin is easy to integrate and falls back to working defaults where it isn't supported, but it can only take off when senders and receivers include standard payjoin support in their software. Payjoin Dev Kit makes it easy for wallet developers to integrate BIP 78 standard payjoins everywhere, having working reference integrations for Bitcoin Core, LND, and BDK.
Repository: github.com/payjoin
License: MITBolt12 for LND
Bolt12 brings a new invoice format, enabling static invoices (offers) as well as recurring payments. It adds support to receive payments in a lightning-native way without using a web server. It also uses Blinded Paths to disguise the destination of a node both when fetching the invoice and when paying. This improves privacy and, therefore, security for the receiver of the payment.
Consequently, Bolt12 makes it much easier to receive and send payments without any third-party infrastructure in a native-lightning way. Static invoices make donations and recurring payments much easier.
Repository: lightningnetwork/lnd
License: MITSplicing
Splicing is the ability to resize Lightning channels on-the-fly, giving users of the Lightning Network many additional benefits that were not intuitively obvious at first. Splicing scales Lightning by removing a fundamental limitation. Removing this limitation increases fungibility and lowers blockspace usage, an important step towards maturing the Lightning network and enabling the onboarding of millions, and ultimately billions, of people.
Repository: ddustin/splice
License: BSD-MITRaspiblitz
Raspiblitz is a do-it-yourself node stack that allows you to run a Lightning Node together with a Bitcoin Core full node on your Raspberry Pi. While the Raspberry Pi is the most common hardware running this particular software, it was developed to support multiple hardware platforms and can run on bare metal servers too.
The open-source project was started in 2018 as part of a Lightning hackathon in the German Bitcoin space. Since then, it has grown to over 150 contributors and 2000 stars on GitHub. The software integrates dozens of services and tools via its plugin system and sports advanced features like touchscreen support, channel autopilot, backup systems, DynDNS, SSH tunneling, and more.
Repository: raspiblitz/raspiblitz
License: MITLabelbase
Labelbase is a label management service for Bitcoin transactions and addresses. It provides features for adding labels, importing and exporting labels, and offers a public API for integration with wallets and existing workflows.
Labelbase supports BIP-329, a format for unifying label data. The goal of the project is to offer a convenient solution for managing labels associated with Bitcoin transactions and addresses across wallets and other tools. By providing a unified label management interface, Labelbase enhances the user experience, improves privacy, and promotes better organization and understanding of Bitcoin transactions.
Repository: Labelbase/Labelbase
License: MITBTCPay Server
BTCPay Server is a free, open-source & self-hosted bitcoin payment gateway that allows self-sovereign individuals and businesses to accept bitcoin payments online or in person without added fees.
At its core, BTCPay Server is an automated invoicing system. Merchants can integrate the software with their website or shop, so customers are presented with an invoice upon checkout. The status of the invoice will update according to settlement, so merchants can fulfill the order at the appropriate time. The software also takes care of payment refunding and bitcoin management alongside many other features.
Repository: btcpayserver/btcpayserver
License: MITZeroSync
While ZeroSync is still at an early stage, its promise is to allow verification of Bitcoin's chain state in an instant. It offers compact cryptographic proofs to validate the entire history of transactions and everyone's current balances.
The first application is to "zerosync" Bitcoin Core in pruned mode. The long-term vision for ZeroSync is to become a toolbox for custom Bitcoin proofs.
Repository: zerosync/zerosync
License: MITMutiny Wallet
Mutiny Wallet is a web-first wallet capable of running anywhere, providing instant onboarding and platform censorship resistance. It is self-custodial, privacy-focused, user-friendly, and open-sourced under the MIT license.
The wallet has a strong focus on privacy, scalability, and accessibility. In addition to features that you would expect a regular lightning wallet to have, the team is working to incorporate Nostr-related features into the wallet, such as a feed of friends' Zaps, native Zap sending and receiving, a lightning subscription specification for services such as nostr relays, and a P2P DLC marketplace. The team's goal is to provide users with a seamless experience, combining the power of Bitcoin and Lightning with social media in a way that matches the Bitcoin ethos.
Repository: MutinyWallet
License: MITnext-auth Lightning Provider
The goal of this project is to implement an authentication provider for next-auth, an authentication provider for the popular open-source framework NextJS. The next-auth framework has nearly 500k weekly downloads and powers the authentication of many modern web, mobile, and desktop apps. Having a plug-and-play Provider for Lightning makes integration easier and more attractive for developers.
Repository: jowo-io/next-auth-lightning-provider
License: ISCCashu
Cashu is a Chaumian ecash system built for bitcoin that brings near-perfect privacy for users of custodial bitcoin applications. A Cashu ecash mint does not know who you are, what your balance is, or who you're transacting with. Users of a mint can exchange ecash privately, without anyone being able to know who the involved parties are.
Payments are executed without anyone able to censor specific users. There are multiple implementations of the Cashu protocol. Popular open-source wallets are Cashu Nutshell, Cashu.me, and Nutstash.
Repository: cashubtc/cashu
License: MITlnproxy
lnproxy is a simple privacy tool that empowers users of custodial Lightning wallets with better payment destination privacy and sovereign node runners with enhanced receiver privacy. lnproxy works like a "poor man's" rendezvous router, providing privacy for users without taking custody of their funds. The project encompasses an LNURL-style protocol specification and a collection of open-source implementations of lnproxy clients and a relay.
Repository: lnproxy/lnproxy
License: GPL 3.0 & MITBlixt Wallet
Blixt is a non-custodial wallet for bitcoiners who want to give Lightning a try. It runs on Android, iOS, and macOS. It is easy to use and straightforward to set up, making it a user-friendly option to get started with Lightning.
Blixt uses LND and Neutrino under the hood, directly on the phone, respecting your privacy. The wallet does not use any centralized servers for doing transactions. Channels are opened automatically on the user's behalf, making it easy to get up and running on Lightning.
Repository: hsjoberg/blixt-wallet
License: MIT
In addition to the software projects listed above, three educational initiatives were selected for funding:
- Bitcoin Education in Nigeria is an initiative started and led by Apata Johnson. Apata's project aims to educate youths on bitcoin and the opportunities it brings for the people living in the rural areas of Nigeria.
- 21 Ideas is a project that aims to bring quality Bitcoin education to Russian citizens. Tony and others have been working for many years on translations, original material, and hands-on tutorials for beginners. We believe that education is paramount to proper Bitcoin use, and localization is paramount for everyday citizens to properly grasp the importance as well as the novel concepts of bitcoin.
- CoreDev.tech is organizing recurring developer events, which are all about bringing devs together so that they can hack on Bitcoin Core and related software.
We received hundreds of applications in the last couple of months, which is a fantastic signal and something we are delighted about. Some applications are still being reviewed by the OpenSats board, as we try our best to assess feasibility, alignment, and potential impact of each project. We will announce additional grants as applications pass our grant selection process.
Unfortunately, we were unable to fund all of the proposals that were sent to us. Please don't hesitate to apply again in case your application was rejected this time around. The applicant pool was very competitive, which is a great thing to see in and of itself.
Grants for the projects above are funded by contributions to the Bitcoin General Fund. Our operations as well as our grant programs are made possible by generous donors like you. If you want to help fund the Bitcoin ecosystem, please donate to the Bitcoin General Fund.
Our team is screening applications constantly, and we will announce new grants and funding opportunities as they arise. If you are working on an open-source project in and around bitcoin, and you think your work is aligned with the OpenSats mission, please apply for funding.
-
@ 3bf0c63f:aefa459d
2007-05-16 00:00:00Danilo acordou cedo
Danilo acordou cedo e saiu para pegar o metrô, trajava aquelas vestes que seus amigos chamavam de "roupa de comunista", uma calça velha de brim, bege, uma blusa branca com uma logomarca vermelha - que não tinha nada a ver com comunismo - velha sob um paletó azul surrado e chinelo de dedo. Suas roupas eram todas parecidas entre si e, combinadas com sua barba malfeita castanha e seu olhar fundo típico de pessoas alcoolizadas, davam-lhe, realmente, um aspecto notório de comunista.
Quando o metrô parou na estação, Danilo entrou com sua mochila. Não havia assentos livres, mas ele já estava acostumado, aliás, até gostava de ficar em pé, para sentir melhor no rosto o vento que só vinha das janelas superiores do veículo. Colocou a mochila no chão e se segurou em uma das barras de ferro do veículo. Seus cabelos, apesar de curtos, balançavam em intrépidos estandartizados movimentos, como se dançassem o som de "One of These Days".
-
@ 78733875:4eb851f2
2023-07-07 22:06:45The mission of OpenSats is to support and maintain a sustainable ecosystem of funding for free and open-source projects that help Bitcoin flourish. Nostr is such a project, which is why OpenSats introduced The Nostr Fund and built a team around the protocol's originator to help fund the growing nostr ecosystem. As an open, interoperable, and censorship-resistant protocol, nostr has the chance of doing social-native networking right.
After weeks of sorting through applications, we are excited to announce the first round of grants from The Nostr Fund. OpenSats is proud to support over a dozen projects, from clients to relay implementations to adjacent tools and design efforts.
In no particular order, here they are:
- NDK by @pablof7z
- Habla by @verbiricha
- Coracle by @hodlbod
- Iris by @mmalmi
- Damus by @jb55
- rust-nostr & nostr-sdk by @yukibtc
- Nostr Relay NestJS by @CodyTseng
- Soapbox by @alexgleason
- Code Collaboration over Nostr by @DanConwayDev
- Satellite by @lovvtide
- Amethyst by @vitorpamplona
- Pinstr by @sepehr-safari
- nostr.build by @nostr.build
- Gossip by @mikedilger
- Nostr SDK iOS by @bryanmontz
- Nostr Design by @karnage
The projects above have received grants of various durations and sizes, and we have more nostr-related applications in the pipeline. Donate to The Nostr Fund if you want to help fund the nostr ecosystem.
Without further ado, let's take a closer look at each project in turn.
NDK
NDK is a nostr development kit that makes the experience of building Nostr-related applications—whether they are relays, clients, or anything in between—better, more reliable, and overall more enjoyable to work with than existing solutions. The core goal of NDK is to improve the decentralization of Nostr via intelligent conventions and data discovery features without depending on any one central point of coordination, such as large relays or centralized search providers.
Repository: nostr-dev-kit/ndk
License: MITHabla
Habla is a website for reading, writing, curating, and monetizing long-form content on nostr. It uses NIP-23 to allow markdown-formatted articles and embedded nostr content such as notes, profiles, lists, relays, badges, and more. The goal of Habla is to give everyone an alternative to centralized publishing platforms such as Medium or Substack, which are by their very nature prone to censorship and deplatforming.
Repository: verbiricha/habla.news
License: GNU GPL v3.0Coracle
Coracle is a nostr web client focusing on user experience, performance, and scaling of the nostr network beyond the "twitter clone" use-case. The end goal is to build marketplaces, groups, chat, and more on top of an emergent web of trust. Coracle is already one of the most mature and accessible clients for new users while also providing some novel features for more advanced nostriches.
Repository: coracle-social/coracle
License: MITIris
Iris is a multi-platform nostr client that is available for web, mobile, and desktop. Iris' design goals are speed, reliability, and ease of use. The client features public as well as private messaging, customizable feeds, an offline mode, and speedy account creation.
Repository: irislib/iris-messenger
License: MITDamus
Damus is a cutting-edge nostr client for iOS. The goal of Damus is to integrate bitcoin with social media and to show the power, censorship resistance, and scalability of nostr in general. Damus includes picture and video uploading, is fully translated into 24 languages, supports automatic translation of notes, and includes all of the features you would expect from a Twitter-like client.
Repository: damus-io/damus
License: GNU GPL v3.0rust-nostr & nostr-sdk
Rust-nostr is a Rust implementation of the nostr protocol. It is a high-level client library with the explicit goal to help developers build nostr apps for desktop, web, and mobile that are both fast and secure. Rust crates can be easily embedded inside other development environments like Swift, Kotlin, Python, and JavaScript, making rust-nostr a versatile base to build upon. While the project is in the early stages of development, over 35 NIPs are already supported, with more to come.
Repository: rust-nostr/nostr
License: MITNostr Relay NestJS
Nostr-relay-nestjs is a Nostr relay with a clear structure that is easy to customize to your needs. This relay implementation is based on the NestJS framework and focuses on reliability and high test coverage.
Repository: CodyTseng/nostr-relay-nestjs
License: MITSoapbox
Soapbox started out as an alternative to Mastodon but has grown to encompass ActivityPub and nostr while being interoperable with both. In February 2023, the team launched the "Mostr" bridge, seamlessly connecting nostr to the ActivityPub Fediverse and enabling bidirectional communication between both protocols. This bridge exposes over 9.4M potential users in nostr's target audience to nostr, many of whom have already left the Fediverse completely in favor of nostr.
Repository: gitlab.com/soapbox-pub
License: GNU Affero General Public License v3.0Code Collaboration over Nostr
This project is a proof-of-concept for a much-needed, often discussed, and permissionless, nostr-based GitHub alternative. The goal is to replace the traditional interactions using a centralized server or service with a nostr-based alternative centered around nostr events. Commits, branches, pull requests, and other actions are all modeled as nostr events, with permissions managed through groups so that multiple maintainers can manage a repository. This model reduces the barriers for clients to support repository collaboration and allows for interoperability between repository management tools.
Repository: DanConwayDev/ngit-cli
License: MITSatellite
satellite.earth is a web client for nostr that has a community focus and presents conversations as threaded comments, borrowing from the traditional Reddit interface.
Repository: lovvtide/satellite-web
License: MITAmethyst
Amethyst is one of the most popular nostr clients for Android. Amethyst comes with expected features such as account management, feeds, profiles, and direct messages. Amethyst also offers native image uploads, public chat groups, link previews, one-tap zaps, public and private bookmarks, as well as the ability to follow hashtags, and other novel features. You can install releases of Amethyst via F-Droid or Google Play.
Repository: vitorpamplona/amethyst
License: MITPinstr
Pinstr allows users to easily organize and discover new ideas by creating public boards of pins. Users can star, comment, and zap other users' boards. Users can find curated boards of other users and create boards themselves. Default boards include users' bookmarked content, among other lists.
Repository: sepehr-safari/pinstr
License: MITnostr.build
Nostr.build is a free-to-use media hosting service that allows users to upload images, gifs, videos, and audio files to share them as nostr events. The team recently released their code under an MIT License so that anyone might use the software to offer a similar service.
Repository: nostrbuild/nostr.build
License: MITGossip
Gossip is a fast and stable desktop nostr client focused on the Twitter-like micro-blogging aspect of nostr. Gossip follows people by downloading their events from whichever relays they post to (rather than relays you configure) and was the impetus for NIP-65. It does not use complex web technologies such as JavaScript or HTML rendering and stores your private key only in an encrypted format. Consequently, Gossip is considered more secure than other clients by some. The client is packaged and released for Linux, Windows, and MacOS.
Repository: mikedilger/gossip
License: MITNostr SDK iOS
The nostr SDK for iOS is a native Swift library that will enable developers to quickly and easily build nostr-based apps for Apple devices. The library plans to implement all approved NIPs and will follow Apple's API patterns, so that iOS developers feel comfortable using it from the start. The SDK aims to be simple in its public interface, abstracting away as much complexity as possible so that developers can focus on what makes their specific application unique.
Repository: nostr-sdk/nostr-sdk-ios
License: MITNostr Design
Nostr Design will be a comprehensive resource for designers and developers to build successful nostr products. Nostr introduces several new concepts that most people are not familiar with. Given its nature, the protocol presents some unique design challenges for developers and users alike. The Nostr Design efforts are led by Karnage, who has done stellar product design work around nostr in the past. We believe that this project has the potential to impact the entire nostr space, as it can act as a go-to source for developing quality products, addressing user needs, as well as providing concrete examples and building blocks for product designers and developers alike.
License: Public Domain, Creative Commons
We have received hundreds of applications in the last couple of weeks, many related to or exclusively focused on nostr. Most projects that applied focus on bitcoin and lightning. We will announce another wave of grants for these soon.
To all the nostr projects that applied and didn't make the cut this time around: don't be discouraged. Please apply for funding again in the future. We will announce new grants and funding opportunities quarterly, and there is always the possibility of being listed on the OpenSats website to receive pass-through donations for your project.
We are excited to support the projects above in building the tools we bitcoiners care so deeply about. The future is bright; we just have a lot of building to do.
-
@ 97c70a44:ad98e322
2023-07-25 18:30:24Recently David King had me on his NostrTalks podcast, and one of the things we talked about was how custom feeds might work. I was reeling from David challenging my assumptions about relay centralization, so I didn't do a good job of enumerating what our different options for custom feeds are, and how they differ.
Data Vending Machines
Pablo followed up here, asking my opinion on custom feeds in general, and his Data Vending Machines proposal in particular. I thought I'd write up my response in long form for posterity.
I like DVMs, although I ignored it for a while because I figured that Pablo was bouncing off the walls as usual (congrats on placing in the AI competition!) and I'd wait till he settled down. I read a bit of his nip this morning and the custom feeds use case caught my eye as an interesting application of the DVM idea.
Algorithm options
I don't know if vending machines are a good solution to custom feeds, but I'm open to being convinced. Below are a few different approaches we could take to the problem:
- Built into clients — closed or open source, not swappable except maybe using NIP 89
- Built into relays — swappable, composable, monetizable, allows for moderation/publishing models, can support "private" algorithms. Also highly consistent, you're not going to get partial data because events weren't where you were looking.
- Published by pubkeys — not parameterizable. We saw how this went with @Sherry's recommendations bot. Great idea, but lots of limitations.
- List- or label-based curations — lightweight, not parameterizable, but very clean, could support private use case via wrapping
- DVMs — parameterizable, monetizable, but very heavy, probably not suitable for general purpose custom feeds due to latency
- Custom algorithms published as NostrScript events — swappable, pluggable, very flexible, content-addressable.
I think all those options have their own benefits and trade offs for different use cases. Relay-based feeds are going to be good for low-latency, dynamically-built feeds. Pubkeys are good for publishing-type use cases. NostrScript is good for shareable user-defined feeds (and other things).
I don't think DVMs make sense for "feeds" as we think of them, but could be good for generating more bespoke artifacts, like a "top highlights for this week" email/feed, or something that interleaves nostr events with some external data source.
My favorite of the bunch is lists and/or labels, which can be arbitrarily nested and used by clients/scripts/relays to build more sophisticated feeds (for example, by incorporating distributed moderation or web-of-trust-based recommendations subjective to the target user).
This has actually always been a core part of my vision for nostr and coracle since even before I found nostr. Unlike Amethyst's kind-1984 based content moderation approach, lists and labels can be interpreted using user-defined terminology.
In other words, the definition of "good music" depends on who you ask. Allowing the answer to be generated not by a single pubkey or relay, but by your entire network seems to best embrace the emergent behavior nostr enables. These content recommendations can be weighted as much as desired based on other social data including reactions, replies, mutual follows, how many lists someone appears on, etc.
Of course, this more analysis-heavy approach means that feeds are not defined only by lists and labels, but by an opinionated interpretation layered on top of those, which might best be provided by a relay, DVM, or client.
Do we even want feeds?
While we're on the topic, it's also worth asking if "feeds" are really what we want. The idea is very closely tied to the infinite-scroll pattern common on legacy social media, and the name evokes the process of fattening livestock for the kill. In some ways, I think infinite scroll is with us for better or for worse - pagination buttons at the bottom of a page is no solution, that would just be an inferior version of the same thing.
Perhaps, taking a leaf out of Twitter's book, brevity is the answer. Being forced to fit our thoughts into 280 characters helps us write more clearly and with more interest, what if "feeds" had the same constraints? Having no page bottom incentivises a never-ending list of diffuse non-information.
This is one reason I love email newsletters - at best they are carefully researched, edited, scheduled, and sent to your inbox. The email inbox is a hostile place, missives that land there are likely to suffer a swift demise. If an email is to survive, it must be something the recipient wants to read.
In a way, The Nostr Report is itself a custom feed in this sense, published via a shared pubkey. It would also be possible to write a "client" that links your pubkey and email, and sends you a daily/weekly digest, tailored to your interests and social graph.
Taking this idea a step further and combining it with list and label based feeds, instead of creating lists based on content, you might create a list based on timeframe. Then, in order to generate a digest for that day or week, you would pull all time-based curations from your network, analyze them for common threads, and re-publish as an email, note, or list.
At any rate, I think this is an exciting area of research, and that we've only scratched the surface of what's possible with an open social graph. I'm glad to see Pablo creating an entirely new category of solution with his proposal.
-
@ 78733875:4eb851f2
2023-07-07 22:04:12OpenSats is pleased to announce a new long-term support (LTS) program for Bitcoin Core developers and similar Load-Bearing Internet People.[^fn-lbip] This grant program is designed to provide financial support for developers who are working on critical infrastructure for the bitcoin network.
The LTS program is a new initiative from OpenSats and is distinct from our regular grant program, which is more expansive in scope. It is also distinct from OpenSats' website listings, which allows reviewed open-source projects to receive tax-deductible donations via OpenSats. The LTS program is specifically designed to provide long-term support for developers who are working on critical open-source infrastructure in and around bitcoin.
Having a longer time horizon than regular grants, the LTS program is geared towards long-term stability for grantees, with a minimum grant duration of 12 months and possible grant durations of two years or longer. This will allow developers to focus on their work without having to worry about financial constraints.
To be eligible for the LTS program, applicants must:
- have a track record of quality contributions
- be mission-driven and self-motivated
- be able to work in public
- be bitcoin-only
Applications for the LTS program are now open: https://opensats.org/apply/
The first recipient of an OpenSats LTS Grant is Marco Falke, a long-term maintainer and contributor of Bitcoin Core with thousands of contributions over many years. Marco will continue to focus on testing and quality assurance, as well as maintenance and review, helping to make sure that the Bitcoin Core software is as solid as it can be. You can read more about his contributions here.
We appreciate all the hard work that goes into building and maintaining critical open-source infrastructure. It is a hard and often thankless job. We hope that we can play a role in closing the gaps in bitcoin open-source funding, and we look forward to working with contributors in the future.
OpenSats aims to be an additional pillar of the increasingly solid funding landscape in and around bitcoin. We have learned a lot from the programs of the past and aim to join Brink, Spiral, Chaincode, HRF, and other successful grant programs to support those who build the tools that ensure the protection of individual liberties in our digital world.
We are committed to supporting the development of bitcoin. The LTS program is a new way for OpenSats to support long-term contributors who are building, maintaining, testing, researching, and reviewing critical software.
We encourage all qualified developers to apply for the LTS program. Together, we can build a stronger and more resilient bitcoin network.
[^fn-lbip]: "An LBIP is a person who maintains the software for a critical Internet service or library, and has to do it without organizational support or a budget backing him up." —Eric S. Raymond
-
@ dafdf2c1:cd561387
2023-07-31 11:10:07The digital landscape is ever-evolving, and at the forefront of this dynamic realm stands Yakihonne, a platform that continuously strives to amaze and delight its users. In this comprehensive review, we delve into the latest updates that have set the internet abuzz with excitement. Brace yourself as we explore Yakihonne's innovative additions, designed to revolutionize user engagement and make interactions a delightful experience.
Table of Contents 1. NIP-25 Support: Empowering Users with Voting! 2. Zap Stats: Showcasing Interactions on Profiles! 3. "Login with an Extension" Button: Now More Visible! 4. Enhanced Search: Making Discovery a Breeze! 5. Discover with Tags: Unleashing Creativity! 6. Conclusion
1. NIP-25 Support: Empowering Users with Voting! Yakihonne's commitment to democracy shines brightly through the implementation of NIP-25 support. Users now have the power to actively participate in shaping the platform's content by both upvoting and downvoting. This revolutionary feature amplifies user voices, giving them the ability to express their preferences and celebrate the content they cherish the most.
2. Zap Stats: Showcasing Interactions on Profiles! Zaps have become an integral part of the Yakihonne experience, and now, users can proudly display their interactions through Zap stats on their profiles. Witnessing the number of Zaps sent and received adds a personal touch to profiles, serving as a badge of honor that celebrates their contributions within the vibrant Yakihonne community.
3. "Login with an Extension" Button: Now More Visible! Accessibility is a cornerstone of Yakihonne's ethos, and the team has made a thoughtful improvement by making the "login with an extension" button more visible. No longer lurking in the shadows, this button gracefully appears grayed out, ensuring users can explore seamless login options with ease.
4. Enhanced Search: Making Discovery a Breeze! Discovering captivating content on Yakihonne has been taken to new heights with enhanced search functionality. The search list is now optimized for efficiency, guiding users effortlessly to the content they seek. Additionally, providing relevant and tailored content that sparks curiosity
5. Discover with Tags: Unleashing Creativity! Navigation through Yakihonne's diverse content is now a breeze with the introduction of tags. Users can click on tags within articles to explore related content under the same category. This intuitive feature opens up new avenues of exploration, making Yakihonne an endless reservoir of inspiration.
Conclusion A Bright Future for Yakihonne! In this exhilarating journey through Yakihonne's latest updates, we have witnessed the platform's unwavering commitment to empowering its users. The introduction of NIP-25 support, Zap stats on profiles, and improved search functionality have revolutionized the way we interact and explore within this creative haven.
Moreover, the visible "login with an extension" button and the innovative tag system demonstrate Yakihonne's dedication to accessibility and seamless navigation. The team's thoughtful implementation of these features has transformed our experience, making Yakihonne a cherished destination for creative souls and passionate creators alike.
As we bid adieu to this review, we are filled with excitement for the limitless potential that lies ahead for Yakihonne. With its constant pursuit of excellence and commitment to fostering a vibrant community, we eagerly await the next wave of innovations that will shape our artistic journey.
In the spirit of boundless creativity and artistic wonder, let us celebrate the bright future of Yakihonne, where every interaction is a masterpiece waiting to unfold.
Thank you, Yakihonne, for creating a platform where imagination knows no bounds!
With enthusiasm and admiration.
-
@ 138e0951:6a1d8320
2023-07-31 09:58:35originalidad 0% .... probando nostr
-
@ f6a5c82f:3aa6f2a0
2023-07-31 11:08:34Introduction Welcome, Yakihonne enthusiasts! Today, we embark on an exciting exploration of the latest updates that have breathed new life into our favorite platform. Yakihonne has always been a community that values user feedback, and these enhancements are a testament to the team's commitment to delivering a remarkable user experience. Let's dive into the world of Yakihonne's cutting-edge updates and uncover the magic that awaits us!
Table of Contents:
1. Revamped Comment Section: Unleashing the Power of Interaction 2. NIP-25 Support: Empowering User Opinions with Seamless Voting 3. Zap Stats on Profiles: Celebrating Interactions, One Zap at a Time 4. Enhanced Search Functionality: Your Gateway to a World of Content 5. Conclusion
1. Revamped Comment Section: Unleashing the Power of Interaction The heart of any thriving community lies in its ability to communicate and interact. With the newly re-implemented comment section, Yakihonne has taken a leap forward in fostering engaging conversations. Notably, users now have the privilege to delete comments, offering greater control over discussions and promoting a friendlier atmosphere.
Hey Look here i just made a comment 😊
I just deleted my comments, like nothing happened 😂
2. NIP-25 Support: Empowering User Opinions with Seamless Voting The power of expressing our opinions just got a major boost with NIP-25 support! Upvoting and downvoting posts and comments has become a seamless affair, allowing us to actively shape the content we cherish. Witnessing the Yakihonne community embrace this empowering feature has been truly awe-inspiring.
Note: I noticed that users can upvote multiple times, and I think that’s a bug. Last but not least, users cant upvote and downvote simultaneously, which is okay.
3. Zap Stats on Profiles: Celebrating Interactions, One Zap at a Time Our profiles have become a canvas of interaction and impact with the addition of Zap stats! Now, we can proudly display the Zaps we've sent and received, a delightful way to celebrate our contributions within the Yakihonne ecosystem. These stats lend a personal touch to our profiles, strengthening our bond with fellow users.
I was going to calculate how much i have made from Yakihonne but all thanks to this new feature there wont be need for that😊
4. Enhanced Search Functionality: Your Gateway to a World of Content The search experience on Yakihonne has reached new heights of efficiency and convenience. Say goodbye to tedious searches, as the team has optimized the search list, allowing us to find what we seek with remarkable ease. Moreover, the adjustment of user searching results to the NIP-21 URI scheme ensures relevant and tailored content that sparks curiosity.
Goodbye to tedious searches
Conclusion Yakihonne's latest updates have left me astounded and deeply appreciative of the team's commitment to creating a user-centric platform. Each enhancement has brought me closer as a community, empowering me to share, engage, and explore in ways that touch my heart.
As we embrace these cutting-edge features, let's celebrate the essence of Yakihonne - a platform that cherishes user interactions, encourages diverse opinions, and continually strives to make our journey exceptional.
Thank you, Yakihonne team, for ushering in this exciting era of growth and innovation. Here's to an extraordinary future filled with countless memorable moments on Yakihonne!
Happy Yakihonne-ing!
-
@ a012dc82:6458a70d
2023-07-31 04:04:52Table Of Content
- The Rise of River: A Game-Changer in the Bitcoin Space
- Factors Influencing Thiel's Decision
- Implications of Thiel's Investment
- Conclusion
- FAQ
Peter Thiel, the renowned venture capitalist and co-founder of PayPal, has recently made headlines with his bold move of investing $35 million in a Bitcoin startup called River. Thiel's investment in the cryptocurrency industry has sparked curiosity and debate among investors and industry experts. In this article, we will explore the reasons behind Thiel's decision to back River and delve into the potential implications of this move for the Bitcoin ecosystem.
Peter Thiel's Bold Move The Rise of River: A Game-Changer in the Bitcoin Space
Understanding River's Mission River is a Bitcoin startup founded in 2019 with a mission to make Bitcoin more accessible and user-friendly. The company aims to bridge the gap between traditional finance and the world of cryptocurrencies by providing institutional-grade Bitcoin financial services. River offers a suite of tools and services designed to simplify the process of buying, selling, and storing Bitcoin for both individual and institutional investors.
Thiel's Confidence in River's Vision Peter Thiel's investment in River is a testament to his confidence in the startup's vision and potential. Thiel has a track record of backing disruptive and innovative companies, and his support for River indicates his belief in the transformative power of Bitcoin. By investing a substantial amount of capital, Thiel is betting on River's ability to drive mainstream adoption of Bitcoin and revolutionize the way people interact with the digital currency.
Factors Influencing Thiel's Decision Growing Institutional Interest in Bitcoin One of the key factors influencing Thiel's decision to invest in River is the increasing institutional interest in Bitcoin. In recent years, major financial institutions, including banks and asset management firms, have started to embrace cryptocurrencies and recognize their potential as an investment asset. Thiel, known for his keen insights into emerging trends, likely sees the institutional adoption of Bitcoin as a significant catalyst for its future growth.
Addressing Bitcoin's Accessibility Challenges Bitcoin's mainstream adoption has been hindered by various challenges, including its complexity and lack of user-friendly infrastructure. River aims to tackle these hurdles by providing a seamless and secure platform for buying, selling, and storing Bitcoin. Thiel's investment in River indicates his belief that the startup's innovative solutions can address these accessibility challenges and make Bitcoin more user-friendly for a broader audience.
Potential for Bitcoin's Price Appreciation Thiel's investment in River may also be driven by his belief in the long-term potential for Bitcoin's price appreciation. As an early advocate of Bitcoin, Thiel has consistently expressed his bullish stance on the digital currency. By investing in a Bitcoin startup like River, Thiel is positioning himself to benefit from the potential growth of the cryptocurrency market as a whole.
Peter Thiel's Bold Move Time in Bitcoin and Implications of Thiel's Investment
Boosting Confidence in the Bitcoin Ecosystem Thiel's investment in River not only provides the startup with substantial funding but also boosts confidence in the broader Bitcoin ecosystem. Thiel's track record as a successful investor and his reputation in the financial industry lend credibility to the potential of Bitcoin as a disruptive force in the world of finance. His endorsement of River may encourage other investors and institutions to take a closer look at Bitcoin and its underlying technology.
Accelerating Mainstream Adoption Thiel's investment in River has the potential to accelerate the mainstream adoption of Bitcoin. River's focus on user-friendly financial services can attract a wider audience of investors who were previously hesitant to enter the cryptocurrency space. By providing a seamless and secure platform, River can help overcome the barriers that have prevented mainstream adoption, ultimately driving more people to embrace Bitcoin as a viable investment option.
Fostering Innovation in the Bitcoin Industry Thiel's investment in River also fuels innovation in the Bitcoin industry. The injection of capital enables River to expand its operations, hire top talent, and further develop its suite of services. This infusion of resources can have a ripple effect, inspiring other entrepreneurs and startups to enter the Bitcoin ecosystem with their own innovative solutions. Thiel's backing of River may spark a wave of creativity and competition, ultimately driving the industry forward.
Peter Thiel's Bold Move Conclusion Peter Thiel's $35 million investment in the Bitcoin startup River is a bold move that highlights his confidence in the future of Bitcoin and the potential of innovative companies within the cryptocurrency space. Thiel's investment not only supports River's mission to make Bitcoin more accessible but also boosts confidence in the broader Bitcoin ecosystem. By addressing Bitcoin's accessibility challenges, River has the potential to drive mainstream adoption and accelerate the growth of the cryptocurrency market. Thiel's backing of River also fosters innovation within the industry, inspiring other entrepreneurs to develop their own groundbreaking solutions. With Thiel's strategic investment, River is well-positioned to make a significant impact on the Bitcoin landscape and shape the future of digital finance.
FAQ Who is Peter Thiel? Peter Thiel is a renowned venture capitalist and co-founder of PayPal. He is known for his investments in disruptive and innovative companies.
What is River? River is a Bitcoin startup that aims to make Bitcoin more accessible and user-friendly by providing institutional-grade financial services for buying, selling, and storing Bitcoin.
Why did Peter Thiel invest in River? Peter Thiel invested in River because he believes in the startup's vision and potential to drive mainstream adoption of Bitcoin. He also sees the growing institutional interest in Bitcoin and the need to address its accessibility challenges.
What are the implications of Thiel's investment in River? Thiel's investment in River boosts confidence in the Bitcoin ecosystem, accelerates mainstream adoption, and fosters innovation within the industry.
That's all for today, see ya tomorrow
If you want more, be sure to follow us on:
NOSTR: croxroad@getalby.com
X: @croxroadnews
Instagram: @croxroadnews.co
Subscribe to CROX ROAD Bitcoin Only Daily Newsletter
https://www.croxroad.co/subscribe
DISCLAIMER: None of this is financial advice. This newsletter is strictly educational and is not investment advice or a solicitation to buy or sell any assets or to make any financial decisions. Please be careful and do your own research.
-
@ aa55a479:f7598935
2023-02-20 13:44:48Nostrica is the shit.
-
@ c8df6ae8:22293a06
2023-07-24 04:30:59The Kennedy administration will begin to back the US dollar with real, finite assets such as gold, silver, platinum and Bitcoin, which is the world's hardest liquid asset, to strengthen the US dollar and guarantee its continued success as the global reserve currency.
— Robert F. Kennedy Jr.Welcome to the latest issue of the Bitcoin For Families newsletter. This issue covers the potential implications of a Bitcoin backed US dollar.
What are the implications of a Bitcoin backed US dollar?
On July 19th, presidential candidate Robert Kennedy Jr gave a speech at a PAC event where he declared the plans of his administration to, at least partially, back the US dollar with gold, silver, platinum and Bitcoin.
You can watch the whole 10 minutes speech here:
What could happen if RFK Jr. does get elected and goes forward with this promise? Let’s look at some scenarios.
What will the RFK administration do?
RFK Jr. said he would back 1% of the treasury bills (T-Bills) issued by the treasury with gold, silver, platinum and Bitcoin.
T-Bills are short-term securities issued by the government with a maturity rate of 4, 8,13, 17, 26 or 52 weeks. The interest rate is set based on the offers done by those bidding on the auction. For example, a recent auction of 26 week T-Bills closed with an interest rate of 5.25%.
Let’s assume that Bitcoin is priced at $32,000 US dollars and that you buy a 26 weeks T-Bill worth $100 and backed by Bitcoin. This would mean that the US government must have possession of 312,500 satoshi (0.00312500 BTC) to back this T-Bill and you should have the option to exchange the T-Bill for that amount of Bitcoin at the end of the 26 weeks.
What would happen if after 26 weeks, the US dollar became weaker than Bitcoin and the exchange rate is now $34,000? Now you only need 294,118 satoshis to back the $100 T-Bill and since the US government had set aside 312,500 satoshi, then all is good.
What would happen if after 26 weeks, the US dollar became stronger than Bitcoin and the exchange rate is now $28,000? Now you need 357,143 satoshi to back the T-Bill and since the US government had set aside 312,500 satoshi, they would need to purchase 44,643 more satoshi to continue to back the T-Bill.
I doubt the US government will be willing to take this exchange rate risk with its debt obligations.
Therefore, the most likely scenario is that the US government sets a fixed exchange rate for the maturity life of the T-Bill. Maybe equal to the exchange rate at the time of the auction.
This means that after 26 weeks you could exchange your T-Bill for 312,500 satoshi no matter what the actual exchange rate is at that point.
What will this do to the price of Bitcoin or the US dollar?
-
The US dollar will get stronger when compared against other fiat currencies not backed by anything:
The US government will send a very strong signal to the world that Bitcoin is the new gold and a very valid store of value. The world’s hardest liquid asset.
Therefore, backing the US dollar with Bitcoin will make the US dollar stronger when compared with other fiat currencies.
-
The US dollar will get weaker when compared against Bitcoin:
The demand for Bitcoin by the US government will put upward pressure on the exchange rate for Bitcoin against all fiat currencies, including the US dollar.
-
A continued policy of expanding the amount of T-Bills backed by Bitcoin will only exacerbate this process, making the US dollar stronger against other countries currencies and driving the price of Bitcoin up both against the US dollar and by extension other fiat currencies (which by definition are now weaker than the US dollar).
What will investors do?
Investors will demand either the same or maybe even lower interest rates for Bitcoin backed T-Bills with the assumption that the Bitcoin amount backing the T-Bill will be worth more after 26 weeks.
The scenario where investors demand higher interest rates because they believe that Bitcoin will get weaker is unlikely since, even if this happens, the T-Bills are denominated in US dollars and investors won’t be forced to redeem the T-Bill for Bitcoin at the end of the 26 weeks.
What will other countries do?
At first, other countries will just watch and see.
But the upward pressure on the price of Bitcoin will make a US T-Bill more attractive than its European Central Bank equivalent causing the Euro to lose value compared against the US dollar.
The same applies to any other fiat currency (not backed by anything).
Other countries will be forced to back some of their short-term debt with Bitcoin to avoid having to pay higher interest rates to counteract the higher attractiveness of the US T-Bill.
The consolidation of Bitcoin as store of value
If all countries start using Bitcoin to back their debt, that will consolidate Bitcoin as the best store of value.
Bitcoin has to first settle as a store of value before it can be used as a medium of exchange and unit of account (what is commonly known as hyperbitcoinization). This possible action by the US government might well be the linchpin that triggers the hyperbitcoinization.
If the other countries do nothing, the US dollar will cement itself as the global reserve currency and the hyperbitcoinization will still happen, just a bit slower since there will be no demand from other treasuries.
Notable notes
nostr:note1dgh2ga7q397huzfcf9ah06ptvcxxwcmfx5yhctw0rh4t6ylvud4s0nrn75
Recommendations
Max DeMarco
Max is the creator of a great documentary about social media and why we need Nostr.
Watch the documentary here:
You can follow Max here.
Bitcoin Therapy 🧠
Level up your Bitcoin knowledge in 3 minutes...every Sunday
Check it out here.
What did you think of today's newsletter?
Your feedback helps me create the best newsletter possible for you.
Please leave a comment and checkout comments from other subscribers and readers. I love hearing from the Bitcoin For Families community ❤️ 🙏🏻
Buy Bitcoin with Swan
If you want to buy Bitcoin, I highly recommend using Swan. They are a Bitcoin only business that is focused on self-custody and educating their users. It's where I buy my Bitcoin.
Use this link to receive $10 free to get you started.
See you again next week! — Alejandro
This newsletter is for educational purposes. It does not represent financial advice. Do your own research before buying Bitcoin.
-
-
@ 93e4df0d:49aca2f8
2023-07-31 10:50:38The digital universe has witnessed an extraordinary spectacle as Yakihonne, the realm of boundless creativity, embarks on a captivating journey of innovation. With a flourish of updates that leave us in awe, let us delve into the magic of Yakihonne's latest features, each designed to elevate our experience to unprecedented heights.
Table of Contents 1. Share: Uniting the World with One Click 2. Comments: A Tale of Interaction and Deletion! 3. NIP-25 Support: Power to the Users—Upvote and Downvote! 4. Zap Stats: A Symphony of Interactions on Profiles! 5. Conclusion
1. Share: Uniting the World with One Click The essence of Yakihonne lies in fostering connections and celebrating creativity, and the introduction of the Share feature brings us closer than ever. With a single click, users can now share their favorite posts with the world, illuminating the digital landscape with brilliance and wonder. As the Yakihonne community becomes a global canvas of inspiration, sharing becomes the heartbeat that connects us all.
Note: I will suggest that a proper share button be added so user’s can share to other social media platforms like WhatsApp, LinkedIn etc.
2. Comments: A Tale of Interaction and Deletion! The heartbeat of any creative community is the exchange of ideas and feedback, and Yakihonne cherishes this interaction with the reimplemented Comments feature. Users can now leave their mark on posts, engaging in vibrant conversations that spark inspiration and growth. Moreover, the power to delete comments adds a touch of finesse, allowing for meaningful curation of our artistic space.
Just by clicking on the small bin like icon the comment disappears
3. NIP-25 Support: Power to the Users—Upvote and Downvote! Yakihonne embraces democracy with open arms through the implementation of NIP-25 support. Users hold the reins of content appreciation, as they can now both upvote and downvote posts. This empowering feature amplifies our voices, giving us the ability to shape the platform's landscape and celebrate the content that touches our souls.
4. Zap Stats: A Symphony of Interactions on Profiles! Zaps are the heartbeat of Yakihonne's enchanting experience, and now, they are beautifully showcased on our profiles through Zap stats. The number of Zaps sent and received serves as a testament to our artistic impact, adding a personal touch to our presence within this vibrant community. As we weave our creative symphony, Zap stats celebrate our contributions with pride.
5. Conclusion Yakihonne's new features have sparked a tapestry of wonder, infusing our artistic journey with boundless excitement and endless possibilities. Share, NIP-05, Comments, NIP-25 Support, and Zap Stats are a testament to Yakihonne's commitment to empowering its users and fostering a community where creativity knows no bounds.
With each new feature, we embark on a journey of exploration, interaction, and discovery, where our artistic souls find solace and inspiration. As we immerse ourselves in the magic of Yakihonne, let us celebrate the brilliance of these updates that unite us as one creative force.
The Yakihonne experience is a testament to the power of human connection, imagination, and artistry. As we embrace the new features, we leave behind a trail of brilliance that lights the way for others to follow.
Step into this world of enchantment, where dreams are woven into reality, and the artist within us finds its true home. Welcome to Yakihonne, where magic meets creativity in a symphony of boundless wonder!
With an enchanting embrace,
-
@ 9ecbb0e7:06ab7c09
2023-07-31 03:59:01Amnistía Internacional deseó este sábado la libertad del preso político cubano José Daniel Ferrer, quien cumple 53 años en terribles condiciones en la cárcel de Mar Verde en Santiago de Cuba y con la tortura de estar incomunicado casi permanentemente.
"Hoy es el cumpleaños del líder político José Daniel Ferrer. Hoy debería celebrar con su familia. José Daniel lleva dos años preso injustamente; es un preso de conciencia de Miguel Díaz-Canel", dijo Erika Guevara Rosas, representante de Amnistía Internacional en América Latina en un mensaje en Twitter.
"¡Exigimos su libertad inmediata! Muy pronto te veremos libre, es mi mayor deseo, José Daniel", concluyó.
"Mi valiente, firme y determinado hermano José Daniel Ferrer García está cumpliendo 53 años de edad. Quisiera desearle muchas felicidades en tan importante día y, sobre todo, pedirle a Dios que ponga sus manos una vez más y permita su libertad inmediata e incondicional, así como la de todos los presos políticos", escribió por su parte en Facebook la hermana del coordinador nacional de la UNPACU, Ana Belkis Ferrer García.
La opositora Rosa María Payá, por su parte, declaró en Twitter: "Cuando todo acabe y los cubanos podamos ser libres en Cuba, será gracias al esfuerzo de hombres (y mujeres) como José Daniel, que hoy cumple 53 años en una celda de aislamiento húmeda y hedionda cargando con la dignidad de todos los cubanos. #NoMásPresosPolíticos #SOSCuba #PatriaYVida #AbajoLaDictadura".
Este sábado Amnistía Internacional relanzó una acción urgente a propósito del cumpleaños de Ferrer en la que invita a todos escribir un correo electrónico a Díaz-Canel exigiendo su liberación.
La familia de José Daniel Ferrer García ha alertado sobre el deterioro físico que sufre por las pésimas condiciones carcelarias, y sobre el acoso de la Seguridad del Estado contra él y sus familiares. En la visita que pudieron hacer la semana pasada, gracias a la fuerza que tomó la campaña que exigía una "fe de vida" del disidente cubano, Ortega, junto al hijo que tiene con Ferrer y otra de las hijas del opositor, Fátima Victoria Ferrer, pudieron llegar a verlo en uno de los pasillos, cerca de la celda donde esta confinado.
Al finalizar la visita denunciaron que Ferrer se encuentra en un "crítico estado de salud", semidesnudo y recluido en una celda de aislamiento en la prisión de Mar Verde.
Ferrer fue detenido el 11 de julio de 2021 en el contexto de los actos de protesta que se celebraron por toda la Isla. Desde el 14 de agosto de 2021 permanece recluido en una celda de castigo, según sus familiares. Desde el 17 de marzo de 2023, permanece en régimen de incomunicación y antes de esta fecha se encontraba en mal estado de salud. Es preso de conciencia, por lo que debe ser puesto en libertad de inmediato y sin condiciones, demandó Amnistía Internacional el pasado 6 de junio.
-
@ 9ecbb0e7:06ab7c09
2023-07-23 23:02:57Nostr es un protocolo que permite crear redes sociales descentralizadas sin censura incluyendo pagos a través de la red Lightning de Bitcoin.
Esta instalación de Nostr Relay Server comprende el uso de un servidor Ubuntu/Debian ya sea personal o que se esté ejecutando en cualquier servidor VPS.
Requisitos necesarios:
- Comprar un dominio de internet o tener un dominio de internet. Puedes comprar un dominio utilizando Namecheap.com u otro servicio.
- Tener en ejecución una instalación de Ubuntu/Debian.
- Logearse con ssh en su servidor.
- Instalar Cargo
- Instalar Rust
- Tener muchos deseos de aprender.
Comenzamos por instalar los paquetes necesarios:
Instalamos Cargo
sudo apt install cargo
Luego procedemos a realizar la instalacion de Rust
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
Cuando se nos pregunte elegimos la opción 1
Una vez que haya finalizada la instalación de Rust vamos a activar el servicio
source /root/.cargo/env
y verificamos que estén instaladas de forma correcta tanto Rust como Cargo
rustc --version cargo --version
En seguida, procedemos a la instalación de Protobuf en el sistema. Protobuf es un formato de serialización de datos estructurados desarrollado por Google, utilizado en la comunicación entre servicios y en el almacenamiento de datos. Para instalarlo, sigue los siguientes pasos.
Primero, asegúrate de tener instalado el paquete unzip, que nos permitirá extraer el archivo ejecutable protoc de un archivo ZIP:
sudo apt update
sudo apt install -y unzip
Obtén la última versión de protoc y asígnala a una variable:
PROTOC_VERSION=$(curl -s "https://api.github.com/repos/protocolbuffers/protobuf/releases/latest" | grep -Po '"tag_name": "v\K[0-9.]+')
Descarga el archivo ZIP de la página de lanzamientos del repositorio de protoc:
curl -Lo protoc.zip "https://github.com/protocolbuffers/protobuf/releases/latest/download/protoc-${PROTOC_VERSION}-linux-x86_64.zip"
Luego, extrae el archivo ejecutable del archivo ZIP:
sudo unzip -q protoc.zip bin/protoc -d /usr/local
A continuación, establece el permiso de ejecución:
sudo chmod a+x /usr/local/bin/protoc
En este punto, el comando protoc ya está disponible para todos los usuarios como un comando global del sistema. Puedes verificar la versión de protoc con:
Podemos verificar la versión de protoc:
protoc --version
Ya no necesitaremos el archivo ZIP, así que lo eliminamos:
rm -rf protoc.zip
Ahora vamos a instalar todas las dependencias necesarias para la instalación de nuestro servidor nostr
sudo apt-get install certbot build-essential sqlite3 libsqlite3-dev libssl-dev pkg-config nginx git -y
sudo apt-get install net-tools whois -y
Luego realizamos la compitación de nostr.
cd /opt
sudo mkdir nostr-data
sudo git clone https://github.com/scsibug/nostr-rs-relay.git
cd nostr-rs-relay
sudo cargo build --release
**Este proceso puede llegar a demorar hasta unos 10 minutos. Tenga calma y espere que se realice completamente. ** Instalamos el servidor nostr en la carpeta bin
sudo install target/release/nostr-rs-relay /usr/local/bin
Si has seguido todos los pasos ya tendrás nuestro relay server casi listo, ahora puedes descargar el archivo de configuración de prueba.
sudo wget https://raw.githubusercontent.com/scsibug/nostr-rs-relay/master/config.toml
Abrimos el archivo con el editor de texto de tu preferencia, en mi caso nano.
sudo nano config.toml
Modificamos las siguientes variables.
relay_url : nostr.domainname.com ( ⚠️ reemplaza con el nombre que le quieras dar a tu relay server) name : Nombre que va a tener el relay description : Escribe de que se trata tu relay server, no te limites, escribe todo lo que quieras para que los usuarios conozcan acerca de ti. pubkey : Tu clave publica en formato hex, esto es para que otros usuarios puedan comunicarse contigo contact : correo@example.com (Es la direccion de correo administrativo de este relay server) tracing : ⚠️ Mantenlo comentado de otra forma el relay server puede dar error data_directory : /opt/nostr-data/ (Donde se van a guardar los datos de nuestro relay server) address : 127.0.0.1 utilizaremos esta ya que vamos a usar nginx como servidor proxy remote_ip_header : "x-forwarded-for" para permitir el logeo real de clientes
Si has seguido los pasos podemos verificar si nuestro relay server se encuentra en ejecución y para hacerlo podemos utilizar el siguiente comando en el terminal.
sudo RUST_LOG=warn,nostr_rs_relay=info /usr/local/bin/nostr-rs-relay
Si todo marcha de forma correcta nos mostrara la salida del servidor con los datos que esta procesando.
Ahora vamos a proceder a crear el servicio que permitirá el funcionamiento de nuestro relay server incluso si se reinicia nuestro servidor.
Para ello vamos a crear el archivo /etc/systemd/system/nostr-relay.service
sudo nano /etc/systemd/system/nostr-relay.service
Dentro del archivo vamos a pegar el siguiente codigo
``` [Unit] Description=Nostr Relay After=network.target
[Service] Type=simple User=TuUsuario WorkingDirectory=/home/TuUsuario Environment=RUST_LOG=info,nostr_rs_relay=info ExecStart=/usr/local/bin/nostr-rs-relay Restart=on-failure
[Install] WantedBy=multi-user.target
```
Debes Cambiar TuUsuario por el nombre del usuario real que estas utilizando.
Ahora habilitamos el servicio, y lo ejecutamos para que se mantenga encendido
sudo systemctl daemon-reload
sudo systemctl enable nostr-relay.service
sudo systemctl start nostr-relay.service
Verificamos si el servicio se está ejecutando en el sistema
sudo systemctl status nostr-relay.service
Si vas hasta aqui deberías ver el servicio en ejecución sin errores. Para salir de la ventana preciona Ctrol + C
Verificamos si el servicio esta escuchando en el puerto 8080 utilizando el siguiente comando
sudo netstat -tnap | grep nostr
y Deberías obtener una respuesta parecida a la siguiente
tcp 0 0 0.0.0.0:8080 0.0.0.0:* LISTEN 81180/nostr-rs-relay
Configuración del proxy inverso Nginx para el acceso de nuestro relay service a través del dominio deseado.
Para ello primero vamos a entrar en la carpeta cd /etc/nginx/sites-available
cd /etc/nginx/sites-available sudo mkdir -p /var/www/nostr/.well-known/acme-challenge/ sudo chown -R 33:33 /var/www/nostr
Creamos el archivo de configuracion nostr-relay.conf
sudo nano nostr-relay.conf
Vamos a pegar el siguiente contenido
```
map $http_upgrade $connection_upgrade { default upgrade; '' close; }
upstream websocket { server 127.0.0.1:8080; }
server { listen 80; server_name relay.example.com; ## <<=== Cambia esto
location /.well-known/acme-challenge/ { root /var/www/nostr; allow all; }
location / { proxy_pass http://websocket; proxy_http_version 1.1; proxy_set_header Upgrade $http_upgrade; proxy_set_header Connection $connection_upgrade; proxy_set_header Host $host; proxy_set_header X-Forwarded-For $remote_addr; }
} ```
En el, basicamente estamos dandole acceso a nuestro relay a través del dominio que hemos escogido. Recuerda que debes cambiar el valor en server_name con el nombre real del dominio que deseas utilizar. Hasta aqui solo estamos utilizando http sin certificado SSL.
Cuando hayas realizado las modificaciones y salvado el archivo de configuración vas a activar y habilitar el sitio para comprobar que efectivamente se esta ejecutando de forma correcta nuestro relay server y escuchando a través de nuestro dominio.
sudo ln -s /etc/nginx/sites-available/nostr-relay.conf /etc/nginx/sites-enabled/. sudo rm -f /etc/nginx/sites-enabled/default sudo nginx -t
Si todo marcha bien y has realizado los pasos vas a obtener el siguiente mensaje
nginx: the configuration file /etc/nginx/nginx.conf syntax is ok nginx: configuration file /etc/nginx/nginx.conf test is successful
Ahora debemos mandar a ejecutar los cambios en nginx
sudo nginx -s reload
Realizar una prueba externa.
Para comprobar que todo esta funcionando bien podemos abrir una nueva ventana Terminal en otra computadora y ejecutar
wget relay.example.com
Donde relay.example.com se refiere al dominio que has destinado para tu relay server y que es el mismo que has utilizado en la configuración de nginx.
Como resultado deberias obtener un archivo index.html al cual le puedes hacer un cat para leer el contenido.
cat index.html
y debería mostrar el siguiente mensaje
Please use a Nostr client to connect.
Hasta aquí vemos que nuestro servidor esta funcionando. Ahora le vamos a añadir un certificado SSL.
Para nuestro siguiente paso vamos a utilizar DHParams
Primero vamos a crear la carpeta donde ubicaremos el certificado.
sudo mkdir /etc/nginx/ssl
y luego vamos a crear el DHParams (deberia tomarnos solo un minuto)
sudo openssl dhparam -out /etc/nginx/ssl/dhparam.pem 4096
Ahora vamos a solicitar el certificado
cd /var/www/nostr sudo certbot certonly --webroot -w . -d relay.example.com --dry-run --agree-tos
Deberías haber obtenido la siguiente respuesta.
Saving debug log to /var/log/letsencrypt/letsencrypt.log Simulating a certificate request for relay.example.com The dry run was successful.
Recuerda cambiar relay.example.com por el dominio tuyo si esta prueba fue satisfactoria entonces puedes proceder a solicitar el certificado SSL utilizando el siguiente comando.
cd /var/www/nostr
sudo certbot certonly --webroot -w . -d relay.example.com
Así, vas a obtener información tal como la ruta en la que se guardó el certificado SSL para tu dominio etc.
Y estamos listos para reemplazar la configuración de nginx con la que va a tener el certificado SSL de nuestro servidor.
cd /etc/nginx/sites-available
sudo nano nostr-relay.conf
Actualizamos el archivo de la siguiente forma
``` map $http_upgrade $connection_upgrade { default upgrade; '' close; }
upstream websocket { server 127.0.0.1:8080; }
server { listen 80; server_name relay.example.com; ## <<=== CHANGE THIS
location /.well-known/acme-challenge/ { root /var/www/nostr; allow all; }
location / { return 301 https://relay.example.com; }
}
server { listen 443 ssl; server_name relay.example.com;
location / { proxy_pass http://websocket; proxy_http_version 1.1; proxy_set_header Upgrade $http_upgrade; proxy_set_header Connection $connection_upgrade; proxy_set_header Host $host; proxy_set_header X-Forwarded-For $remote_addr; }
SSL
ssl_session_cache shared:SSL:10m; ssl_session_timeout 10m; ssl_protocols TLSv1.2 TLSv1.1 TLSv1; ssl_prefer_server_ciphers on; ssl_ciphers "EECDH+ECDSA+AESGCM EECDH+aRSA+AESGCM EECDH+ECDSA+SHA384 EECDH+ECDSA+SHA256 EECDH+aRSA+SHA384 EECDH+aRSA+SHA256 EECDH+aRSA+RC4 EECDH EDH+aRSA RC4 !aNULL !eNULL !LOW !3DES !MD5 !EXP !PSK !SRP !DSS"; ssl_stapling on; ssl_stapling_verify on; ssl_dhparam ssl/dhparam.pem; ssl_ecdh_curve secp384r1;
add_header Strict-Transport-Security "max-age=31536000; includeSubdomains"; add_header X-Frame-Options DENY; add_header X-Content-Type-Options nosniff; add_header X-XSS-Protection "1; mode=block"; add_header Referrer-Policy same-origin; add_header Feature-Policy "geolocation none;midi none;notifications none;push none;sync-xhr none;microphone none;camera none;magnetometer none;gyroscope none;speaker self;vibrate none;fullscreen self;payment none;";
ssl_certificate /etc/letsencrypt/live/relay.example.com/fullchain.pem; ssl_certificate_key /etc/letsencrypt/live/relay.example.com/privkey.pem;
}
```
Recuerda cambiar todas las ocurrencias de relay.example.com por tu dominio real.
Verificamos que la configuración de nginx este correcta.
sudo nginx -t
Si todo es correcto ejecutas
sudo nginx -s reload
Verificamos que nginx + http y el servidor se están ejecutando correctamente
sudo netstat -tnap | grep 'nginx\|nostr'
Si la respuesta es la siguiente
tcp 0 0 0.0.0.0:443 0.0.0.0: LISTEN 53252/nginx: master tcp 0 0 0.0.0.0:80 0.0.0.0: LISTEN 53252/nginx: master tcp 0 0 0.0.0.0:8080 0.0.0.0:* LISTEN 81180/nostr-rs-rela
Significa que esta funcionando de forma correcta nuestro servidor.
En caso de que desees obtener un informe de estado de tu relay server puedes utilizar siempre que así lo desees
journalctl -f | grep --line-buffered nostr_rs_relay | cut -d' ' -f 10,12-100
Ahí podrás obtener un informe de errores en caso de ser necesario.
Si deseas comprobar la conexión con el Relay server puedes utilizar el siguiente enlace que aquí te proveo.
En resumen, Nostr es una plataforma que proporciona una red de noticias en tiempo real, libre y descentralizada. Nostr Relay Server es una herramienta que nos permite unirnos a esta red y compartir noticias y eventos con el resto del mundo. Mediante la instalación y configuración de Nostr Relay en un servidor Ubuntu, podemos contribuir al crecimiento y desarrollo de esta emocionante nueva plataforma de comunicación.
Quieres utilizar mis relay servers?
Si es así puedes conectarte a:
wss://relay.bitransfer.org
wss://relay.bitransfermedia.com
Espero que esta guía les sea de ayuda.
-
@ 9ecbb0e7:06ab7c09
2023-07-31 03:56:43Un cubano denunció el maltrato policial del que fue objeto en un punto de control de La Habana cuando transitaba hacia una zona de playa, según publicación compartida en redes.
El usuario de Facebook Jiosbel García denunció este sábado que fue detenido el viernes último alrededor de las nueve de la mañana en el punto de control del municipio de Playa, yendo para la localidad costera Baracoa, y que lo multan por ir a 50 kilómetros por hora cuando en la zona hay una señal que permite transitar a 60 kilómetros.
García aunque le explicó que iba en familia y que no tenía puntos en su licencia de conducción le aplicaron una multa de 12 puntos por exceso de velocidad.
Para el usuario lo más detestable fue el maltrato recibido por una mujer policía que lo detuvo. “Por eso hice este video y la reclamación y por si existe otro lugar a donde dirigirme”, apuntó, además.
En el video compartido, el usuario realiza el mismo recorrido del día anterior y muestra la señal que advierte que se puede transitar a 60km/h en ese tramo.
“La cámara me captó que yo iba a 50km/h donde hay una señal que dice que se puede ir a 60 km/h. Yo le pregunto a la compañero que por qué me va a poner la multa y me dijo que por ‘licencia operativa’. Le explico que voy en familia a disfrutar en la playa y ella me dice que ‘eso no es problema de ella’ y me pone la multa de 12 puntos”, contó García, que insistió que su disgusto mayor “es por la mala forma en que la muchacha me trató. Me puso la multa y me dijo que no tenía más nada que hablar conmigo, que reclamara si no estaba de acuerdo”.
En la publicación, una usuaria comentó que “por eso este país está como está, ya no se puede salir. Para lo que tienen que estar, no lo están, que se pongan para la delincuencia que está acabando”.
No es la primera vez que trascienden a las redes denuncias de maltrato policial en carreteras cubanas.
-
@ 1bc70a01:24f6a411
2023-07-23 12:11:45This is an old but timeless post I wrote on October 26, 2022.
90% of startups fail. That's the statistic.
It feels true to me. Judging by startup Twitter it seems most startups disappear before the 2 year mark. Some, much sooner.
Most people imagine the startup journey to be like this:
People think you can go from idea to a shipped product fast. After launch it's all happy times and success. We fool ourselves into believing this because we must. Otherwise, it's hard to start.
In reality, it's something like this:
50% quit at the idea stage. I'm making rough calls, not an exact science.
Ideas are easy, but acting is hard. Most people will just stop at the idea. Some may look into it, register a domain (you know who you are 👀) and stop there.
Those who undertake the journey will build for 3-5 days and see some other "better" idea. This is called the shiny object syndrome. Sixty percent of people will quit at the shiny object and start working on something else. Of course, they'll register a domain first!
Ignored the shiny object? Good for you! This is where it starts getting tough.
80% will quit before the "project" is finished. The last 20% of the project feels so close, yet so far away that you wonder if you've made the wrong decision all along. Doubt creeps in. You start thinking - "why bother?" and "this is not going to work out, it was a dumb idea".
The heroic few will get through the grind stage and finish their project. This is where they will call themselves a startup - but they are still a project.
In a great anticipating to the "launch", many founders will fall from the pedestal soon after. Launches are a non-event for most projects. Lucky few gain customers and traction, but the vast majority will disappear in the coming weeks.
That leaves us with the 20% crusaders. The tough bastards who will not give up immediately after the launch. They'll keep pushing. And pushing... and pushing. Until they meet a wall of despair.
The wall of despair makes everything seem hopeless. You've tried. You keep trying. But, nothing happens.
This idea has no legs. This was all a giant mistake! Shut it all down. Go back to your regular job. This was never meant to be.
Sadly, this is just the point where things start getting interesting. But, our hero (or actually 90% of them) give up for good.
For most, this will be around the year 1 mark. For some year 2 or even 3 if they pivoted a few times. It's really hard to keep going at this point.
The glorious few - the supposed 10% (I actually think it's about 2%) start seeing traction. All the pain and the agony of getting to this point is finally worth it.
From here, things can take as long as 1 year, all the way up to 5 years+ to see meaningful growth that pays all the bills and then some.
Startups are a helluva ride. Only the toughest survive. But, it doesn't end there.
Even 50% of successful startups will seize to exist just before year 5. The remain half may make it to year 9, but even the vast majority of those will disappear before the decade's end. Only about 30% of the 10% successful startups make it past one decade.
But none of this matters! If you are a new founder reading this, you know you're the exception. You will succeed! You were chosen.
I joke, but, don't give up. I believe in you. What's the downside in doing so?
-
@ b0cb78eb:0dbcb2d1
2023-07-31 10:26:38Are you a gaming enthusiast looking for cutting-edge Web3 game development companies in California? Look no further! In this article, we will introduce you to the top 10 Web3 game development companies based in California that are revolutionizing the gaming industry. From blockchain technology to virtual reality experiences, these companies are at the forefront of creating immersive and interactive gaming experiences. So, let's dive into the world of Web3 gaming!
Introduction California has long been synonymous with technological innovation, and the gaming industry is no exception. With the rise of Web3 technology, game development companies have found new ways to engage players and introduce decentralized experiences. Web3 games leverage blockchain and other distributed technologies to create unique gaming ecosystems that empower players like never before.
Suffescom Solutions Inc - Web3 Game Development Suffescom Solutions Inc is a prominent Web3 game development company based in California. With a team of skilled developers and designers, Suffescom has created a niche for itself in the gaming world. The company specializes in building blockchain-based games that offer players true ownership of in-game assets, enabling them to trade and monetize their virtual possessions.
RisingMax - Web3 Game Development RisingMax is another noteworthy player in the Web3 game development landscape. The company's focus on virtual reality (VR) and augmented reality (AR) games sets it apart from the competition. By integrating blockchain technology into their VR and AR experiences, RisingMax ensures transparency, security, and seamless in-game transactions.
Best Web3 Development As the name suggests, Best Web3 Development is committed to delivering cutting-edge Web3 games to its users. The company's dedication to creating immersive and innovative games has earned them a loyal following. Best Web3 Development's titles often feature play-to-earn mechanisms, allowing players to earn cryptocurrencies while enjoying the game.
Mythical Games Mythical Games is a trailblazer in the Web3 gaming domain. The company's expertise lies in developing games that offer players true ownership of digital assets through the use of blockchain technology. This ownership enables players to monetize their in-game assets and foster a player-driven economy within the gaming world.
Forte Forte takes a community-driven approach to Web3 game development. Collaborating with game developers and studios, Forte leverages blockchain technology to create games with play-to-earn features. Players can not only enjoy immersive gameplay but also earn valuable cryptocurrencies as they progress through the game.
Lucid Sight Lucid Sight has earned a reputation for its innovation in blockchain-based gaming. The company has developed several hit games that provide players with the ability to buy, sell, and trade virtual assets securely on the blockchain. Lucid Sight's games blur the line between virtual and real-world economies.
Immutable Immutable is renowned for its focus on decentralized gaming experiences. The company's blockchain-based games allow players to have true ownership of their in-game items, which can be freely traded and transferred on blockchain marketplaces. This unique approach has garnered significant attention in the gaming community.
Gala Games Gala Games stands out for its player-centric approach to Web3 game development. The company aims to empower players by offering them ownership and governance over the virtual worlds they explore. Gala Games' titles boast engaging narratives and intricate gameplay mechanics.
Pixowl Pixowl is a leading Web3 game development company that specializes in creating blockchain-powered mobile games. The company's games feature vibrant graphics, compelling storytelling, and seamless integration with cryptocurrencies, making them a favorite among gamers and blockchain enthusiasts alike.
SuperWorld SuperWorld's innovative use of augmented reality has captured the imagination of gamers worldwide. The company allows players to own, trade, and interact with virtual real estate on the blockchain. This unique blend of AR and blockchain technology has earned SuperWorld a place in the Web3 gaming hall of fame.
Conclusion Web3 technology is rapidly transforming the gaming landscape, and California is home to some of the most groundbreaking companies in this domain. Suffescom Solutions Inc, RisingMax, Best Web3 Development, Mythical Games, Forte, Lucid Sight, Immutable, Gala Games, Pixowl, and SuperWorld are spearheading the charge towards decentralized and player-centric gaming experiences.
As the demand for immersive, decentralized gaming grows, these companies are likely to push the boundaries of innovation even further. So, whether you are a gamer seeking novel experiences or an investor looking for the next big thing, keep an eye on these top Web3 game development companies in California.
-
@ 81870f53:29bef6a6
2023-07-31 10:14:40ハリー・ポッターの映画のように、自分自身を透明にできることを想像してみてください。
Hyperstealth Biotechnology の量子ステルス技術のおかげでそれが可能になりました。
カナダの企業ハイパーステルスは、人や物体をほぼ完全に消滅させる技術の特許を取得した。
はい、正しく読みました。 肉眼では見えません。
魔法ですね。
このテクノロジーは、ボート、建物全体、宇宙船など、あらゆるものを隠すために使用できます。
昼も夜も、季節も、環境も問わずに活躍します。
では、どのように機能するのでしょうか?
その秘密は素材にあります。
これは、光をあらゆる方向に偏向させる小さな凸レンズであるレンチキュラー レンズで構成されています。
これらのレンチキュラーレンズは「迷彩」効果を生み出します。
光が素材に当たると、光は屈折して背後の物体を迂回し、肉眼では見えなくなります。
このテクノロジーには大きな可能性があります。
軍隊が戦場で姿を消したり、キャンプのテントが屋外で自然に完全に溶け込んだりする様子を想像してみてください。
ただし、すべてのテクノロジーには危険が伴うことに注意してください。 悪者の手に渡れば、凶悪な目的に使用される可能性があります。
目に見えないことは諸刃の超大国です。
現在、この透明マントはAmazonでは購入できません。
少なくともまだです。
したがって、量子ステルス技術は、私たちの世界の見方を変える可能性があります...いや、むしろ世界の見方を変えるかもしれません。
そしてあなた、もし一日透明になったらどうしますか?
あなたの最もクレイジーな答えを待っています。
https://www.youtube.com/watch?v=VFp1KY5KqcI
-
@ 9ecbb0e7:06ab7c09
2023-07-31 03:49:32La policía del condado de Broward arrestó a dos personas acusadas de robar un reloj valorado en 150 mil dólares en una joyería en el centro de Fort Lauderdale.
Aurel Dobos, de 48 años, y Adina Buchuc, de 41, fueron detenidos en el Aeropuerto Internacional de Miami el miércoles pasado y están siendo conectados con delitos similares que han ocurrido en todo el país, informaron medios locales.
El suceso ocurrió en la Robinson’s Jewelry, ubicado en 820 E Las Olas Blvd., en abril pasado y gracias al video de vigilancia de la joyería.
La pieza en cuestión es un reloj Patek Philippe valorado en más de 150.000 dólares.
“Se veían bien; estaban bien vestidos, tenían puestos Rolex. Llevaba un gran sombrero, decía que eran de Italia; me parecían turistas”, dijo Andrew Robinson, el dueño de la tienda.
“Gracias a Dios pudieron atrapar a estos dos, meterlos en la cárcel y no pueden hacérselo a otra persona porque duele”, agregó el compungido dueño, quien se alegra de que sus sistemas de videos hayan contribuido a identificar a los ladrones.
Dobos y Buchuc fueron arrestado justo antes de tomar un vuelo, se le asocia a un crimen similar en Caifornia y permanecen en la cárcel.
Su fianza es de 10,000 dólares, pero tendrán que demostrar que el dinero que usen para la fianza se ganó legalmente.
En febrero, un ladrón se robó cinco relojes Rolex de una joyería de Aventura, Florida, con un valor aproximado de $177,000 dólares.
El robo ocurrió en la tienda International Jewelers Exchange, ubicada en 19725 Biscayne Blvd., al sur del Aventura Mall.
Los investigadores no tienen claro cómo el individuo logró abrir el cajón cerrado que protegía los cinco relojes de lujo marca Rolex. El hombre se llevó la valiosa carga consigo, a pesar de quedar grabado por las cámaras de seguridad.
-
@ e97aaffa:2ebd765d
2023-07-23 10:15:15Em Portugal, os jovens, especialmente os Millennials, é intitulada, a geração mais qualificada de sempre, apesar de ter estudado mais que os seus pais, não está a encontrar uma melhor qualidade de vida. Para muitos a única solução foi/é emigrar, para os que ficam, a precariedade é a ordem do dia ou são trabalhos muito mal pagos. São também vítimas do problema da falta de habitação, adiam ano após ano a constituição de uma família. Provocando um enorme problema de natalidade no país, para contrariar isso, o governo está a atrair estrangeiros para trabalhar em Portugal.
Os Millennials é uma geração martirizada, que já acumula inúmeras crises devastadoras, desde dos PEC do Sócrates, do Subprime, da dívida soberana, da banca, da covid, da inflação, o pior é que não vai ficar por aqui.
Políticas
Como querem aumentar a produtividade do país, com políticas desastrosas e em vez de resolver os problemas estruturais, os políticos preferem ir pela via mais fácil, mais rápida, apenas utilizar paliativos, que não resolve nada, apenas adia o problema e possivelmente agravado-o.
No caso da produtividade, as políticas limitaram-se a criar universidades e formar alunos, mas depois faltou o resto. Não formamos os empresários, não criamos empresas mais produtivas nem sequer criamos leis que facilitem a criação de empresas.
Formamos mão-de-obra qualificada mas depois a “só” existem empregos em callcenters, supermercados ou na restauração e os que conseguem escapar disso, são na generalidade muito mal pagos em comparação ao resto da europa. Isto não é um desrespeito para estas profissões, qualquer trabalho honesto é valido, o problema aqui, é que os jovens estudaram, qualificaram-se e agora só arranjam empregos não qualificados.
O que nós temos visto nas últimas décadas em Portugal, já parece ser uma política oficial, a exportação de trabalhadores qualificados e importação de mão-de-obra barata, a roçar a escravatura.
Somos um país, onde o primeiro-ministro que aconselha os jovens a emigrar, está tudo dito…
O caso de Odemira é uma grande vergonha para o país, com pessoas extremamente mal pagas, a habitar em casas em péssimas condições de salubridade, com dezenas de pessoas a viver na mesma casa.
No passado os estrangeiros eram sobretudo PALOPs, na sua maioria integram-se bem na comunidade. Posso estar errado, mas estas novas ondas de imigrantes (sobretudo asiática) estão a ter muitas dificuldades de integração, isto poderá ser um enorme problema a nível social no futuro e certamente vão crescer movimentos mais extremistas.
Portugal necessita urgentemente renovar os políticos, menos políticos de carreira, menos boys, mais pessoas com conhecimento/experiência no terreno, pessoas com ideias novas, que façam realmente reformas estruturais.
Empresários
Uma parte dos empresários também necessitam de melhorar, são quota-parte do problema da produtividade, eles necessitam de modernizar a sua empresa e principalmente a sua mentalidade. É necessário mudar o modelo de negócio, apostar em produtos com mais valor acrescentado e deixar a produção baseada em mão-de-obra barata.
O sector agrícolas português dizem que não podem pagar aos trabalhadores mais que o salário mínimo e depois queixa-se que existe falta de mão-de-obra e tem que importar trabalhadores. Eu pergunto, como é possível os empresários franceses que vêm a Portugal contratar pessoas, pagam salários muito superiores e conseguem colocar os seus produtos nos supermercados ainda mais baratos que os produtos portugueses.
Em Portugal também existe um problema, o principio básica da economia, oferta e procura não funciona com os salários, isto é notório neste momento. É uma queixa recorrente dos empresários, a falta de mão-de-obra, uma parte dos empresários recusa-se a pagar mais que o salário mínimo, se aumentarem mais os salários certamente aparecia mais pessoas interessadas na vaga, digo eu. Mas preferem faturar menos, recusar encomendas do que pagar salários mais altos. Ou então vão ao estrangeiro contratar mão-de-obra barata, mantendo sempre o problema de produtividade e de produtos de baixo valor acrescentado.
Aquela ideia dos liberais, que sem o salário mínimo haveria mais empregos, em Portugal não funciona. Por isso eu sou um defensor do salário mínimo nacional é necessário em Portugal, sem ele a miséria seria descumunal.
Salários
“Através de um inquérito a 2,2 milhões de jovens entre os 15 e os 34 anos, residentes em Portugal, o perfil dos jovens portugueses foi traçado, em 2021, num estudo da Fundação Francisco Manuel dos Santos, “Os Jovens em Portugal, Hoje”, do qual faz parte o gráfico acima divulgado. De acordo com o documento, no que respeita ao salário líquido mensal, para jovens que trabalham por conta de outrem, 86% inserem-se nos escalões de valores até 1.158 euros. A maior percentagem – 30% – verifica-se no escalão entre 601 e 767 euros.” in poligrafo.sapo.pt
Perto de três quartos (72%) auferem rendimentos que não ultrapassam os 950 euros líquidos por mês.
Além dos baixos salários, temos leis de trabalho obsoletas, onde quem tem contrato sem termo são super protegidos, tem mil e um direitos. Depois há a outra face da moeda, os que têm contratos precários ou a termos, que poucos direitos e na qual permite que os empresários abusem disso.
Nem vou classificar aquelas empresas travestidas de app, que contornam as leis e exploram os seus trabalhadores, pagando à jorna, sem quaisquer direitos. Fazendo concorrência desleal, prejudicando aquelas empresas que respeitam as leis do país.
Habitação
“Ter autonomia e mais independência: a aspiração de qualquer jovem que se prepara para fazer a transição para a vida adulta. Mas em Portugal, esse caminho está cada vez mais difícil. Desde os empregos e salários precários, às dificuldades no acesso à habitação. Comprar casa é, para muitos jovens, um sonho distante – e nem o arrendamento é viável em alguns casos.
Os números do Eurostat não deixam margem para dúvidas: Portugal é o país da União Europeia (UE) em que os jovens saem mais tarde de casa dos pais – 33,6 anos, em média. E não é por apatia ou falta de vontade. A maioria não consegue mesmo pagar por uma casa.” in Idealista
Agora juntando salários baixos, preço da habitação alta e uma inflação alta, é uma bomba relógio.
Texto originalmente publicado em http://www.rei-artur.com/uma-geracao-perdida/ a 10 de Junho de 2023
-
@ 81870f53:29bef6a6
2023-07-31 09:59:47科学のブレークスルーを探していますか?
1 年前、DeepMind の AlphaFold AI は科学の様相を変え、生物学と医学に対する私たちの理解を変えました。
2022 年 7 月までに、AlphaFold はほぼすべての既知のタンパク質の構造を予測しました。
これにより、生物学の理解、創薬の加速、病気の治療の可能性が大幅に高まりました。
現在、AlphaFold のタンパク質構造データベースは、190 か国以上の 120 万人以上の研究者によって使用されています。
AlphaFold の採用率は全体的に急速に増加しています。
AlphaFold は、大手製薬会社でも創薬プログラムを推進するために使用されています。
AI はすでに次のことを可能にしています。
-
マダガスカルにおける新たな病気の脅威を発見します。
-
より効果的なマラリアワクチンを開発します。
-
がんを治療するための新薬を開発します。
-
抗生物質耐性との戦い。
AlphaFold は、生物学における 50 年来の問題を解決することで科学に革命をもたらしました。
「タンパク質のフォールディングチャレンジ」。
しかし、タンパク質の世界には解決すべき課題がまだたくさんあります。
たとえば、多くのタンパク質は時間の経過とともに形状を変化させることでその機能を達成するため、タンパク質の物理学をより深く理解することが不可欠です。
では、AlphaFold が科学と医学に与える潜在的な影響についてはどう思いますか?
将来的にさらに大きな影響が生じる可能性はあると思いますか?
-
-
@ f4db5270:3c74e0d0
2023-07-23 09:10:31"Alba a Spotorno" (2023)
44x32cm, oil on chalkboard
(Available)
Here a moment work in progress...
The beginning...
-
@ 9ecbb0e7:06ab7c09
2023-07-31 03:47:54El economista y exprisionero político Vladimiro Roca Antúnez, figura insigne de la oposición en Cuba, falleció este domingo en La Habana a los 80 años como resultado de una afectación cerebrovascular.
El deceso de Roca se produjo al filo de las 6 p.m. en su apartamento de Nuevo Vedado, donde se encontraba al cuidado de su sobrina, Vivian Roca. Sufrió un paro cardiorespiratorio como secuela de un infarto cerebral que tuvo en 2020.
“El estaba muy mal desde hacía meses, le repetían con frecuencia los pequeños infartos cerebrales y había perdido la memoria”, dijo Vivian Roca en comunicación telefónica con CiberCuba.
Con la muerte de Vladimiro Roca desaparece una personalidad emblemática de la disidencia política que enfrentó al régimen de Fidel Castro, defendiendo sus diferencias desde posturas de izquierda que forjó en el seno de la tradición familiar.
Nacido en La Habana el 21 de diciembre de 1942, Vladimiro era hijo de Blas Roca, dirigente fundador del Partido Socialista Popular (PSP) y quien se mantuvo en la cúpula del poder revolucionario hasta el final de su vida. Realizó sus estudios primarios en una escuela primaria del barrio de La Víbora y al terminar la enseñanza secundaria se incorporó como aprendiz de cajista en el periódico Hoy, órgano del PSP.
A los 18 años formó parte de la primera promoción de una élite de jóvenes seleccionados para entrenarse como pilotos de cazabombarderos en la Unión Soviética. A su regreso, se mantuvo dentro de las Fuerzas Armadas Revolucionarias de Cuba (FAR) por 10 años.
Se graduó de la licenciatura en Relaciones Económicas Internacionales en el Instituto en 1987 y trabajó en las esferas del gobierno hasta que sus diferencias con el rumbo político del país se hicieron irreconciliables.
Vladimiro manifestó sus discrepancias con los postulados de la Constitución socialista que su propio padre contribuyó decisivamente a instaurar en 1976 como presidente de la Asamblea Nacional del Poder Popular.
Su disidencia activa se hizo ostensible en 1991, cuatro años después del fallecimiento de su padre.
Despedido de su empleo gubernamental, Vladimiro comenzó a expresar públicamente su oposición. En 1996 figuró entre los fundadores del Partido Socialdemócrata de Cuba y, un año después, participó en la creación del Grupo de Trabajo de la Disidencia Interna, que pretendía analizar y proponer soluciones a la crítica situación económica del país.
Unido a otros tres pensadores distanciados del oficialismo, Martha Beatriz Roque, Félix Bonne Carcassés y René Gómez Manzano, elaboró y firmó el documento La patria es de todos, un demoledor examen del descalabro económico y social de Cuba, con propuestas para una reforma democrática y pluripartidista. La publicación del texto encolerizó a Fidel y Raúl Castro, y determinó el arresto de sus promotores en 1997.
Vladimiro y sus compañeros fueron juzgados por delitos “contra de la seguridad nacional del Estado cubano" y "sedición", y condenados a largas condenas de cárcel.
Cumplió prisión hasta 2002 y se mantuvo activo en labores del activismo político y la defensa de los derechos humanos en Cuba.
En 2010, estuvo entre los activistas que logró viajar hasta Banes, en la oriental provincia de Holguín, para participar en las exequias del opositor Orlando Zapata Tamayo, fallecido en una heroica huelga de hambre para reclamar libertad para los presos políticos.
Por decisión familiar, su cadáver será cremado en las próximas horas. No están previstas honras fúnebres hasta el momento.
-
@ 393c8119:75e43710
2023-07-31 09:56:52What is decentralized science (DeSci)?
Decentralized science (DeSci) is a movement that aims to build public infrastructure for funding, creating, reviewing, crediting, storing, and disseminating scientific knowledge fairly and equitably using the Web3 stack.
DeSci aims to create an ecosystem where scientists are incentivized to openly share their research and receive credit for their work while allowing anyone to access and contribute to the research easily. DeSci works off the idea that scientific knowledge should be accessible to everyone and that the process of scientific research should be transparent. DeSci is creating a more decentralized and distributed scientific research model, making it more resistant to censorship and control by central authorities. DeSci hopes to create an environment where new and unconventional ideas can flourish by decentralizing access to funding, scientific tools, and communication channels.
Decentralized science allows for more diverse funding sources (from DAOs, quadratic donations(opens in a new tab)↗ to crowdfunding and more), more accessible access data and methods, and by providing incentives for reproducibility.
Juan Benet - The DeSci Movement https://youtu.be/5ORvbCIW39o
How DeSci improves science
An incomplete list of key problems in science and how decentralized science can help to address these issues
Ethereum and DeSci
A decentralized science system will require robust security, minimal monetary and transaction costs, and a rich ecosystem for application development. Ethereum provides everything needed for building a decentralized science stack.
DeSci use cases
DeSci is building the scientific toolset to onboard Web2 academia into the digital world. Below is a sampling of use cases that Web3 can offer to the scientific community.
Publishing
Science publishing is famously problematic because it is managed by publishing houses that rely upon free labor from scientists, reviewers, and editors to generate the papers but then charge exorbitant publishing fees. The public, who have usually indirectly paid for the work and the publication costs through taxation, can often not access that same work without paying the publisher again. The total fees for publishing individual science papers are often five figures ($USD), undermining the whole concept of scientific knowledge as a public good(opens in a new tab)↗ while generating enormous profits for a small group of publishers.
Free and open-access platforms exist in the form of pre-print servers, such as ArXiv(opens in a new tab)↗. However, these platforms lack quality control, anti-sybil mechanisms(opens in a new tab)↗, and do not generally track article-level metrics, meaning they are usually only used to publicize work before submission to a traditional publisher. SciHub also makes published papers free to access, but not legally, and only after the publishers have already taken their payment and wrapped the work in strict copyright legislation. This leaves a critical gap for accessible science papers and data with an embedded legitimacy mechanism and incentive model. The tools for building such a system exist in Web3.
Reproducibility and replicability
Reproducibility and replicability are the foundations of quality scientific discovery.
- Reproducible results can be achieved multiple times in a row by the same team using the same methodology.
- Replicable results can be achieved by a different group using the same experimental setup.
New Web3-native tools can ensure that reproducibility and replicability are the basis of discovery. We can weave quality science into the technological fabric of academia. Web3 offers the ability to create attestations for each analysis component: the raw data, the computational engine, and the application result. The beauty of consensus systems is that when a trusted network is created for maintaining these components, each network participant can be responsible for reproducing the calculation and validating each result.
Funding
The current standard model for funding science is that individuals or groups of scientists make written applications to a funding agency. A small panel of trusted individuals score the applications and then interview candidates before awarding funds to a small portion of applicants. Aside from creating bottlenecks that lead to sometimes years of waiting time between applying for and receiving a grant, this model is known to be highly vulnerable to the biases, self-interests and politics of the review panel.
Studies have shown that grant review panels do a poor job of selecting high-quality proposals as the same proposals given to different panels have wildly different outcomes. As funding has become more scarce, it has concentrated into a smaller pool of more senior researchers with more intellectually conservative projects. The effect has created a hyper-competitive funding landscape, entrenching perverse incentives and stifling innovation.
Web3 has the potential to disrupt this broken funding model by experimenting with different incentive models developed by DAOs and Web3 broadly. Retroactive public goods funding(opens in a new tab)↗, quadratic funding(opens in a new tab)↗, DAO governance(opens in a new tab)↗ and tokenized incentive structures(opens in a new tab)↗ are some of the Web3 tools that could revolutionize science funding.
IP ownership and development
Intellectual property (IP) is a big problem in traditional science: from being stuck in universities or unused in biotechs, to being notoriously hard to value. However, ownership of digital assets (such as scientific data or articles) is something Web3 does exceptionally well using non-fungible tokens (NFTs).
In the same way that NFTs can pass revenue for future transactions back to the original creator, you can establish transparent value attribution chains to reward researchers, governing bodies (like DAOs), or even the subjects whose data is collected.
IP-NFTs(opens in a new tab)↗ can also function as a key to a decentralized data repository of the research experiments being undertaken, and plug into NFT and DeFi financialization (from fractionalization to lending pools and value appraisal). It also allows natively on-chain entities such as DAOs like VitaDAO(opens in a new tab)↗ to conduct research directly on-chain. The advent of non-transferable "soulbound" tokens(opens in a new tab)↗ may also play an important role in DeSci by allowing individuals to prove their experience and credentials linked to their Ethereum address.
Data storage, access and architecture
Scientific data can be made vastly more accessible using Web3 patterns, and distributed storage enables research to survive cataclysmic events.
The starting point must be a system accessible by any decentralized identity holding the proper verifiable credentials. This allows sensitive data to be securely replicated by trusted parties, enabling redundancy and censorship resistance, reproduction of results, and even the ability for multiple parties to collaborate and add new data to the dataset. Confidential computing methods like compute-to-data(opens in a new tab)↗ provide alternative access mechanisms to raw data replication, creating Trusted Research Environments for the most sensitive data. Trusted Research Environments have been cited by the NHS(opens in a new tab)↗ as a future-facing solution to data privacy and collaboration by creating an ecosystem where researchers can securely work with data on-site using standardized environments for sharing code and practices.
Flexible Web3 data solutions support the scenarios above and provide the foundation for truly Open Science, where researchers can create public goods without access permissions or fees. Web3 public data solutions such as IPFS, Arweave and Filecoin are optimized for decentralization. dClimate, for example, provides universal access to climate and weather data, including from weather stations and predictive climate models.
Get involved
Explore projects and join the DeSci community.
DeSci.Global: global events and meetup calendar(opens in a new tab)↗ Blockchain for Science Telegram(opens in a new tab)↗ Molecule: Fund and get funded for your research projects(opens in a new tab)↗ VitaDAO: receive funding through sponsored research agreements for longevity research(opens in a new tab)↗ ResearchHub: post a scientific result and engage in a conversation with peers(opens in a new tab)↗ LabDAO: fold a protein in-silico(opens in a new tab)↗ dClimate API: query climate data collected by a decentralized community(opens in a new tab)↗ DeSci Foundation: DeSci publishing tool builder(opens in a new tab)↗ DeSci.World: one-stop shop for users to view, engage with decentralized science(opens in a new tab)↗ Fleming Protocol: open-source data economy that fuels collaborative biomedical discovery(opens in a new tab)↗ OceanDAO: DAO governed funding for data-related science(opens in a new tab)↗ Opscientia: open decentralized science workflows(opens in a new tab)↗ LabDAO: fold a protein in-silico(opens in a new tab)↗ Bio.xyz: get funded for your biotech DAO or desci project(opens in a new tab)↗ ResearchHub: post a scientific result and engage in a conversation with peers(opens in a new tab)↗ VitaDAO: receive funding through sponsored research agreements for longevity research(opens in a new tab)↗ Fleming Protocol: open-source data economy that fuels collaborative biomedical discovery(opens in a new tab)↗ Active Inference Lab(opens in a new tab)↗ CureDAO: Community-Owned Precision Health Platform(opens in a new tab)↗ IdeaMarkets: enabling decentralized scientific credibility(opens in a new tab)↗ DeSci Labs(opens in a new tab)↗ We welcome suggestions for new projects to list - please look at our listing policy to get started!
Further reading
DeSci Wiki by Jocelynn Pearl and Ultrarare(opens in a new tab)↗ A guide to decentralized biotech by Jocelynn Pearl for a16z future(opens in a new tab)↗ The case for DeSci(opens in a new tab)↗ Guide to DeSci(opens in a new tab)↗ Decentralized science resources(opens in a new tab)↗ Molecule’s Biopharma IP-NFTs - A Technical Description(opens in a new tab)↗ Building Trustless Systems of Science by Jon Starr(opens in a new tab)↗ The Emergence of Biotech DAOs(opens in a new tab)↗ Paul Kohlhaas - DeSci: The Future of Decentralized Science (podcast)(opens in a new tab)↗ An Active Inference Ontology for Decentralized Science: from Situated Sensemaking to the Epistemic Commons(opens in a new tab)↗ DeSci: The Future of Research by Samuel Akinosho(opens in a new tab)↗ Science Funding (Epilogue: DeSci and new crypto primitives) by Nadia(opens in a new tab)↗ Decentralisation is Disrupting Drug Development(opens in a new tab)↗
Videos
What's Decentralized Science?(opens in a new tab)↗ Conversation between Vitalik Buterin and the scientist Aubrey de Grey about the intersection of longevity research and crypto(opens in a new tab)↗ Scientific Publishing Is Broken. Can Web3 Fix It?(opens in a new tab)↗ Juan Benet - DeSci, Independent Labs, & Large Scale Data Science(opens in a new tab)↗ Sebastian Brunemeier - How DeSci Can Transform Biomedical Research & Venture Capital(opens in a new tab)↗
Link:https://ethereum.org/en/desci/
-
@ 9ecbb0e7:06ab7c09
2023-07-31 03:45:24Un ciudadano cubano fue detenido en Cancún después de que presuntamente robó tres bolsas de sopa de un supermercado. El sospechoso afirmó haber perdido su empleo en la zona hotelera hace diez días y declaró que cometió el acto impulsado por el hambre. La situación generó un fuerte dispositivo de seguridad en la supermanzana 512, esquina Nichupté con Chac Mool.
Los policías, pertenecientes a la Secretaría de Seguridad Ciudadana del estado de Quintana Roo, se movilizaron rápidamente cuando recibieron el llamado de alerta del robo que tuvo lugar en el negocio llamado Soriana. Imágenes compartidas en redes sociales muestran que al menos tres oficiales y dos patrullas arribaron al lugar de los hechos.
Alerta Quintana Roo, un medio de noticias de la región, no justificó el robo, pero aprovechó el incidente para cuestionar la actuación policial en otros delitos que ocurren en la ciudad.
Mientras que en casos de extrema violencia e inseguridad, relacionados con el narcotráfico, no son atendidos de manera correcta, en situaciones de delitos menos graves hasta más dos patrullas arriban para detener a una sola persona.
Este caso en particular pone de relieve las dificultades que enfrentan algunos migrantes cubanos fuera de su país natal, y cómo la falta de empleo y recursos puede llevar a actos desesperados.
Es importante señalar que este incidente no refleja la situación de todos los oriundos de la Isla en el extranjero, debido a que muchos de ellos han encontrado éxito y prosperidad en sus nuevos hogares.
Las autoridades locales no revelaron el nombre del detenido, y tampoco se ofreció información sobre su situación migratoria. En caso de no tener la documentación apropiada para una estancia regular en México, podría ser entregado a las autoridades migratorias, si su caso legal no pasa a una situación mayor.
En los comentarios del post, algunos mexicanos cuestionaron la actitud de los agentes, además de que expresar palabras de apoyo al cubano desesperado. “Poca vergüenza de los policías y de quien los haya llamado más, habiendo tanto asesino en la calle y corren a detener a un pobre hombre que tiene hambre”, dijo una usuaria.
Otra agregó: “debieron ayudarlo pagando las sopas o hubieran sido más empáticos con él, recuerden que el mundo da vuelta, ¿Ya se les olvidó todo lo que pasamos en la pandemia?”, dijo la internauta Ximena Pablo.
-
@ 1bc70a01:24f6a411
2023-07-21 12:03:38The concept of value of value is one where information yearns to flow freely, transactions should be voluntary, unlimited and direct. In V4V model, people pay what something is worth to them.
Sounds great. On paper. There are some issues…
Free sucks
At least, that’s the perception. People don’t assign much value to free. Ask anyone who has ever ran any business and has not suggested a value for a product or service and they’ll tell you that they earned far less than when charging for the thing.
It’s true, some people will give a lot, some a little, and most none. Most - none. None.
Pricing is Signal
Pricing is a signal of desirability and quality. Of course, it is often incorrect and people manipulate pricing all the time. But for the most part, people don’t see much value in free. Unless a recommended price is offered, people will usually pay nothing. This is not a great model to thrive on if you spend years of your life acquiring knowledge and turning it into products that nobody ultimately buys. I have very personal experience with free. I’ve created and sold digital products and ran many pricing experiments myself. The highest priced products usually generated the most revenue. Surprise! The middle cost product (same product, just priced less) decimated the revenue stream. When set to 0 (even with a suggested minimum price), I generated almost no revenue at all.
None of this is surprising. Pricing acts as a psychological anchor. “You get what you pay for” is ingrained in our brains whether we think about it or not.
People are clueless
The issue with price is that most people don’t have a clue what anything is worth. The only time people have any rough idea of what they should pay for something is when they have already purchased that thing in the past. But, introduce something they have never before purchased and they won’t have a single clue about what to pay. Take for example a set of professional photos of you and your family. Unless you’ve been to a photo studio in the last 5 years, you probably won’t have a single clue what that package of photos is worth. Does that mean the product is worthless? Of course not, but people don’t know what to pay.
In a value for value model, the absence of price makes it super difficult to determine the value of anything. You may take some social cues from previous payments from other people, but this could backfire for the content creator.
Suppose I created a UI framework that saved developers hundreds of hours. In theory, I should be able to charge at least a few hours’ worth of value for this product. If the developer’s time is valued at $100/hour, a $200 price for a product that saves you $2000 worth of time seems very justifiable. Not only do you get to use it once, but you can re-use the product for ALL future projects and employment.
Now, remove the price and see what people pay. Absolutely nothing. You may have a few people who pay $200 voluntarily, but it’s highly unlikely The vast majority will pay nothing, and some may “tip” in the 5-$60 range. Anything that approaches a $100 mark is seen as a purchase. Hey, I don’t make the rules, I just see what other founders have figured out long ago and combine with my own observations. Don’t kill the messenger.
Free is Expensive
If I am accurate in my assessment and recall my personal experiences accurately, then the majority of people who consume your value will do so for free. When that content is a product, you may end up spending a lot of time on supporting the thing that is not generating any revenue. You don’t want to be rude and ignore people so you’ll probably spend your valuable time answering questions and helping them troubleshoot issues. All of that time adds up. Startup founders who offer free tiers or near free tiers of services learn very quickly that free customers are the most painful and demanding. You are basically forced to charge just to avoid dealing with demanding people who expect everything for nothing.
Free is Noise
Price is not just a request for value, but it acts as a feedback signal for future content. If you have no idea what people are paying for, it’s difficult to know if what you create is worth anything. A situation where the vast majority of your content is consumed for free yield a lot of noise.
Well, why not focus on the people who pay? You certainly could, but it ends up being a tiny fraction of the sample size you could have had if you actually charged something up front.
Lack of forecasting
Businesses rely on predictible revenue. Forecasting is necessary for all sorts of decisions if you work with anyone but yourself. It helps with purchasing decision (expenses) and with planning of future products. Value for value makes it impossible to know what your revenue will be next month as you just have no idea if everyone pays nothing or a lot.
V4V could make you uncompetitive
In a model where one person charges a fixed price and the other is relying on the good will of the people to "see the value" in their work, the person with predictible revenue will most likely win out in a competitive environment - enabling them to get ahead of you and your business. They will have an easier time planning further content / products and hiring people to scale the business even further.
It’s not all hopeless
That’s not to say that I don’t like the idea of value for value. Of course I only want people to pay if they find the thing useful. The issue is that people may not know the thing is useful until they’ve already acquired it. At that point who is going back to pay for the thing they already got for free? Few to none.
Value for value may work. For some.
I’m not saying value for value doesn’t work sometimes, for some people. It is entirely possible that a person earns a living on v4v transactions. However, I think for that to be true there may be other factors at play such as social standing, personal brand, influence, likability, status within a community. The vast majority of creators do not fall into this category and will just struggle.
I’m cautiously optimistic about V4V and hope it works out at scale. But as it stands, I have not seen much evidence that it actually pays the bills. Yes, there has been some support for podcasts on Fountain, but it is unclear whether it is just as or more significant than traditional transaction model.
“Information is not scarce” is irrelevant
There’s some notion that information yearns to be free and cannot be scarce by nature. I think this may be a false argument from the start. When we purchase digital things, we are not paying for scarcity - it’s totally irrelevant. We pay for the experience and the feeling we get from that thing. In fact, the same is probably true for physical products (with the added benefit of personal sustenance). I don’t go into the grocery store to buy a dinner and fork over the money because it’s scarce. I pay because I’m hungry. There’s utility and there’s pleasure and fulfillment. If I’m having a dinner with friends, there’s also fun. Unless I am totally misunderstanding the argument, I’m not sure how it applies.
In Summary
- Value 4 value may work at scale, but remains to be seen
- It could be great fun money but not serious enough to pay the bills (for most of us)
- Sounds good on paper but we humans have our own ways of thinking about value and what it's worth
- May work well for people who build a personal brand or have status in a community
As always I look forward to your thoughts. Let me know if I’m overlooking something or should consider some point of view in more depth.
-
@ 57fe4c4a:c3a0271f
2023-07-30 22:54:53👥 Authors: Carla Kirk-Cohen ( nostr:npub17xugd458km0nm8edu8u2efuqmxzft3tmu92j3tyc0fa4gxdk9mkqmanw36 )
📅 Messages Date: 2023-07-27
✉️ Message Count: 1
📚 Total Characters in Messages: 698
Messages Summaries
✉️ Message by Carla Kirk-Cohen on 27/07/2023: Transcripts of specification meetings held every other Monday are now available at https://btctranscripts.com/lightning-specification, thanks to Gurwinder at Chaincode.
Follow nostr:npub1j3t00t9hv042ktszhk8xpnchma60x5kz4etemnslrhf9e9wavywqf94gll for full threads
-
@ 393c8119:75e43710
2023-07-31 09:35:44The corpus of scientific data is fragmented, access-controlled, and rapidly growing beyond the capacity of centralized services to maintain. Recent developments in peer-to-peer technology have made it possible to establish a permanent archive of scientific records that is open to all. In this article series, we dive deep into the cutting edge technology of decentralized file storage networks and offer potential paths forward for a collaborative decentralized science ecosystem. We also introduce OpSci Commons, an open-access knowledge commons sustainably designed to capture the value of scientific enterprise in a decentralized cloud services marketplace.
Knowledge, Who is it For?
The boundaries of knowledge have historically been limited by access to tools for observation and high quality data. The ability to make significant jumps forward in our understanding of the natural world used to belong to the privileged few.
Ptolemy had the Armillary Astrolabe and papyrus to record the Earthly boundaries of human understanding — boundaries that went unchallenged for over a millenia. Galileo had the Convex Objective and parchment to populate our universe with god-like spheres locked in cosmic coordination. Hubble used the power of the Hooker telescope to circumscribe an infinitely expanding horizon for all human knowledge, leaving behind a challenge for subsequent truth-seekers in a universe where anything was possible.
Distributed Knowledge, Anatomical Plate. Source: 1857 JG Heck
Until recently, only those that were part of an exclusive club of academics could obtain access to the instruments and troves of data required to take on outstanding challenges in science. Today, significant advancements in astronomy and physics are made possible by open collaboration and data sharing practices. The questions are too big, byzantine models too recursive, and engineering challenges too complex for even the most enlightened individual to single handedly solve. The horizon of our cumulative understanding of the universe only expands today because the doors to high quality datasets and the tools to work with them are open to everyone, everywhere.
Rich in Data, Poor in Wisdom
While the astronomy community has set the standard for collaborative open science practices, many fields are still rooted in the legacy practice of career advancement based on reputation and ego. It is difficult for many to see how we can move beyond such adversarially entrenched academic interests. However, the reality of the challenges facing modern science today will inevitably force a cultural revolution-this very paradigm shift is already occurring today with the emergence of open access science data commons, journals, and free software. A digital explosion of data obtained from scientific observations of our natural world is generating more content than what institutional infrastructure can provide for upkeep, storage, and tools to sift through the expanding mass of raw knowledge.
Thousands of petabytes of valuable data and observations on human health, economic activity, social dynamics, and the universe and our impact on it are siloed in outdated storage systems. This data is inaccessible to search engines, stored in arcane schemas known only to a few, and likely never to be utilized. An estimate of over 80% of raw scientific data collected from the 1990s is estimated to be lost forever due to deprecated technology and inadequate archival infrastructure (Wiener-Bronner 2013). Today, the likelihood of finding a dataset falls by 17% year-after-year, beginning three years after a paper is published (Vines et al. 2014). The practice of deliberately restricting access to scientific data limits our society’s rate of innovation precisely when we have never had so many problems that require scientific innovation to solve.
Decentralized file storage protocols offer solutions to this failing via content-addressable data, programmable incentives for data storage, provenance tracking, censorship resistance, and bandwidth that scales with global adoption. A peer-to-peer science data commons powered by these features may provide a resilient digital fabric that aligns a decentralized community of discovery around the most critical and challenging problems of today.
A Short History of Peer-to-Peer Content Networks
Peer-to-peer file sharing is as old as the internet. In fact, the predecessor to the internet as we know it, ARPANET, was strictly a peer-to-peer network when it was first booted up in 1969 (Paratii, 2017). Resilience to network degradation, high bi-directional bandwidth, information redundancy, aggregation of resources, and an intrinsic participatory nature are all merits that made distributed peer-to-peer networks a first-choice design amongst early internet architects and engineers. Many iterations of such direct information-sharing have appeared in the short history of the internet, some improvements, others dead-ends.
The emergence of public key cryptography in 1973 marked the beginning of identity protocols and verifiability of content through an ingenious key-pair signing system (Cocks 2001). For the first time, users on a network could trust a packet of information encrypted by a secret key if it could be uniquely decrypted by a key publicly posted by a known identity. Later, Ralph Merkle would invent the Merkle Tree in 1979 as a way of tracking the provenance of packets of information and paving the way for version control software such as git and svn (Merkel 1987). The synthesis of public key cryptography technology with Merkel Tree data structures would continue to drive innovation such as the emergence of blockchains, distributed computing, and consensus mechanisms that enhance resilience to attack and minimize fragmentation of information in distributed networks.
One of the most famous examples of distributed networks, Napster, connected peers through a centralized indexing server; which was later shut down by authorities following a lawsuit by Metallica for copyright infringement in 2001. The introduction of the Distributed Hash Table (DHT) revolutionized the design of peer-to-peer networks, unlocking higher tiers of decentralization and making the networks more resilient to content moderation and censorship. DHTs were initially used to help nodes on peer-to-peer networks remember each other’s locations. In the early-internet era, this allowed peer-to-peer networks to scale in a truly decentralized way because they did not need to rely on centralized servers like Napster did. The extremely popular peer-to-peer network BitTorrent was one of the first networks to utilize DHTs.
The Bitcoin Codebase Fingerprint. Source: Amelia Wattenberger
In 2009, Bitcoin entered the scene (Nakamoto 2008). While the peer-to-peer networks prior to Bitcoin allowed users to easily and quickly transfer data to each other, they were not engineered to be tamper-proof records of cryptographically verifiable exchanges. Events can only be appended to the Bitcoin ledger if the node submitting the transactions proves that they have done a certain amount of computational work within a short time window. Bitcoin is the first instance of a peer-to-peer network with a single global state that defines truth for the purposes of the network at consensus-in this case, the transfer of a cryptographic token representing economic value.
The concept of cryptographic proof for verifying events in a distributed network paved the way for accelerated innovation in peer-to-peer technology. Interplanetary File System (IPFS), a peer-to-peer file sharing protocol, synthesizes key advancements in decentralized computation such as DHTs and Merkel Trees with cryptographic proof to provide a base layer for a permanent internet records archive. IPFS makes it possible, for the very first time, for information to truly belong to a web commons with intrinsic resistance to geographical censorship, attacks on data integrity through content revision, and bandwidth bottlenecks imposed by centralized service providers.
Current State of Cloud Storage
The early 2000s saw the emergence of centralized cloud service providers that would become the gate-keepers for content on the internet. Today, the cloud storage market is dominated by very few players. Amazon, Microsoft, and Google control over half the market, and Amazon alone controls a third of the market, according to a Canalysis (2020) estimate. Amazon reached its near-monopoly position by solving critical scalability problems of the early internet, but by reaching this position, it created a new set of problems, all of which stem from centralization. The main problems are inefficient resource allocation, data fragmentation across isolated repositories, lack of privacy and security, and unnecessarily high costs. Overall, the cloud service providers control the terms that govern the data they store, making them an arbiter for access to knowledge.
A Taxonomy of Control Schemas Employed by Big Tech Corporations. Source: Manu Cornet
Amazon has recently begun offering enticing data storage deals for scientists to further increase the size and depth of their content moat (Amazon, 2018). Analysts speculate that Amazon may increase the value of their services if they can compile massive amounts of high quality interoperable datasets from industry, academic, and government researchers (Goldfein and Nguyen, 2018). For example, the Allen Brain Observatory has struck an agreement with Amazon to store dozens of terabytes of valuable neuroimaging observations (Allen Brain Institute, 2018).
While Amazon offers free storage for data upload, egress from their servers often incur a heavy fee, sometimes trapping data within their expansive computing centers and making Amazon the de facto owner of publicly funded research. Community backlash appears to have budged Amazon enough to consider a 15% waiver in monthly cloud storage costs for “eligible” research institutions. It appears that Amazon has taken a page from the science publishing industry and identified access to knowledge as another lucrative component of their increasingly sprawling cloud business model. Even so, a counter-current to the trend of centralization is building momentum and stands to disrupt the monolith of control that big tech companies have erected in the last two decades.
Looking Forward to An Open Web
As part of this counter-current, IPFS has led to the emergence of many additional technological innovations powering the decentralized web. In this article series, we cover the major decentralized data storage protocols and discuss their potential to serve as an underlying fabric for a decentralized science data commons. We begin with a deep dive into the history, mechanics, and popular applications behind IPFS.
Join the Decentralized Open Science Movement
Does the idea of a free, open, internet of science ring a resonant chord with you? Consider joining the Opscientia community to learn, connect, and collaborate with others building a commons for co-discovery.
Articles in This Series
- Decentralized Content Networks for a Permanent Science Data Commons: IPFS
- Engineering Incentives for Data Storage as a Commodity: Filecoin
- A Permanent Web of Linked Data: Arweave
- Peer-to-Peer Storage without a Blockchain: Storj
- One of the First Decentralized Cloud Storage Platforms: Sia
- The World Computer’s Hard Drive: Swarm
- Open, Free, and Automated Pipelines for Permanently Archiving Massive Scientific Datasets
- OpSci Commons: A Decentralized and Autonomous Knowledge Commons
References
Allen Brain Institute. (2018, August 9). Neuroscience Data Joins the Cloud. Retrieved November 21, 2021, from https://alleninstitute.org/what-we-do/brain-science/news-press/articles/neuroscience-data-joins-cloud
Amazon. (2018, July 12th). New AWS Public Datasets Available from Allen Institute for Brain Science, NOAA, Hubble Space Telescope, and Others. Retrieved November 12, 2021, from New AWS Public Datasets Available from Allen Institute for Brain Science, NOAA, Hubble Space Telescope, and Others
Canalysis. (2020, April 29). Global cloud services market Q1 2021. Retrieved November 27, 2021, from https://www.canalys.com/newsroom/global-cloud-market-Q121
Cocks, C. (2001, December). An identity based encryption scheme based on quadratic residues. In IMA international conference on cryptography and coding (pp. 360–363). Springer, Berlin, Heidelberg.
Jocelyn Goldfein and Ivy Nguyen. (2018, March 27). Data is not the new oil. Retrieved 20 November, 2021 from Data is not the new oil — TechCrunch
Merkle, R. C. (1987, August). A digital signature based on a conventional encryption function. In Conference on the theory and application of cryptographic techniques (pp. 369–378). Springer, Berlin, Heidelberg.
Paratii. (2017, October 25). A Brief History of P2P Content Distribution, in 10 Major Steps. Retrieved November 20, 2021, from A Brief History of P2P Content Distribution, in 10 Major Steps | by Paratii | Paratii | Medium
Nakamoto, S. (2008). Bitcoin: A peer-to-peer electronic cash system. Decentralized Business Review, 21260.
Vines, T. H., et. al. (2014). The availability of research data declines rapidly with article age. Current biology, 24(1), 94–97.
Wiener-Bronner, D. (2013, December 23). Most Scientific Research Data From the 1990s Is Lost Forever. Retrieved November, 13, 2021, from Most Scientific Research Data From the 1990s Is Lost Forever — The Atlantic
By:Shady El Damaty, Ph.D. Link:https://pulse.opsci.io/rich-in-data-poor-in-wisdom-science-needs-a-decentralized-data-commons-98c7ffdb56a1
-
@ a4a6b584:1e05b95b
2023-07-21 01:51:34Is light really a constant? In this blog post by Adam Malin we theorize about redshift caused not by expanding space, but by changes in zero point energy field over cosmic time
I was inspired to write this post after reading a paper written in 2010 by Barry Setterfield called Zero Point Energy and the Redshift. If you want a very deep dive into this concept, I recommend you check it out. Here is the link.
I recently read an intriguing paper that puts forth an alternative explanation for the redshift we observe from distant galaxies. This paper, published in 2010 by Barry Setterfield, proposes that redshift may be caused by changes over time in the zero point energy (ZPE) field permeating space, rather than cosmic expansion. In this post, I'll summarize the key points of this theory and how it challenges the conventional framework.
An important distinction arises between Stochastic Electrodynamics (SED) and the more mainstream Quantum Electrodynamics (QED) framework. SED models the zero point field as a real, random electromagnetic field that interacts with matter, possessing observable physical effects. In contrast, QED considers virtual particles and zero point energy as mathematical constructs that do not directly impact physical systems. The zero point energy discussed in this proposed mechanism builds upon the SED perspective of modeling the quantum vacuum as a dynamic background field that can exchange energy with matter. SED provides a means to quantitatively analyze the redshift effects hypothetically caused by changes in the zero point field over cosmic time.
The standard model of cosmology attributes redshift to the Doppler effect - light from distant galaxies is stretched to longer wavelengths due to those galaxies receding away from us as space expands. Setterfield's paper argues that the data actually better supports a model where the speed of light was higher in the early universe and has decayed over time. This would cause older light from farther galaxies to be progressively more redshifted as it travels through space and time.
Setterfield cites historical measurements of the speed of light by scientists like R.T. Birge that showed systematic decreases over time, contrary to our modern assumption of constancy. He argues that while experimental improvements have reduced uncertainties, residual trends remain even in recent high-precision measurements.
A key part of Setterfield's proposed mechanism is that the ZPE interacts with subatomic particles to give them mass and stabilize atomic orbits. As the ZPE has increased in strength over cosmic history, it has caused a contraction in electron orbits within atoms. This results in electron transitions emitting bluer light over time. Thus, looking out into space is looking back to earlier epochs with weaker ZPE and redder light.
This theory raises some thought-provoking questions. For instance, have we misinterpreted redshift as definitive evidence for an expanding universe? Might a static universe with slowing clocks account for observations we attribute to dark matter and dark energy? However, changing existing scientific paradigms is extremely challenging. Let's examine some potential counterarguments:
- The constancy of the speed of light is a fundamental pillar of modern physics backed by extensive experimental verification. This theory would require overturning tremendous empirical support. Setterfield argues that Lorentz and others kept an open mind to variations in c early on. While unexpected, new evidence could prompt another evolution in perspective.
- The current Lambda-CDM cosmological model based on general relativity matches a wide array of observations and predicts phenomena like the cosmic microwave background. But it also has issues like the need for speculative dark matter and dark energy. An alternate cosmology with varying c may provide a simpler unifying explanation.
- Astrophysical observations like supernova brightness curves seem to confirm expanding space. But these interpretations assume constancy of c and other principles that this hypothesis challenges. The cosmic microwave background, for instance, could potentially be re-interpreted as a cosmological redshift of earlier light.
Read more from Adam Malin at habitus.blog.
Adam Malin
You can find me on Twitter or on Nostr at
npub15jnttpymeytm80hatjqcvhhqhzrhx6gxp8pq0wn93rhnu8s9h9dsha32lx
value4value Did you find any value from this article? Click here to send me a tip!
-
@ a4a6b584:1e05b95b
2023-07-21 01:44:57The French sociologist and philosopher Jean Baudrillard offered profound critiques of modern society, technology, media and consumer culture through his work. Often associated with postmodernism and post-structuralism, Baudrillard challenged established notions of truth, reality and the power dynamics between humans and the systems they create.
In this blog post, we'll explore some of Baudrillard's most notable concepts and theories to understand his influential perspectives on the contemporary world.
Simulacra and Simulation
One of Baudrillard's most well-known works is "Simulacra and Simulation." In the book, Baudrillard argues that our society has replaced reality and meaning with symbols, images and signs that he calls “simulacra.”
He believed human experience is now a simulation of reality rather than reality itself. Unlike simple imitations or distortions of reality, simulacra have no connection to any tangible reality. They are mere representations that come to define our perception of existence.
To illustrate this concept, Baudrillard outlined three “orders” of simulacra:
-
First Order: Copies or imitations that maintain a clear link to the original, like photographs.
-
Second Order: Distortions of reality that mask the absence of an original, like heavily edited advertising images.
-
Third Order: Simulations with no original referent that become “hyperreal,” like video game worlds.
Per Baudrillard, much of postmodern culture is made up of third-order simulacra. Media representations and simulations hold more meaning than reality, creating a “hyperreality” driven by consumerism and spectacle.
Hyperreality
Building on his theory of simulacra, Baudrillard introduced the concept of “hyperreality” to describe how representations and simulations have come to replace and blur boundaries with reality.
In hyperreality, the lines between real life and fictional worlds become seamless. Media, technology and advertising dominate our perception of reality, constructing a simulated world that appears more real and appealing than actual reality.
For example, the carefully curated lives presented on social media often appear more significant than people’s daily lived experiences. Additionally, idealized representations of reality presented in movies, TV and advertising shape our expectations in ways that real life cannot match.
According to Baudrillard, we increasingly interact with these fabricated representations and hyperreal signs over direct experiences of reality. The simulations and models come to define, mediate and construct our understanding of the world.
Consumer Society
In “The Consumer Society,” Baudrillard argues that traditional institutions like family, work and religion are losing significance to the prevailing values and rituals of consumerism.
Rather than simply meeting needs, consumption has become a defining way of life, shaping identities and social belonging. Non-essential consumer goods and experiences drive the quest for happiness, novelty and status.
Baudrillard notes that consumer society fosters a constant pressure to consume more through an endless pursuit of new products, trends and experiences. This “treadmill of consumption” fuels perpetual dissatisfaction and desire.
Additionally, he highlights how objects take on symbolic value and meaning within consumer culture. A luxury car, for example, denotes wealth and status beyond its functional utility.
Overall, Baudrillard presents a critical perspective on consumerism, showing how it has come to dominate modern society on psychological and cultural levels beyond simple economic exchange.
Symbolic Exchange
In contrast to the materialistic values of consumer society, Baudrillard proposes the concept of “symbolic exchange” based on the exchange of meanings rather than commodities.
He suggests symbolic exchange has more power than material transactions in structuring society. However, modern culture has lost touch with the traditional symbolic exchanges around fundamental existential themes like mortality.
By marginalizing death and pursuing endless progress, Baudrillard argues that society loses balance and restraint. This denial leads to excessive consumption and constant pursuit of unattainable fulfillment.
Fatal Strategies
Baudrillard’s theory of “fatal strategies” contends that various systems like technology and consumerism can take on lives of their own and turn against their creators.
Through proliferation and exaggeration, these systems exceed human control and impose their own logic and consequences. For instance, while meant to be tools serving human needs, technologies can shape behavior and exert control in unanticipated ways.
Baudrillard saw this reversal, where the object dominates the subject who created it, as an impending “fatal” danger of various systems reaching a state of autonomous excess.
This provides a cautionary perspective on modern society’s faith in perpetual progress through technology and constant economic growth.
Conclusion
Through groundbreaking theories like simulation, hyperreality and symbolic exchange, Jean Baudrillard provided deep critiques of modern society, consumer culture, media and technology. His work dismantles assumptions about reality, history and human agency vs. systemic control.
Baudrillard prompts critical reflection about the cultural, psychological and philosophical implications of postmodern society. His lasting influence encourages re-examining our relationships with consumerism, technology and media to navigate the complex intricacies of the contemporary world.
Read more from Adam Malin at habitus.blog.
Adam Malin
You can find me on Twitter or on Nostr at
npub15jnttpymeytm80hatjqcvhhqhzrhx6gxp8pq0wn93rhnu8s9h9dsha32lx
value4value Did you find any value from this article? Click here to send me a tip!
-
-
@ a4a6b584:1e05b95b
2023-07-21 01:22:42A comprehensive exploration of Metamodernism, its principles and influences across art, design, music, politics, and technology. This piece delves into how Metamodernism paves the way for a future that harmonizes grand narratives and individual nuances, and how it could shape the cultural, economic, and digital landscape of the future.
Introduction
Welcome to the metamodern era. A time where we find ourselves caught in the flux of digital revolution, cultural shifts, and a rekindling of grand narratives, all while still
As an artist, I find myself standing at the crossroads of these cultural and technological shifts, with Metamodernism serving as my compass. I'd like to share my thoughts, experiences, and observations on what Metamodernism is, and how I believe it will shape our future.
This journey began as an exploration into the potential future of graphic design, taking cues from an evolving cultural paradigm. I quickly realized that Metamodernism was not just about creating compelling visual narratives, but it had potential to influence every aspect of our lives, from politics and technology to our individual experiences and collective narratives.
Metamodernism, in essence, is about balancing the best of what came before – the grand narratives and optimism of modernism, and the skepticism and relativism of postmodernism – while forging ahead to create a new, coherent cultural reality.
So let's embark on this journey together to understand the metamodern era and its impact on our culture, our technology, and our art. Let's delve into Metamodernism.
Understanding Postmodernism and Metamodernism
To appreciate the metamodern, we must first unpack the concept of postmodernism. Rooted in skepticism, postmodernism came into being as a reaction to modernism’s perceived failures. Where modernism sought universality, believing in grand narratives and objective truths, postmodernism reveled in fragmentation and the subjective nature of reality. It questioned our institutions, our ideologies, and our grand narratives, challenging the very structure upon which our society was built.
However, as we moved deeper into the postmodern era, a palpable sense of fatigue began to set in. The endless questioning, the constant fragmentation, and the cynical deconstruction of everything began to take a toll. While postmodernism provided valuable insights into the limitations of modernist thinking, it also left us feeling disconnected and adrift in a sea of relativism and irony.
This is where Metamodernism steps in. As the cultural pendulum swings back from the fragmentation of postmodernism, it does not return us to the naive grand narratives of modernism. Instead, Metamodernism synthesizes these seemingly contradictory ideologies, embracing both the skepticism of the postmodern and the optimism of the modern.
In essence, Metamodernism is a search for meaning and unity that still acknowledges and respects the complexity of individual experience. It recognizes the value in both grand narratives and personal stories, aspiring to create a more cohesive cultural discourse.
The metamodern era is a new dawn that challenges us to be both skeptical and hopeful, to engage in dialogue and debate, and to harness the opportunities that lie ahead. It's not about choosing between the grand narrative and the individual, but rather finding a way to harmoniously integrate both.
Metamodernism in Politics
In recent years, we've seen political movements around the world that embody the elements of Metamodernism. On one hand, there's a call for a return to grand narratives and nostalgia for perceived better times, while on the other, there's a desire to dissolve hierarchical structures and traditional norms in favor of individual freedom and recognition.
A case in point is the political era marked by the rise of Donald Trump in the United States. Trump's slogan, "Make America Great Again," was a nod to the modernist ideal of a grand narrative - a return to American exceptionalism. It was an appeal to a past time when things were, as perceived by some, better.
Meanwhile, reactions on the left have taken a different trajectory. Movements to decentralize power, break down traditional norms, and encourage more individual subjectivity echoing postmodern sentiments.
Metamodernism enables us to interpret these political movements from a fresh perspective. It does not discard the grand narrative nor does it plunge into fragmentation. Instead, it presents a narrative of balance and synthesis, oscillating between the modernist and postmodernist perspectives, and offering a way forward that is nuanced, respectful of individual experiences, and yet oriented toward a shared goal for the culture and people.
In the realm of politics, the metamodern era isn't about swinging to one extreme or another. Instead, it suggests a way to reconcile the polarity and move forward, synthesizing the best of both perspectives into a more nuanced, inclusive future. This is the metamodern political landscape, complex and dynamic, where grand narratives and individual stories coexist and inform one another.
The Metamodernist Canvas in Graphic Design
Now, let's look at the impact of Metamodernism on graphic design, a realm where I live and breathe every day. Here, Metamodernism offers a fresh perspective, providing a way to express the complexity of the world we live in and creating a narrative that is both universal and individual.
Traditional graphic design was about simplicity and clarity. It was built on the modernist principles of functionalism and minimalism, where form follows function. Postmodern design, however, sought to question these principles, embracing complexity, contradiction, and the power of the image.
As a graphic designer in the metamodern era, I find myself torn between these two extremes. On one hand, I appreciate the clarity and simplicity of modernist design. On the other, I am captivated by the dynamism and complexity of postmodern aesthetics.
The solution, I believe, lies in the synthesis offered by Metamodernism. Metamodernist design does not reject the past, but rather builds on it. It blends the simplicity of modern design with the vibrancy of postmodern aesthetics, creating something that is both familiar and fresh.
The Metamodernist canvas is a space where contrasting ideas can coexist and inform each other. It is a space where the universal and the individual intersect, creating narratives that resonate on multiple levels. It is a space where design can play a role in building a more cohesive and well integrated society.
The challenge for designers in the metamodern era is to create designs that reflect this complexity and nuance, designs that speak to both the individual and the collective, designs that challenge, inspire, and unite. It's a tall order, but it's a challenge that, as designers, we are ready and excited to embrace.
Liminal Spaces and the Visual Language of Metamodernism
A pivotal concept within the Metamodernist philosophy is that of the "liminal space" - an in-between space, where transformation occurs. These spaces, often associated with uncertainty, dislocation, and transition, have become particularly poignant in recent times as we grappled with the global impact of COVID-19.
Within this context, we've all had a shared experience of liminality. Offices, parks, and public spaces - once bustling with activity - suddenly became eerily quiet and deserted. These images have since been ingrained in our collective memory, symbolizing a profound shift in our way of life.
From a visual perspective, these liminal spaces offer a unique canvas to create Metamodernist narratives. Picture a 3D render of an empty office space, serving as a backdrop for a fusion of past and future aesthetics, where classical works of art - subtly altered - coexist with modern elements. Consider the emotional impact of a low-resolution Mona Lisa or a melting clock a la Salvador Dalí set against the familiar concrete reality of the modern workspace.
This use of liminal space is not just a stylistic choice. It's a nod to our shared experiences, an acknowledgment of the transitions we are going through as a society. It's a way of showing that while we live in an era of immense change and uncertainty, we are also capable of creating new narratives, of finding beauty in the unfamiliar, and of moving forward together.
The challenge in Metamodernist design is to create a visual language that resonates with our collective experiences, that brings together the past and the future, the familiar and the strange, and that stimulates thought, dialogue, and connection. And that, I believe, is where the true power of Metamodernist design lies.
Metamodernism in Art, Music, and Memes
Just as in politics and graphic design, Metamodernism manifests itself in various facets of culture, from art and music to internet memes. This wide-ranging influence attests to the universality of Metamodernist thinking and its ability to encompass and unify diverse aspects of human experience.
In visual arts, consider Banksy's elusive street art, which often blends irony and sincerity, public space and private sentiment, modern graffiti techniques and traditional painting styles. In music, take the example of Kanye West's album "Jesus is King," which fuses gospel traditions with hip-hop sensibilities, blurring the line between secular and religious, the mainstream and the fringe.
Meanwhile, the internet meme culture, characterized by its oscillation between irony and sincerity, absurdity and poignancy, chaos and order, is perhaps one of the most profound expressions of Metamodernism. Memes like "This is fine," a dog calmly sitting in a burning room, epitomize the Metamodernist spirit by acknowledging the complexities and contradictions of modern life while also seeking to find humor and connection within them.
Even the trend of remixing adult rap with kids shows can be seen as Metamodernist. It juxtaposes the mature themes of rap music with the innocence of children's entertainment, resulting in a work that is both familiar and disorienting, humorous and thought-provoking.
In all these instances, Metamodernist works draw from the past and present, high culture and popular culture, the sacred and the profane, to create experiences that are multilayered, dynamic, and rich in meaning. They acknowledge the complexity and diversity of human experience, yet also aspire to forge connections, provoke thought, and inspire change.
The Rise of Bitcoin and Metamodernist Economics
The rise of Bitcoin - the world's first decentralized digital currency - is a prime example of Metamodernist influence in economics. Bitcoin incorporates elements from both modernist and postmodernist economic theories, yet transcends them by creating a novel economic system that has never been seen before.
On one hand, Bitcoin harks back to the modernist ideal of hard money. It revives the principles of scarcity and predictability that underpinned the gold standard, a system that many believe led to stable, prosperous economies. On the other hand, Bitcoin's design is rooted in postmodern principles of decentralization and disintermediation, disrupting traditional economic hierarchies and structures.
But Bitcoin isn't just a fusion of modern and postmodern economics. It goes a step further by incorporating elements of Metamodernist thinking. Bitcoin's design encourages a sincere, cooperative approach to economic interaction. Its transparent, tamper-proof ledger (blockchain) promotes trust and collaboration, discourages deceit, and enables all participants, no matter how big or small, to verify transactions independently.
Moreover, Bitcoin is a grand narrative in itself - a vision of a world where economic power is not concentrated in the hands of a few, but distributed among many. At the same time, it acknowledges the individuality and diversity of its participants. Each Bitcoin user has a unique address and can transact freely with anyone in the world, without the need for a middleman.
Bitcoin's rise offers a glimpse into what a Metamodernist economic system might look like - one that combines the best aspects of modern and postmodern economics, while also adding a new layer of trust, cooperation, and individual freedom.
The Impact of Urbit and Metamodernist Computing
Urbit symbolizes a compelling manifestation of Metamodernist ideology within the realm of technology. This unique operating system revolutionizes the individual's interaction with the digital world, intrinsically mirroring the principles of Metamodernism.
In contrast to the postmodern complexities that plague the current internet – a web characterized by surveillance capitalism, privacy invasion, and data centralization – Urbit leans towards a modernist vision. It champions the idea of the internet as a streamlined, intuitive tool, but concurrently it envisions something unprecedented: a digital landscape where each user not only owns but is also the infrastructure of their digital identity and data.
The design philosophy of Urbit embodies a characteristic Metamodernist oscillation, as it traverses between elements of the past and the future, the familiar and the uncharted. It embraces the modernist simplicity reminiscent of early computing while concurrently advancing a futuristic concept of a personal server for every individual, in which they possess full sovereignty over their digital existence.
Urbit’s operating system and its unique programming language, Nock and Hoon, employ a Kelvin versioning system. This system is designed to decrement towards zero instead of incrementing upwards, epitomizing the modernist pursuit of perfection and simplicity. Once the protocol reaches zero, it signifies that an ideal state has been achieved, and no further changes will be required. This, in essence, represents the modernist grand narrative embedded within Urbit's design.
In the wider narrative of Metamodernism, Urbit symbolizes a decentralized, user-centric digital future. It recognizes the individuality of each user and emphasizes their control over their digital persona and interactions.
The promise of a completely decentralized internet is a vision still in progress. Regardless, it offers crucial insights into how Metamodernist principles could potentially shape our digital future. It paints a picture of an equilibrium between grand narratives and individual nuances, encapsulating a collective digital aspiration, as well as personal digital realities.
Moving Towards a Metamodernist Future
The common thread running through the Metamodernist era is the simultaneous embrace of grand narratives and personal experiences, the oscillation between modernist and postmodernist ideals, and the sincere pursuit of a better future.
However, moving towards a Metamodernist future is not without challenges. The danger lies in taking Metamodernist principles to an extreme, where the balance between irony and sincerity, grand narratives and individual nuances, can be lost. It's vital to avoid the pitfalls of dogmatism and extremism that plagued previous cultural eras.
For instance, an overemphasis on grand narratives can lead to totalitarian tendencies, while an excessive focus on individual nuances can result in cultural fragmentation. Metamodernism's strength lies in its ability to reconcile these extremes, fostering a sense of shared purpose while acknowledging individual experiences and perspectives.
Similarly, the interplay of irony and sincerity, often seen in Metamodernist works, should not tip over into either pure cynicism or naive earnestness. The goal should be to create a dialectic, a conversation, a fusion that creates a new, more complex understanding.
As we move towards this future, we can use the tools at our disposal – from graphic design and art, to music, memes, Bitcoin, and Urbit – to explore and shape this Metamodernist narrative. By consciously adopting Metamodernist principles, we can construct a culture that is at once reflective of our individual experiences and representative of our collective aspirations. In doing so, we can pave the way for a future that truly encapsulates the complexity, diversity, and richness of the human experience.
A Future Infused with Metamodernism
Metamodernism offers a comprehensive cultural lens through which we can understand, critique, and navigate our world. It provides a potential pathway to a future where our grand collective narratives coexist harmoniously with our nuanced individual experiences.
In the creative world, artists and designers can become the torchbearers of this movement, integrating Metamodernist principles into their work. They can leverage the power of nostalgia, sincerity, irony, and innovation to create works that resonate with the complexities of the human condition, reflecting both shared experiences and personal journeys.
In the world of technology and economics, Metamodernist principles illuminate the path to a more decentralized, user-centric digital future, as embodied by Bitcoin and Urbit. These platforms highlight the value of individual autonomy within a collective system, creating a new narrative of economic and digital empowerment.
In the political realm, Metamodernism can help create a dialogue that is both encompassing and respectful of diverse perspectives. It advocates for a new kind of political discourse that eschews extreme polarization in favor of a nuanced conversation that acknowledges the complexities of our world.
In essence, the potential of Metamodernism lies in its capacity to weave a compelling tapestry of our collective human experience – one that is vibrant, complex, and teeming with diverse narratives. By understanding and embracing the principles of Metamodernism, we can co-create a future that truly reflects the dynamic interplay of our shared narratives and individual nuances.
In this promising future, we can all become active participants in the Metamodernist narrative, shaping a world that values both the grandeur of our collective dreams and the authenticity of our individual experiences. The future is not just something we move towards; it is something we actively create. And Metamodernism provides a powerful blueprint for this creation.
Read more from Adam Malin at habitus.blog.
Adam Malin
You can find me on Twitter or on Nostr at
npub15jnttpymeytm80hatjqcvhhqhzrhx6gxp8pq0wn93rhnu8s9h9dsha32lx
value4value Did you find any value from this article? Click here to send me a tip!
-
@ 57fe4c4a:c3a0271f
2023-07-30 22:54:53📝 Summary: The convo discusses the concept of multipath keysend, which allows for splitting payments into multiple parts. This adds complexity to keysend but can be useful in certain situations. ZmnSCPxj proposes a multipath payment protocol called "keysend" that enables this functionality. The protocol allows the receiver to claim the payment once all parts have arrived.
👥 Authors: • Olaoluwa Osuntokun ( nostr:npub19helcfnqgk2jrwzjex2aflq6jwfc8zd9uzzkwlgwhve7lykv23mq5zkvn4 ) • Thomas HUET ( nostr:npub1tcwr7j30p5q4sypnsw4arca9s2433s7wdcpt2z2sk7pkqfsntjds0pu5xp ) • Matt Morehouse ( nostr:npub1zgyy2j829vn4zuvhkgza7qe37knzdls4kuwt2k8cehwpjxn9duwqv5j4mf ) • Matt Corallo ( nostr:npub1e46n428mcyfwznl7nlsf6d3s7rhlwm9x3cmkuqzt3emmdpadmkaqqjxmcu ) • ZmnSCPxj ( nostr:npub1g5zswf6y48f7fy90jf3tlcuwdmjn8znhzaa4vkmtxaeskca8hpss23ms3l )
📅 Messages Date Range: 2023-07-27 to 2023-07-29
✉️ Message Count: 5
📚 Total Characters in Messages: 13928
Messages Summaries
✉️ Message by ZmnSCPxj on 27/07/2023: A scheme for creating a
keysend
protocol that allows for multipath payments is proposed, where the receiver can claim the payment once all parts have arrived.✉️ Message by Thomas HUET on 28/07/2023: The motivation for multipath keysend is to allow for splitting payments into multiple parts, ensuring the receiver can only claim the payment once all parts have arrived. This adds complexity to keysend, but may be useful for certain use cases.
✉️ Message by Matt Morehouse on 28/07/2023: ZmnSCPxj proposes a scheme for a multipath payment protocol called "keysend" that allows for splitting payments into multiple parts.
✉️ Message by Olaoluwa Osuntokun on 29/07/2023: The email discusses a scheme for creating a multipath payment protocol using keysend, allowing the receiver to claim the payment once all parts have arrived.
✉️ Message by Matt Corallo on 29/07/2023: The author suggests implementing a multipath payment protocol for keysend, allowing the receiver to claim the payment once all parts have arrived.
Follow nostr:npub1j3t00t9hv042ktszhk8xpnchma60x5kz4etemnslrhf9e9wavywqf94gll for full threads
-
@ 57fe4c4a:c3a0271f
2023-07-30 22:52:21👥 Authors: aymeric at peersm.com ( nostr:npub1tm2gdrkzrts6uhjx0snassk2txhcjnrenq6nptw66jfdmu3x4e7s3r4u2v )
📅 Messages Date: 2023-07-27
✉️ Message Count: 1
📚 Total Characters in Messages: 2674
Messages Summaries
✉️ Message by aymeric at peersm.com on 27/07/2023: A concerned individual has raised a bug regarding "inscriptions" on the Bitcoin blockchain, suggesting they are spam and causing issues. They request a feature to reject inscriptions.
Follow nostr:npub15g7m7mrveqlpfnpa7njke3ccghmpryyqsn87vg8g8eqvqmxd60gqmx08lk for full threads
-
@ c80b5248:6b30d720
2023-07-20 04:06:46Why this isn't a PR
I didn't know exactly where to post this... I have been thinking a lot the past few days about how we can use nostr and moderated communities NIP-172 to free the power of git from it's centralized overlord, GitHub.
I know there are already a few open pull requests that inspired this. PR #223 has lots of great discussion about the benefits of leaning on existing git servers for a nostr-git implementation. PR #324 establishes the usage of a
"c"
tag to make commits available for queries on relays. I think this aspect will be critical for fast and effecient git content discovery over nostr.Given that I have limited experience with submitting PRs to open project on GitHub and was not sure where to share these ideas. I took it as an opportunity to write a sample NIP! I would love to get feedback from others, especially folks like nostr:npub1melv683fw6n2mvhl5h6dhqd8mqfv3wmxnz4qph83ua4dk4006ezsrt5c24 and nostr:npub160t5zfxalddaccdc7xx30sentwa5lrr3rq4rtm38x99ynf8t0vwsvzyjc9, who have both been working on their PRs longer than I have been thinking about this.
A few highlights
- Git remotes get their own kind and they are replaceable. This allows events that reference the remote repository to use a static address (just like a longform article) that will continue to work if the author changes the location of the remote - especially important to maintain censorship resistance.
- The git remote kind can provide indexed
"c"
tags that allow remote discovery via relays. If multiple servers have instances of the Nostr git repository then all of those relays can quickly be surfaced by querying on the correct commit hash. - Because these coordination events effectively suggest changes that a user might pull into their existing git repositories it makes sense to use these events within moderated communities NIP-172 to establish approval mechanisms that indicate when a git-enabled client should merge a PR or whether or not a commit should be trusted.
The last thing I will note is that I have framed the usage of the new event kinds assuming that only key commits will be posted to Nostr. This should reduce data usage on relays if only commits that require discussion or action need to be posted and referenced. Git-enabled clients will still be able to access all other commits from individual remote git servers. However, there is nothing outside of data size stopping this framework from being used to track every single commit on a repository if that ended up being desirable.
Enough of the preamble...
NIP-XXX
Git Remote Index and Commit Checkpoints
draft
optional
author:armstrys
Git Kinds
The goal of this nip is to introduce mechanisms for git remote repository discovery and commit checkpointing into the Nostr protocol. Git is already decentralized by nature. Centralized clients like GitHub serve two primary purposes. - They establish one central remote repository as the source of truth. - They provide a platform for non-git metadata tracking including issues and comments.
A decentralized implementation on Nostr should replace the central remote repository with discovery mechanisms that take advantage of the already decentralized nature of git and also provide Nostr-native representations of commits that need to be referenced by external metadata not handled by git.
This nip introduces two new kinds to achieve this: - A "git repository" parameterized replaceable event (
kind:34617
) to provide undateable connection points to existing git servers that are easily discoverable and referenced by the second new kind... - A "git checkpoint" event (kind:4617
) which has the sole purpose of providing a coordination reference point for key commits in a git repository and should follow a reply structure similar to the chain of commits in git.With these two kinds it is possible to represent censorship resistant personal repositories, forked repositories, and even moderated repositories by integrating these kinds with NIP-172.
Note: This implementation assumes nothing about how the client will interact with git because it only aims to coordinate a layer above git to track metadata and discover existing git server locations. Clients would need to integrate authentication via other providers until git server implementations with Nostr authentication are available.
Git Tags
Additionally, we introduce two application-specific tags that should be used in conjunction with the new git event kinds: 1. A
"c"
tag that takes a commit, a branch name, and a marker ('',"head"
,"compare"
,"output"
) like so:["c", "<commit hash>", "<branch name>","<marker>"]
. 2. A"git-history"
tag that provides a plain text history of commands that a user ran to generate a merge output (e.g.“git checkout <base hash>\ngit merge <compare hash> —no-ff”
). 3. An"auth-required"
tag that allows an author to publicize whether a remote requires authentication to access. Clients SHOULD assume that remotes with no"auth-required"
default totrue
- the equivalent of["auth-required", true]
Git Remote Definition
Kind:34617
defines a replaceable event (NIP-33) that provides a url to a remote repository as it's.content
. Using a replaceable event allows other events (like commit checkpoints) to reference this remote via"a"
tag without concern that the links will break should the author need to change the location of the remote repository. The event SHOULD contain one or more"c"
tags for any commit that define the"head"
commit of each branch that the user author wants to make discoverable in the repository. The author can include other commits on each branch for key commits like releases.json { "id": "<32-bytes lowercase hex-encoded SHA-256 of the the serialized event data>", "pubkey": "<32-bytes lowercase hex-encoded public key of the event creator>", "created_at": "<Unix timestamp in seconds>", "kind": 34617, "tags": [ ["auth-required", false], ["a", "34550:<Community event author pubkey>:<d-identifier of the community>", "<Optional relay url>"], ["d", "<git remote name>"], ["c", "<commit hash>", "<branch name>","head"], ["c", "<commit hash>", "<branch name>",""], ["c", "<commit hash>", "<other branch name>","head"], ], "content": "<address to remote>" }
Git Checkpoint Definition
The usage of
"c"
marked tags help reference different git-related events, all usingkind:34617
: 1. An event with a two"c"
tags marked"head"
and"output"
should be interpreted as a standard commit checkpoint. 2. An event with three"c"
tags marked"head"
,"compare"
, and"output"
should be interpreted as a merge checkpoint 3. An event with a"c"
tag marked"compare"
, but with no"output"
commit should be interpreted as a pull/merge request 4. An event with at least one"c"
tag but without a"compare"
or"output"
marker should be interpreted as a release/tag and should reply to the appropriate commit. 5. An barekind:34617
event with no"c"
tag should be interpreted as a comment if it is a reply to another event or as an issue if it is not.A commit checkpoint SHOULD include at least one
"a"
tag to akind:34617
remote repository where the tagged git commit in question can be found. Clients may also query for matching"c"
tags to discover other relevant remotes as needed.Clients should interpret any
"a"
tag that includes"34617:*"
as the first place to search for commits referenced in akind:4617
event.Clients MUST use marked event tags (NIP-10) to chain checkpoints by onto the last availably commit checkpoint on the same branch. Identifying the proper
"root"
and"reply"
events allow other clients to follow and discover events in the same chain and forks.Both
kind:34617
andkind:4617
events MAY include a NIP-172 style"a"
tag to establish a moderated repository. This may also help with repository remote discovery and organization as thekind:34550
community event could suggest default remotes for the community.An example of a standard commit checkpoint:
json { "id": "<32-bytes lowercase hex-encoded SHA-256 of the the serialized event data>", "pubkey": "<32-bytes lowercase hex-encoded public key of the event creator>", "created_at": "<Unix timestamp in seconds>", "kind": 4617, "tags": [ ["c", "<current head hash>", "<optional branch name>","head"], ["c", "<expected output hash>", "<branch name>", "output"], ["a", "34617:<compare remote event author pubkey>:<compare remote name>"], ["e", "<event id of first checkpoint with output on checkpoint chain>", "<optional relay url>", "root id"], ["e", "<event id of previous checkpoint with output on checkpoint chain>", "<optional relay url>", "reply"], ["a", "34550:<Community event author pubkey>:<d-identifier of the community>", "<Optional relay url>"] ], "content": "<description of commit>" }
An example of a merge/pull request checkpoint:
json { "id": "<32-bytes lowercase hex-encoded SHA-256 of the the serialized event data>", "pubkey": "<32-bytes lowercase hex-encoded public key of the event creator>", "created_at": "<Unix timestamp in seconds>", "kind": 4617, "tags": [ ["c", "<current head hash>", "<branch name>","head"], ["c", "<compare output hash>", "<branch name>", "compare"], ["a", "34617:<compare remote event author pubkey>:<compare remote name>"], ["e", "<event id of first checkpoint with output on checkpoint chain>", "<optional relay url>", "root id"], ["e", "<event id of previous checkpoint with output on checkpoint chain>", "<optional relay url>", "reply"], ["a", "34550:<Community event author pubkey>:<d-identifier of the community>", "<Optional relay url>"] ], "content": "<description of merge/pull request>" }
An example of a merge checkpoint:
json { "id": "<32-bytes lowercase hex-encoded SHA-256 of the the serialized event data>", "pubkey": "<32-bytes lowercase hex-encoded public key of the event creator>", "created_at": "<Unix timestamp in seconds>", "kind": 4617, "tags": [ ["git-history", "<plain text commands used to execute merge>"], ["c", "<current head hash>", "<branch name>","head"], ["c", "<compare hash>", "<optional branch name>", "compare"], ["c", "<expected output hash>", "<branch name>", "output"], ["a", "34617:<compare remote event author pubkey>:<compare remote name>"], ["e", "<event id of first checkpoint with output on checkpoint chain>", "<optional relay url>", "root id"], ["e", "<event id of previous checkpoint with output on checkpoint chain>", "<optional relay url>", "reply"], ["e", "<event id of merge/pull request>", "<optional relay url>", "mention"], ["a", "34550:<Community event author pubkey>:<d-identifier of the community>", "<Optional relay url>"], ["p", "<hex pubkey of merge/pull request author>"] ], "content": "<description of changes since last checkpoint>" }
When establishing a merge checkpoint that references a merge/pull request the client should include the merge/pull request author as a"p"
tag similar to the usage of replies in NIP-10 -
@ 393c8119:75e43710
2023-07-31 09:31:06The existing scientific journal system is broken. We discuss possible solutions, including what journals should select for and new possibilities that web3 technologies offer.
Science philosopher David Deutsch has stated that the purpose of science is the discovery of explanatory knowledge about the world that is both true (i.e. replicable and universal) and “hard to vary” (i.e. producing non-arbitrary explanations that are empirically falsifiable and don’t rely on appealing to authority and doctrine).¹
Falsification, criticism and the proposal of new explanations and discoveries are mediated by scientific journals. Ultimately — for the vast majority of fields — being published in prestigious scientific journals confers legitimacy to scientific work, attracts the attention of researchers worldwide, secures grants from funders for future research, and is essential for scientists to find jobs and to get promoted.
In other words, prestigious scientific journals have emerged as the gatekeepers of scientific legitimacy. But is the process of getting accepted in a top journal really the same thing as doing good science? And if not, is there a better solution that is both realistic and viable?
We believe there is. In this series, we look at the current system of scientific production and describe the structural problems that are arising from it, while also exploring how web3 technologies enable us to build a new system to address these problems.
Scientific journals as ranking and curating devices
Under the current paradigm of scientific production, scientists need to constantly provide evidence of their “productivity” in order to advance their careers (i.e. get hired or promoted) and to obtain funding for their future research plans, because this is how they are evaluated by their employers and by funding agencies.
One obstacle in this evaluation process is that evaluators hardly ever have the time to engage fully with the body of research each scientist has produced. Thoroughly studying all previous work of just one scientist would potentially require days, weeks, or even months.
This is an unrealistic demand on evaluators, even the most diligent and well-intentioned ones. Instead, evaluators are forced to rely on heuristics that make it easier to assess a scientist’s body of work, such as how many peer-reviewed publications a scientist has produced, and whether or not they were featured in top journals.
Publication in scientific journals is therefore the current key performance indicator for scientists. This metric has become the gold standard by which science is curated and ranked across many fields. Some journals are considered to be much more prestigious (i.e. harder to get into) than others, and these are weighted more heavily by evaluators of a scientists’ career, while also adding perceived legitimacy to the findings themselves.
The editors of scientific journals therefore exercise a great deal of influence in the scientific world; it is they who decide which submissions fall within their scope and are “good enough” to be evaluated in detail, and they are the ones who make a final decision about whether to accept or reject a submission based on the peer reviews they receive.²
Here is a good summary of the current state of the peer review system conducted in 2018 by Publon. Notably, most peer reviews are anonymous (i.e. the authors don’t know who their referees were) and they are not shared publicly, even if an article gets accepted for publication. This implies a lack of accountability for what happened during the review process, exposing this critical part of the scientific production function to turf wars, sloppiness, arbitrariness, and conflicts of interest that can be easily hidden. Furthermore, the review process of journals is typically slow, taking months or years until publication, and it is often riddled with journal-specific, arbitrary submission requirements such as formatting instructions that waste scientists’ time as they bounce their submissions around from journal to journal until they finally find a publication outlet.³
Thus, scientific journals play a key gatekeeping role in scientists’ careers, but the way in which articles get selected or rejected by journals is typically intransparent to the public, inefficient, and unaccountable. Furthermore, journals decide under which conditions the wider public can access the articles they accepted for publication. For the vast majority of journals, published articles are either hidden behind paywalls or, if they’re open-access, require substantial publication fees (thousands of dollars) which have to be paid by the authors or their employers. We’ll come back to the business model of journals further below.
Citations and impact
One popular proxy for the importance and quality of a scientific publication (its “impact”) is the number of citations it receives. The more citations an article receives, the more important it is perceived to be for the scientific discourse in a particular field. Citations are easy to count and to compare and have thus become popular quantitative heuristics for judging how successful scientists are. This gives scientists a powerful incentive to increase their citations as an end in itself.
But getting a lot of citations is not synonymous with conducting good science. One of many problems with using citations as a proxy for “impact”⁴ ⁵ is that scientific work takes time to disseminate and to accrue citations. On average, scientific papers reach their citation peak 2–5 years after publication.⁶ ⁷ This makes it almost impossible to use citation counts to evaluate the impact of scientists’ most recent work. Because funders and institutions need to make allocative decisions prior to the completion of a discovery’s citation lifecycle (i.e. “long term credit”), a more immediate cue is used (“short term credit”): the prestige of journals is used to judge the impact of recent work by scientists instead of the number of citations, which take a few years to accumulate. In many fields, it is almost impossible for a scientist to get hired or promoted without having at least one or several recent publications in “top journals”, i.e. those that are perceived as most prestigious and most difficult to get into.
The most salient proxy for the prestige of a journal is its impact factor,⁸ which measures the yearly mean number of citations of articles published in the last two years. The prestige of journals and their impact factor are metrics that by design pool reputation across all the papers published in a journal, irrespective of their actual individual quality and impact. But the distribution of citations within journals is typically highly skewed — about half the citable papers in a journal tend to account for 85% of a journals’ total citations.⁵ Because of these dramatic differences in citation patterns between articles published in the same journal, the impact factor of a journal is only a crude proxy for the quality and importance of papers published within a journal.⁹ Furthermore, the impact factor of small journals can be highly sensitive to the inclusion of one or a few articles that amass a high number of citations quickly.
Journal impact factors also vary substantially across fields, partly as a function of the prevailing citation culture and the absolute size of an academic discipline, but also as a function of journal size and which types of publications are counted (e.g. letters, editorials, news items, reviews).⁶ Thus, the impact factor of a journal is partly driven by aspects that are unrelated to the quality of the articles it publishes.
The impact factor metric was not originally intended for its current usage as a proxy for journal quality. Instead, it was first devised by Eugene Garfield and librarians started adopting it to help decide which journals to subscribe to.⁸ Since it has become an important part of journals’ reputations, for-profit subscription-based journals have since learned to to optimize their impact factor using a wide variety of tactics to game the system.¹⁰
When a metric is optimized for as a target, it often ceases to be a good metric of the underlying object of interest (i.e. the quality and importance of scientific publications).¹⁰ Scientists have been on the receiving end of the adoption of the impact factor and have adopted the norm — even while often decrying it — as a result of institutional demand for immediate proxies of scientific productivity. Problems associated with optimizing over this crude yardstick have been well documented,¹⁰ and despite repeated calls to abandon journal impact factors as a measure of scientific productivity of academics and institutions, it remains the most widely used metric for that purpose, partly due to a lack of agreement about what alternative measure should be used instead.¹¹ ¹²
Responding to the incentives created by this state of affairs, prestigious journals have learned to manage their article portfolio as one would diversify a portfolio of uncertain market bets. Essentially, editors are placing bets on papers according to the number of expected future citations a given article will generate; the more citations a journal’s portfolio generates, the better the impact factor, which in turn drives revenue.
But prestigious journals, once they achieve that prestige, also become market movers: because they have a large “market share” in the attention economy of scholars and journalists, articles published in their outlets are likely to garner more citations, creating a flywheel effect which consolidates the gains of the incumbent journals and makes them extremely hard to displace. A journal with a high impact factor is therefore likely to gather more citations than another journal that publishes an article of the same quality level, moving the impact factor even further away from being a useful metric.
Under the current incentive structure, novelty beats replicability
Independent replication of empirical results is critical to the scientific quest for better explanations for how the world works.¹³ ¹⁴ Without replicability, novel findings can be based on error or fabrication, and we are essentially relying on someone’s authority instead of objective proof. Unfortunately, replications do not score nearly as high in the prestige hierarchy of scientific publications as novel and surprising results. For example, only 3% of all journals in psychology explicitly encourage the submission of replication studies, while many journals explicitly state that they do not publish replications.¹⁵
Thus, scientists have little or no incentive to produce replicable research results. Instead, they face a “publish-or-perish” or even an “impact-or-perish” culture based on novelty and impact that shapes their success in the academy.¹⁰ One of the core issues surrounding the use of the citations and impact factors as metrics for scientific productivity is that they do not account for reproducibility of the published discoveries. Novel, surprising and provocative results are more likely to receive attention and citations, and are therefore sought after by editors and journals — even though novel and surprising findings are also less likely to be true.
The decoupling of replicability from commonly-used performance indicators has contributed to a raging replication crisis in many fields of science.¹³ ¹⁶ ¹⁷ ¹⁸ ¹⁹ ²⁰ The incentives for scientists to produce novel, attention-grabbing results are so strong that many cases of downright data manipulation and fraud have been reported.²¹ ²² ²³ Furthermore, poor research designs and data analysis as well as many researcher degrees of freedom in the analysis of data encourage false-positive findings.¹³ ¹⁶ ²⁴ As a result, recent large-scale replication studies of high-impact papers in the social sciences found that only ~60% of the original results could be replicated.¹⁷ ¹⁹ More than 70% of researchers have tried and failed to reproduce another scientist’s experiments, and more than half have failed to reproduce their own experiments.²⁵ This means widely circulated results are about as likely to be right as wrong.
To make matters worse, non-replicable studies tend to be cited more than replicable studies,²⁶ and citation patterns of papers after strong contradictory replication results adjust only modestly.²⁷ As a result of this bias in favor of novelty and against replicability, the scientific endeavor is not self-correcting efficiently. Because citations in published articles are only looking backwards in time (i.e. they only reflect what parts of previously published literature were cited), it’s nearly impossible for readers of an article to ascertain whether a study’s novel findings are replicable and trustworthy or not. Journals also have an incentive not to facilitate replications because successful replications are not novel enough to garner a lot of attention (i.e. impact and citations), while unsuccessful replications undermine journals’ claims of quality assurance.
In the technical appendix, we explore in more detail the incentives of journal editors to select for novelty and against research that replicates existing results. In contrast, we consider what an ‘ideal’ criterion might look like that maximizes the value of the overall research enterprise. Replications, particularly the first few, would receive significantly more weight in an ideal system of how science is evaluated.
The current separation of replicability from impact, the lack of incentives to replicate existing work, and the lack of incentives to provide “forward-looking” visibility of replication outcomes all contribute to the precarious state of many scientific fields today.16 Fundamentally, there is a disconnect between the current practice of rewarding scientists for publishing as many “high impact” findings as possible and the goal of the scientific endeavour — developing reliable explanations.
Yet, despite its inherent flaws, prestigious journals and academic institutions continue to operate under this paradigm, and scientists have little choice but to play along because their professional future largely depends on it.
The business model of scientific journals
Traditional scientific journals require authors to transfer their copyrights to the publisher. Copyrights are a type of intellectual property that gives its owner the exclusive right to make copies of creative work, thereby creating monopoly power for the copyright owner to monetize the work. The market for scientific publications is largely dominated by five large for-profit companies (Elsevier, Black & Wiley, Taylor & Francis, Springer Nature and SAGE), which together control more than 50% of the market between them.²⁸ Worldwide sales of access rights to scientific papers amount to more than USD 19 billion, which puts the scientific publication industry between the music industry and the film industry in terms of revenue.
The two leading business models of publication companies are “pay-for-access” and “pay-for-publication”. Both of these models rely on the unpaid labour of scientists to conduct peer-review which amounts to a multi-billion dollar donation of scientists to the publication industry, boosting the profits of publishing houses mostly with public funds or researcher’s private time and denying scientists fair rewards for performing high-quality referee work.²⁹
In the pay-for-access model, journals charge subscription fees to individuals and institutions such as university libraries. Each individual journal usually charges hundreds of dollars for an annual subscription and access to individual articles typically costs between $20 and $100.
Institutional subscribers such as universities, libraries, and governments are presented with bundle “deals”, which often contain not only the most highly ranked journals of a publisher but also a large number of niche or low impact journals which the subscriber might not pay for if not included in a bundle. This practice of exploiting a dominant market position by bundling goods is a powerful anti-competitive strategy to consolidate that market position.³⁰ ³¹ ³² By essentially taking up a large chunk of a library’s budget in one deal, an incumbent can protect its market against competition from newcomers.
Journal subscriptions under this model are a huge burden on public funds.³³ For example, the UK spent $52.3 million for annual journal subscriptions in 2014,³⁴ and the Netherlands was paying over $14 million in 2018 for subscriptions by their public universities to the journals of just one large publishing house (Elsevier). Despite the substantial expenditure of public funds on journal subscription fees, the tax-paying public, which funds most of the research and the journal subscription fees, does not have access to the science its taxes pay for.
In the “pay-for-publication” model, authors pay a fee for each article they publish. In contrast to the “pay-for-access” model, these articles are published under an open-access agreement and are typically accessible to the public online. Publication charges vary across journals and article types, with typical publication fees ranging between $2,000 and $11,000.35 Scientists either have to pay these fees out of their research budgets, or out of their own pockets, or they rely on their employers (e.g. universities) to cover the cost. The total number and the market share of “pay-for-access” journals continue to grow each year.³⁶ ³⁷
There is a perverse incentive at the heart of the “pay-for-publication” model: authors of an article pay only upon acceptance of their manuscript. This means that, for every rejected manuscript, a journal loses money. Thus, open-access journals need to be less restrictive in their selection to sustain their business models. While open-access journals have lowered the barriers to accessibility of knowledge, and many are well-meaning and high-quality journals, the model as a whole has led to a worldwide epidemic of predatory journals, a lowering of standards, and has opened the floodgates for research of little to no value.³⁸ ³⁹ ⁴⁰ ⁴¹
Our science arbitration system is therefore stuck between a rock and a hard place: on one side, subscription-based publishers control the distribution channels and have proven resilient, immovable, and powerful forces of capital extraction from taxpayer funds. Their highly selective flagship journals allow for profitable bundle deals. Meanwhile, on the other side, the open-access model thrives on volume and has enabled the rise of predatory publishers worldwide, flooding the scientific literature with an onslaught of fraudulent, unsound, or even plagiarized reports masquerading as science.¹⁰
Finally, both the “pay-for-access” and the “pay-for-publication” models exclude the vast majority of scientists from low-prestige institutions and people from developing countries from their ability to participate in science, thereby exaggerating inequality and restricting opportunities for progress and development.
In recent years, we have witnessed a rise in free alternatives: preprint platforms such as bioRxiv, medRxiv, or SSRN which allow scientists to post early versions of their manuscripts online. These preprint platforms follow the lead of physicists, which principally rely on Arxiv for disseminating work in their communities. In a similar vein, economists rely on working paper platforms such as NBER, mostly due to the fact that it often takes multiple years to be published in a reputable economic journal. However, preprints and working papers are not peer-reviewed and often differ substantially from the final version of the published manuscript or never get accepted for publication in a peer-reviewed journal at all. Thus, it is difficult or impossible for laymen readers to evaluate if they can trust the results reported in these outlets. As we have seen during the COVID epidemic, preprint platforms, especially in the medical field, can be misused to spread misinformation and unsound scientific results.⁴²
In summary, the current scientific publication ecosystem is highly exploitative and unfair; it restricts scientific progress and opportunities for development; and it primarily benefits the current oligopoly of scientific publishing houses and their shareholders at the expense of the public. While preprint platforms exist as an alternative to academic journals, they lack the rigor of peer-review and are more prone to be the source of incorrect information.
How Web3 technologies offer hope for the future
Technological innovations have historically enabled vast improvements in our ability to produce and share knowledge. Examples include the invention of printing (which made storing and distributing knowledge possible at scale), the development and improvement of scientific instruments, the Internet (which enabled immediate, worldwide access to computer programs, databases, and publications), and supercomputers that now permit fast processing of massive amounts of data.
The latest wave of innovation concerns human coordination at scale using web3 technologies which enable a decentralized version of the Internet that is based on peer-to-peer networks of a growing list of publicly available, tamper-proof records. Web3 is a powerful departure from the centralized, intransparent, data-hoarding principles of web2 which underlies the attention economy, the success of companies such as Facebook, Google, and the proprietary, vertically-integrated platforms of oligopolistic scientific publishers.
In contrast to these, the core premise of web3 is the widespread distribution of ownership to users and the trustless, censorship-resistant execution of code orchestrated through distributed ledger technology. As web3 adoption gains steam and viable applications continue to be built, one intriguing question is whether elite journals could be restructured as a scientific cooperative on web3.
The potential benefit of restructuring the current scientific publishing paradigm on web3 is that it would enable scientists to earn a stake in the multi-million dollar business of scientific publishing based on the soundness of their contributions. If this could be done successfully, it would materially address some of the challenges and problems that have arisen under the current, centralized model as outlined above. But while technologically feasible, it would likely be opposed by incumbents: leading publishers have firmly opposed ownership as a red line not to be crossed, preferring mass resignation of their editors over setting such a menacing precedent to their bottom line.⁴³ The world’s best scientists create immense value both to the world and for publishers, and web3 offers a new paradigm for this value to be recognized.
Beyond returning the value created by scientists to scientists, web3 offers technological capabilities for new modes of cooperation, incentive systems, and remunerating instruments. As we have seen with DEFI, the finance industry is under pressure by the rise of programmable monies (“money legos”). DAOs — decentralized autonomous organisations — are emerging at an increasing pace, ranging from financial services providers (e.g. MakerDao) to digital art investment collectives (e.g. PleasrDAO). Web3 is burgeoning with radical experimentation, such as new modes of capital allocation for public good through quadratic funding (e.g. GITCOIN), decentralized identity management, decentralized storage solutions (e.g. IPFS, ARWEAVE, Filecoin), self-custodial collective wallets (e.g. Gnosis), and a blossoming DAO toolkit ecosystem (e.g. Aragon, Commons Stack).
Furthermore, the possibility of pseudonymous identities tied to scientific reputation offers a new horizon for keeping the identity of referees protected even in a completely open, transparent scientific evaluation system.⁴⁴ In web3, we can tie pseudonymous identities tied to real, highly valued contributions to scientific endeavours in a tamper-proof and auditable way. By combining such a “proof-of-skill” system with pseudonymity, we can create a scientific ecosystem that simultaneously promotes open debate and reduces bias.
At the heart of the web3 ethos is the dream of decentralizing the world towards a more merit-based distribution of value and ownership, and returning the sovereignty of the individual over his finances, data, contributions and identity. Now that the building blocks are out there, there is much to be said about the promises of Scientific Journals as DAO collectives, channeling the value they create back to their communities.
Steps have already been taken by a few pioneers in the space of application of Web3 to science. There is a niche ecosystem out there already: VitaDAO is an example of a Web3 project which brings together some of the world’s great laboratories in longevity research with an emphasis on funding their effort and having a stake in the IP which results from it. Other projects such as ResearchHub are attempting to crowd-source curation of scientific work through Reddit-like social mechanisms.
The scale of the problem we face is truly global, and much of the future of humanity depends on our scientific engine’s ability to self-correct, falsify, criticize, and converge closer to the truth. In his book, David Deutsch supports that as long as these core properties are maintained, humanity has set course towards the beginning of an infinity of progress.1 Unfortunately, there is empirical evidence demonstrating that scientific progress has been steadily decelerating in the past few decades, with each Dollar invested into science yielding smaller social returns over time.⁴⁵ One possible explanation of this worrying trend is that ideas are getting harder to find.⁴⁵ But the replication crisis and the open floodgates of bad science also point to the faulty functioning of our scientific validation apparatus as a source of decreasing returns to science.
Combined in the right way, web3 technologies could disrupt and substantially improve our scientific-legitimacy conferring engine all while returning the value created by scientists to scientists.
Technical appendix
Scientific journals as agent-based black boxes that predict the value of manuscripts
To improve the current publication system, it would be useful to define an objective function that describes what journals should select for to maximize the contribution of publications to the creation of knowledge. Based on such an objective function, different selection mechanisms could be compared and ranked in their ability to contribute to the creation of knowledge. This is what we attempt to do here.
As a first step, we can conceptualize journals as prediction pipelines designed to sort and classify scientific work according to its expected value. Each participant in the evaluation process of a journal has a model of the world, or more precisely — of what constitutes valuable science. Participants may or may not agree on what they view as valuable science. And, typically, neither referees nor editors are explicit about what their personal evaluation criteria are. Let us call these potentially heterogeneous models of the world “black boxes”.
At each stage of the scientific publication process, these black boxes produce signals which are combined into a final prediction rendered by the editor. Provided the expected scientific value exceeds a certain journal set-standard, the work is accepted for publication. If it misses the mark, the work is rejected or invited for resubmission provided the referee’s requests can be thoroughly addressed.
Machine-learning framework: Scientific journals as ensemble learning
The majority of current scientific journals can be thought of as a 3-stage predictive process that combines predictions from different black-box algorithms. In machine learning, this is known as ensemble learning. Ensemble learning is the process of combining different predictive algorithms to increase predictive accuracy.⁴⁶ ⁴⁷ The editor, generally a senior scientist, performs an initial prediction (“The desk”) which constitutes the initial filtering on expected scientific impact. Passing the desk brings a paper into the next stage, which involves sending out the submission to peer-reviewers. The reviewers perform their own predictions on the expected scientific value of the work. In the final stage, the editor weighs and aggregates these signals with his own to form his final prediction.
Agent-based framework: Effort and truth are necessary to prevent noise, collusion and sabotage
In an ideal world, every black box involved in the process a) expends maximum effort and b) truthfully reports its prediction. The former is required because these models of the world are costly to apply: the detailed and minute work required to evaluate the soundness of the methodology and the justifications for the conclusion is a time-consuming process. Every submission is a high-dimensional input that needs to be broken down and evaluated on multiple dimensions to determine its expected scientific impact. By expanding insufficient effort, the prediction turns to noise.
By not reporting the truth, we run into the risks of unwarranted gatekeeping. Likewise, there is a threat of collusion between authors and peer-reviewers to provide each other with inflated reviews. Noise, sabotage and collusion are three failure modes of modern scientific journals’ peer-review process and can only be averted through effort and honesty. This is a particularly acute problem because peer reviewers (and often editors) work pro-bono for the publishing house, and there is little to no benefit in providing effortful reviews.¹⁰ ²⁹
Formalizing the scientific journal
In an abstract sense, we can think of a research work as determining the truth of a hypothesis, by offering new evidence that is, ideally, very convincing (but may in fact not be so). A hypothesis has the form that condition X leads to outcome Y. The quality of the research contribution (Q) depends on how much we learn (L), i.e. how much the information increases our confidence in the hypothesis, and how important the hypothesis is to the scientific enterprise overall (V). That is, let Q=V∙L.
The value of new knowledge depends on its implications, given our existing knowledge base, and on the potential proceeds from those implications, for example new inventions. These things are difficult to observe. Even similarly qualified referees and editors may disagree to an extent on what V is, because of their subjective understanding of current knowledge, their skill and imagination in envisioning future impact, and their perception as to which problems are most important to solve. We just take it as given here that there is a meaningful true V, and that readers of scientific work “guess” at it. Greater ability tends to produce better guesses.
How much we learn can be understood with reference to Bayes’ rule, P(Y|X)=P(Y)∙P(X|Y)/P(X), where P(Y) is the prior likelihood that outcome Y occurs, and P(Y|X) is the posterior likelihood (when condition X holds in the data). P(Y|X) measures the strength of the inference that X entails Y. We denote this by R. P(X|Y)/P(Y) measures how much more likely it is that condition X is observed when the outcome is Y. In other words, P(X|Y)/P(X) captures the information contained in X about Y. We define P(X|Y)/P(X)=1+I, so that I=0 reflects that X is as likely to occur with Y or without Y, and therefore nothing was learned from studying condition X. If I is different from 0, then X changes our expectation of Y. We can write P(Y)=R/(1+I), and therefore L≡P(Y|X)-P(Y)=R-R/(1+I). (Here we assume that positive relationships between X and Y are being tested, i.e. I≥0. There is no loss of generality, since Y can always be relabeled as the opposite outcome to make a negative relationship positive.)
The quality of a contribution can now be expressed as Q=V∙(R-R/(1+I)), where V is the (projected) value of being able to predict outcome Y, R is the degree to which Y depends on condition X, and I captures how our beliefs about Y changed due to this research. Note that R and I both affect Q positively, and Q≤V. When nothing new was learned (I=0), or when the condition does not predict the outcome (R=0), or when predicting the outcome is irrelevant (V=0), then Q=0. Note that a replication of a prior result can be a quality contribution, since it might significantly increase support for a hypothesis, especially when it is one of the first replications.¹³ ¹⁴ A negative result (where Y does not occur under condition X) can also be a quality contribution if it corrects the current prior.
An interesting, and probably common case, arises if a paper reports surprising results that are potentially paradigm-shifting, but the results turn out to be false. Intuitively, Q might be smaller than zero in this case, because an influential result that is false could do substantial damage both in terms of wasted time and effort by scientists, but also considering the welfare consequences for society. For example, irreproducible pre-clinical trials create indirect costs for patients and society.⁴⁸ Furthermore, future research that builds on the false discovery may not only waste resources, it may also derail scientific progress into further false discoveries.
When an error is made in the Bayesian model, the evidence does not justify the conclusions. Suppose the hypothesis is misspecified, and the relationship between condition and outcome is actually negative (I<0), but mistakenly reported as positive. Then L=R-R/(1+I))<0, which would make the quality of the contribution Q negative.
If we think of scientific progress as a linear process, a positive Q value implies that the new discovery makes some kind of positive contribution to scientific advance. A false discovery may not only not contribute to our knowledge, it may actually add confusion and entropy, resulting in scientific regress. Nevertheless, an editor might publish such a paper, misjudging Q.
The stated purpose of scientific journals is to publish contributions that advance knowledge (Q > 0). It is useful at this point to differentiate between what journals should be evaluating in order to advance knowledge (i.e. the normative case) and what journals actually do in practice (i.e. the descriptive case).
In the normative case (i.e. an ideal world), the predictive algorithm of journals should try to identify papers that have high Q values. This is complicated by the fact that the true value of a contribution is inherently difficult to assess and influenced by subjective insight and preferences. In addition, referees and editors need to exert effort to confirm the objective validity of the analysis, but they are not rewarded for doing so.
We shall denote the predicted quality of the contribution by Q’=f(V’,R’,I’), where primes indicate estimated quantities. Referees and editors will not necessarily evaluate Q according to the Bayesian model, but may assign subjective weights to each. V’ is to a large degree subjective; R’ and I’ can in principle be determined more objectively, but getting them right is effort-intensive, so the task is left mostly to referees. The referees make a report m, the accuracy of which depends on effort e∈0,1. In general, m(e)=t+ρ∙(1-e), where t is the true value and is a random variable that is symmetry (e.g. normally) distributed around zero. Note that the larger the effort, the smaller the potential error ρ∙(1-e).
In a typical process, the Editor, i = 1, performs a first scan of the submission. The submission is sent out for formal review if the editor believes it passes some minimum threshold, which is influenced by the editor’s relative preference for novelty, replicability, etc. If the paper is sent out by the editor for formal review, the referees similarly evaluate it, again giving potentially different weights to different criteria.
The editor then summarizes the evaluations to arrive at a final decision on the paper. If estimated quality is greater than the journal-specific threshold, the paper either gets a revise-and-resubmit and the process is repeated, or the paper is ultimately rejected or published.
Some adverse incentives journal editors face tend to bias these decisions toward novelty and against replications. Controversial, or otherwise attention-grabbing, results will tend to garner citations as researchers try to verify them. If maximizing reputation through citations is a goal, then it is rational for journals not to incentivize and reward replication efforts, although they are a crucial component of the scientific enterprise. Replications also suffer from the dilemma that they are, provocatively put, “not interesting” or “not credible.” If a replication study confirms the original result, or negates a result that was published recently and is not yet widely known, it may not be viewed as noteworthy. If it fails to confirm a well-known result, it will likely face doubt. Moreover, if only negative replications are “novel” enough to be publishable in a well-regarded journal, researchers face substantial risk (as well as bias) in attempting such a study, given that it might yield a positive result.
These aspects suggest that the “estimated quality” of an article will be based on weights that do not correspond to the Bayesian learning framework and may reflect differences in priorities between the editor and the referees, who are less motivated to generate future citations for the journal. Ultimately, referee judgments may be reflected in the final decision to a lesser extent than appears, and this would further reduce the referees’ incentive to commit effort.
To summarize the above points:
- Editors and referees will not necessarily evaluate articles according to consistently weighted criteria, and their judgments may well deviate from the best possible prediction of true quality.
- In particular, editors have incentives to weight novelty more strongly than replicability, and referees have incentives to limit their efforts to verify scientific accuracy. This can lead to a published literature with many low-quality papers (even if referees exert maximum effort due to intrinsic motivations).
Given the small number of referees and editors that evaluate each paper for each journal and their potential heterogeneity, the distribution of realized quality of publications will have a high variance across journals, and each submission of a paper to a different journal is akin to a lottery draw. Since journals require that the papers they evaluate are not under consideration at a different journal at the same time, this implies a substantial loss of time between the moment of first submission to a journal and the point where an article actually gets published. It also implies substantial costs for the authors of the submission, given that many journals have different formatting requirements etc. Thus, the current practice of curating and evaluating scientific contributions is inefficient and a waste of (public) resources.
If replicability is over-emphasized, the literature would be dominated by true findings, but there would be little or no advances in what we reliably know.
In an ideal world where journals achieve their stated objective of publishing papers of the highest possible quality:
(a) A logically derived rule is employed for predicting quality from the estimated strength of evidence and novelty of research work.
(b) Referees are given extrinsic incentives to put effort into verification and report truthfully.
If (a) and (b) are fulfilled, progress in the scientific literature would be faster if journals were to allow the simultaneous submission of papers to different publication outlets, and if more researchers were involved in the evaluation process.
Authors: Philipp Koellinger, Christian Roessler, Christopher Hill
Philipp Koellinger: DeSci Foundation, Geneva, Switzerland; University of Wisconsin-Madison, La Follette School of Public Affairs, Madison, WI, USA; Vrije Universiteit Amsterdam, School of Business and Economics, Department of Economics, Amsterdam, The Netherlands
Christian Roessler: Cal State East Bay, Hayward, CA, USA
Christopher Hill: DeSci Foundation, Geneva, Switzerland
References
- Deutsch, D. The Beginning of Infinity: Explanations That Transform the World. (Penguin Books, 2012).
- Goldbeck-Wood, S. Evidence on peer review — scientific quality control or smokescreen? BMJ 318, 44–45 (1999).
- Huisman, J. & Smits, J. Duration and quality of the peer review process: the author’s perspective. Scientometrics 113, 633–650 (2017).
- MacRoberts, M. H. & MacRoberts, B. R. Problems of citation analysis. Scientometrics 36, 435–444 (1996).
- Adam, D. The counting house. Nature 415, 726–729 (2002).
- Amin, M. & Mabe, M. A. Impact factors: use and abuse. Medicina 63, 347–354 (2003).
- Min, C., Bu, Y., Wu, D., Ding, Y. & Zhang, Y. Identifying citation patterns of scientific breakthroughs: A perspective of dynamic citation process. Inf. Process. Manag. 58, 102428 (2021).
- Garfield, E. The history and meaning of the journal impact factor. JAMA vol. 295 90 (2006).
- Aistleitner, M., Kapeller, J. & Steinerberger, S. Citation patterns in economics and beyond. Sci. Context 32, 361–380 (2019).
- Biagioli, M. & Lippman, A. Gaming the Metrics: Misconduct and Manipulation in Academic Research. (MIT Press, 2020).
- Seglen, P. O. Why the impact factor of journals should not be used for evaluating research. BMJ 314, 498–502 (1997).
- Moed, H. F. Citation analysis of scientific journals and journal impact measures. Curr. Sci. 89, 1990–1996 (2005).
- Ioannidis, J. P. A. Why most published research findings are false. PLoS Med. 2, e124 (2005).
- Moonesinghe, R., Khoury, M. J. & A Cecile J. Most published research findings are false — But a little replication goes a long way. PLoS Med. 4, e28 (2007).
- Martin, G. N. & Clarke, R. M. Are psychology journals anti-replication? A snapshot of editorial practices. Front. Psychol. 8, 523 (2017).
- Smaldino, P. E. & McElreath, R. The natural selection of bad science. R Soc Open Sci 3, 160384 (2016).
- Camerer, C. F. et al. Evaluating replicability of laboratory experiments in economics. Science 351, 1433–1436 (2016).
- Open Science Collaboration. PSYCHOLOGY. Estimating the reproducibility of psychological science. Science 349, aac4716 (2015).
- Camerer, C. F. et al. Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015. Nat Hum Behav 2, 637–644 (2018).
- Dreber, A. et al. Using prediction markets to estimate the reproducibility of scientific research. Proc. Natl. Acad. Sci. U. S. A. 112, 15343–15347 (2015).
- Verfaellie, M. & McGwin, J. The case of Diederik Stapel. American Psychological Association https://www.apa.org/science/about/psa/2011/12/diederik-stapel (2011).
- Grieneisen, M. L. & Zhang, M. A comprehensive survey of retracted articles from the scholarly literature. PLoS One 7, e44118 (2012).
- Callaway, E. Report finds massive fraud at Dutch universities. Nature 479, 15 (2011).
- Schweinsberg, M. et al. Same data, different conclusions: Radical dispersion in empirical results when independent analysts operationalize and test the same hypothesis. Organ. Behav. Hum. Decis. Process. 165, 228–249 (2021).
- Baker, M. 1,500 scientists lift the lid on reproducibility. Nature 533, 452–454 (2016).
- Serra-Garcia, M. & Gneezy, U. Nonreplicable publications are cited more than replicable ones. Sci Adv 7, (2021).
- Hardwicke, T. E. et al. Citation patterns following a strongly contradictory replication result: Four case studies from psychology. Adv. Methods Pract. Psychol. Sci. 4, 251524592110408 (2021).
- Hagve, M. The money behind academic publishing. Tidsskr. Nor. Laegeforen. 140, (2020).
- Aczel, B., Szaszi, B. & Holcombe, A. O. A billion-dollar donation: estimating the cost of researchers’ time spent on peer review. Res Integr Peer Rev 6, 14 (2021).
- Adams, W. J. & Yellen, J. L. Commodity bundling and the burden of monopoly. Q. J. Econ. 90, 475–498 (1976).
- Greenlee, P., Reitman, D. & Sibley, D. S. An antitrust analysis of bundled loyalty discounts. Int. J. Ind Organiz 26, 1132–1152 (2008).
- Peitz, M. Bundling may blockade entry. Int. J. Ind Organiz 26, 41–58 (2008).
- Bergstrom, C. T. & Bergstrom, T. C. The costs and benefits of library site licenses to academic journals. Proc. Natl. Acad. Sci. U. S. A. 101, 897–902 (2004).
- Lawson, S., Gray, J. & Mauri, M. Opening the black box of scholarly communication funding: A public data infrastructure for financial flows in academic publishing. Open Library of Humanities 2, (2016).
- Else, H. Nature journals reveal terms of landmark open-access option. Nature 588, 19–20 (2020).
- Laakso, M. & Björk, B.-C. Anatomy of open-access publishing: a study of longitudinal development and internal structure. BMC Med. 10, 124 (2012).
- Solomon, D. J., Laakso, M. & Björk, B.-C. A longitudinal comparison of citation rates and growth among open-access journals. J. Informetr. 7, 642–650 (2013).
- Clark, J. & Smith, R. Firm action needed on predatory journals. BMJ 350, h210 (2015).
- Grudniewicz, A. et al. Predatory journals: no definition, no defence. Nature 576, 210–212 (2019).
- Richtig, G., Berger, M., Lange-Asschenfeldt, B., Aberer, W. & Richtig, E. Problems and challenges of predatory journals. J. Eur. Acad. Dermatol. Venereol. 32, 1441–1449 (2018).
- Demir, S. B. Predatory journals: Who publishes in them and why? J. Informetr. 12, 1296–1311 (2018).
- Brierley, L. Lessons from the influx of preprints during the early COVID-19 pandemic. Lancet Planet Health 5, e115–e117 (2021).
- Singh Chawla, D. Open-access row prompts editorial board of Elsevier journal to resign. Nature (2019) doi:10.1038/d41586–019–00135–8.
- Increasing Politicization and Homogeneity in Scientific Funding: An Analysis of NSF Grants, 1990–2020 — CSPI Center. https://cspicenter.org/reports/increasing-politicization-and-homogeneity-in-scientific-funding-an-analysis-of-nsf-grants-1990-2020/ (2021).
- Bloom, N., Jones, C. I., Van Reenen, J. & Webb, M. Are Ideas Getting Harder to Find? Am. Econ. Rev. 110, 1104–1144 (2020).
- Polikar, R. Ensemble Learning. in Ensemble Machine Learning: Methods and Applications (eds. Zhang, C. & Ma, Y.) 1–34 (Springer US, 2012).
- Sagi, O. & Rokach, L. Ensemble learning: A survey. Wiley Interdiscip. Rev. Data Min. Knowl. Discov. 8, e1249 (2018).
- Begley, C. G. & Ellis, L. M. Raise standards for preclinical cancer research. Nature 483, 531–533 (2012).
By:DeSci Foundation Link:https://desci.medium.com/why-we-need-to-fundamentally-rethink-scientific-publishing-43f2ae39af76
-
@ 57fe4c4a:c3a0271f
2023-07-30 22:52:21👥 Authors: Peter Todd ( nostr:npub1m230cem2yh3mtdzkg32qhj73uytgkyg5ylxsu083n3tpjnajxx4qqa2np2 )
📅 Messages Date: 2023-07-30
✉️ Message Count: 1
📚 Total Characters in Messages: 3752
Messages Summaries
✉️ Message by Peter Todd on 30/07/2023: A pull request has been submitted to enable full-RBF by default in Bitcoin Core, as approximately 40% of Bitcoin hash power is already using it.
Follow nostr:npub15g7m7mrveqlpfnpa7njke3ccghmpryyqsn87vg8g8eqvqmxd60gqmx08lk for full threads
-
@ 57fe4c4a:c3a0271f
2023-07-30 22:52:20📝 Summary: The conversation revolves around the issue of inscriptions in the Bitcoin network, which are seen as a spam attack. One person suggests rejecting inscriptions in the mempool as a solution, while another argues that blocking spam could be seen as censorship. Standardization rules are proposed as a way to address the issue, but there are concerns about the impact on regular transactions. The conversation also touches on the problems of initial blockchain download time and UTXO set growth. Overall, there is a debate about how to handle inscriptions and the ethical implications of spam in the Bitcoin network.
👥 Authors: • Léo Haf ( nostr:npub1llvyx59pm062s9wtv23tz70djmgxxz807evjcjzyt8v5dxugysas9etyr0 ) • leohaf at orangepill.ovh ( nostr:npub1slp5wtupc9zgq044phx04yh53qqktjuwmhy5ayecdh9p5s48qc7qpexe6g ) • vjudeu at gazeta.pl ( nostr:npub1357006afyypkgz03lmq8fnuvlkyjt0rukx8rt56ck8xv396jaceqmnssga ) • rot13maxi ( nostr:npub12aguh8y7hacsxc3c7fsjzkqh95nu9cuh4ec6htx9y3sz75722npqmllglz )
📅 Messages Date Range: 2023-07-25 to 2023-07-30
✉️ Message Count: 6
📚 Total Characters in Messages: 32106
Messages Summaries
✉️ Message by leohaf at orangepill.ovh on 25/07/2023: The writer is concerned about a bug in recent software versions that causes inscriptions to take up a large amount of space on the blockchain, impacting the UTXO set. They request options to reject inscriptions in the mempool and raise ethical questions about NFTs and Tokens.
✉️ Message by vjudeu at gazeta.pl on 26/07/2023: The problem of inscriptions in Bitcoin has not been addressed seriously because there is no good solution and it would lead to other serious problems like initial blockchain download time and UTXO set growth. Rejecting inscriptions in the mempool would result in a never-ending chase and the creation of different inscriptions. The Bitcoin community has consistently rejected concepts like NFTs and Tokens, but some unstoppable concepts like soft-forks still exist. Inscription creators have created a non-enforced soft-fork with their rules.
✉️ Message by leohaf at orangepill.ovh on 26/07/2023: Inscriptions are a major spam attack in the Bitcoin network, and not taking action against them could encourage more similar attacks in the future. Adding a standardization option could be a solution.
✉️ Message by vjudeu at gazeta.pl on 27/07/2023: Not taking action against spam could be seen as acceptance. Some argue blocking spam is censorship and could lead to blocking regular transactions.
✉️ Message by Léo Haf on 27/07/2023: Standardization rules were introduced to address issues like the opreturn limit, maxancestorcount, minrelayfee, and dust limit. Bitcoin defenders can detect and standardize spam transactions more easily than creating new types of spam. Default policy can be a weakness or strength, depending on integration into Bitcoin Core. Using a pre-segwit node is not a solution as it cannot initiate new ones. Satoshi discussed spam and some consider Ordinals as spam. Blocking Ordinals is seen as censorship and could lead to blocking regular transactions. The Bitcoin network tolerating spam could be perceived by spammers. The IBD problem and UTXO set growing problem need to be solved. People can still use Taproot to upload data and turn off the witness to become a pre-Segwit node. Blocking certain ways of pushing data may lead to data being pushed into legacy parts.
✉️ Message by rot13maxi on 30/07/2023: Bitcoin defenders can win the cat and mouse game against spam transactions by detecting and standardizing them, making it harder for inscriptions to reach miners. Appeals to Satoshi are not convincing arguments.
Follow nostr:npub15g7m7mrveqlpfnpa7njke3ccghmpryyqsn87vg8g8eqvqmxd60gqmx08lk for full threads
-
@ 32e18276:5c68e245
2023-07-19 02:56:47I’m so lazy I’m thinking of running the damus merch store via stateless and serverless lightning payment links. All data is collected and stored in the lightning invoice descriptions which are fetched from your node. You can do this without having to run any server code except a lightning node!
This is the same tech we used when selling merch as at bitcoin Miami. It was extremely reliable. I love these things, they are so easy. Integrating with the legacy fiat system is such a pita, It may just be a lightning-only store for now because of how simple this is. Here's what a lightning payment link looks like:
http://lnlink.org/?d=ASED88EIzNU2uFJoQfClxYISu55lhKHrSTCA58HMNPgtrXECMjQuODQuMTUyLjE4Nzo4MzI0AANgB6Cj2QCeZAFOZ1nS6qGuRe4Vf6qzwJyQ5Qo3b0HRt_w9MTIwJm1ldGhvZD1pbnZvaWNlfG1ldGhvZD13YWl0aW52b2ljZSZwbmFtZWxhYmVsXmxubGluay0mcmF0ZT04BERlYXRoIFN0YXIABQAAAGQGQW4gb2JqZWN0IG9mIHVuZmF0aG9tYWJsZSBwb3dlcgAHEwhodHRwczovL3VwbG9hZC53aWtpbWVkaWEub3JnL3dpa2lwZWRpYS9lbi9mL2Y5L0RlYXRoX3N0YXIxLnBuZwA=
How it works
The entire product page is stored as data in the url. When a customer click the link, the product info is decoded and rendered as a webpage. The data in the url includes
- The product name
- Description
- Price in sats
- Product image url
- Fields to collect data from the user
- Lightning node address
- Lightning node rune for fetching and waiting for invoice payments
This works thanks to a javascript library I created called "lnsocket". It allows you to connect to your CLN node over websockets. Once the user fills out all of the info, a new lightning invoice is fetched with this information in the description, by connecting directly to your node. This connection is end-to-end encrypted thanks to the lightning protocol itself.
To your lightning node, it looks like another lightning node is connecting to it, but in reality it's just a dumb client asking for things.
At this point, custom lightning packets called "commando" packets are sent to your node which asks your node to run certain commands. CLN authenticates these packets using the rune and then returns a response. This is pretty much the same as calling these commands directly on your lightning node, except now someone is doing it from a browser in a secure way!
Why not just run btcpayserver?
btcpayserver is cool and is more powerful, but I like exploring simpler ways to do things that don't require running lots of software which can be challenging for many non-technical people. You shouldn't have to become a server administrator to start accepting payments. It should be as simple as running a bitcoin and lightning node, pushing all of the application logic to the clients.
This is a similar philosophy to what we have in the nostr space. Let's make it easier for people to use self-sovereign tools. Everyone deserves freedom tech.
Anyways, I'm still working on https://lnlink.org. I just added images and nostr address support! You can make your own payment links here! Try it out:
http://lnlink.org/?d=ASED88EIzNU2uFJoQfClxYISu55lhKHrSTCA58HMNPgtrXECMjQuODQuMTUyLjE4Nzo4MzI0AANgB6Cj2QCeZAFOZ1nS6qGuRe4Vf6qzwJyQ5Qo3b0HRt_w9MTIwJm1ldGhvZD1pbnZvaWNlfG1ldGhvZD13YWl0aW52b2ljZSZwbmFtZWxhYmVsXmxubGluay0mcmF0ZT04BERlYXRoIFN0YXIABQAAAGQGQW4gb2JqZWN0IG9mIHVuZmF0aG9tYWJsZSBwb3dlcgAHEwhodHRwczovL3VwbG9hZC53aWtpbWVkaWEub3JnL3dpa2lwZWRpYS9lbi9mL2Y5L0RlYXRoX3N0YXIxLnBuZwA=&edit=1
-
@ 393c8119:75e43710
2023-07-31 08:08:08Public Goods, Integration, and the Bigger Picture
Photo by Robynne Hu on Unsplash
This is Part 3 of the Science Token Engineering blog series. In this blog post, we’ll put together everything we’ve discussed so far and we’ll try to envision what an open science community could look like.
By looking at the science value flow, we have been able to identify its weaknesses and design a system that aims to solve them. Incidentally, we have also been looking for a way to maximize the creation of new knowledge and to ensure the value of this new knowledge is correctly distributed amongst those involved.
Throughout the last two posts, we have essentially been treating knowledge as any other private good, which actually isn’t a perfect analogy. The value of knowledge increases with its integration. In other words, new research is not going to bring much value to the world unless it is adopted by other people.
So how do we increase the adoption of new knowledge?
Knowledge as a Public Good
A reasonable approach might be to incentivize public research, or research which will produce public knowledge assets and will not belong to a specific person (beyond the accreditation of its creators). For example, if funding should be allocated to a biomedical research project that has the potential to save millions of lives, there must be no room for malicious behavior from any party involved, and thus it might be a better idea to give this project more funding under the condition that everything will become a public good.
Thinking back to Part 2 and the decentralized model of science value flow, the DAO Treasury, curated by its community, can be restricted to funding public research projects which maximize their integration within the scientific community. This in turn ensures that the incentives of the open science ecosystem are aligned so as to not only create a fair distribution of value, but to also maximize the utility of the newly created knowledge assets.
Figure 1. Schema of the public funding, profit-sharing model
Figure 1 shows a possible model for an open science ecosystem that aligns the incentives of the DAO towards public research funding, but which also utilizes the effectiveness of the decentralized knowledge market in unlocking previously hidden value from the private sector. The top loop is almost identical to the profit sharing model discussed in Part 2, but this time it specifies that funding is exclusively allocated to public research projects. Furthermore, we make the distinction between different types of researchers depending on the knowledge assets they produce.
- A data provider is somebody who runs experiments and collects data.
- An algorithm provider is someone who uses data to create new insights.
- A compute provider is an entity with significant amounts of data who participates in the market to receive rewards from that data.
Global Potential of the Profit-Sharing Model
This model builds upon the realization that collaboration and data sharing are not restricted to researchers, but can expand to private research companies, entire labs, and universities. This notion is represented in the lower part of the schema in Figure 1, where private entities make use of the decentralized knowledge market to accelerate their own research.
The paradigm shift of the science value flow requires time and resources, as outlined in Part 2, but by providing the right incentives, centralized agencies can unlock previously unattainable value of their IP. On top of that, they can access new knowledge resources either from the public research sector or from other private entities, thus greatly accelerating research and development.
Put simply, the Web3 model enables collaboration on an unprecedented scale and quick integration of new knowledge to a potentially wide range of areas, both of which increase scientific output, reduce the need for intermediaries in scientific funding and knowledge dissemination, and overall increase value that science brings into the world.
Conclusion
In a nutshell, science is broken, but not beyond repair. Web3 has the tools we need to build a better science ecosystem that is fair to its participants, sustainable in the long run, and, most importantly, best utilizes the enormous potential of science to create a better world.
Science Token Engineering Blog Series
Science Token Engineering Part 1: The Problem with Science Science Token Engineering Part 2: The Profit Sharing Vision
By:Jakub Smékal Link:https://pulse.opsci.io/simulations-for-science-token-engineering-part-3-bb59192d1e71
-
@ 32e18276:5c68e245
2023-07-17 21:55:39Hey guys!
Another day, another TestFlight build. This fixes many mention bugs and includes bandwidth improvements.
During nostrica, jack told everyone to open their phones and say GM to fiatjaf. This actually brought down nostrica's wifi! Damus is really dumb when it first opens and makes many requests. Sometimes hundreds (nip05 validation, etc). This build fixes all those issues. Damus will no longer:
- Make hundreds of nostr address validation requests.
- Make tons of duplicate lnurl requests when validating zaps
nostr address validation only happens when you open someones profile now.
This build also fixes some annoying mention issues. If you forget a space when mentioning someone, it will automatically add it.
I've also removed the restriction where you were not allowed to login to "deleted" accounts. This was way too confusing for people, and logging into a deleted account will allow you to reset the profile information and get it going again. You're welcome NVK.
Another thing that was added in this build is support for
_
usernames in nostr addresses. This will hide your full nostr address username when used. Damus will also hide your username if it matches your profile username. Damus always did this before but it was incorrect. Now it will show your full nostr address (nip05) with its proper username. You can stop bugging me about this now Semisol.Last but not least there are some small tweaks to longform note padding. Nothing too crazy but it does make notes like this look less cramped.
Until next time!
Added
- Show nostr address username and support abbreviated _ usernames (William Casarin)
- Re-add nip05 badges to profiles (William Casarin)
- Add space when tagging users in posts if needed (William Casarin)
- Added padding under word count on longform account (William Casarin)
Fixed
- Don't spam lnurls when validating zaps (William Casarin)
- Eliminate nostr address validation bandwidth on startup (William Casarin)
- Allow user to login to deleted profile (William Casarin)
- Fix issue where typing cc@bob would produce brokenb ccnostr:bob mention (William Casarin)
-
@ fa984bd7:58018f52
2023-07-30 22:04:26Today I stumbled upon a team that has built a client and a few service providers using NIP-90: Data Vending Machines.
They detailed two milestones on the project's page that are a perfect example of the Before and After pictures that I can't help but point them out here.
The project is aiming at creating an AI system that generates images and logos.
Before
First take: using L402, the project is hitting an endpoint that requires payment to complete. This means, the client is integrating with a specific endpoint, paying and receiving a result.
After
At the end of the first milestone, they mention they are interested in exploring what this would look like using NIP-90: Data Vending Machines.
What happens?
Instead of integrating specifically with an endpoint, which could change it's price, lower it's quality, go out of business or simply stop being the best possible provider for these type of jobs.
Instead of this client tightly-coupling with a specific endpoint, they use NIP-90 to communicate with an infinite market of service providers who try to get the job done for them.
No endpoint need apply.
Guess what? PhotoBolt completely migrated over to DVM!
X vs Y axes / Free-market vs centrally-planned designs
If it's not abundantly obvious by now, the first approach demands from the service provider a complete output that is acceptable, and hopefully, the best in the market.
Thus, the business behind the endpoint needs to vertically integrate, which means, they need to try and be the best at each step required to deliver a complete result.
If it's not the best in the market, the client will have to, at some point, shop around until they find a new endpoint to integrate with to buy this type of jobs from.
Rinse and repeat.
The new alternative that DVMs unlock is that there is no need to acquire services from a vertically-integrated service provider, which allows services providers to specialize in being the best possible at something widely specific and let Nostr and Lightning to be the glue that ties everything together.
Today I stumbled upon a team that has built a client and a few service providers using NIP-90: Data Vending Machines.
They detailed two milestones on the project's page that are a perfect example of the Before and After pictures that I can't help but point them out here.
The project is aiming at creating an AI system that generates images and logos.
Before
First take: using L402, the project is hitting an endpoint that requires payment to complete. This means, the client is integrating with a specific endpoint, paying and receiving a result.
After
At the end of the first milestone, they mention they are interested in exploring what this would look like using NIP-90: Data Vending Machines.
What happens?
Instead of integrating specifically with an endpoint, which could change it's price, lower it's quality, go out of business or simply stop being the best possible provider for these type of jobs.
Instead of this client tightly-coupling with a specific endpoint, they use NIP-90 to communicate with an infinite market of service providers who try to get the job done for them.
No endpoint need apply.
Guess what? PhotoBolt completely migrated over to DVM!
X vs Y axes / Free-market vs centrally-planned designs
If it's not abundantly obvious by now, the first approach demands from the service provider a complete output that is acceptable, and hopefully, the best in the market.
Thus, the business behind the endpoint needs to vertically integrate, which means, they need to try and be the best at each step required to deliver a complete result.
If it's not the best in the market, the client will have to, at some point, shop around until they find a new endpoint to integrate with to buy this type of jobs from.
Rinse and repeat.
The new alternative that DVMs unlock is that there is no need to acquire services from a vertically-integrated service provider, which allows services providers to specialize in being the best possible at something widely specific and let Nostr and Lightning to be the glue that ties everything together.
PS. Here are milestone 1 and milestone 2
-
@ 6d5f85c4:a3f7e400
2023-07-30 21:58:47About me
Good day nostr! Was tagged by for introductions so here it goes.
Not gonna give away too much of the spice away but here are some of the things that have taken up a majority of my life.
- Wildland Firefighting 🔥
- Military 🎖️
- Skiing ⛷️
- Photography & Cinematography 📽️
- Forestry 🌲
- Exploring the World 🛫
Slowly adding freedom technology to the list but it is rather newish in my life. Looking forward to seeing #nostr grow as a community and meeting more of you all. Have a good day everyone!
-
@ 393c8119:75e43710
2023-07-31 08:01:02Photo by Ousa Chea on Unsplash
This is part 2 of the blog series on Science Token Engineering. If you haven’t already, read part 1 of this series. By now, we have established the major limitations of current science value flow: flow linearity, centralization of value, and dependence on centralized agencies for this value flow to even exist. Today, let’s take a look at an alternative system which solves all of these issues.
One project in the Web3 space that has completely transformed a traditional value flow is Ocean Protocol. Very briefly, Ocean Protocol allows you to take full ownership of your data and sell it as an asset in a decentralized marketplace so that other people can run compute jobs on your data without actually seeing it. There’s more to Ocean Protocol than that, but it is this specific feature that incentivizes complete data sharing, as you can get value for your data without actually losing the intellectual property those data hold.
The data economy spans a large number of fields, including science, hence the obvious question: Can we apply the concept of a decentralized market to scientific research? In this post, we’ll take a closer look at a DeSci ecosystem centered around decentralized science marketplaces and describe how this system improves on the inefficiencies of academia described in Part 1 of this series, namely, linearity of the value flow, centralization of value, and an overall dependence on centralized agencies.
Decentralized Science Marketplaces
Let’s assume we can, and that such a marketplace exists. Essentially, a decentralized science marketplace (or DeSciMart for short) replaces traditional centralized knowledge curators such as scientific journals, and since each seller can choose the level of privacy on the assets they’re selling, a DeSciMart incentivizes complete sharing of new scientific knowledge. In other words, researchers suddenly have a space where they can share all the data they collect, all the new algorithms or programs they develop, and choose the licenses they attribute to the research papers they publish. What’s more, researchers can also now reap the benefits of their knowledge assets over time using a DeSciMart. For instance, imagine a research project that publishes a private dataset and a research paper to the decentralized marketplace. Not only do the researchers have complete ownership of that research paper, but also if a new researcher wants to use the existing dataset in their own research, they can pay to run compute jobs on it, meaning that a research project conducted years ago can be rewarded continuously based on its actual value within the wider scientific community.
A DeSciMart offers a clear solution to the centralization of the final value from scientific research (the end of the value flow in Part 1), but how do we solve the initial centralization of funding? At the heart of the Web3 movement are DAOs (Decentralized Autonomous Organizations), which are entirely community run enterprises with a common goal, distributed governance, and so much more. DAOs share a number of characteristics with traditional companies, e.g. a treasury that is curated by members of the respective group. Similarly to how a centralized agency provides funding for research grants, a decentralized organization can fulfill this role, only with more flexibility in the design of its funding mechanisms. This is actually not a brand new concept as decentralized funding has been playing a huge part in the development of many projects.
Putting everything together, a possible value flow for decentralize science might look something like this:
Figure 1. Schema of the Web3 profit sharing model
Let’s take a look at how this model solves the problems with the current status quo of scientific research.
1. Flow Linearity
As in the traditional system, value in DeSci starts in some treasury. Researchers then apply for grants and the best proposals (note: we’ll discuss the notion of a best proposal in a different post) get funded. This funding is once again used to buy the necessary resources for the research project (data, equipment, etc.). However this time, instead of using funding to publish a limited subset of the knowledge that is then submitted to a centralized scientific journal, all knowledge assets are published to the decentralized knowledge market. The flow linearity is broken in this model in two ways. First, since researchers retain ownership of anything they publish, they will receive continuous rewards as other members of their community pay to gain access to their knowledge assets, thus value flows back to the researchers and is not locked within a centralized entity. Second, the decentralized knowledge market can collect transaction fees which are fed back into the DAO treasury, thus increasing the sustainability of the system. This model is largely influenced by the Web3 Sustainability Loop proposed by Trent McConaghy and its primary mechanism for achieving a fair value distribution is a circular flow of value.
2. Centralization of Value
Inevitably, as more people adopt the Web3 model for funding, conducting, and sharing scientific achievement, the DAO Treasury and the Knowledge Market will store more and more value; however, don’t confuse value with centralization. The DAO Treasury is, by its definition, community-run, therefore there is no centralized entity that can operate without a standardized decision-making process that includes everyone in the community. Similarly, the Knowledge Market doesn’t belong to anyone in particular. Yes, some specific people have worked to implement it and perhaps set the transaction fees (although these can be set dynamically by an algorithm), but once the market is deployed, there is no off-switch, and there is no key that gives any potentially malicious actor access to its contents.
3. Dependence on Centralized Agencies
The points above should be enough to convince you of the basic idea behind an open science ecosystem; however, it’s a good idea to also discuss the limitations of this model, which are partly tied to the dependence on centralized agencies. Part 1 of this blog series outlined the problems of current science value flow. Part 2 showed how we can improve the systems that are currently in place to solve these problems. While it would be truly incredible if migration to a new system was entirely independent of its previous versions, that is often not realistic and sometimes not even desirable. A gradual shift towards the Web3 science ecosystem will require support from the centralized agencies that can identify the greater good that open science can bring into the world. Furthermore, reaching the point of sustainability will require a tremendous reallocation of value currently locked in the centralized agencies discussed previously. Consequently, the Web3 open science model is not entirely free from the dependence on centralized agencies, but it does give us the ability to design the ecosystem so as to reach sustainability in a minimal period of time.
Conclusion
In this post, we outlined an alternative model for scientific value flow in which profit is shared fairly between researchers contributing to the expansion of human knowledge. This model solves the flow linearity problem and the centralization of value problem outlined in Part 1 of this series, but is limited by the transition period that any change of systems entails.
So what’s next? The model described in this post provides a high-level overview of what we want to focus on; however, it is very broad in the sense that if you took away the words “science,” “knowledge,” and “researcher,” you can describe almost any field’s Web3 ecosystem. With this in mind, we can develop a higher resolution model that specifically considers the role of science in society and tries to develop the right mechanisms to incentivize the maximization of scientific value. See you in Part 3!
Science Token Engineering Blog Series
Science Token Engineering Part 1: The Problem with Science — previous Science Token Engineering Part 2: The Profit Sharing Vision — current By:Jakub Smékal Link:https://pulse.opsci.io/science-token-engineering-part-2-the-profit-sharing-vision-bfc7c4f69f69
-
@ 32e18276:5c68e245
2023-07-16 22:47:17Hey guys, I just pushed a new Damus update TestFlight. This should drastically improve longform event rendering. Let me know if you find any bugs!
Full Changelog
Added
- New markdown renderer (William Casarin)
- Added feedback when user adds a relay that is already on the list (Daniel D'Aquino)
Changed
- Hide nsec when logging in (cr0bar)
- Remove nip05 on events (William Casarin)
- Rename NIP05 to "nostr address" (William Casarin)
Fixed
- Fixed issue where hashtags were leaking in DMs (William Casarin)
- Fix issue with emojis next to hashtags and urls (William Casarin)
- relay detail view is not immediately available after adding new relay (Bryan Montz)
- Fix nostr:nostr:... bugs (William Casarin)
-
@ 393c8119:75e43710
2023-07-31 07:56:20Photo by Shubham Dhage on Unsplash
This is the first article in a series of blog posts discussing science token engineering, the focus of my Open Web Fellowship at OpSci.
A Synthesis of Computer Science & Behavioral Economics
First, what is science token engineering? I highly recommend this excellent blog post by Trent McConaghy as an overview of this exciting, rapidly growing field in the Web3 space.
Essentially, science token engineering is a practice of applying token engineering principles specifically to scientific systems. More specifically, we look at science from the perspective of value flows (Where does the value originate? What happens to it in the system? Where does it end up?). This can lead us to identify inefficiencies in the system that we can directly address with alternative value flows, which are later verified in a simulation. The software used for simulating scientific value flows is called DARC-SPICE and if you’re interested, I highly encourage you to check out this technical excursion of the different simulations.
This will all make more sense down the line, so let’s begin with the main question:
What does the current science value flow look like?
The Science Value Flow
To answer this question, imagine you’re a research scientist applying for funding. You submit a proposal for a research project in the hopes of receiving a grant either from the institution you work at or from an external agency. Once you receive that funding, you’ll use it to cover the costs of getting all the necessary resources for your research (data, equipment, personnel, etc.), but part of that funding will inevitably go towards publishing your results in a scientific journal to disseminate your findings to the larger scientific community.
Going back to the original question of value flow, we can see:
- all value originated in some grant funding agency (fully monetary value),
- which was then transformed by the researchers into new knowledge (intellectual value),
- which was finally captured within a knowledge curator, in this case a scientific journal (both intellectual and monetary value).
Omitting the inevitable leakage of value caused by uncontrollable variables, it’s clear that the science value flow is incredibly linear; it starts in one centralized place (a grant funding agency) and ends up in another (a scientific journal).
Figure 1. A schema of the science value flow model
Figure 1 shows this linear value flow. Now, you might wonder whether there is anything wrong with this model. After all, this is how scientific research has been conducted for almost three centuries. I outline the problems with this model below.
Problems with the Baseline Science Model
The centralization of value, both at the level of funding agencies and knowledge curators, can (and does) introduce a number of inefficiencies. For instance, if I am a researcher and have spent years collecting valuable data, that data probably doesn’t belong to me, so I have little to no control over what happens to it [1,2]. Furthermore, with the little control I do have, I will not want to share that data since it represents my potential competitive advantage in getting future grants for additional research and for getting recognition within the scientific community when utilizing this data to justify/support/falsify evidence-based claims.
And what if somebody wants to do research on data that has already been collected, but isn’t available to use? It means the data needs to be collected again, which requires resources that could have been used on processing the existing data. Essentially, the current flow of value does not incentivize collaboration and data sharing, which is inefficient.
In summary, we have so far identified the following problems with the current science value flow:
- linear flow of value,
- value is centralized, and
- research is dependent on centralized agencies.
Conclusion
These three points outline the motivation behind science token engineering, which seeks to solve these issues by designing a new community where the incentives of all participants are aligned to maximize efficiency of scientific research and fairness of value distribution. Thanks to the incredible world of Web3, scientists can be free of the dependence on centralized agencies, they can retain ownership of the work they do and receive fair rewards based on their contributions. Together, we’ll explore how we can reach this goal. Stay tuned for Part 2.
By:Jakub Smékal Link:https://pulse.opsci.io/science-token-engineering-part-1-the-problem-with-science-ab6a6a33fa39
-
@ 4d444439:7ed2458b
2023-07-30 13:38:17By Daniel Kang, Kobi Gurkan, and Anna Rose
Feel free to read all my articles on the anti-censorship long content platform yakihonne.com.
AI-generated audio is becoming increasingly indistinguishable from human-produced sound. This emerging technology, while impressive, is unfortunately increasingly misused. We’re witnessing instances where this convincingly replicated audio is being manipulated to conduct scams, perpetrate identity theft, and misused in other ways. How can we safeguard ourselves and effectively combat the misuse of this advanced technology?
In an environment where AI-generated audio can mimic human voices flawlessly, we need a reliable chain of trust stretching from the initial capture of audio to its final playback. This chain of trust can be established using cryptographic technologies: attested microphones for capturing the audio and carried through to the final playback via ZK-SNARKs.
In the remainder of the blog post, we’ll describe how to use these tools to fight AI-generated audio. We’ll also describe how the open-source framework zkml can generate computational proofs of audio edits, like noise reduction. To demonstrate this end-to-end process, we’ve simulated the process of capturing audio to performing verified edits. We’ll describe how we did this below!
Cryptographic tools for fighting AI-generated audio
Establishing a chain of trust from the audio capture to final playback requires trusting how the audio is captured and how the audio is edited. We will use cryptographic tools to establish this chain of trust.
Attested microphones for trusted audio capture
The first tool we will use are called attested microphones. Attested microphones have a hardware unit that cryptographically signs the audio signal as soon as it is captured. This cryptographic signature is unforgeable, even with AI tools. With this signature, anyone can verify that the audio came from a specific microphone. In order to verify that audio came from a specific individual, that person can publish the public key of the attested microphone.
Unfortunately, there’s two limitations of attested microphones. The first (which we will address below) is that attested microphones don’t allow you to perform edits on the audio, including edits like noise reduction or cutting out sensitive information. The second is that these attested microphones currently don’t exist, even though the technology is here. We hope that hardware manufacturers consider building attested microphones to combat AI-generated audio!
ZK-SNARKs for verified edits
Once we have the raw audio, there are many circumstances where we want to privately edit the original audio. For example, intelligence agencies can use background noise to identify your location, which compromises privacy. To preserve privacy, we may want to perform edits like removing the background noise or cutting out parts of an interview that might contain sensitive information.
In order to perform these edits, we can use ZK-SNARKs. ZK-SNARKs provide computational integrity. For audio, ZK-SNARKs allow the producer of the audio to privately edit the audio without revealing the original. Similar to cryptographic signatures, ZK-SNARKs are unforgeable, allowing us to extend the chain of trust to edits.
Demonstrating the technology
To showcase the power of attested microphones and ZK-SNARKs, we’ve constructed an end-to-end demonstration of the chain of trust for audio. In our demonstration, we recorded a short 30 second clip, where each of us (Anna, Daniel, and Kobi) recorded on our own microphone. In other words, there are three 30 second clips.
Because attested microphones don’t exist yet, we simulated the attested microphone by signing the individual audio clips with Ethereum wallets. These wallets contain private keys that would be similar to the secure hardware elements in the attested microphone. The signatures we’ve produced are also unforgeable, assuming our wallets aren’t compromised.
During the recording process, Daniel’s microphone picked up some background echo, so we wanted to cut it out and combine the clips into one. We produced a ZK-SNARK that verifies these edits were done honestly from the original audio clips. Furthermore, the ZK-SNARK hides the input audio, so you won’t be able to extract the background noise in Daniel’s clip! This helps preserve privacy.
In the following demo, the final audio file is presented coupled with a proof and a set of signatures. The verification program verifies both, ensuring we know the exact chain of operations performed on the input audio files resulting in the audio you can hear.
Technical deep dive
To understand how our demonstration works at a deeper level, we’ve done a technical deep dive below. You can skip to the conclusion without missing anything!
We’ve outlined the overall architecture below:
Overall architecture of trusted audio
As we can see, the first step (after capturing the audio) is to produce the signatures. Since we don’t have attested microphones, we used Ethereum wallet addresses, which are publicly associated with us (Anna, Daniel, and Kobi), to sign hashes of the original audio. Ethereum uses ECDSA, which allows anyone to verify the signatures we produced with our public key. The private key must remain hidden. In hardware, this can be done using trusted enclaves. The hardware manufacturer can destroy the private key after it is placed on the device. By doing so, the private key is inaccessible!
Given the signed input audio, we want to be able to edit them with computational integrity while preserving the privacy of the original audio. Under the random oracle model of hashing, the hashes reveal nothing about the input. We can combine the hashes with ZK-SNARKs to preserve privacy.
ZK-SNARKs allow a prover to produce a proof that a function executed honestly while keeping parts of the input hidden (and selectively revealing certain inputs or outputs). In our setting, we can compute a function that computes the hashes of the inputs and outputs the edited audio from the inputs. By revealing the hashes, we can be assured that the inputs match the recorded audio! We’ve shown what happens within the ZK-SNARK below:
Conclusions
As we’ve seen, attested microphones and ZK-SNARKs can provide a chain of trust for audio while preserving privacy. With the rise of AI-generated audio, we’re seeing an increasing need to establish this chain of trust. We hope that our demonstration will spur hardware manufacturers to consider building attested microphones.
Stay tuned for more posts on this topic as we delve deeper into other tools to fight malicious AI-generated content. And if you’d like to discuss your idea or brainstorm with us, fill out this form and join our Telegram group. Follow me on Twitter for the latest updates as well!
Important note: the code for this demonstration has not been audited and should not be used in production.
By:Daniel Kang Link:https://medium.com/@danieldkang/fighting-ai-generated-audio-with-attested-microphones-and-zk-snarks-the-attested-audio-experiment-d6ea0fc296ac
-
@ e8ed3798:67dd345a
2023-07-16 02:49:48This article has been translated into Japanese here: https://yakihonne.com/article/naddr1qq257w2t8qeksc6tdg6njnekdc6x55j0w56nvq3qarkn0xxxll4llgy9qxkrncn3vc4l69s0dz8ef3zadykcwe7ax3dqxpqqqp65wu2llgg
This article has been translated into French here: https://yakihonne.com/article/naddr1qq2h23jjwck4zajsv4485h68f5mj6c66vfmxuq3qarkn0xxxll4llgy9qxkrncn3vc4l69s0dz8ef3zadykcwe7ax3dqxpqqqp65wvsexdg
In this article we are going to explore the conceptual origins of the original metaverse called "cyberspace" and see how nostr finally enables it to exist as it was predicted in early science fiction. Then we will explore what cyberspace might be able to do for humanity and how you can contribute to this exciting new open-source metaverse project.
A Concept Obscured by Time
What is a "metaverse"? Ask 10 different people and you will get 10 different answers. Some will say that it is an online game where you can use and transfer crypto assets. Some will say it's a virtual reality experience with extrasensory input for things like smell and touch. Some will say that a metaverse is anywhere you can connect and express yourself digitally. And some may tell you the metaverse doesn't exist yet because we don't have the technology to make it happen.
It's hard to define what a metaverse is because nobody has convincingly built it yet. This is demonstrable by simply asking anyone to show you a metaverse. They may show you VR Chat, or Meta (Facebook), or the HoloLens or Apple Vision, or someone's Discord server or NFT ecosystem, or say "it's not real, and if it is it's probably stupid."
But how did we get here? Why do we all know about the metaverse but we can't define it and we don't even know what it looks like? Where did the idea of the metaverse come from? This answer will give us the conceptual background we need to untangle the question of "what is a metaverse" and see how cyberspace can exist today.
The word "metaverse" was first popularized in Neal Stephenson's 1992 book "Snow Crash", and he was in turn inspired by William Gibson's earlier 1984 book "Neuromancer" (and 2 other books in a trilogy called "The Sprawl"), which popularized the word "cyberspace" and "matrix" in reference to digital 3D spaces.
Accordingly, cyberspace preceded "the metaverse" by 8 years and serves as the foundation for our exploration. Gibson's cyberspace was a digital 3D world one could connect to via a cyberspace deck — a machine/brain interface — and interact with all the data in the entire world. Artificial Intelligences guarded data constructs and kept out intruders with lethal feedback programs that would fry human operator's brains. In the books, cyberspace is described as being used for pretty much everything: entertainment, education, communication, commerce, data storage, and crime, and it is used daily by billions of people throughout the world, including millions of people in orbital colonies.
Mysterious Properties
In Gibson's work, cyberspace has many fascinating qualities that, until nostr, remained very mysterious and seemed to be impossible. Here are some examples.
1. Cyberspace is Permissionless
Everyone can use cyberspace but nobody has full control over it; cyberspace seems to exist outside of every jurisdiction and system. It never goes down for maintenance or has connection issues or suffers security breaches. It seems to exist everywhere at once, even though it is explicitly stated that cyberspace was created by humanity. How can humanity create an uninterruptable, omnipresent digital system that nobody can control but everyone can use?
2. Power is Wielded Without Privilege
Certain people, corporations, and AI weilded greater levels of power in cyberspace than others, but the mechanism of that power was not derived from permissions or access levels or privileged administrator capabilities; rather, the power came from some mysterious other source that could not be granted, governed, or revoked by any law or system or authority. This power seemed also to be heavily influenced by...
3. Hardware and Skill
The hardware that people used to connect to cyberspace had a direct impact on their capabilities in cyberspace. There is a a specific example where a character uses a premium cyberspace deck and describes the speed and smoothness of their ability to fly through cyberspace. A short while later, this operator is captured and immobilized by another dangerously skilled operator, demonstrating that while the cyberspace hardware was top-tier, the operator's skill level was also a big factor governing interactions in cyberspace. This also demonstrates that conflict is possible in cyberspace. How is it possible that hardware directly correlates to your capacity for virtual action? How can someone else influence your actions against your will in this digital reality?
4. Space is Scarce
Bitcoin was the first scarce digital resource to ever exist, and having only been created 14 years ago, scarce digital resources are still a novel concept to humanity in 2023. In Gibson's cyberspace, territory was conquered, captured, and fought over, indicating that the space in cyberspace may be scarce or valuable in some way. How could it be possible to fight for digital territory or acquire it without any governing intermediary took keep track of who owns what?
5. Construction Has a Cost
Constructs, or cyberspace "buildings", required some kind of effort or cost to create, but it was not clear to whom this cost was paid or how the effort was expended.
The metaverse of Stephenson's "Snow Crash" shares many properties with Gibson's cyberspace, although Stephenson is somewhat opinionated with the literal shape of the metaverse. He depicts it as a street that wraps around a black planet, and the land off the street is where people build things and claim territory.
This metaverse similarly does not seem to be owned or controlled by any single entity, yet it is available to everyone in the world to connect, interact, and build in 24/7. It is not described as an application or piece of software, but rather a place that is the sum of other software created by many different parties with different interests and motivations.
Here are some mysterious traits of Stephenson's metaverse:
6. Rules without Rulers
The rules of the metaverse were never broken — not even by hackers or bad guys. How are rules enforced by a system nobody controls?
7. No Teleportation, Localized Rules
The metaverse had rules about where you could spawn in and how you move. Teleportation was not an option. Specialized vehicle software could make traveling easier. Rules for travel were enforced by the metaverse itself. Experiences in the metaverse were localized to constructs that implemented their own rules. For example, sword fighting was allowed in The Black Sun hacker haven because it was programmed to exist there; other places did not have such activities. How can certain places in cyberspace have unique rules separate from the rest of cyberspace? How is locality enforced in a digital system controlled by noone?
8. Customize Without Compromising Everything
People were able to customize how their avatars and constructs looked. How do you govern this so people don't abuse it? Without enforceable rules, one person could make their avatar be the size of the universe and ruin the metaverse for everyone.
9. Everything is Connected
Digital systems in the real world had a presence in the metaverse too, although they were far from commonly populated areas. How do real world systems relate to the metaverse?
I enumerate these points in order to provide some context for where the idea for "the metaverse" began. These two books were instrumental in originally defining what the metaverse was even though it raised many specific questions that naturally were never addressed. As I have said: if the authors knew how to build the things they wrote about, they might not need to sell any books at all! This is fiction after all. Specifics are not required, but I have been focused on uncovering these specifics in the context of new technologies such as bitcoin and nostr and I believe they may now be defined.
To summarize, here are the mysterious properties of cyberspace/the metaverse:
- Everyone can use it
- Nobody controls it
- It exists everywhere
- An individual's power is not granted by any system or authority but comes from their hardware and skill
- Individuals can weild their power against each other
- Territory is scarce and may be captured
- Constructs may be build on territory but have some kind of cost
- The system enforces rules on everyone, or somehow incentivises everyone to follow the rules — even bad actors
- locality is enforced and travel requires time
- certain localities have unique rules
- freedom to customize your avatar and constructs are bounded in some way to prevent total corruption and abuse of digital space
- there is some form of connection between real world digital systems and cyberspace
Nothing Like It
I want to make it very clear that no metaverse in existence today exhibits all of the above properties or even a few of them, and very few digital systems exhibit even one of these properties.
Before I discovered Neuromancer several months ago, my perception of "cyberspace" was that of a quaint, naive concept of what interconnected computer systems would become that never came to pass. I used to ironically refer to the internet as "cyberspace" when talking with other developers as a kind of humorous or self-important overemphasis. I thought that cyberspace was a cultural relic and a failed prediction of what the internet would be. The glossy, 3D wireframe neon world never materialized, and for this I've always felt a sense of loss of what could have been.
Why didn't cyberspace ever come into existence? As computers became more advanced and consumer-focused, they did not attempt to emulate a 3D space, but rather did the practical and sensible thing of emulating paper documents so people could get work done digitally.
As the internet became a household utility, centralized systems and applications were the first and only way that humanity knew to grow our collective capabilities online. Websites grew as they collected users and became behemoths. Ebay, Google, Amazon, PayPal. Although the internet itself was technically decentralized, decentralized systems and applications would come later when the problems with centralization became obvious as these internet giants started abusing their power.
As the internet continued to grow and resemble less and less the fictional worlds of cyberspace and the metaverse, these lofty ideas of digital 3D worlds turned into toys for most people — unnecessary but entertaining fluff. The notion of cyberspace faded into 80's retro culture and the word cyberspace was painfully repurposed to simply refer to the internet or networked systems collectively. This, however, is a terribly inaccurate use of the word. There is no space to speak of on the internet. But, sadly, the interet was the closest thing we have ever had to Gibson's vision of cyberspace, and it really wasn't close by any measure except that people all over the world use it for pretty much everything. All the other magical properties of cyberspace were simply forgotten. They were fiction. They weren't necessary. They weren't possible. There was no way to make it work and no reason to do so.
However, when reading these books from nearly 4 decades past, there is something poignant to me in the fact that across these various works by various authors, the metaverse (or cyberspace) posesses consistent properties that no author is willing to alter. What is so inviolate about a fictional concept that would make it so consistent across works? Perhaps there is a kernel of truth in what is imagined, and that truth is so compelling that it must be repeated until it crosses the veil between idea and reality.
Reality And Cyberspace
I have always been fascinated by technology's ability to improve our lives, and I have strived to create consequential software my entire life, like video games played via fitness watches and augmented reality navigation web apps, but it wasn't until I read Softwar that I realized something profound.
In reality, I am able to do anything I want as long as I have the thermodynamic energy to do it.
It may not be legal or socially acceptable, but if I have the energy to do something, I can. Likewise, if someone wants to stop me from doing something, they can't unless they also expend enough energy to stop me.
This means that reality is permissionless, because nobody can disallow my actions. It also means reality is thermodynamic, because every action has a cost that must be paid to the universe in the form of entropy.
Conversely, in digital systems, the amount of energy you have is irrelevant. The things you can do are only the things you are allowed to do by the permissioned system, or, the things you can trick the system into letting you do. This means that within a digital system, you always have a severely limited subset of available actions, and your ability to execute those actions has nothing to do with your thermodynamic potential and everything to do with the permissions you are granted by others. No matter how strong my muscles are or how clever I am, I can't do more than a digital system allows me to do, and even if I hack it, I still can't do anything I want — only more than I'm supposed to.
Almost all digital systems and software are permissioned and non-thermodynamic.
This is why, fundamentally, no metaverse that has ever been built actually matters. This is why no video game keeps you interested in it forever, because the actions that you can do and the extent to which you can do them are arbitrarily disconnected from your ability to act in reality. It's "just a game" — a limited subset of reality.
This isn't fundamentally a bad thing. Obviously, software has done a lot for humanity. And software-based rules have been used in many contexts to keep things fair for everybody. Except, unfortunately, digital systems are never truly fair. There's always a programmer who wrote the rules. There's always an admin above you. There's always a hacker. And there's always someone with more permissions than you who can restrict your potential. Compared to reality, this seems arbitrary and capricious. At least in reality, the universe to which you pay your entropy costs is truly, terribly impartial and unbiased. This is why thermodynamic systems are fundamentally fair; even though it may not seem fair to you it is truly fair to everyone.
Reality is a permissionless, thermodynamic protocol, and almost nothing in the digital world resembles this whatsoever.
Nostr, however, is permissionless. Until now you may not have considered this to be a fundamental property of our universe, but it is, and the fact that nostr exhibits this same property is quite compelling. In fact, any system that is truly permissionless (of which there are very few) seems to have the magic ability to capture people's imaginations in ways that no other digital system can. Things such as bitcoin, perhaps the most famous permissionless digital system, has such broad and profound effects on people who grasp it that they have been known in some cases (such as Andreas M. Antonopoulos') to stop eating, stop sleeping, research it obsessively, and completely change their entire way of life. How's that for consequential software?
Like bitcoin, nostr can also be thermodynamic via proof-of-work (NIP-13). And with the combination of these two properties, which are shared with reality, we are suddenly able to dispell every single mystery we've encountered regarding cyberspace.
Proof-of-work is the secret ingredient to dissolve the fiction in the science fiction of cyberspace.
A Mystery Solved About Cyberspace and Bitcoin
The reason that the properties of cyberspace and the metaverse are so mysterious is because they are actually properties of reality, but inside a digital system.
In reality, it is no surprise that your hardware and skill affects your ability. No surprise that your work determines your power. No surprise that you can't teleport and must travel using energy. No surprise that building constructs or customizing yourself takes effort and cost. No surprise that territory is scarce and must be defended. No surprise that conflict happens between people. All of this is so completely normal that it's easy to overlook.
The reason these things in Neuromancer and Snow Crash and other works about the metaverse seemed so mysterious is because they weren't possible to model in a digital space. These authors took properties of reality and put them in a digital space, and it seemed amazing because nobody knew how it could actually work.
This is how it is possible: you design a digital system that has the same fundamental properties as reality. It must be permissionless. It must be thermodynamic. Then you have a system wherein cyberspace can exist.
Nostr is the fulfillment of these requirements. Like bitcoin, nostr has captured the imaginations of thousands of early adopters and developers. The magic is there. But few may realize why it is feels so magical. The answer is that when you make a digital system that can model reality's own properties, you have created an extension of reality itself. This is one of the most significant discoveries in human history, because a digital extension of reality can allow humanity to connect, collaborate, and grow in a place where their physical diatance does not matter. Barring any major advancements in spacefaring technology, cyberspace will be the most significant departure from earth you may experience in your lifetime.
I posit that a virtual action which is permissionless and thermodynamic is as real as a physical action, except the consequences of that action happen in the digital space rather than the physical one. It's not quite reality, but it is like a mirror. It is a true extension of reality. Nothing in it is disconnected from the real world. And therefore, unlike any other digital system, it can be truly consequential.
Healthy Skepticism
Wait a minute, you may say. Cyberspace is still just a digital way of communicating. Isn't email and texting and video chat good enough? Why do we need cyberspace? How could it possibly be so important to humanity?
This is a great question.
One may ask similar questions about bitcoin. Don't we already have digital money? Why do we need absolutely scarce, decentralized censorship-resistant money?
Or about nostr: why do we need another way to transmit text? What good is it to be decentralized and censorship resistant?
Like bitcoin, cyberspace makes something digital into a scarce resource, but instead of enabling the capture of time (on a timechain) cyberspace enables the capture of scarce space (in a construct or your movement chain).
In the physical world, our movement can easily be censored. Our expression of power is censored. Our capture of space is censored. Not so in cyberspace. The only thing that matters in cyberspace is the thermodynamic energy you exert — just like in reality, but minus the permissioned (artificial) power structures of governments and laws.
Just as bitcoin doesn't care who you are or who you are transmitting value to, cyberspace doesn't care who you are or where you are moving to or where you are claiming space.
Consequences of Proof-of-Work in Detail
I'd like to enumerate the specific ways in which proof-of-work in nostr demystifies the mysterious properties of cyberspace mentioned earlier.
- Everyone can use cyberspace but nobody has full control over it.
Just replace "cyberspace" with "nostr" and the above statement is already true. If we build cyberspace on nostr, then cyberspace will inherit these properties.
- Certain people, corporations, and AI weilded greater levels of power in cyberspace than others
If we base an entity's power on their thermodynamic output via proof-of-work, then we have a permissionless way for cyberspace participants to enact their real thermodynamic potential in cyberspace to whatever degree they choose.
- The hardware that people used to connect to cyberspace had a direct impact on their capabilities in cyberspace.
If all actions in cyberspace are thermodynamic like in reality, then hardware capable of greater amounts of proof-of-work will enable more powerful actions. Mobile phone, desktop computer, or ASIC farm: take your pick.
- Territory in cyberspace was conquered, captured, and fought over, indicating that it may be scarce or valuable in some way. Constructing had a cost.
The territory in cyberspace is the maximum 3-dimensional coordinate space that can be represented by 256 bits. In cryptography, nostr, and bitcoin, 256-bit numbers are very commonly used along with mathematical functions like SHA-256 that process data in blocks of 256, so it is a good "round" amount of information to work with.
You can divide a 256-bit number into X, Y, and Z coordinates that are each 85 bits long. This leaves 1 extra least-significant bit from the 256 bits that is ignored.
This means that each axis of cyberspace is 2^85 units long.
Space can be claimed by publishing a construct event whose event ID is the coordinate. The event ID is obtained from hashing the event (standard process for all events in the nostr protocol). This means the event ID can be mined to obtain the coordinates you desire (or at least something close by).
If two constructs overlap, the one with the most proof-of-work wins. This is how territory can be contested in a permissionless way in cyberspace, as alluded to in Gibson's works.
The cost of construction is the proof-of-work, and the maintenance of that territory by proof-of-work is the digital analogy of either paying taxes to a government who will protect your land, or, protecting your land with your own thermodynamic energy. Notice how money, also known as time, is inextricably linked to the ownership of property in reality and now also in cyberspace. If property were free to own and maintain, would it be worth anything at all?
I have been presented with the argument that it does not cost you anything to hold bitcoin, so constructs or cyberspace real estate should be no different. To this I say that for you to hold your bitcoin, a tremendous amount of cost is expended by all the miners in the world. If not for them, your bitcoin would be double-spent or stolen by invalid transactions. The validity of your unmoving bitcoin requires the continual, perpetual operation of the largest computer network in the world. So, continual energy expenditure to secure your property is true with bitcoin, cyberspace, and all of your things in reality as well.
Unlike bitcoin, whose value lies in the entire network, constructs are valuable only to their owner. Therefore, it is the responsibility of the construct's owner to defend it.
More complex symbiotic relationships in construct defense may be borne out of the perpetual fight for survival inherent in any thermodynamic system. But this is only speculation.
Enforcing rules universally
As with any protocol, deviation excludes you from it and submission grants a share in its value. Forking the cyberspace meta-protocol, whether by disobeying its rules or rewriting them, results in an irreconcilable forking of digital reality. The value of cyberspace as a protocol depends on network effects like any protocol, and the first-mover has a strong advantage in any protocol war. Additionally, as all actions require proof-of-work and must be chained together, the sum of your history in cyberspace is put at risk of invalidation should you deviate from the protocol everyone else is following.
Ultimately I believe that the purpose of cyberspace should be to imbue humanity with new capabilities and opportunities, and I think that it will, simply by virtue of the fact that it is built on a protocol and interoperates with protocols that likewise imbue humanity with new capabilities and opportunities (notsr, bitcoin, and TCP-IP). I desire that cyberspace will be flexible and capable enough to support as many use-cases as possible as long as the fundamental properties are preserved as axiomatic non-negotiables. Cyberspace itself must be both permissionless and thermodynamic. Without these properties, cyberspace becomes just another digital illusion apart from reality.
Flexibility, locality, and customzation
With these axioms in mind, I think it would be very smart to create a method of defining construct-level rules that exist only within certain spaces. This would allow for custom interactions and systems to exist in the fabric of cyberspace, making it very flexible and local. Any such system or customization, even if cosmetic, must impose a fundamental thermodynamic cost. The details of how this could work are yet to be developed, but the blueprints of fiction, our axioms, and our ingenuity will lead us to it.
The metaverse of things
Because nostr is a web-friendly protocol, it is trivial to connect anything to cyberspace. I love to tell the tale of how I witnessed an early 2022 conversation between someone and (if I remember correctly) Will Casarin's smart dishwasher that was posting kind 1 status updates to nostr. If someone can talk to a smart dishwasher over nostr, then the Metaverse of Things already exists. However, the degree to which these things expose themselves to the metaverse should follow the same security model as for the internet at large. Unlike in fiction, I doubt anyone will allow sufficient proof-of-work be the only prequisite for commandeering a real-world system. That's ok. Ultimately, the relationship between reality and cyberspace isn't meant to be a 1:1 map. Digital systems only really ought to participate in cyberspace if they share the same properties as cyberspace. Most digital systems are permissioned and non-thermodynamic, and therefore do not have a compelling reason to exist in cyberspace. But anything is indeed possible.
Chasing a Ghost or Following a Blueprint?
I'd like to make the distinction that the purpose of a construct (and of most things) in the real-world cyberspace is not necessarily the same as in Gibson's literary cyberspace nor in Stephenson's literary metaverse. But the fundamental properties as depicted remain the same, and it is enabled by proof-of-work.
Remember, the remarkable thing about Gibson's cyberspace is that it is a digital world that functions like reality — there is egalitarian conflict resolution, scarce space, and universally enforced rules. The cyberspace meta-protocol likewise enables a digital system to function like space in reality. The motivations and reasons behind this digital system may be completely different than in the books, but that doesn't mean the mechanisms are any less accurate.
My goal is not to reproduce Gibson's and Stephenson's work in reality. The properties of this fiction are compelling, and the implementation and usage of cyberspace will completely depend on free market forces — exactly as it should be. It is not for me to decide. These works opened the conceptual pathway to creating this new thermodynamic digital reality. The human motivations and actions that shape cyberspace will undoubtedly cause it to look plenty different than depicted in the books, while the funamental properties remain identical.
Throughout the process of designing the cyberspace meta-protocol I have tried to keep it as simple and fundamental as possible, using these books as my guidepost. Whether these authors realized it or not, their depictions of cyberspace and the metaverse were extremely consistent and coherent, which makes not only for great fiction and believability but also for a great guide to follow in developing a real system.
Step Into Cyberspace
I'd like to provide some concrete examples of how cyberspace works so that the concepts presented herein are not without application.
NOSTR in 3 minutes
To interact with cyberspace, one must simply publish certain "kinds" of nostr events. If you are not familiar with nostr, here is a short explanation. Nostr is made up of people running clients, which are just apps like on your phone or desktop, and people running relays, which are like servers that store events. Clients download streams of events from relays in real time. Clients can also publish events to relays. Publishing an event is like sending a tweet. Clients normally publish the same event to many relays at once. As long as you send the tweet to at least one relay that your friend is connected to, they will see your tweet. Anyone can run a relay or build a client and connect to whichever relays they want. In this way, nostr is permissionless and decentralized.
A "kind 1" event is essentially a tweet, but there are other kinds of events, each represented by a number. A kind 0 event is what you publish when you update your profile with a new bio or screen name. Anyone may make up a new kind of event and assign it any number, except it would be poorly supported if you used a number that is already accepted as part of the nostr protocol for another purpose than you are using it for. Luckily, there are a lot of numbers to go around.
A private/public keypair is an anonymous cryptographic identity, and it can be used for secure communication, storage of bitcoin, and other various things. The keys themselves are just unfathomly large unguessable numbers represented in the hexadecimal number system which includes numbers 0-9 and a-f (base 16 instead of base 10). When you publish an event on nostr, it is signed by your private key, and the event contains your public key. This allows anyone to verify that the event is legitimately from that public key, which presumably only a certain person controls. In this way, nobody can forge or tamper with events without invalidating them, because the signature would not match the public key.
Meta-protocol
This is why I refer to cyberspace as a meta-protocol because it is simply a specific way of publishing and interpreting specific event kinds over nostr and visualizing them in a 3D space.
Drift
To move in cyberspace, you must publish a kind 333 event, referred to as a Drift event. This event contains your 3D cyberspace coordinates, your direction, your existing velocity, a reference to your previous Drift event, and proof-of-work to add velocity by your direction. The amount of proof-of-work on the drift event determines your acceleration. Proof-of-work can be added to any nostr event by choosing an amount of work, represented by the number of leading binary zeroes on the event's ID, and hashing the event with a different nonce until the target amount of work is reached. This process is specified in NIP-13.
Each Drift event may be validated by running the coordinates and velocity through a standardized cyberspace algorithm (currently being developed) to verify that the value changes from one drift event to the next are within a tolerable range of error. It is in effect a way of simulating the movement within a physics system in order to validate that the movements did not break the rules of cyberspace physics. In this way, every participant in cyberspace is a validator of everyone else they are physically near.
When a drift event is signed, the reference to the previous drift event is included in the signature. This creates, in effect, a personal verifiable hash chain history of your movements and actions in cyberspace that anyone else can verify.
Dishonesty and Punishment
In order to encourage people to be honest about their movement chains, anyone who finds an invalid — or "broken" — movement chain may punish its owner by publishing a Derezz event on it, which will invalidate all movement chains and proof-of-work owned by the victim and teleport them back to their home coordinate where they originally spawned when they first used cyberspace. This is effectively a respawn. You start from scratch, but you can keep your constructs.
One can easily lie about their movement chains and teleport anywhere at any time. But on nostr, for the most part, events cannot be deleted. Therefore, a cheater will leave a bright trail by which others may cyber-kill them via Derezz. A broken movement chain is like a ghost copy of the cheater that can't move. For deeper protocol reasons that the adventurous may explore in the spec, this makes the ghost copy extra vulnerable to Derezz.
A cheater may choose to ignore the Derezz attack and continue to teleport where desired. Nothing in nostr or cyberspace can stop this. But to everyone else who follows the protocol, this type of behavior can easily be ignored. The habitual cheater may as well be a ghost, as their thermodynamic actions will be ignored by everyone else who has chosen to obey the protocol. The cheater might redeem themselves by publishing a very long valid chain of events, but this probationary period may be too demanding for habitual protocol breakers. Other aggressive actions may be leveled against cheaters, making their operation in cyberspace unproductive, difficult, and dangerous.
A cheater in cyberspace is an easy target. As the punishment of cheaters is a noble act of justice rather than an evil act of predation, I expect cheaters in cyberspace to be punished with great swiftness and mirth.
Incentives for honest movement legitimize the spatial aspect of cyberspace. In cyberspace, space is real and consequential. Traversing it has a cost. No two places are funamentally the same because there is a real cost to visit them. And all space is scarce, because of the hard limit of the 2^85 coordinate system, which was chosen to be compatible with the most popular mining agorithm in the world, SHA-256.
Other actions
To read all about the different kinds of events one can publish to interact with cyberspace, check out (and contribute!) to the official cyberspace specification here: https://github.com/arkin0x/cyberspace
In this specification you will find technical implementation details for clients to interact with the cyberspace meta-protocol, including definitions of other actions that may be taken by operators, including:
- manipulating "gravity" to affect other operators
- creating proof-of-work armor against Derezz
- cloaking one's location with stealth
- and more!
Ubiquity and Omnipresence of Cyberspace
A very interesting recent development is an open-source project called
nostrmesh
by lnbits that enables anyone to host a nostr relay on a small mesh-networked device, such as an Arduino. Imagine a network of billions of these devices, scattered across the globe, running on battieries and solar panels, each contributing to the infrastructure of cyberspace — a decentralized, omnipresent digital cosmos, accessible from virtually anywhere. This level of ubiquity brings us one step closer to the vision of Gibson's work, where cyberspace becomes an integral part of our daily lives and can be found everywhere — even in outer space!With such a ubiquitous and omnipresent network, the spatial limitations of cyberspace extend far beyond conventional digital boundaries, intertwining with our physical world in a way that was once the domain of science fiction. The implications of this development are enormous and lay the foundation for the potential uses and influence of cyberspace, which we will explore in the following section.
Consequences of Cyberspace
It is difficult to predict whether cyberspace will find its place in the daily lives of billions or be forgotten once again. However, the foundations of cyberspace are inextricably linked to technologies that have been developed and adopted for the sake of human freedom and personal rights: public key cryptography, hashing, proof-of-work, bitcoin, and nostr.
As one who takes science fiction seriously enough to remove the fiction from it entirely, I find speculation to be invaluable. One cannot go where the imagination does not first lead. Our entire perception of the world is parsed from an abstraction created by our minds. We automatically assign meaning to inherently meaningless things — symbols, patterns, etc. Fiction is our reality, and our perception of the world is the sum of logical patterns within this fiction. Therefore, let us do what we do best and create fiction, or speculate, without hesitation.
I personally envision cyberspace to be a place of commerce and social organization. Constructs enable people to claim cyber land. They can use this land to design interactive experiences that are governed by localized rules and thernodynamics. The Lightning network enables instant transfer of value through cyberspace, facilitating the transaction of information, services, cyber experiences, digital and physical goods.
Use cases include shopping, gaming, gambling, competitions, live cyber events, virtual-presence social gatherings, virtual protests, collaborative spaces, advertising, education, tourism, development of cyberspace-based applications, data visualization, research, social networking, and even more that we haven't imagined yet.
Conclusion
As we venture into the vast digital landscapes of cyberspace, we are not simply traversing through lifeless data, but immersing ourselves in a consequential world that reflects the order and complexity of our physical reality. This revolutionary approach to cyberspace isn't merely a mirage of science fiction, but a tangible exploration of its key principles, built on the bedrock of public key cryptography, hashing, proof-of-work, bitcoin, and nostr.
Through the implementation of a meta-protocol layered over nostr, cyberspace opens a myriad of opportunities – drifting through the boundless expanses of the digital cosmos, crafting personal domains, or engaging in vibrant social interactions. Just as actions in the physical world carry costs and consequences, so too does cyberspace enforce its own unique set of rules and repercussions, fostering a sense of shared responsibility, fairness, and cooperation among its denizens.
Envisioned as a playground for creativity, commerce, and social organization, cyberspace, in its current formative state, already shows tremendous potential for an expansive array of use cases. Its inherent thermodynamic properties and permissionless nature offer a groundbreaking amalgamation of digital and physical realities, poised to profoundly augment our capabilities, experiences, and opportunities in a dynamic, inclusive, global, and consequential realm.
However, the most exciting aspect of cyberspace is not merely what it currently offers, but its potential to continuously evolve and redefine itself. As more people engage with and contribute to its development, it's bound to expand and morph in ways we can hardly fathom today.
As we stand on the cusp of this digital frontier, we are not chasing after phantoms of fiction. Instead, we are architects and pioneers of a revolutionary new realm, where digital existence converges with physical principles. This remarkable blend of technology and human imagination sets the stage for an unprecedented era of exploration and innovation, signaling a future that is as exciting as it is unpredictable.
Join me on this grand endeavor to shape cyberspace, to mold this new frontier into a shared and diverse digital world that reflects the best of human spirit and ingenuity. Together, let's step into cyberspace, the frontier of the future.
Build with me
Cyberspace is for humanity, and therefore I desire as many humans to be involved in its construction as possible. I have created the following resources for anyone who wants to learn more or get involved:
Join the ONOSENDAI Telegram group: https://t.me/ONOSENDAITECH
I love answering questions! Please hop in and ask away!
Pull requests welcome on the spec: https://github.com/arkin0x/cyberspace
Check out the first cyberspace client, ONOSENDAI: https//onosendai.tech (thermodynamics still in development)
Pull requests welcome on ONOSENDAI: https://github.com/arkin0x/ONOSENDAI
whoami
My name is Nick. I go by arkinox. I have been making websites since I was 11 and designing games since I was 4. I've been the director of web for a midwest marketing firm for 10 years and the co-founder and senior vice president of an augmented reality company, innovatar.io, for 4 years.
Support FOSS
If you would like to support development of this protocol I would love for you to get involved. Also, I am accepting donations and looking for opportunities to pursue it full time.
Follow me on nostr: npub1arkn0xxxll4llgy9qxkrncn3vc4l69s0dz8ef3zadykcwe7ax3dqrrh43w
Zap me with Bitcoin Lightning: arkinox@getalby.com
Soli Deo gloria
-
@ 4d444439:7ed2458b
2023-07-30 13:37:16This is a piece written by Amber Case, governance board member of Superset DAO, Former Research Fellow at MIT Media Lab and Harvard BKC, author of Calm Technology, and Co-founder of DAO DAO. Case is working with BlockScience team members on the design of a first-of-its-kind data trust DAO. This is the first article outlining the motivation for its creation and the initial structure.
Feel free to read all my articles on the anti-censorship long content platform yakihonne.com.
I’m thrilled to announce I’ve joined the governance board of Superset, a new DAO with a challenging but important mission:
To give people more control and better benefits from their own user data.
We all know the problem. When we sign up for an online service, we quickly click Agree on the Terms of Service window without really knowing what that entails. We do know how often companies abuse our data, feeding it into algorithms that exploit our emotions and invade our privacy, while also commercializing it to make billions of dollars.
And because all of the major services seem problematic, we feel as if we have no choice but cave in, giving them unrestricted use of our data. Government regulation may be an important part of the solution, but it’s a blunt instrument; it’s slow and cumbersome to implement, and may only make the problem worse. Thanks to EU laws, we now click “Accept All Cookies” to every new website we visit, but that added inconvenience may outweigh any benefit.
At the moment, the current Terms of Service form that users “Agree” to is an ultimatum. Enlightened organizations like Mozilla offer a better version. We believe it’s time to build off their innovative work and completely transform the ToS concept.
Unless we actively resist this process, we have no power over what happens to our data.
Superset follows a DAO governance model that gives its members governance over their data, coalesces and channels their voice, acting as an advocate and protector. While government regulation provides a one size fits none solution, DAOs provide more granular oversight power.
Through this DAO, we have the ability to revoke the consent to use all members’ data in one shot. This mechanism ensures the DAO can provide a check on Delphia’s power over its users in practice. Specifically, the ability to say, “If this concern is not addressed, everyone’s consents will be revoked” gives us real bargaining power.
This is also not a new idea. For roughly the last 15 years, as social networks and advertising platforms grew in reach and power, there have been many attempts to create a kind of coalition that acts in the best interests of the users and their data. There have been proposals about more ethical uses of data, as well as data trusts and data DAOs. But so far, none of them have caught on. Success is not achieved solely from what is built, but also how it fits into an ecosystem. Data governance and data commercialization need to be pursued symbiotically, rather than antagonistically.
Superset has an important, if somewhat ironic advantage to previous attempts to create a data coalition: It’s a project whose launch funding came from Delphia, a YCombinator-funded data driven investment service. Importantly, Superset is independent of Delphia, with a legally enshrined purpose of constraining Delphia’s power over its users’ data. This gives us both the independence and financing needed to deploy this DAO that has a symbiotic relationship with an established tech company.
Delphia is not simply doing this to be a good corporate citizen. Our DAO’s hypothesis is that being respectful of people’s data and that making them first class citizens in economic data systems will actually lead to higher quality data and to better commercial opportunities for Delphia. When we prove this through Superset, we believe other companies will adopt a similar pattern.
Very roughly summarized, here’s how it works:
Superset unlocks the Voice option for Users
When users sign-up with Delphia, the service leverages their data to power the company’s machine-learning model. But the company does not get unlimited, uncompensated access to that data — the users still maintain oversight of the terms of that data usage through Superset.
This is why Delphia’s sign-up process includes an onboarding process starting with a screen that looks like this, different from any that has come before:
Users of the Delphia App are invited to join Superset directly from within the App.
Clicking Agree starts the process for the user to join Superset, which is legally contracted with both Delphia and data contributors, in order to serve the interests of those who share their data with the company, according to Superset’s purpose.
As that suggests, the DAO is not only a blockchain-based group; it is also legally formed as a Guernsey purpose trust. This gives us real, actionable protection over user data, which we informally call a “killswitch”. If DAO members collectively decide that Delphia is unfairly exploiting their data (per the agreed upon terms of service), they can collectively opt out in bulk.
SuperSet provides a model for user-centric oversight for any companies to ethically manage their users’ data. By proving that it is not only possible, but beneficial to govern algorithmic systems paves the way for consumers and regulators to demand change.
As controversies around Twitter, TikTok, and other leading social networks continue, new competitors are clamoring to be seen as better and more ethical alternatives to the market leaders. A would-be Twitter competitor would want to adopt the Superset DAO model as a way of attracting early adopters and differentiating itself from other platforms.
In the tradition of open source software, Superset is an open source organization. Superset’s animating purpose, formative documents, governance model and technical tooling are being shared publicly. While no one has made a data trust quite like this before, there is no silver bullet for governance; we expect Superset to evolve over time. Forks are encouraged!
Zooming in on Superset to see how it empowers Users to hold Delphia accountable.
Another one of Superset’s summoners is Dr. Michael Zargham, CEO of BlockScience, whose work on cybernetics I’ve admired for some time. We finally got to meet in person at last summer’s Decentralized Web Conference. That’s when we quickly realized there was a lot of overlap between our interests in systems, cyborgs and cybernetics! It’s been a privilege to work together on Superset, and on this post.
Reflecting on my work in Calm Technology, I believe Superset offers a calmer experience for our data — giving people tools that allow us to manage our data, but without demanding constant engagement. Successful data governance provides peace of mind.
This is an exciting project and I can’t wait to talk more about it soon. As with my new series on cybernetics, it reflects a life goal I formulated over the New Year: I’d like to take what I’ve learned in the first half of my life to complex systems, and apply them in the real world.
Beginning with Superset.
For more info, please see the FAQ below, or check out the Superset site.
FAQ
How can people join Superset?
As of right now, simply sign up for Delphia through its mobile app and click Agree to join Superset during onboarding. Delphia is collecting and commercially leveraging user data, and has contracted with Superset to negotiate over the ways in which they collect, store, use and compensate users for that data.
What is Delphia?
Delphia is an investment app that leverages user data in its prediction algorithm for publicly traded companies.
Who is on Superset’s Governing Board?
Eric Alston is a constitutional lawyer, Research Associate for the Comparative Constitutions Project at the University of Chicago Law School, and is Faculty Director of the Hernando De Soto Capital Markets Program at the University of Colorado at Boulder.
Ben Bartlett is a technology lawyer and Vice Mayor for the City of Berkeley.
Amber Case is author of Calm Technology and has consulted for Microsoft, IDEO, Deloitte, Virgin, Warner Brothers, Fedex, and Esri, among many other organizations. She was previously a fellow at MIT’s Center for Civic Media and Harvard’s Berkman Klein Center for Internet & Society.
Andrew Peek is CEO of Delphia. Superset identifies Andrew as an interested trustee, limiting his power to impact Superset’s decisions. Andrew has helped shape Superset’s mission of serving data contributors and an advocate for Superset’s formation to Delphia’s board.
Dr. Michael Zargham is the Founder & Chief Engineer of BlockScience, a Board Member at the Metagovernance Project and holds a PhD in Electrical Engineering from the University of Pennsylvania. BlockScience has worked with Delphia to design and implement its systems to be technically, economically and ethically sound. Due to BlockScience’s role as a contractor to Delphia, Zargham is also identified as an interested trustee.
What are the limitations imposed on “interested” trustees?
Interested trustees are treated as having a conflict of interest. They recuse themselves from voting on board resolutions, unless the independent (non-interested) trustees determine that they are not conflicted. Practically, independent trustees are by default assumed to be unconflicted and interested trustees are by default assumed to be conflicted.
How is the board held accountable to Superset’s animating purpose?
Working in tandem with our board, there is a third party algorithmic auditor. The algorithmic auditor monitors how Delphia utilizes the contributed data, and what revenues are produce from commercial applications of that data.
The algorithmic auditor also fulfills the critical role of Trust Enforcer, as well as providing concrete reporting to Superset members so they can be informed participants in governance.
How will people’s lives be different if Superset succeeds?
If Superset’s governance model is widely adopted across the Internet, consumers’ daily lives will not substantially change; however, they will experience long term benefits from a shift toward balance of power between corporations and users of their products.
Who can participate in the Superset DAO?
Anyone who signs up for Delphia has the option to become a member of Superset, and enjoy the right to raise any question or concern about their data with the DAO as desired. As members, they’ll also receive members benefits from Delphia and any other entities Superset chooses to work with in the future. We might occasionally hold referendums or votes, but only as needed; participation is voluntary and optional.
What should members expect when joining Superset?
Members will engage with other early adopters who are passionate about democratic governance and consumer rights, as well as help define and refine the governance practices within Superset.
Am I giving my data to Delphia when I sign up? Can I take it back?
By joining Superset, you’re entering into a relationship with Delphia where you give the company consent to store your data and leverage it in designated ways in exchange for democratic oversight through Superset, along with other benefits. You retain your right to revoke data usage rights and even to have it deleted from all Delphia servers.
What can Superset do if you have an issue with how your data is being used by Delphia?
Individual members can revoke consent over their data and order its deletion by request at any time. Our collective “circuit breaker” is triggered by a majority vote and bulk revokes consents over members’ data. Our third party auditor will then confirm that this data is no longer being used. If usage continues after the circuit breaker is triggered, Delphia would be in violation of its contractual obligations to Superset’s legal entity, enforceable in a US court of law.
How can other companies use the structure outlined in this post to govern their own data trusts through a DAO?
Our design is based on first principles which can be carried forward to other orgs with similar animating purposes. Our governance work, smart contract code, documentation and other artifacts will be shared publicly under a creative commons license. Organizations looking for support should contact BlockScience.
What’s the roadmap for Superset?
We are soft launching Superset now, and will expand slowly as new members join. The issuance of the membership cards is our first major milestone. Early members will help determine our governance infrastructures such as deliberative processes, proposal making, and voting mechanisms. We are also developing a public documentation site.
Is Delphia offering a cryptocurrency?
Delphia is exploring the use of cryptocurrency for its compensation models, but due to the current regulatory environment, it’s currently not clear if this pilot will be continued.
Is there a blockchain?
Superset member logs will be run on a blockchain. When you join Superset, you get a membership card which is effectively an NFT that also grants access to our digital community spaces, such as a members-only forum, along with other future benefits.
Does Delphia put my user data on a blockchain?
No, not at all. User data is kept private. Privacy preserving technologies, such as zero knowledge proofs, are being explored for technical enforcement of agreed upon terms.
Why is Superset being structured as a DAO? Wouldn’t it work better as a registered non-profit? What specifically about a DAO affords more surface area for governance between parties?
Superset is a new kind of organization and the non-profit structure can be restrictive. To preserve flexibility, the DAO is utilizing a Guernsey purpose trust with fiduciary duties to the purposes of the Superset — to ensure that members receive a fair share of the returns resulting from the use of their data and to determine how their data may be collected, processed, stored, and used. The DAO is a decentralized governance model enabling members to participate in governance through a variety of technical tools.
What happens if Delphia is purchased?
This DAO will continue to exist and operate even when or if Delphia is purchased. The purchaser will still be bound by any legal agreements Delphia has with Superset and its members.
Where can I read more about how Superset operates?
We’ll be releasing more documentation soon, but for now you can check out the https://www.supersetdao.com/ site, and sign up for https://delphia.com/.
By:Amber Case Link:https://medium.com/block-science/we-need-more-control-over-our-own-user-data-43f267a817f5
-
@ cc33d933:0347fdcc
2023-07-31 07:42:16Willkommen zu Nostr
Ein paar Schritte, die den Einstieg in Nostr leichter machen
https://nostr.build/p/nb5759.png
Diese Einführung ist verfügbar in: * Englisch nostr:naddr1qqxnzd3cxy6rjv3hx5cnyde5qgs87hptfey2p607ef36g6cnekuzfz05qgpe34s2ypc2j6x24qvdwhgrqsqqqa28lxc9p6 thanks to nostr:npub10awzknjg5r5lajnr53438ndcyjylgqsrnrtq5grs495v42qc6awsj45ys7 * Deutsch nostr:naddr1qqxnzd3c8yerwve4x56n2wpeqgsvcv7exvwqytdxjzn3fkevldtux6n6p8dmer2395fh2jp7qdrlmnqrqsqqqa283gyc96 thanks to nostr:npub1eseajvcuqgk6dy98zndje76hcd485zwmhjx4ztgnw4yruq68lhxq45cqvg * French: nostr:naddr1qqxnzd3cxyunqvfhxy6rvwfjqyghwumn8ghj7mn0wd68ytnhd9hx2tcpzamhxue69uhhyetvv9ujumn0wd68ytnzv9hxgtcpvemhxue69uhkv6tvw3jhytnwdaehgu3wwa5kuef0dec82c33xpshw7ntde4xwdtjx4kxz6nwwg6nxdpn8phxgcmedfukcem3wdexuun5wy6kwunnxsun2a35xfckxdnpwaek5dp409enw0mzwfhkzerrv9ehg0t5wf6k2qgawaehxw309a6ku6tkv4e8xefwdehhxarjd93kstnvv9hxgtczyzd9w67evpranzz2jw4m9wcygcyjhxsmcae6g5s58el5vhjnsa6lgqcyqqq823cmvvp6c thanks to nostr:npub1nftkhktqglvcsj5n4wetkpzxpy4e5x78wwj9y9p70ar9u5u8wh6qsxmzqs * Chinese: nostr:naddr1qqxnzd3cx5urvwfe8qcr2wfhqyxhwumn8ghj7mn0wvhxcmmvqy28wumn8ghj7un9d3shjtnyv9kh2uewd9hszrrhwden5te0vfexytnfduq35amnwvaz7tmwdaehgu3wdaexzmn8v4cxjmrv9ejx2aspzamhxue69uhhyetvv9ujucm4wfex2mn59en8j6gpzpmhxue69uhkummnw3ezuamfdejszxrhwden5te0wfjkccte9eekummjwsh8xmmrd9skcqg4waehxw309ajkgetw9ehx7um5wghxcctwvsq35amnwvaz7tmjv4kxz7fwdehhxarjvaexzurg9ehx2aqpr9mhxue69uhhqatjv9mxjerp9ehx7um5wghxcctwvsq3jamnwvaz7tmwdaehgu3w0fjkyetyv4jjucmvda6kgqgjwaehxw309ac82unsd3jhqct89ejhxqgkwaehxw309ashgmrpwvhxummnw3ezumrpdejqz8rhwden5te0dehhxarj9ekh2arfdeuhwctvd3jhgtnrdakszpmrdaexzcmvv5pzpnydquh0mnr8dl96c98ke45ztmwr2ah9t6mcdg4fwhhqxjn2qfktqvzqqqr4gu086qme thanks to nostr:npub1ejxswthae3nkljavznmv66p9ahp4wmj4adux525htmsrff4qym9sz2t3tv * Swedish: nostr:naddr1qqxnzd3cxcerjvekxy6nydpeqyvhwumn8ghj7un9d3shjtnwdaehgunfvd5zumrpdejqzxthwden5te0wp6hyctkd9jxztnwdaehgu3wd3skueqpz4mhxue69uhkummnw3ezu6twdaehgcfwvd3sz9thwden5te0dehhxarj9ekkjmr0w5hxcmmvqyt8wumn8ghj7un9d3shjtnwdaehgu3wvfskueqpzpmhxue69uhkummnw3ezuamfdejszenhwden5te0ve5kcar9wghxummnw3ezuamfdejj7mnsw43rzvrpwaaxkmn2vu6hydtvv94xuu34xv6rxwrwv33hj6ned3nhzumjdee8guf4vae8xdpex4mrgvn3vvmxzamndg6r27tnxulkyun0v9jxxctnws7hgun4v5q3vamnwvaz7tmzd96xxmmfdejhytnnda3kjctvqyd8wumn8ghj7un9d3shjtn0wfskuem9wp5kcmpwv3jhvqg6waehxw309aex2mrp0yhxummnw3e8qmr9vfejucm0d5q3camnwvaz7tm4de5hvetjwdjjumn0wd68y6trdqhxcctwvsq3camnwvaz7tmwdaehgu3wd46hg6tw09mkzmrvv46zucm0d5q32amnwvaz7tm9v3jkutnwdaehgu3wd3skueqprpmhxue69uhhyetvv9ujumn0wd68yct5dyhxxmmdqgszet26fp26yvp8ya49zz3dznt7ungehy2lx3r6388jar0apd9wamqrqsqqqa28jcf869 thanks to nostr:npub19jk45jz45gczwfm22y9z69xhaex3nwg47dz84zw096xl6z62amkqj99rv7 * Spanish: nostr:naddr1qqfxy6t9demx2mnfv3hj6cfddehhxarjqyvhwumn8ghj7un9d3shjtnwdaehgunfvd5zumrpdejqzxthwden5te0wp6hyctkd9jxztnwdaehgu3wd3skueqpz4mhxue69uhkummnw3ezu6twdaehgcfwvd3sz9thwden5te0dehhxarj9ekkjmr0w5hxcmmvqyt8wumn8ghj7un9d3shjtnwdaehgu3wvfskueqpzpmhxue69uhkummnw3ezuamfdejszenhwden5te0ve5kcar9wghxummnw3ezuamfdejj7mnsw43rzvrpwaaxkmn2vu6hydtvv94xuu34xv6rxwrwv33hj6ned3nhzumjdee8guf4vae8xdpex4mrgvn3vvmxzamndg6r27tnxulkyun0v9jxxctnws7hgun4v5q3vamnwvaz7tmzd96xxmmfdejhytnnda3kjctvqyd8wumn8ghj7un9d3shjtn0wfskuem9wp5kcmpwv3jhvqg6waehxw309aex2mrp0yhxummnw3e8qmr9vfejucm0d5q3camnwvaz7tm4de5hvetjwdjjumn0wd68y6trdqhxcctwvsq3camnwvaz7tmwdaehgu3wd46hg6tw09mkzmrvv46zucm0d5q32amnwvaz7tm9v3jkutnwdaehgu3wd3skueqprpmhxue69uhhyetvv9ujumn0wd68yct5dyhxxmmdqgs87hptfey2p607ef36g6cnekuzfz05qgpe34s2ypc2j6x24qvdwhgrqsqqqa28ldvk6q thanks to nostr:npub138s5hey76qrnm2pmv7p8nnffhfddsm8sqzm285dyc0wy4f8a6qkqtzx624 * Dutch: nostr:naddr1qqxnzd3c8q6rzd3jxgmngdfsqyvhwumn8ghj7mn0wd68ytn6v43x2er9v5hxxmr0w4jqz9rhwden5te0wfjkccte9ejxzmt4wvhxjmcpp4mhxue69uhkummn9ekx7mqprfmhxue69uhhyetvv9ujumn0wd68yemjv9cxstnwv46qzyrhwden5te0dehhxarj9emkjmn9qyvhwumn8ghj7ur4wfshv6tyvyhxummnw3ezumrpdejqzxrhwden5te0wfjkccte9eekummjwsh8xmmrd9skcqgkwaehxw309ashgmrpwvhxummnw3ezumrpdejqzxnhwden5te0dehhxarj9ehhyctwvajhq6tvdshxgetkqy08wumn8ghj7mn0wd68ytfsxyhxgmmjv9nxzcm5dae8jtn0wfnsz9thwden5te0v4jx2m3wdehhxarj9ekxzmnyqyt8wumn8ghj7un9d3shjtnwdaehgu3wvfskueqpy9mhxue69uhk27rsv4h8x6tkv5khyetvv9ujuenfv96x5ctx9e3k7mgprdmhxue69uhkummnw3ez6v3w0fjkyetyv4jjucmvda6kgqg8vdhhyctrd3jsygxg8q7crhfygpn5td5ypxlyp4njrscpq22xgpnle3g2yhwljyu4fypsgqqqw4rsyfw2mx thanks to nostr:npub1equrmqway3qxw3dkssymusxkwgwrqypfgeqx0lx9pgjam7gnj4ysaqhkj6 * Arabic: nostr:naddr1qqxnzd3c8q6rywfnxucrgvp3qyvhwumn8ghj7un9d3shjtnwdaehgunfvd5zumrpdejqzxthwden5te0wp6hyctkd9jxztnwdaehgu3wd3skueqpz4mhxue69uhkummnw3ezu6twdaehgcfwvd3sz9thwden5te0dehhxarj9ekkjmr0w5hxcmmvqyt8wumn8ghj7un9d3shjtnwdaehgu3wvfskueqpzpmhxue69uhkummnw3ezuamfdejszenhwden5te0ve5kcar9wghxummnw3ezuamfdejj7mnsw43rzvrpwaaxkmn2vu6hydtvv94xuu34xv6rxwrwv33hj6ned3nhzumjdee8guf4vae8xdpex4mrgvn3vvmxzamndg6r27tnxulkyun0v9jxxctnws7hgun4v5q3vamnwvaz7tmzd96xxmmfdejhytnnda3kjctvqyd8wumn8ghj7un9d3shjtn0wfskuem9wp5kcmpwv3jhvqg6waehxw309aex2mrp0yhxummnw3e8qmr9vfejucm0d5q3camnwvaz7tm4de5hvetjwdjjumn0wd68y6trdqhxcctwvsq3camnwvaz7tmwdaehgu3wd46hg6tw09mkzmrvv46zucm0d5q32amnwvaz7tm9v3jkutnwdaehgu3wd3skueqprpmhxue69uhhyetvv9ujumn0wd68yct5dyhxxmmdqgsfev65tsmfgrv69mux65x4c7504wgrzrxgnrzrgj70cnyz9l68hjsrqsqqqa28582e8s thanks to nostr:npub1nje4ghpkjsxe5thcd4gdt3agl2usxyxv3xxyx39ul3xgytl5009q87l02j * Russian: nostr:naddr1qqxnzd3cxg6nyvehxgurxdfkqyvhwumn8ghj7un9d3shjtnwdaehgunfvd5zumrpdejqzxthwden5te0wp6hyctkd9jxztnwdaehgu3wd3skueqpz4mhxue69uhkummnw3ezu6twdaehgcfwvd3sz9thwden5te0dehhxarj9ekkjmr0w5hxcmmvqyt8wumn8ghj7un9d3shjtnwdaehgu3wvfskueqpzpmhxue69uhkummnw3ezuamfdejszenhwden5te0ve5kcar9wghxummnw3ezuamfdejj7mnsw43rzvrpwaaxkmn2vu6hydtvv94xuu34xv6rxwrwv33hj6ned3nhzumjdee8guf4vae8xdpex4mrgvn3vvmxzamndg6r27tnxulkyun0v9jxxctnws7hgun4v5q3vamnwvaz7tmzd96xxmmfdejhytnnda3kjctvqyd8wumn8ghj7un9d3shjtn0wfskuem9wp5kcmpwv3jhvqg6waehxw309aex2mrp0yhxummnw3e8qmr9vfejucm0d5q3camnwvaz7tm4de5hvetjwdjjumn0wd68y6trdqhxcctwvsq3camnwvaz7tmwdaehgu3wd46hg6tw09mkzmrvv46zucm0d5q32amnwvaz7tm9v3jkutnwdaehgu3wd3skueqprpmhxue69uhhyetvv9ujumn0wd68yct5dyhxxmmdqgs87hptfey2p607ef36g6cnekuzfz05qgpe34s2ypc2j6x24qvdwhgrqsqqqa286qva9x by nostr:npub10awzknjg5r5lajnr53438ndcyjylgqsrnrtq5grs495v42qc6awsj45ys7
Hallo, mit-Nostrich!
Nostr ist ein brandneues Phänomen. Es gibt ein paar Schritte, die den Zugang erleichtern und die Nutzung verbessern.
👋 Willkommen
Da Sie diesen Text lesen, kann man annehmen, dass Sie bereits Nostr beigetreten sind indem Sie eine der Apps (bspw. Damus, Amethyst, Plebstr) für ihr Mobilgerät heruntergeladen haben, oder einen Nostr Web Client geöffnet haben (bspw. snort.social, Nostrgram, Iris). Für einen Einsteiger ist es empfehlenswert, den Schritten zu folgen, die die App ihrer Wahl vorschlägt – deren Begrüßungsverfahren liefert alle nötigen Grundlagen, und Sie müssen nichts weiter einstellen, außer Sie wollen wirklich etwas ändern. Wenn Sie auf diesen Post hier gestoßen sind, aber noch gar keinen Nostr “Account” haben, können Sie dieser einfachen Schritt für Schritt Anleitung von nostr:npub1cly0v30agkcfq40mdsndzjrn0tt76ykaan0q6ny80wy034qedpjsqwamhz folgen.
🤙 Spaß haben
Nostr ist dafür entworfen worden, damit sich Leute verbinden können, gehört werden und dabei Spaß haben. Das ist alles (natürlich gibt es eine Fülle andere Anwendungen, wie ein Werkzeug für Freiheitskämpfer und Whistleblower zu sein, aber das verdient einen eigenen Artikel). Wenn etwas nicht klappt, wenden Sie sich einfach an erfahrene Nostricher, die Ihnen gerne helfen werden. Nostr zu nutzen ist ganz leicht, aber es gibt ein paar Besonderheiten im Vergleich zu herkömmlichen Plattformen, also ist es natürlich und sogar empfohlen Fragen zu stellen.
Hier ist eine inoffizielle Liste von Nostr Ambassadors, die gerne beim Einstieg behilflich sind:
nostr:naddr1qqg5ummnw3ezqstdvfshxumpv3hhyuczypl4c26wfzswnlk2vwjxky7dhqjgnaqzqwvdvz3qwz5k3j4grrt46qcyqqq82vgwv96yu
Allen Nostrichern auf der Liste wurde eine Nostr Ambassador Medaille verliehen, die es Ihnen leichter macht, sie zu finden, sie zu überprüfen und ihnen zu folgen.
⚡️ Zaps aktivieren
Zaps sind eine der ersten Besonderheiten, die einem auffalen, wenn man Nostr beigetreten ist. Zaps ermöglichen Nostr Benutzern sofort Belohnungen zu senden und so die Erstellung nützlicher und unterhaltsamer Inhalte zu unterstützen. Dies wird mit Bitcoin und dem Lightning Network möglich. Diese dezentralen Zahlungsprotokolle lassen einen sofort ein paar sats (die kleinste Einheit im Bitcoin Netzwerk) schicken, genauso einfach wie ein Like in einem herkömmlichen sozialen Netzwerk. Wir nennen dieses Modell “Wert für Wert”. Mehr über dieses ultimative Monetarisierungsmodell finden Sie hier: https://dergigi.com/value/
In diesem Post von nostr:npub18ams6ewn5aj2n3wt2qawzglx9mr4nzksxhvrdc4gzrecw7n5tvjqctp424 finden Sie eine gute Einführung zu Zaps.
Sie sollten Zaps aktivieren, auch wenn Sie sich nicht als Content Creator sehen – andere werden einige ihrer Posts für wertvoll halten und möchten ihnen dann ein paar sats schicken. Der einfachste Weg auf Nostr Mehrwert zu erzielen erfordert nur ein paar Schritte:
- Laden Sie sich Wallet of Satoshi für ihr Mobilgerät herunter - vermutlich ist das die beste Wahl für die, für die Bitcoin and Lightning etwas Neues ist.[^1] Natürlich gibt es viele andere Wallets, und sie können Sich gerne eine aussuchen, die ihnen besser gefällt. Vergiss Sie aber nicht ein Backup ihres Wallets zu erstellen!
- Tippen Sie auf “Receive”
- Tippen Sie auf die Lightning Addresse, die angezeigt wird (der Text, der aussieht wie eine Email Addresse) an, um sie zu kopieren.
- Fügen Sie die Adresse in das entsprechende Feld in den Einstellungen ihres Nostr Clients ein (das Feld ist bspw. mit “Bitcoin Lightning Address”, “LN Address” oder etwas Ähnlichem beschriftet, je nachdem welche App Sie benutzen).
📫 Holen Sie sich eine Nostr Addresse
Eine Nostr Addresse, auch oft als “NIP-05 Identifier” von den alten Hasen auf Nostr bezeichnet, sieht aus wie eine Email Adresse und:
🔍 Hilft ihren Nostr Account zu teilen, und anderen ihn zu finden
✔️ Dient als Nachweis, dass Sie ein Mensch sind (und kein Bot)
Hier ein Beispiel für eine Nostr Adresse: Tony@nostr.21ideas.org
Einfach zu merken und in eine Nostr App einzugeben, um den Benutzer zu finden.
Um eine Nostr Adresse zu bekommen kann man kostenlose Dienste wie Nostr Check (von nostr:npub138s5hey76qrnm2pmv7p8nnffhfddsm8sqzm285dyc0wy4f8a6qkqtzx624), oder kostenpflichtige wie Nostr Plebs (von nostr:npub18ams6ewn5aj2n3wt2qawzglx9mr4nzksxhvrdc4gzrecw7n5tvjqctp424). Beide bieten unterschiedliche Vorteile und Sie wählen einfach einfach einen aus. Ein anderer Weg zu einer Nostr Addresse führt über eine Browser Extension. Mehr zu diesem Ansatz finden Sie hier) .
🙇♀️ Die Grundlagen kennenlernen
Unter der Haube unterscheidet sich Nostr stark von herkömmlichen sozialen Plattformen, sodass es für einen Einsteiger vorteilhaft ist, ein grundlegendes Verständnis davon zu erlangen, wie es funktioniert. Dazu muss man keine Programmiersprache lernen oder die Details eines Netzwerkprotokolls verstehen. Es ist aber sehr hilfreich, das Gesamtbild zu sehen und den Unterschied zwischen Nostr und Twitter/Medium/Reddit zu verstehen. Es gibt beispielsweise keine Passwörter und Logins, sondern private und öffentliche Schlüssel. Dazu gibt es einige ausführliche Ressourcen, die Ihnen beim Verstehen von Nostr helfen. Alle, die ihre Aufmerksamkeit verdienen, sind auf dieser ordentlich aufgebauten Landingpage von nostr:npub12gu8c6uee3p243gez6cgk76362admlqe72aq3kp2fppjsjwmm7eqj9fle6 mit viel 💜 gesammelt worden.
Die Informationen in diesen Quellen werden Ihnen auch helfen, ihre Nostr Schlüssel (ihren Account) zu sichern. Also ist es wichtig, einen Blick darauf zu werfen.
🤝 Sich mit anderen verbinden
Die Möglichkeit sich mit interessanten [^3] Menschen zu verbinden macht Nostr besonders. Jeder wird gehört und niemand ausgeschlossen. Hier sind ein paar einfach Wege andere auf Nostr zu finden:
- Finde Sie, wem Sie auf Twitter folgen: https://www.nostr.directory/ ist dafür hilfreich.
- Folgen Sie, wem die folgen, denen Sie folgen: Besuchen Sie das Profil einer Person, deren Interessen Sie teilen, sehen Sie sich die Liste der von dieser gefolgten Konten und verbinden Sie sich ggf. mit ihnen.
- Rufen Sie den Globalen Feed auf: Jeder Nostr Client hat die Möglichkeit den Globalen Feed aufzurufen. Dieser enthält die Posts aller Nostr Benutzer. Folgen Sie einfach den Leuten, die Sie interessant finden (aber haben Sie Geduld - da kannst ist auch eine Menge Spam dabei).
- Verwenden Sie #hashtags: Hashtags sind eine gute Möglichkeit interessante Themen zu filtern. Tippen Sie einfach auf ein #hashtag, das Sie interessant finden, und Sie finden mehr Posts zu diesem Thema. Sie können in ihrer App auch nach Hashtags suchen. Fügen Sie sie auch ihren eigenen Posts hinzu, damit sie leichter gefunden werden.
https://nostr.build/i/0df18c4a9b38f1d9dcb49a5df3e552963156927632458390a9393d6fee286631.jpg Screenshot of https://nostrgraph.net/ dashboard by nostr:npub1ktw5qzt7f5ztrft0kwm9lsw34tef9xknplvy936ddzuepp6yf9dsjrmrvj
🗺️ Nostr erkunden
Diese fünf Schritte sind ein guter Anfang für einen erfolgreichen Start mit Nostr. Aber es gibt so viel mehr zu entdecken! Nostr ist nicht nur einfach ein Twitter Ersatz, seinen Möglichkeiten sind nur durch unsere Vorstellungskraft Grenzen gesetzt.
Hier noch eine Liste von unterhaltsamen und nützlichen Nostr Projekten:
- https://nostrapps.com/ eine List von Nostr Apps
- https://nostrplebs.com/ – hol Dir deine NIP-05 und andere Nostr Vorteile (zahlungspflichtig)
- https://nostrcheck.me/ – Nostr Adresse, Medien Uploads, relay
- https://nostr.build/ – Medien-Upload und Verwaltung (und mehr)
- https://nostr.band/ – Nostr Netzwerk und Benutzer Statistiken
- https://zaplife.lol/ – Zapping Statistiken
- https://nostrit.com/ – Posts zu bestimmten Zeiten senden.
- https://nostrnests.com/ – Twitter Spaces 2.0
- https://nostryfied.online/ - Sichere deine Nostr Daten
- https://www.wavman.app/ - Nostr Musik Player
📻 Relays
Nachdem Sie jetzt Nostr kennengelernt haben, sollten Sie sich unbedingt meine Kurzanleitung zu Nostr-Relays ansehen: https://lnshort.it/nostr-relays. Das muss einen am Anfang zwar nicht interessieren, aber später sollte man es sich einmal angesehen haben.
📱 Nostr auf dem Mobilgerät
Ein reibungslose Nostr Nutzung auf Mobilgeräten ist möglich. Diese Anleitung hilft Dir, dich nahtlos in Nostr-Webanwendungen auf deinem Smartphone anzumelden, zu posten, zu zappen und mehr:https://lnshort.it/nostr-mobile
Danke fürs Lesen und wir sehen uns auf der anderen Seite!
nostr:npub1eseajvcuqgk6dy98zndje76hcd485zwmhjx4ztgnw4yruq68lhxq45cqvg
[^1]: nicht vergessen
-
@ 4d444439:7ed2458b
2023-07-30 13:35:56A ground-breaking study by Integrated Biosciences, in collaboration with researchers from the Massachusetts Institute of Technology (MIT) and the Broad Institute of MIT and Harvard, has unveiled a promising approach to anti-aging drug discovery by employing artificial intelligence (AI). The research, detailed in the paper “Discovering small-molecule senolytics with deep neural networks,” involved the AI-guided screening of more than 800,000 compounds, resulting in the identification of three highly potent drug candidates with improved medicinal chemistry properties compared to existing senolytics under investigation[¹^].
Feel free to read all my articles on the anti-censorship long content platform yakihonne.com.
“This research result is a significant milestone for both longevity research and the application of artificial intelligence to drug discovery,” said Felix Wong, Ph.D., co-founder of Integrated Biosciences and first author of the publication[²^]. “These data demonstrate that we can explore chemical space in silico and emerge with multiple candidate anti-aging compounds that are more likely to succeed in the clinic, compared to even the most promising examples of their kind being studied today.”
Senolytics are compounds that selectively induce apoptosis (programmed cell death) in senescent cells, which are non-dividing cells that accumulate with age and contribute to a wide range of age-related diseases and conditions, such as cancer, diabetes, cardiovascular disease, and Alzheimer’s disease[³^]. Although some senolytic compounds have shown promising clinical results, most of them have been hindered by poor bioavailability and adverse side effects[⁴^]. Integrated Biosciences, founded in 2022, aims to tackle these obstacles and advance anti-aging drug development by leveraging artificial intelligence, synthetic biology, and other cutting-edge tools.
“One of the most promising routes to treat age-related diseases is to identify therapeutic interventions that selectively remove these cells from the body similarly to how antibiotics kill bacteria without harming host cells. The compounds we discovered display high selectivity, as well as the favorable medicinal chemistry properties needed to yield a successful drug,” said Satotaka Omori, Ph.D., Head of Aging Biology at Integrated Biosciences and joint first author of the publication[⁵^]. “We believe that the compounds discovered using our platform will have improved prospects in clinical trials and will eventually help restore health to aging individuals.”
In this innovative study, Integrated Biosciences researchers trained deep neural networks on experimentally generated data to predict the senolytic activity of any molecule. Using this AI model, they discovered three highly selective and potent senolytic compounds from a chemical space of over 800,000 molecules[⁶^]. All three candidates exhibited chemical properties indicative of high oral bioavailability and favorable toxicity profiles in hemolysis and genotoxicity tests.
Structural and biochemical analyses revealed that all three compounds bind Bcl-2, a protein that regulates apoptosis and is also a chemotherapy target[⁷^]. Experiments testing one of the compounds in 80-week-old mice, roughly corresponding to 80-year-old humans, demonstrated that it cleared senescent cells and reduced the expression of senescence-associated genes in the kidneys[⁸^].
“This work illustrates how AI can be used to bring medicine a step closer to therapies that address aging, one of the fundamental challenges in biology,” said James J. Collins, Ph.D., Termeer Professor of Medical Engineering and Science at MIT and founding chair of the Integrated Biosciences Scientific Advisory Board[⁹^]. Dr. Collins, who is senior author on the Nature Aging paper, led the team that discovered the first antibiotic identified by machine learning in 2020.
“Integrated Biosciences is building on the basic research that my academic lab has done for the last decade or so, showing that we can target cellular stress responses using systems and synthetic biology. This experimental tour de force and the stellar platform that produced it make this work stand out in the field of drug discovery and will drive substantial progress in longevity research[¹⁰^].”
Conclusion
The discovery of novel senolytic compounds with the aid of AI-guided deep neural networks represents a potentially groundbreaking advancement in the field of longevity research. The three identified drug candidates display high selectivity and improved medicinal chemistry properties compared to existing senolytics under investigation, offering hope for more effective therapies that target the root causes of age-related diseases and conditions.
The application of artificial intelligence to drug discovery has the potential to revolutionize the way researchers explore chemical space and identify promising compounds. In this study, AI was able to sift through over 800,000 molecules and pinpoint three highly selective and potent senolytic compounds, which showed favorable toxicity profiles and high oral bioavailability. As AI continues to improve and evolve, its contributions to drug discovery and longevity research are likely to become even more significant.
This advancement highlights the potential of AI-guided drug discovery to address one of the most fundamental challenges in biology: aging. By developing therapies that selectively remove senescent cells from the body, researchers can target the underlying causes of age-related diseases and work towards restoring health to aging individuals. The success of Integrated Biosciences’ platform in identifying these promising drug candidates is a testament to the power of artificial intelligence, synthetic biology, and other next-generation tools in advancing our understanding of aging and driving progress in longevity research.
Glossary
- Senolytics: Compounds that selectively induce apoptosis (programmed cell death) in senescent cells, which are non-dividing cells that accumulate with age and contribute to various age-related diseases and conditions.
- Apoptosis: A form of programmed cell death that occurs in multicellular organisms, helping to maintain cellular homeostasis and eliminate damaged or unwanted cells.
- Senescent cells: Non-dividing cells that accumulate with age, leading to a decline in tissue function and contributing to age-related diseases and conditions.
- Bioavailability: The proportion of a drug or other substance that enters the circulation when introduced into the body, allowing it to have an active effect.
- Haemolysis: The rupture or destruction of red blood cells, which can lead to the release of haemoglobin into the surrounding fluid.
- Genotoxicity: The destructive effect of certain substances on a cell’s genetic material, potentially leading to mutations or cancer.
- Bcl-2: A protein that plays a key role in regulating apoptosis by inhibiting cell death and promoting cell survival.
- Deep neural networks: A type of artificial neural network with multiple layers between the input and output layers, allowing the network to learn complex patterns and representations from large amounts of data.
References
- Integrated Biosciences, Massachusetts Institute of Technology (MIT), and Broad Institute of MIT and Harvard. (2023). Discovering small-molecule senolytics with deep neural networks. Nature Aging.
- Wong, F. (2023). Integrated Biosciences Co-founder.
- López-Otín, C., Blasco, M. A., Partridge, L., Serrano, M., & Kroemer, G. (2013). The hallmarks of aging. Cell, 153(6), 1194–1217.
- Zhu, Y., Tchkonia, T., Pirtskhalava, T., Gower, A. C., Ding, H., Giorgadze, N., … & O’Hara, S. P. (2015). The Achilles’ heel of senescent cells: from transcriptome to senolytic drugs. Aging Cell, 14(4), 644–658.
- Omori, S. (2023). Head of Aging Biology at Integrated Biosciences.
- Integrated Biosciences. (2023). AI-guided Screening Results.
- Oltersdorf, T., Elmore, S. W., Shoemaker, A. R., Armstrong, R. C., Augeri, D. J., Belli, B. A., … & Phillips, D. C. (2005). An inhibitor of Bcl-2 family proteins induces regression of solid tumours. Nature, 435(7042), 677–681.
- Integrated Biosciences. (2023). Senolytic Compound Experiment Results.
- Collins, J. J. (2023). Termeer Professor of Medical Engineering and Science at MIT.
- Integrated Biosciences Scientific Advisory Board. (2023). Contributions to Drug Discovery and Longevity Research.
By:Tom Martin Link:https://medium.com/@thomasjmartin/ai-guided-discovery-of-novel-senolytics-a-leap-forward-in-longevity-research-60a150be4961
-
@ d266da53:7224d834
2023-07-30 13:25:10On a sunny morning last December, Iyus Ruswandi, a 35-year-old furniture maker in the village of Gunungguruh, Indonesia, was woken up early by his mother. A technology company was holding some kind of “social assistance giveaway” at the local Islamic elementary school, she said, and she urged him to go.
Ruswandi joined a long line of residents, mostly women, some of whom had been waiting since 6 a.m. In the pandemic-battered economy, any kind of assistance was welcome.
At the front of the line, representatives of Worldcoin Indonesia were collecting emails and phone numbers, or aiming a futuristic metal orb at villagers’ faces to scan their irises and other biometric data. Village officials were also on site, passing out numbered tickets to the waiting residents to help keep order.
Ruswandi asked a Worldcoin representative what charity this was but learned nothing new: as his mother said, they were giving away money.
Gunungguruh was not alone in receiving a visit from Worldcoin. In villages across West Java, Indonesia—as well as college campuses, metro stops, markets, and urban centers in two dozen countries, most of them in the developing world—Worldcoin representatives were showing up for a day or two and collecting biometric data. In return they were known to offer everything from free cash (often local currency as well as Worldcoin tokens) to Airpods to promises of future wealth. In some cases they also made payments to local government officials. What they were not providing was much information on their real intentions.
This left many, including Ruswandi, perplexed: What was Worldcoin doing with all these iris scans?
To answer that question, and better understand Worldcoin’s registration and distribution process, MIT Technology Review interviewed over 35 individuals in six countries—Indonesia, Kenya, Sudan, Ghana, Chile, and Norway—who either worked for or on behalf of Worldcoin, had been scanned, or were unsuccessfully recruited to participate. We observed scans at a registration event in Indonesia, read conversations on social media and in mobile chat groups, and consulted reviews of Worldcoin’s wallet in the Google Play and Apple stores. We interviewed Worldcoin CEO Alex Blania, and submitted to the company a detailed list of reporting findings and questions for comment.
Our investigation revealed wide gaps between Worldcoin’s public messaging, which focused on protecting privacy, and what users experienced. We found that the company’s representatives used deceptive marketing practices, collected more personal data than it acknowledged, and failed to obtain meaningful informed consent. These practices may violate the European Union’s General Data Protection Regulations (GDPR)—a likelihood that the company’s own data consent policy acknowledged and asked users to accept—as well as local laws.
Stay updated on the latest developments in Crypto , follow me on Yakihonne.com.
In a video interview conducted in early March from Erlangen, Germany, where the company manufactures its orbs, Blania acknowledged that there was some “friction,” which he attributed to the fact that the company was still in its startup phase.
“I'm not sure if you're aware of this,” he said, “but you looked at the testing operation of a Series A company. It’s a few people trying to make something work. It’s not like an Uber, with like hundreds of people that did this many, many times.”
Proof of personhood
Two months before Worldcoin appeared in Ruswandi’s village, the San Francisco–based company called Tools for Humanity emerged from stealth mode. Worldcoin was its product.
The company’s website described Worldcoin as an Ethereum-based “new, collectively owned global currency that will be distributed fairly to as many people as possible.” Everyone in the world would get a free share, the company suggested—if they agreed to an iris scan with a specially designed device that resembles a decapitated robot head, which the company refers to as the “chrome orb.”
The orb was necessary, the website continued, because of Worldcoin’s commitment to fairness: each person should get his or her allotted share of the digital currency—and no more. To ensure there was no double-dipping, the chrome orb would scan participants’ irises and several other biometric data points and then, using a proprietary algorithm that the company was still developing, cryptographically confirm that they were human and unique in Worldcoin’s database.
“I’ve been very interested in things like universal basic income and what’s going to happen to global wealth redistribution,” Sam Altman, Worldcoin’s cofounder and the former President of Silicon Valley accelerator Y Combinator, told Bloomberg, which first reported on the company last summer. Worldcoin was intended, he explained, to answer the question “Is there a way we can use technology to do that at a global scale?”
The company was just getting started—its aim is to garner a billion sign-ups by 2023.
In the same article the then 27-year-old Blania, who joined Worldcoin straight out of a physics masters program at Caltech, added that “many people around the world don’t have access to financial systems yet. Crypto has the opportunity to get us there." (Blania and others have used “Worldcoin” to refer to the company as well as the currency; we do the same here.)
But beyond these do-gooder intentions, Worldcoin would also solve key technical problems for Web3, the much-hyped, blockchain-powered third iteration of the internet, where data and content could be decentralized and controlled by individuals and groups rather than a handful of tech companies.
Giving “ownership in this new protocol to everyone” would be the “fastest” and “biggest onboarding into crypto and Web3” to date, Blania told MIT Technology Review in an interview, addressing one of Web3’s major challenges: a relative dearth of users.
Additionally, by biometrically confirming that an individual is human, Worldcoin would solve another “very fundamental problem” in decentralized technologies, according to Blania: the risk of so-called Sybil attacks, which occur when one entity in a network creates and controls multiple fake accounts. This is particularly dangerous in decentralized networks where pseudonyms are expected. Coming up with a truly Sybil-resistant proof of personhood has thus far been difficult, and this is seen as another barrier for mass Web3 adoption.
Worldcoin has done field testing in 24 countries; (from left to right) these promotional images were taken in Sudan, Indonesia, Chile, and Kenya.
With these two solutions, Worldcoin could become “an open platform that everyone can use [for] both the proof-of-person part and the distribution part,” Blania said. Therein lies Worldcoin’s promise: if it succeeds, this protocol could become the universal authentication method for a whole new generation of the internet. If that happens, the currency itself could become far more valuable. “Investors hope that the Worldcoin project brings value to the world and, as a result, that this equity and/or these tokens will appreciate in value,” the company said in an emailed statement.
This may be why some of Silicon Valley’s biggest names, in addition to Altman, are pouring money into it; Andreessen Horowitz recently led a $100 million investment round that tripled the startup’s valuation, from an already heady $1 billion to $3 billion.
Look into the orb
By the time we spoke to Blania in March, Worldcoin had already scanned 450,000 eyes, faces, and bodies in 24 countries. Of those, 14 are developing nations, according to the World Bank. Eight are located in Africa. But the company was just getting started—its aim is to garner a billion sign-ups by 2023.
Central to Worldcoin’s distribution was the high-tech orb itself, armed with advanced cameras and sensors that not only scanned irises but took high-resolution images of “users’ body, face, and eyes, including users’ irises,” according to the company’s descriptions in a blog post. Additionally, its data consent form notes that the company also conduct “contactless doppler radar detection of your heartbeat, breathing, and other vital signs.” In response to our questions, Worldcoin said it never implemented vital sign detection techniques, and that it will remove this language from its data consent form. (As of press time, the language remains.)
The biometric information is used to generate an “IrisHash,” a code that is stored locally on the orb. The code is never shared, according to Worldcoin, but rather is used to check whether that IrisHash already exists in Worldcoin’s database. To do this, the company says, it uses a novel privacy-protecting cryptographic method known as a zero-knowledge proof. If the algorithm finds a match, this indicates that a person has already tried to sign up. If it does not, the individual has passed the uniqueness check and can continue registration with an email address, phone number, or QR code to access a Worldcoin wallet. All of this is meant to occur in seconds.
Worldcoin says that biometric information remains on the orb and is deleted once uploaded—or at least it will be one day, once the company has finished training its AI neural network to recognize irises and detect fraud. Until then, beyond vague descriptions like “personal data…sent via secure, encrypted channels,” it’s unclear how this data is being handled. “During our field-testing phase, we are collecting and securely storing more data than we will upon its completion,” the blog post states. “We will delete all the biometric data we have collected during field testing once our algorithms are fully-trained.”
In response to our questions just before this article went to press, Worldcoin said the public version of their system would soon eliminate the need for new users to share any biometric data with the company—though it hasn’t explained how this will work.
A useless IOU
But we do know how onboarding works. To get Worldcoin into the smartphones of new users, the company contracts with local ”orb operators” to manage signups for their countries or regions.
Operators apply for the job and are interviewed and approved by the Worldcoin team, though Anastasia Golovina, a company spokesperson, emphasized in an email that operators “are independent contractors, not Worldcoin employees.” As such, they work without contracts or guarantee of payment, instead receiving commission for each person’s biometric data that they collect. However, Golovina added, they must “comply with local laws and regulations, including local labor laws.”
These country-level operators receive their commission in the stablecoin Tether. Stablecoins are a type of cryptocurrency whose value is pegged to a traditional currency, often the US dollar. They determine the rates they pay their subcontractors (typically in local currency), as well as the working conditions (full-time, part-time, or temporary gig work.) Both country-level and subcontracted orb operators are incentivized by commission-based payment structures to register as many people as quickly as possible.
On the other side, new users currently earn at least $15 worth of Worldcoin for submitting to the biometric scan, and $5 more when they log in to their Worldcoin wallet, though the total amount available has since changed to $25 for later recruits. Some users receive the sum all at once, for others it vests at a rate of $2.50 per week. Blania says that differences are meant to test out the most effective incentives. Either way, Worldcoin isn’t a stablecoin, and since the currency has not yet launched, the company “do[es] not yet know how many WLD tokens would be equivalent to USD $20,” it noted in a written statement.
To understand user incentives, some people were given the option to receive $20 worth of Bitcoin instead, effectively allowing them to cash out. Worldcoin said that it found its “most engaged users elected to hold on to their WLD,” though most of our interviewees said the opposite.
But with the ability to cash out ending last fall, for now the promise of $20 or $25 worth of Worldcoin amounts to an IOU from the company. Any tokens users may have in their digital wallets are, for all intents and purposes, worthless.
Taking a chance
Worldcoin’s users joined for a myriad of reasons.
“Out of curiosity” was a common refrain. Because the orb operator “seemed nice”—or happened to be their brother, cousin, or classmate—was another. Some hoped to get in early on what could become the next Bitcoin. Others had lost jobs or income during the pandemic. Some became desperate as civil war threatened to reignite around them. Most just wanted the free money—at least one only wanted to buy lunch. Many suspected it was a scam, though few could risk passing it up in case it was not.
Ruswandi fit into several of these categories. He had lost much of his work as a furniture maker during the pandemic and spent his free time trading stocks and cryptocurrencies and frequenting crypto-related message boards and exchanges.
“I was curious and thought it wouldn’t hurt to try,” he recalled, adding that the money was attractive given his reduced income.
But he quickly had doubts. Neither the company representatives on site nor the village officials could answer even basic questions about Worldcoin. After doing more research online and coming up empty, he came to conclude it was a scam. He believed the mysterious giveaway was a mass data collection effort disguised as some kind of secret, offline airdrop—a tactic in which cryptocurrency projects release free tokens to encourage adoption.
After all, many of his neighbors’ understanding of the internet was limited to the Facebook app pre-installed on their smartphones, so before prospective users were even able to receive the new currency, Worldcoin representatives “first had to help many residents in setting up emails [and] logging in to the web,” Ruswandi recalled. If it was about attracting users to a new cryptocurrency, he wondered, “why did Worldcoin target lower-income communities in the first place, instead of crypto enthusiasts or communities?”
The biometrics question
When Worldcoin made its “We’re here!” announcement last October, it encountered immediate backlash.
As NSA whistleblower Edward Snowden put it in a tweet thread, “Don’t catalogue eyeballs. Don’t use biometrics for anti-fraud. In fact, don’t use biometrics for anything. The human body is not a ticket-punch.”
Iyus Ruswandi, pictured in front of the Worldcoin recruitment site in Gunungguruh, West Java, had many questions about why the company needed an iris scan—none of which were answered. MUHAMMAD FADLI
Many doubted Worldcoin’s privacy protocols, especially since the company had yet to issue a white paper or open its code for outside evaluation. “This looks like it produces a global (hash) database of people's iris scans (for ‘fairness’), and waves away the implications by saying ‘we deleted the scans!’ Yeah, but you save the hashes produced by the scans. Hashes that match future scans,” Snowden tweeted.
There were also questions about hardware security. Jeremy Clark, an associate professor at the Concordia Institute for Information Systems Engineering that focuses on applied cryptography, questions the security of the orb: “The machine itself will have some security protections,” he says, “but none of that technology is perfectly secure. So it's usually an economic question…if this project is as successful as they want it to be, then it's going to become more profitable to try and tackle.”
Others took issue with the company’s purported focus on fairness given that 20% of the coins had already been allocated: 10% to Worldcoin’s full-time employees, and another 10% to investors, like Andreessen Horowitz.
Additionally, many in the blockchain field disagreed with the underlying premise of what Worldcoin was trying to build: creating one identity across Web3 was anathema to a movement that had turned to blockchain, decentralized finance, and DAOs (“decentralized autonomous organizations”) for the express purpose of not being known.
Others remain unconvinced that Worldcoin can actually reach everyone in the world—and instead, serves as a distraction from ongoing work to create new identity paradigms. Identity expert Kaliya Young, while declining to comment on Worldcoin specifically, says that “it’s common for companies to claim that ‘if everyone in the world was in our system, everything would be fine.’ Newsflash: everybody is not going to be in your system, so let’s move on and talk about how we solve problems” in online identity.
For Blania and his team, the criticism misses the mark. “Big parts of our team have had backgrounds in crypto…so we care about this [privacy] a lot,” he told MIT Technology Review. “I fully understand the concern,” he said, but he thinks it’s more “emotional gut reaction” than “objective criticism.” What the critics were missing, he added, was just how good Worldcoin’s protocol would be at protecting privacy once complete.
Stephanie Schuckers, the director of the Center for Identification Technology Research at Clarkson University, says that’s not outside the realm of possibility, as biometric technology has made a number of recent advances. One of the newest trends is “template security,” which uses cryptography to make a transformation of your biometric data. “When you store it, if it were stolen, it can’t be reverse-engineered back to your original biometrics,” she says.
But the reason that it has yet to be commercialized, she adds, is that cryptographic transformation often leads to “performance degradation.” Instead of matching the new biometric data to an existing biometric sample, template security matches a computer algorithm’s interpretation of the data, via some kind of hash or code, to another stored code. This adds room for error, Schucker says, making it “more difficult to match biometrics in this encrypted space,” though she adds that recent advances in template security have addressed some of those shortcomings.
Template security sounded like a possibility for what Worldcoin was doing—though Schucker cautioned that without seeing their code, or more detail beyond Worldcoin’s blog posts, it was hard to say for sure. Worldcoin has promised to open source its code, including repeating to MIT Technology Review on multiple occasions that this would occur “within the next few weeks”—since we first contacted the company in February.
Besides, the company added in a statement, “It is important to emphasize that we collect data not for the purpose of profiting from it or surveilling our users, like many other tech companies out there. Rather, our goal is to use the data for the sole purpose of developing our algorithms to minimize fraud and enhance user privacy.”
Reeling them in
Representatives of Worldcoin used a range of questionable tactics and enticements to bring in new users, according to many of the people MIT Technology Review spoke to.
When operations began in Sudan last March, the operators found it hard to “explain the concept of digital currencies to people who don’t even have emails”, according to Mohammad Ahmed Abdalbagee, one of Sudan’s four former orb operators. So instead they ran an AirPod giveaway contest to encourage registration that resulted in some 20,000 sign-ups.
At an Islamic high school in Indonesia’s West Java province, Worldcoin applied to teach a cryptocurrency workshop. The school’s student activity coordinator, Muhammad Hilham Zein, read the application and recommended it for approval on the understanding that it was “to share knowledge on crypto…not to encourage students to invest in digital currency.”
"Why did Worldcoin target lower-income communities in the first place, instead of crypto enthusiasts or communities?"
But attendees—at least one of whom was 15, which violates Worldcoin’s own terms of use—as well as our reporter’s first-hand observations tell a different story. During the 45-minute sessions, Worldcoin staff were too busy registering the dozen or so students, helping them download the app and sign up for emails, and finally scanning their biometrics, to provide information on cryptocurrency, Worldcoin itself, or how participants could give or take away consent. (Students did, at least receive their allotment of Worldcoin, which would vest weekly).
More recently, in roughly 20 villages in West Java that hosted recruitment events, many new users, like Iyus Ruswandi, were attracted by giveaways.
“It was held during the pandemic, where the government usually handed out social assistance packages,” explained Ece Mulyana, the principal of an elementary school madrasa who was informed, the night before, that his school was to be used as a Worldcoin registration site. Because the instructions came from a higher-level official—Ade Irma, the sub-district governance head, who was helping Worldcoin coordinate the village registration drives, “I couldn’t refuse the request,” Mulyana said.
Mulyana says that Irma paid him a fee of 2,000 IDR (around 14 US cents, at the time of writing) for each person successfully scanned. Mulyana estimates that 170 made the cut, for a total of 340,000 IDR (roughly $23.80, just under 10% of the average monthly pay of a government worker ).
Heni Mulyani, the sub-district leader who approved the events and Irma’s boss, said the money was provided “for coffee and cigarettes,” a euphemism for gratuities given to government officials to facilitate desired actions. She said none of the money paid went towards site rental—but, she added, “we assure you it’s not coming from the village fund or budget.”
A view of Gunungguruh, one of roughly 20 villages that Worldcoin visited for recruitment. MUHAMMAD FADLI
Instead, the money came from a company called PT Sandina Abadi Nusantara, cofounded by a man named Muhammad Reza Ichsan, who happens to be Worldcoin’s “best-performing operator” (according to Worldcoin’s launch blog post), and his mother. The company was the legal entity through which Worldcoin Indonesia conducted its activities; it was Ichsan’s mother’s job to reach out to local government officials to coordinate recruitment.
Ichsan has told MIT Technology Review that “we don’t pay the village, but we have an operational fund for people who helped us assemble the public in the field.”
Even if Mulyani had not misused village funds, these gratuities are—with rare exceptions— illegal under Indonesia’s anti-corruption and anti-bribery laws, with potential criminal penalties for both the giver and receiver.
In response to questions about payments to village officials, Worldcoin representatives said they were unaware of the incident, called it “isolated,” and that they have launched an investigation to learn more. While they could not yet draw conclusions, Golovina wrote, “It appears possible that some or all of these payments may have been for bona fide operating expenses, for example, fees required to set up operations in a school or other facility, or to pay for permits or licenses required to operate in certain locations.” This stands in contradiction to both the official’s and their orb operator’s descriptions.
Worldcoin also called the other examples we put to them, including the AirPod giveaway in Sudan and the deception of school officials in Indonesia “independent and isolated efforts by local Orb Operators,” and added that “we are wholly focused on incentivizing Operators to sign up engaged users who are excited about using Worldcoin.”
For their part, villagers were not told that at least some of their officials were being paid to promote Worldcoin; in fact, many thought the event was run by the government itself, as Mulyana, the school principal, recalled. “We have to explain to them that it was not a government program,” he said—that “Worldcoin is a foreign company who came and needed help from the village staff.”
Some villagers now doubt that they will receive any money at all now that late January, the time when they were told Worldcoin representatives would return to the village to hand out funds, has come and gone. Nor has the ability to trade Worldcoin from the wallet appeared, for those digitally savvy enough to navigate the app.
Operating blind
The mixed messages and misinformation weren’t necessarily intentional. The orb operators we spoke to often mentioned how little information they received from the Worldcoin representatives who recruited them, even as they were made acutely aware that their payment was tied to the number of people they could sign up. (Worldcoin said that it provides its country-level orb operators with a code of conduct, which sub-operators must also abide by, and that it is moving away from commissions based on number of sign-ups.)
Bryan Mtembei was one such operator. A civil engineer who recently graduated from college in Nakuru, Kenya’s fourth-largest city, Mtembei freelanced for Worldcoin after he was scanned on campus last September.
He wishes that he had received “a brief training or basics about Worldcoin.” Instead the only instruction he got was to “bring more people in to get yourself more money,” he said. “The rest was up to my social marketing skills.”
So he did his best to answer new users’ questions, with the most frequent being about privacy: Mtembei estimates that roughly 40% of the individuals he approached had concerns about sharing their biometric data. When he initially expressed similar concerns, he was assured by a representative that all his questions were addressed in the Worldcoin “white paper.” No such document exists. According to the company, this is by design—people would be unlikely to read “a long, highly technical academic-style paper,” it said, and its shorter blog posts could be thought of as white papers. Ultimately, Mtembei's need for money overrode his concerns; he says that he signed up between 150 and 200 people, at 50 KS (44 US cents) per scan.
Bryan Mtembei first met Worldcoin representatives on his college campus in Nakuru, Kenya. He was scanned and later worked as an orb operator. BRIAN OTIENO
And he wasn't alone. Willis Okach, a college student in Nairobi recruited, like Mtembei, to become an orb operator after his own scan, also got involved because of the money. “You don't have any and someone is offering you some,” he explained, adding that he thinks Worldcoin “feels that students don’t have a lot of money so they will sign up.” For his two days of work, Okach signed up 50 people and earned 100 KS (USD 0.88) for each set of biometric data that he brought in.
According to Golovina, the Worldcoin spokesperson, “all users who sign up during field testing are provided full disclosure about what is being collected and how that data is used and are required to provide their consent before they’re allowed to sign up. Any individual who does consent to our collection and use of their biometric data may revoke their consent at any time and this data will be deleted.”
But of the people we interviewed, none were explicitly told—or, in the case of orb operators, told others—that they were “test users,” that photographs and videos of their faces, and 3D body maps were captured and being used to train the orb’s “anti-fraud algorithm” to “differentiate between people,” that their data was treated differently from the way others’ would be handled later, or that they could ask for their data to be deleted.
Ángel Rodriguez, a security guard for the Santiago Metro in Chile, recalled checking a box in the Worldcoin app agreeing to the terms of service, but recalled the instructions being in English, a language that he does not read. In addition, the app, with its link to the data consent forms, was not available until “late 2021,” according to Worldcoin, at which point, field testing had been going on for at least a year.
Sometimes, new users were asked to provide additional personal data, which Worldcoin claims it never requests. Almost all of the people we spoke to were asked to provide email addresses to log into their wallets (even after Worldcoin introduced a QR code for sign-ins). Some were asked for phone numbers as well.
Golovina has denied in multiple email statements that emails or phone numbers were required for sign-up, though “we do make certain features available to users who choose to provide their phone number or email address, like the ability to send and receive Worldcoin. But things like this will always be optional.” Worldcoin did not explain what else users could do with the token without the ability to send or receive it.
In Nairobi, meanwhile, several students said that orb operators took a photo of their national ID cards to confirm, as Okach recalled, that he was “not…a robot.” Worldcoin said that it has never requested national identification documents from users, though they do request it from their orb operators.
When we shared these comments with interviewees, they did not recognize their own experiences. Mtembei emphasized that personal details were never optional, and there was no way to sign up at his orb without both email and phone. “That CEO is lying,” he said (mistakenly attributing Golovina’s statement to Blania.)
Mohammad Ahmed Abdalbagee, one of the four orb operators hired in Sudan, added that it was his team’s efforts that convinced Worldcoin to add phone numbers as a sign-in method in the first place. “Before they started in Sudan, they used the email as the main identifier, but we told them that this wouldn’t work in Sudan. Many college students don’t even have emails, they use their phones to register in social media,” he said.
Crypto-colonialism
Researchers that study the tech sector’s relationship with the global south were concerned—but not surprised—by Worldcoin’s behavior.
“It's a race to see who gets the most data in this AI-driven economy,” says Payal Arora, a digital anthropologist and author of The Next Billion Users: Digital Life Beyond the West. Stronger data protection laws in Europe and the United States mean that the most ambitious entrepreneurs in those regions can’t get all the training data that they need from their own populations, she says, so they have to look to the developing world.
In fact, according to its launch blog post, Worldcoin is unavailable in either the United States or China due to regulatory constraints, while Bloomberg reported that it has also shut down field tests in other countries, including Turkey and Sudan, for similar reasons. Worldcoin has, however, signed up a number of users in the US at demos held at cryptocurrency conferences, though the company does not consider its US activities to be a form of field testing.
It’s just cheaper and easier to run this kind of data collection operation in places where people have little money and few legal protections.
Pete Howson, a senior lecturer at Northumbria University who researches cryptocurrency in international development, categorizes Worldcoin’s actions as a sort of crypto-colonialism, where “blockchain and cryptocurrency experiments are being imposed on vulnerable communities essentially because…these people can’t push back,” he told MIT Technology Review in an email.
What makes the crypto version even more harmful than other forms of data colonialism is that decentralization, the core tenet of blockchain, makes for “very limited accountability…when things go wrong,” Howson explained. “You’ll often hear this phrase ‘Do Your Own Research’, or DYOR, because these guys don’t care much for rules and regulations.”
But inequities in information and internet access make that “do your own research” ethos all but impractical for many people in developing regions. Similarly, huge economic disparity means that in Kenya, say, the promise of just under half a US dollar could be a compelling incentive for someone to give up their biometric data, whereas in Norway or the US, such an offer wouldn’t go far.
Simply put, it’s just cheaper and easier to run this kind of data collection operation in places where people have little money and few legal protections.
Data lapses and policy holes
Although much of Worldcoin’s field testing has been happening in developing countries, the company stressed that it is also active in developed countries, including several in Europe. “Worldcoin has always tried to conduct field tests in a sample of countries around the globe that would be representative of the world as a whole,” the company told us.
This presents its own challenges. In collecting, controlling, and processing the personal data of EU-defined “data subjects”—that is, any person within the European Union, including citizens, residents, and potentially visitors whose data is being collected—Worldcoin is subject to the European Union’s General Data Protection Regulation (GDPR).
Enacted in 2018, the GDPR requires that data subjects be fully informed about why their data is collected, how it will be used, who will be processing it, where it will be transferred, how they can erase it, and how they can stop its processing. Failing to sufficiently safeguard data can lead to fines of up to 4% of global revenue or 20 million euros, depending on the severity of the infraction. Further, GDPR applies outside of Europe if a company collects or processes the personal data of European data subjects. So a company registered in Delaware and headquartered in San Francisco, like Worldcoin, is not necessarily exempt.
That is, however, exactly what Worldcoin has claimed in its data consent form, which—until MIT Technology Review submitted its list of questions—asked users to accept the following statements:
- “we [Worldcoin] voluntarily comply with the GDPR as a matter of policy”
- “we have not adopted a board-approved data privacy and security policy describing the means and the methods by which we plan to protect your Data to meet the standards prevalent in the GDPR”
- “there is a possibility that our policies and procedures will not be sufficient to meet GDPR requirements”
- “it may be more difficult to assert your privacy rights in court in the United States if we do not comply”
This policy tries to create “carve-outs,” says Marietje Schaake, the international policy director at Stanford University’s Cyber Policy Center and a former Member of the European Parliament, who reviewed the document. Exceptions, she adds, are not possible under the GDPR—and besides, the fact that Worldcoin has a German subsidiary already subjects it to the GDPR.
“As an EU citizen, you have the right to challenge it,” Schaake says, referring to any potential violation. Those challenges would be reviewed by European data protection authorities and eventually argued in European courts rather than American ones, as Worldcoin’s policy suggests.
Worldcoin said that it is fully compliant with the GDPR, and has registered with the Bavarian Data Protection Authority. It added that it employs a data protection officer, and that it has conducted a data privacy impact assessment—though it has declined to make either the officer or the assessment available for public scrutiny. Worldcoin added that the statements in their consent policy “were previously included in an abundance of caution…They no longer appear in the latest version of our Data Consent Form.” As of publication, however, the language still remains online.
For Aida Ponce del Castillo, a researcher at the European Union Trade Institute, who studies regulations for emerging technology and serves as her organization’s data protection officer, this lack of transparency is unjustified. “DPIA are not confidential business information,” she told MIT Technology Review—and while publication is not mandatory, she pointed to European Commission recommendations that companies “consider publishing at least parts, such as a summary or a conclusion.”
The Bavarian Data Protection Authority has yet to respond to MIT Technology Review’s request to confirm the company’s registration.
"That's manipulation"
Beyond the ethical questions, though, lie more practical ones, like: how well does Worldcoin actually work?
For some test users and orb operators on the ground the answer has been, not well at all.
Sometimes, this was due to issues with the orb. In Sudan, local orb operator Abdalbargee says that it would take as many as six attempts for the orb to recognize someone’s face. “Actually it took my friend an entire week for the device to recognize his iris,” he adds.
Orbs were also prone to malfunctions, slowing down recruitment processes and requiring repair in Germany. When Buzzfeed News found similar orb malfunctions in a recent investigation, Worldcoin used language that it has repeated with us: calling one particularly egregious case an “isolated outlier.”
Meanwhile, the transition from a web-based wallet to an app-based wallet has caused a number of users to appear to lose either their entire accounts or all of their coins. For others, the app has proved buggy, draining battery life or leading them into in a spiral of loading and reloading.
Rodriguez, the Chilean security guard, has been trying to resolve his wallet issues since shortly after he was scanned. After signing up in February, and being asked to input his email, phone number, and use a QR code, the app was creating such performance issues for his phone that he deleted it entirely. When he tried to re-download the app, he found that his username no longer existed.
To fix it, he was told by a local orb operator, he would have to find the orb and re-scan his biometric data. But if Worldcoin works as the company claims, re-scanning his iris would simply result in the orb linking his iris with his old iris hash. In other words—and as Worldcoin has subsequently confirmed— there’s no way to recover an account once it’s lost.
Then there are the instances of identity spoofing that the orb has been unable to detect. In mid-2021, one businessman in Indonesia was able to register and access the wallets of over 200 users after they had been scanned and verified as human, and transfer out their contents—held in Bitcoin at the time. Worldcoin says that this occurred when the wallet was still accessible via a web log-in, rather than a mobile app, and that “since transitioning…we have not detected this type of fraud.”
Meanwhile, those who fear that the whole thing may have been a scam want to know what they’ve lost. “50 KS is not enough to give an eyeball away,” says Okach, the university student in Nairobi that spent a weekend recruiting others to Worldcoin. “That’s manipulation, taking advantage of students without clear clarification about what it is they are doing or what they want.”
Forget all those people
When we began reporting this story we noticed that three of the five countries initially cited as case studies for successful field testing—Indonesia, Sudan, and Kenya—were classified as low or lower-middle income by the World Bank. The power and economic differentials seemed ethically fraught, so we began digging.
We wanted to know: what was it like to serve as an early user in this global crypto experiment? What did the participants actually understand—or what were they told—about cryptocurrency, Worldcoin, and the ramifications of giving up their biometric data? Did they provide informed consent—and what would that even look like in this context? And, ultimately—sharing the same question voiced by many of our interviewees—what were the iris scans really for?
MUHAMMAD FADLI portrait of Ruswandi’s neighbor, Solihin (a community leader)
Left to right: Ruswandi’s neighbors Sadili, Solihin (a community leader), and Eli were among the 170 villagers scanned.
In the end, it was something that Blania said, in passing, during our interview in early March that helped us finally begin to understand Worldcoin.
“We will let privacy experts take our systems apart, over and over, before we actually deploy them on a large scale,” he said, responding to a question about the privacy-related backlash last fall.
Blania had just shared how his company had onboarded 450,000 individuals to Worldcoin—meaning that its orbs had scanned 450,000 sets of eyes, faces, and bodies, stored all that data to train its neural network. The company recognized this data collection as problematic and aimed to stop doing it. Yet it did not provide these early users the same privacy protections. We were perplexed by this seeming contradiction: were we the ones lacking in vision and ability to see the bigger picture? After all, compared with the company’s stated goal of signing up one billion users, perhaps 450,000 is small.
But each one of those 450,000 is a person, with his or her own hopes, lives, and rights that have nothing to do with the ambitions of a Silicon Valley startup.
Speaking to Blania clarified something we had struggled to make sense of: how a company could speak so passionately about its privacy-protecting protocols while clearly violating the privacy of so many. Our interview helped us see that, for Worldcoin, these legions of test users were not, for the most part, its intended end users. Rather, their eyes, bodies, and very patterns of life were simply grist for Worldcoin’s neural networks. The lower-level orb operators, meanwhile, were paid pennies to feed the algorithm, often grappling privately with their own moral qualms. The massive effort to teach Worldcoin’s AI to recognize who or what was human was, ironically, dehumanizing to those involved.
When we put seven pages of reporting findings and questions to Worldcoin, the company’s response was that nearly everything negative that we uncovered were simply “isolated incident[s]” that ultimately wouldn’t matter anyway, because the next (public) iteration would be better. “We believe that rights to privacy and anonymity are fundamental, which is why, within the next few weeks, everyone signing up for Worldcoin will be able to do so without sharing any of their biometric data with us,” the company wrote. That nearly half a million people had already been subject to their testing seemed of little import.
Rather, what really matters are the results: that Worldcoin will have an attractive user number to bolster its sales pitch as Web3’s preferred identity solution. And whenever the real, monetizable products—whether it’s the orbs, the Web3 passport, the currency itself, or all of the above—launch for its intended users, everything will be ready, with no messy signs of the labor or the human body parts behind it.
Additional reporting by Lujain Alsedeg and Antoaneta Roussi
Correction: This story has been corrected to note that GDPR fines can be up to 4% of global revenue, a previous version misstated the percentage, and updated to clarify GDPR regulations apply to personal data protection of European subjects.
By: Eileen Guo & Adi Renaldi Link:https://www.technologyreview.com/2022/04/06/1048981/worldcoin-cryptocurrency-biometrics-web3/
-
@ 2e2cf253:737f1193
2023-07-12 05:46:05Gm gm Dueters,
We are glad to tell you that we are planning something big!
As the Hong Kong regulatory environment for blockchain and crypto currency is getting positive, we think it’s the best time to seize the opportunity and expand our market for RWA investors. Come and hear it out!
Hong Kong’s Progressive Crypto Regulations and Supportive Environment for Web3 Businesses
In February, Hong Kong introduced a set of progressive regulations aimed at fostering a favorable environment for crypto-related activities. As crypto regulations in the United States become more stringent, Hong Kong sees an opportunity to emerge as the next prominent crypto hub by offering a more favorable regulatory landscape.
In mid-April, a large crowd eagerly awaited entry into the Hong Kong Convention Center, where the city’s first web3 festival was taking place.They had gathered to explore the opportunities Hong Kong had in store for crypto ventures.
Despite its geographical proximity to China, known for its restrictive stance on cryptocurrencies, Hong Kong is actively striving to establish a regulatory environment that appeals to digital asset companies globally.
Duet Protocol Seizes Opportunities in Hong Kong and Launches Airdrops to Reward Loyalty
As an action to take advantage of the favorable regulatory environment in Hong Kong, with the increasing number of users and customers switching from Web2 to Web3 and the presence of numerous education studios offering blockchain education to traditional investors, Duet Protocol will collaborate with select education studios and establish our exchange, issue more synthetic assets, and be their primary marketplace for trading.
To get more and more people familiarized with our trading platform, Duet Pro, we encourage the purchase and Duet pro trading and would like to reward the active users in our campaign “DuetBoost”, encompassing from July 11th to November 11th,2023, UTC+0, containing a series of Airdrops.
In the “DuetBoost” campaign, the selection and rewarding of users for the airdrop will be based on several criteria to determine the most loyal participants. The following factors will be considered:
-
User’s Recent Extra Holding of Duet Token: The campaign will take into account the amount of Duet tokens newly purchased by the user, starting from July 11th, 2023, UTC+0.
-
User’s Trading Activities on Duet Pro: The campaign will consider the user’s trading activities on the Duet Pro exchange. The frequency and volume of trades made by the user will be taken into account, demonstrating their engagement and participation in the platform’s trading activities.
-
User’s Credibility as a True User: The campaign will assess the credibility of users to ensure they are genuine participants. Factors such as account verification, adherence to platform guidelines, and overall trustworthiness will be considered to determine the authenticity and reliability of users.
In addition, we have partnered with a data analytics lab to develop a personalized airdrop strategy and effectively predict user behavior, specifically focusing on identifying instances of free riding.
Your participation in the survey and inclusion in the whitelist will not only increase your eligibility for the airdrop but also contribute to the ongoing refinement of our platform’s services and offerings.
Please take a moment to provide us with your information through the survey link below, and join the airdrop whitelist to benefit from this personalized initiative and improve your overall experience with our platform.
https://644ce16a.sibforms.com/serve/MUIFAN_x7BokYRahpOLBn4NMzvgDQJrJOwqerbf9DGmotnczY89oui8efwO0A4ROFGvcvNFvfWE1eEh1POlaZ_38Gj4t1d4mqtyUAgVi39XHTnGDVQQ-D0KXAWs5nPezriwEg9Kw-yKJj-yS4A7tbRD4AgvKRq1izgdv7tUqfTOdOq-sjQGKLsuuyajHL6ZugF7yQs22ChByzh1Y
Scan the QR code to get access to the form.
Follow Duet Protocol’s social medias to learn more. https://link3.to/duetprotocol
-
-
@ 3cea4806:10fe3f40
2023-07-30 11:27:00Previously, I had written a post on Stacker News titled "The ICANN Domain Problem, Solved With Nostr?", I presented the issue of how ICANN is the central controlling authority of internet domain names, and how we don't genuinely own them, and that issue spills over the solution that NIP-05 is trying to provide on Nostr.
The purpose of that post, and this one, is to showcase the issue of NIP-05, with the bigger issue being ICANN itself, to hopefully open up the discussion so that people can figure out a solution, all the while trying to present a potential solution myself. People have presented valid criticisms of the idea and presented holes in what I had presented as a potential solution, and they were valid ones. After some time, another solution presented itself to me, however, it might rub Bitcoiners the wrong way, but considering the opened floodgates with Ordinals, I thought might as well potentially make use of it.
What's The ICANN Problem?
Among the many reasons why people love Bitcoin is because it's not centrally controlled by anyone or a group of colluding individuals. The same goes for Nostr, which is seeing continuous growth in user adoption as they migrate from other centralized social platforms and other services to it. As such, current domain holders can completely lose access to their domain that's registered at ICANN, by an entity in ICANN, or by an external influence.
In short, the ICANN problem is that you are not truly the owner of any domain you've purchased under their rule. With that said, this leads to the issue that Nostr is trying to currently solve, which is simple, human-readable handle names, and a solution was presented called NIP-05 handles, but because of the ICANN, it is a bandaid solution.
The NIP-05 Problem
Nostr's NIP-05 handle solution attempts to address the issue of creating human-readable names for decentralized domains. Instead of using NPUB addresses directly, the idea is to associate handles with domains to improve user accessibility and convenience. However, upon closer examination, it becomes apparent that this approach poses several significant challenges and fails to achieve the desired decentralization and censorship resistance.
One of the primary concerns with the NIP-05 handle solution is its reliance on domain names. As we know, the domain name system is currently managed by ICANN, a centralized authority. This centralization introduces a critical vulnerability, as it means that a controlling entity could potentially take control of a domain associated with NIP-05 handles.
In such a scenario, if an authoritative entity seizes control of a domain, all the handles associated with that domain would be rendered useless, causing users to lose their hard-earned and recognizable identities. This loss could lead to a severe setback for user outreach and branding, as individuals and businesses may have built their online presence around these handles. Such an outcome significantly undermines the purpose of decentralized domains, as it introduces a single point of failure and compromises the integrity of the entire system.
The Failed Nostr Solution
The previous idea that I had presented to solve the ICANN problem, in gist, is to have a file that all relays would hold and update with domain names and connect them with users' NPUBs, but that had its issues:
1. Consensus on File State
One of the major with the Nostr idea was a robust mechanism for relays or entities referencing the file to reach a consensus on its current state. In a decentralized domain system, it's crucial to ensure that all participants are on the same page regarding the associations between names and their corresponding public addresses. Without a consensus protocol, discrepancies and inconsistencies would arise, leading to fragmented and unreliable domain resolution.
2. Ease of Squatting
Another pressing concern was the ease of domain squatting, where individuals could register and hold multiple names, limiting others' access to those names at little to no cost. The potential for abuse could hinder fair access to domain names. It would be a first-come-first-served process with no reasonable financial barriers.
New Solution: Utilize Bitcoin
Here's my newest attempt at trying to provide a solution to the ICANN problem, which is to make Bitcoin itself the new home for domains instead of ICANN. While others have presented the idea of other chains that are attached or anchored to Bitcoin that has already done this, I'd view it as still an issue since those chains are not as decentralized or powerful as Bitcoin itself and can be susceptible to failure.
Secondary Block Rewards: Domain Name Tokens (DNTs)
At the moment, the Bitcoin protocol rewards Miners that discover a block with an amount of BTC. The idea here is to add another kind of reward alongside the current one: Domain Name Tokens (DNTs).
Let's say this idea has been implemented into Bitcoin. An individual has successfully mined a Bitcoin block, and that person was rewarded with the current usual 6.25 BTC. Alongside that, they are also rewarded with, for example, 1,000 DNTs would be rewarded to that Miner, which can be then used, given, sold, or traded with other people in the world. The price of a DNT would be determined by supply and demand.
Once an individual has acquired a DNT, which was originally gained through POW, and then gained before use through a potential financial cost, they would then update the DNT that they own by attaching a name to it (permanent), and other relevant information like an IP address or a Nostr NPUB or note ID (can be changed later), and then send it (to yourself? to a contract? not sure how this part would work) to have it confirmed in the Bitcoin network, and if the name is available (ideally, there'd be a system that checks the network if a name is available or not before you send), you'd have that DNT recorded in the Bitcoin network with the associated name that no one can take, which would also have data that would connect it to a server to showcase your website on a browser.
Third Block Reward Type: Human Readable Bitcoin Address Tokens (HRBATs)
Accidentally thought of this when I was writing down the main idea above for the domain name tokens on Bitcoin. If users can obtain tokens so that they can be used to register/record a word and attach with its data, specifically an IP address or a Nostr NPUB or note ID, then another type of token that is pretty much the same as the domain one, but specifically for Bitcoin addresses.
This would follow the same idea as above, where miners would be rewarded with Y amount of HRBATs, and would be given/sold to the masses. A person with an HRBAT can then sign and send it with a name, have it confirmed and recorded in the Bitcoin blockchain, and then attach a Bitcoin address of their choosing to it. The results? We would now have human-readable Bitcoin addresses to transact with.
- Before: "Hey John, you can send me that 0.05 BTC to bc1qar0srrr7x...5l643ly...9re59gtzz...mdq"
- After: "Hey John, you can send me that 0.05 BTC to HummusMan9K"
End Thoughts & Thanks For Reading!
Here are a few extra thoughts I had while I was thinking about these issues and solutions: - I don't think both of these types of rewards should be case-sensitive, as it would lead to various malicious issues if it was. - I'm not sure if this requires a new BIP or not, and if it does and the solution is sound, then the latest issue would be approval from the Bitcoin network and approval time (as well as sound development/code of course). - Other data can be attached to a DNT, such as a Bitcoin address, or a Bitcoin LN address as well. - Even though a DNT is pretty much the same thing almost as an HRBAT, the reason I'd suggest there'd be two different types is for better organization and push for separate uses. A DNT would have "Domain Name", "Server Address", "Bitcoin Address", and "Bitcoin LN Address" fields, while HRBAT would have "Bitcoin Address Name", "Bitcoin Address", and "Bitcoin LN Address" fields. - Internet browsers would need to add support for DNTs, and an agreed method of accessing them via the URL field needs to be figured out. Perhaps something like "Visit my website on bd:freakoverse" where "bd" stands for "Bitcoin Domain", is to point toward the Bitcoin network. Browsers can offer a quick toggle to switch between ICANN domains and Bitcoin domains next to the URL field, and a user can set whichever is the default.
That's about it. What do you think of this idea to potentially solve the ICANN problem, if it isn't good or if there are holes in it, please mention it, and let's delve into a discussion to find a better solution. Thanks for reading!
-
@ eea716ed:ec1e1eda
2023-07-12 03:12:11Water navigating complexity | Logging accomplishments | All-product survey | Mountains | Ice water
Career
Practices
Logging Accomplishments
Started a new practice of logging daily accomplishments. Learned about it here.
The idea is to build an internal proof of work. Accomplishments, successes, and victories get lost as we push past the daily slog.
I write my accomplishments separately for each portfolio/ project. That way i can quickly skim through at the end of the week to write up a weekly log of successes.
Template for Work
Created a lightweight daily work template in Logseq that i really enjoy: https://cdn.nostr.build/p/4a84.png
Government of Alberta’s DDD (Digital Design & Delivery)
Beginning to find a groove. I’m part of a newly forming team called Design HQ.
The vision for Design HQ has boundaries - there is a collective will for its creation. At the same time, the definite idea of how the vision looks is yet to be cocreated. And it will likely continue to be shaped and refined for the next while.
Table stakes for Design HQ are to support design practitioners: UXers and service designers. It’s a wildly challenging and exciting task to build resources, tools, and supports for and with these practitioners… using service design approaches. It’s as meta as you can get.
There are many brilliant humans across the DDD. Navigating this space reminds me of the importance of staying cognitively flexible. Building in complexity is like being like water - as Bruce Lee put it.
https://cdn.nostr.build/p/80Zg.png Link here
I’m still continuing to do spaced repetition to internalize the org’s mental models, practices, and philosophies.
Chaincode
User Feedback
Created a tiny but efficient all-product survey. https://cdn.nostr.build/p/lMdP.png
The design goal was to use one data collection tool across several products. This decreases overhead, especially while products move from alpha-beta-live.
Insights will of course be paired (and triangulated) with user interviews.
Product Development
Most products are super close to alpha release. It’s going to be like popcorn soon.
Here’s a sneak peek with the Bitcoin Knowledge Project (name is still in progress): https://cdn.nostr.build/p/PxZ0.jpg
Lifestuff
-
Been helping Brittany a bunch to prepare Bolli Imports for Shambhala https://cdn.nostr.build/p/Wgdk.webp
-
Went camping in Alberta’s crownland. We’re so far north that could still see the silhouette outline of the mountains at midnight. (Image not at midnight lol). Did a bunch of cold dips in the ice cold creek. https://cdn.nostr.build/p/6KZ2.webp
Image link
-
Hit up the Calgary Stampede. Ate nasty food. Saw big majestic horses. https://cdn.nostr.build/p/BmZ0.webp
-
Wrote part of this with my feet in a bucket of ice water. This one hurt life hell for some reason. Usually not this bad. https://cdn.nostr.build/p/QEwP.webp
-
-
@ 32c7ac5a:7bc8d77d
2023-07-31 03:43:35From the moment Hashrate Asset Group (HAG) unveiled plans to launch a Security Token Offering (STO), our community’s interest spiked, leading to a wealth of questions about HAG and the STO process. Recognizing this curiosity and the need for clear, understandable information, we have compiled this in-depth FAQ. This FAQ is designed to provide insights into HAG’s operations and investments. We aim to demystify this intricate, yet captivating industry, helping you gain a deeper understanding of HAG’s functioning and the potential our STO tokens hold. So, let’s explore the dynamic operations of HAG and the promising prospects of our STO tokens.
1. How does HAG manage the mining equipment backing the STO tokens? What are the steps taken to ensure efficient and continuous operation?
We will utilize 30% of net yield from the Bitcoin mining as resources to maintain or update HAG mining equipments. Currently, HAG collaborate with mining farms and they serve the onsite maintenance. In fact, our management team acquires such maintenance capability as well, so we will review the job or physically participate if needed. In fact, such maintenance standard is specified in the contract between HAG and the mining farm.
2. How does HAG calculate the net yield for STO token holders?
The net yield will be defined in the Master Token Agreement.
“Net Yield” is the gross Bitcoin yielded from mining operations minus the following:
electricity used in mining; gas fees; mining pool commission fees; mining facility operating costs and overhead; other direct costs of mining; taxes incurred by Company; listing fees on the INX ATS and any other trading forum [Anticipated Paying Agent fees and costs for Periodic Distributions].
In computing Net Yield:
-
neither management fees (if any) nor the Company’s general administrative costs will be subtracted from gross Bitcoin yield; and
-
income tax of HAG Holders that the Company is required to withhold will be netted from individual distributions and not subtracted from Gross Yield
3. How does HAG manage electricity costs to maximize the profitability of Bitcoin mining?
The HAG management team are the early investor of Bitmain, hence they have the know how of mining industry. Fisrt of all, HAG collaborate with the top mining farms that provide reasonable price on electricity but with proven records on equipment management. In fact, compared with electricity cost, the loss from poor maintenance cost much more, and HAG will renew the mining equipment before its expected service life. Of course, HAG will get involved in the maintenance duty to ensure the right of HAG holder. Secondly, the mining equipment is planned to setup in the US with much more transparency on electricity policy.
In addition, Biden’s proposal on charging additional tax to Bitcoin mining is rejected by the Congress.
4. What is the estimated return on investment for HAG STO tokens? How does the dividend payout process work and how often are dividends paid out?
Because of the legal compliance, we could not predict the return of any financial asset including HAG. However, feel free to join our community that might have some discussion about the potential returns. The dividends will be on a monthly basis. The detailed process for dividends distribution will be disclosed in our PPM. The PPM will disclosed before the launch of HAG. If course, we will ensure every process is safe, legitimate, and correct.
5. How does HAG ensure the security of its tokens and the underlying assets?
The security of token itself will distributed by INX’s Transfer Agent with required licenses. For the security after distribution to the Holder, Holder might take your responsibility to keep the token safe. Of course, holder could contact INX or HAG, and we will do our best to provide available assistance.
6. Can HAG tokens be traded freely on the secondary market? If so, how to ensure the liquidity?
HAG could be traded freely on the secondary market. Because of the compliance, such secondary market is currently separated from the crypto market. Security token is the future for its compliance, so the liquidity will have huge space to grow. Furthermore, compared with direct investing in mining pool and the liquidity of underlying RWA security token, the liquidity of security token like HAG is much better. Besides, HAG security token is based on ERC-1404 which is applicable to ERC-20. Therefore, the future application scenario especially the ERC-20 DeFi will be prospective.
7. What are the future plans for HAG?
The short-term plans:
a) Get the mining equipment online and let holder start recieving dividends.
b) Make sure the mining farm’s operation is done beyond satisfaction and achieve the economy of scale gradually.
c) Keep communicating with holders/potential holders and make sure our practices are transparent at reasonable level.
The longer-term plans:
a) Increase the partnerships with other mining farms to better achieve the economy of scale and diversification.
b) Collaborating with institutions from different countries/regions to find out any potential they could have with HAG. The pool will get deeper and deeper.
c) Start talking with DeFi project and looking for any potential collaboration.
8. Financial Reporting:
a) Will HAG disclose the financial report? Yes, we will.
b) How often does HAG release financial reports, and what information is included in these reports? Since HAG is SEC-filed company, so we will follow SEC’s rules on the financial information disclosure that HAG fits.
c) Are HAG’s financial reports audited by an independent party? If so, who conducts these audits? For the financial report auditing, HAG will do so, and we are in the process of interviewing the candidate to conduct the audit request by SEC in the future.
d) How can investors access HAG’s financial reports? It will be on HAG website.
e) How does HAG ensure transparency in its financial reporting? HAG is a SEC-filed company, so we follow US SEC’s rule. The audit on the financial information will be conducted. The result will be on HAG website publicly.
About Hashrate Asset Group
Hashrate Asset Group aspires to build the world’s first sustainable, compliant and transparent Bitcoin standard arithmetic operating model. HAG Token allows investors to join the ecosystem and receive a real-time return on your investment. HAG mining farm is located in the United States, and the team is composed of industry-leading professionals from Bitmain, Goldman Sachs, and TSMC.
Website: https://www.hagsto.com/
Twitter: https://twitter.com/HashrateAsset
Discord: https://discord.gg/9mkYSr23cz
Telegram: https://t.me/HashrateAsset
-
-
@ 32c7ac5a:7bc8d77d
2023-07-31 03:38:36In recent years, Bitcoin mining and real-world assets (RWAs) such as real estate, equipment, and intellectual property have gained significant attention. The combination of these two concepts creates a unique opportunity for investors to participate in the growing cryptocurrency market with the security of tangible assets. Hashrate Asset Group (HAG) achieves this by launching security token offering (STO) and integrating RWAs with the BTC DeFi system.
HAG STO: A Compliant and Secure Tokenization Model
HAG’s STO token, backed by the intrinsic value of hash power on Bitcoin mining equipment, offers investors’ confidence in its real-world value. HAG purchases the latest Bitcoin mining equipment, such as the S19XP, to mine Bitcoin. The value of the hash power on mining equipment is used to back the HAG STO token, creating a secure investment opportunity for investors.
Each HAG STO token represents one perpetual terahash power, which is a fractional ownership of hash power on the mining equipment and the Bitcoin mined by that equipment. Investors can purchase HAG STO tokens, knowing that they are backed by the value of the perpetual mining power of mining equipment and the Bitcoin it mines. This model ensures that the value of the HAG STO token is tied to the underlying mining power and Bitcoin mining equipment, the Bitcoin it mines, and BTC dividend payouts.
As the mining equipment operates and generates new Bitcoin, there are dividend payouts to token holders in the form of wBTC. These payouts provide an additional stream of income for investors, enhancing the value proposition of the HAG STO token. Hashrate Asset Group distributes 70% of the net yield to HAG STO token holders monthly on a pro rata basis. The remaining 30% is utilized as a reserve to maintain the targeted hash rate per HAG security token and as general working capital. This sustainable model allows investors to benefit from a steady return on their investment, and it firmly anchors the HAG STO within the BTC DeFi ecosystem.
Furthermore, the HAG STO token can be traded on security token exchanges, giving investors exposure to the growing cryptocurrency market without the volatility and risk associated with other cryptocurrencies. As the Bitcoin market grows and the value of the mined Bitcoin increases, so too would the value of the HAG STO token.
Integrating RWA with BTC DeFi through HAG: A Compliant and Secure Tokenization Mode
DeFi (Decentralized Finance) represents a flourishing domain that leverages blockchain technology to craft a more transparent, decentralized financial system. The DeFi ecosystem hosts an array of financial applications and protocols running on blockchain networks, offering users access to a multitude of financial services and products, including trading, lending, borrowing, and yield farming.
DeFi’s features suggest it can serve as an efficient conduit for RWA integration, allowing these assets to gain exposure among investors. An illustrative example is the tokenization of a real estate property into a digital token on a blockchain. This token could then serve as collateral for a loan on a DeFi platform, allowing the token owner to access liquidity without having to liquidate the underlying asset.
However, uniting DeFi with RWA presents challenges, including regulatory compliance, security, and scalability. HAG offers a compliant, secure, and robust vehicle to intertwine RWA with the BTC DeFi system, overcoming these hurdles.
HAG’s recent STO presents a distinctive opportunity to blend RWA with the BTC DeFi ecosystem. The tokenization of RWA enables fractional ownership and the transferability of assets, which were previously arduous to transact in smaller increments. HAG’s STO elevates this concept by tokenizing hashrate ownership in the Bitcoin mining process. This strategy allows investors to partake in Bitcoin mining without the complexities of mining equipment setup, electricity negotiation, or hash power maintenance. Moreover, HAG tokens can be utilized for yield farming, providing investors with high yields.
Potential Challenges and the Role of HAG
While the combination of DeFi with RWA represents an exciting opportunity for investors to access previously inaccessible markets and assets, some challenges remain, including regulatory compliance, security, and scalability. HAG addresses these challenges by providing a compliant, secure, and effective vehicle to combine RWA with the BTC DeFi system.
HAG tokens are fully compliant with the U.S. Securities Act, ensuring that investors have the necessary protections and regulatory oversight to invest in these types of assets. This compliance not only provides peace of mind for investors but also sets a new standard for digital asset investment, paving the way for the entire cryptocurrency market and Bitcoin mining industry.
Each HAG Token represents one terahash per second of perpetual mining hashrate, and token holders receive monthly dividend payments in the form of WBTC net of expenses based on the number of HAG Tokens they hold. This model provides investors with a sustainable and consistent return on their investment through a broad range of pricing assumptions of the underlying digital asset.
In addition, HAG’s asset class is security, so buying HAG is similar to buying stocks in the traditional market. HAG’s underlying asset lies in the Bitcoin ecosystem, but it also provides substantial application scenarios in the Ethereum ecosystem. As an ERC-1404 based security token, HAG acquires potential in DeFi based on ERC-20. These two standards are comparable, and the ERC-1404 is applicable to ERC-20 DeFi projects.
Conclusion
In conclusion, the HAG model combines Bitcoin mining, RWA, and DeFi, creating new opportunities for investors to participate in the cryptocurrency market while minimizing risk. The potential for dividend payouts in the form of newly mined Bitcoin provides an additional income stream for investors, further enhancing the value proposition of the STO token.
As the popularity of cryptocurrency continues to grow, this new type of STO token could become a valuable addition to any investor’s portfolio. By offering a unique investment opportunity that combines the benefits of Bitcoin mining, real-world assets, and DeFi integration, Hashrate Asset Group is at the forefront of innovative tokenization models in the world of digital assets.
About Hashrate Asset Group
Hashrate Asset Group aspires to build the world’s first sustainable, compliant and transparent Bitcoin standard arithmetic operating model. HAG Token allows investors to join the ecosystem and receive a real-time return on your investment. HAG mining farm is located in the United States, and the team is composed of industry-leading professionals from Bitmain, Goldman Sachs, and TSMC.
Website: https://www.hagsto.com/
Twitter: https://twitter.com/HashrateAsset
Discord: https://discord.gg/9mkYSr23cz
Telegram: https://t.me/HashrateAsset
-
@ 32e18276:5c68e245
2023-07-11 21:23:37You can use github PRs to submit code but it is not encouraged. Damus is a decentralized social media protocol and we prefer to use decentralized techniques during the code submission process.
[Email patches][git-send-email] to patches@damus.io are preferred, but we accept PRs on GitHub as well. Patches sent via email may include a bolt11 lightning invoice, choosing the price you think the patch is worth, and we will pay it once the patch is accepted and if I think the price isn't unreasonable. You can also send an any-amount invoice and I will pay what I think it's worth if you prefer not to choose. You can include the bolt11 in the commit body or email so that it can be paid once it is applied.
Recommended settings when submitting code via email:
$ git config sendemail.to "patches@damus.io" $ git config format.subjectPrefix "PATCH damus" $ git config format.signOff yes
You can subscribe to the [patches mailing list][patches-ml] to help review code.
Submitting patches
Most of this comes from the linux kernel guidelines for submitting patches, we follow many of the same guidelines. These are very important! If you want your code to be accepted, please read this carefully
Describe your problem. Whether your patch is a one-line bug fix or 5000 lines of a new feature, there must be an underlying problem that motivated you to do this work. Convince the reviewer that there is a problem worth fixing and that it makes sense for them to read past the first paragraph.
Once the problem is established, describe what you are actually doing about it in technical detail. It's important to describe the change in plain English for the reviewer to verify that the code is behaving as you intend it to.
The maintainer will thank you if you write your patch description in a form which can be easily pulled into Damus's source code tree.
Solve only one problem per patch. If your description starts to get long, that's a sign that you probably need to split up your patch. See the dedicated
Separate your changes
section because this is very important.When you submit or resubmit a patch or patch series, include the complete patch description and justification for it (-v2,v3,vn... option on git-send-email). Don't just say that this is version N of the patch (series). Don't expect the reviewer to refer back to earlier patch versions or referenced URLs to find the patch description and put that into the patch. I.e., the patch (series) and its description should be self-contained. This benefits both the maintainers and reviewers. Some reviewers probably didn't even receive earlier versions of the patch.
Describe your changes in imperative mood, e.g. "make xyzzy do frotz" instead of "[This patch] makes xyzzy do frotz" or "[I] changed xyzzy to do frotz", as if you are giving orders to the codebase to change its behaviour.
If your patch fixes a bug, use the 'Closes:' tag with a URL referencing the report in the mailing list archives or a public bug tracker. For example:
Closes: https://github.com/damus-io/damus/issues/1234
Some bug trackers have the ability to close issues automatically when a commit with such a tag is applied. Some bots monitoring mailing lists can also track such tags and take certain actions. Private bug trackers and invalid URLs are forbidden.
If your patch fixes a bug in a specific commit, e.g. you found an issue using
git bisect
, please use the 'Fixes:' tag with the first 12 characters of the SHA-1 ID, and the one line summary. Do not split the tag across multiple lines, tags are exempt from the "wrap at 75 columns" rule in order to simplify parsing scripts. For example::Fixes: 54a4f0239f2e ("Fix crash in navigation")
The following
git config
settings can be used to add a pretty format for outputting the above style in thegit log
orgit show
commands::[core] abbrev = 12 [pretty] fixes = Fixes: %h (\"%s\")
An example call::
$ git log -1 --pretty=fixes 54a4f0239f2e Fixes: 54a4f0239f2e ("Fix crash in navigation")
Separate your changes
Separate each logical change into a separate patch.
For example, if your changes include both bug fixes and performance enhancements for a particular feature, separate those changes into two or more patches. If your changes include an API update, and a new feature which uses that new API, separate those into two patches.
On the other hand, if you make a single change to numerous files, group those changes into a single patch. Thus a single logical change is contained within a single patch.
The point to remember is that each patch should make an easily understood change that can be verified by reviewers. Each patch should be justifiable on its own merits.
If one patch depends on another patch in order for a change to be complete, that is OK. Simply note "this patch depends on patch X" in your patch description.
When dividing your change into a series of patches, take special care to ensure that the Damus builds and runs properly after each patch in the series. Developers using
git bisect
to track down a problem can end up splitting your patch series at any point; they will not thank you if you introduce bugs in the middle.If you cannot condense your patch set into a smaller set of patches, then only post say 15 or so at a time and wait for review and integration.
-
@ e6817453:b0ac3c39
2023-07-30 11:00:33In 2005, Microsoft’s Chief Identity Architect, Kim Cameron, wrote an influential paper called The Laws of Digital Identity.
Laws of identity
Sure, here's Kim Cameron's Laws of Identity in markdown format:
Kim Cameron's Laws of Identity
- Law of Control:
-
Users must have control over their own digital identities. They should be able to decide and control how their identities are used and shared.
-
Law of Minimal Disclosure:
-
The solution which discloses the least identifying information and best limits its use is the most stable, long-term solution.
-
Law of Justifiable Parties:
-
Digital identity systems must limit disclosure of identifying information to parties having a necessary and justifiable place in a given identity relationship.
-
Law of Directed Identity:
-
A universal identity system must support both "omnidirectional" identifiers for use by public entities and "unidirectional" identifiers for use by private entities, thus facilitating discovery while preventing unnecessary release of correlation handles.
-
Law of Pluralism:
-
A universal identity system must channel and enable the interworking of multiple identity technologies run by multiple identity providers.
-
Law of Human Integration:
-
The identity system must define the human user as a component integrated through protected and unambiguous human-machine communications.
-
Law of Consistent Experience Across Contexts:
- The unifying identity metasystem must provide a simple, consistent experience while enabling separation of contexts through multiple operators and technologies.
These principles are designed to ensure user control, privacy, and security in digital identity systems, and they have influenced a wide range of subsequent work in digital identity.
It defines the core principles of a meta-identity system that allow the building next generations of identity systems.
Let's look at how Self-Sovereign Identity Principles support this law and enable to build of a meta-identity system on top.
SSI principles
Sure, here are the core principles of Self-Sovereign Identity (SSI) in markdown format:
Principles of Self-Sovereign Identity (SSI)
- Existence:
-
Users must have an independent existence. Their identities should exist even outside of the digital realm.
-
Control:
-
Users must control their identities. They should have the ability to access, manage, and control the data and information that is associated with their identity.
-
Access:
-
Users must have access to their own data. They should be able to retrieve, move, and store their data as they wish.
-
Transparency:
-
Systems and algorithms must be transparent. The systems used to administer and operate a user's identity must be open, both in how they function and in how they are managed and governed.
-
Persistence:
-
Identities must be long-lived. Ideally, they should last forever, or at least for as long as the user wishes.
-
Portability:
-
Information and services about identity must be transportable. They should not be held by a singular third-party entity, even if it's a trusted entity.
-
Interoperability:
-
Identities should be as widely usable as possible. They should function in all the places where identity information is required.
-
Consent:
-
Users must agree to the use of their identity. Their consent should be a requirement for any identity transactions in which their data is utilized.
-
Minimization:
-
Disclosure of claims must be minimized. When data is disclosed, the user should provide the minimal amount of data necessary for the transaction.
-
Protection:
- The rights of users must be protected. Whether through legal means or through the architecture of the identity system itself, a user's rights, including their right to privacy, should be protected.
SSI puts the individual at the center of digital identity management, providing a strong framework for privacy, security, and user control.
correlation of digital identity laws and SSI principles © A Comprehensive Guide to Self Sovereign Identity In a picture, we see a mapping of laws and principles. Only one item is missing. Direct Entity — is covered by DIDs. So entities should have public and resolved identifiers, like the peer-to-peer private ones that allow the building of pseudo-anonymous relations.
You could get more details in a book.
Learn digital identity
-
@ e6817453:b0ac3c39
2023-07-30 09:22:47In the digital age, our identities are fragmented across various platforms, each holding a piece of our data. This fragmentation poses a significant challenge, as it prevents us from having a unified view of our digital identities. However, the concept of a holistic identity, digital twins, and autonomous agents can solve this problem, providing a more comprehensive and self-sovereign approach to digital identity.
Holistic Identity: A Unified View of Self
Holistic identity is not just a technical term; it's a philosophical concept that aims to solve the problem of data fragmentation. Unlike traditional identity systems that are authoritative and siloed, a holistic identity provides a unified view of an individual's data across various platforms.
A holistic identity is a snapshot of all data points about you, including your behavior, activities, and posts. It's not just about an identifier or login password; it's about aggregating all the data points that identify you, providing a more comprehensive view of your digital self.
Digital Twins: Your Digital Copy
A digital twin is a digital copy of you and all your data that you control. It's a continuation of your holistic identity, aggregating all the data points about you and all the data produced by you.
The concept of a digital twin goes beyond just storing information; it's about getting benefits out of it. With a digital twin, you can interact with your data, gain insights, and even sell your data. It opens up a world of possibilities, from personalization to automation.
Autonomous Agents: Your Digital Assistants
Autonomous agents are the next step from digital twins. They are essentially digital assistants that can perform tasks on your behalf. These agents can have access to a portion of your data and can perform various operations, from booking tables and buying tickets to trading operations and data trading on data exchanges.
Autonomous agents can analyze data from your digital twin and perform actions based on it. They can optimize routine tasks, cooperate with each other, create trust networks, and even make micro-payments.
The Future of Self-Sovereign Identity
The concepts of holistic identity, digital twins, and autonomous agents are interlinked and form the cornerstone of a self-sovereign identity. They provide a way to have a sovereign persona and proof that the data belongs to you.
These concepts are not just theoretical; they have practical applications that can revolutionize various domains, from healthcare and finance to personalization and automation. They represent the future of digital identity, a future where we have more control over our data and where our digital identities are unified, comprehensive, and self-sovereign.
In conclusion, the era of self-sovereign identity is upon us. It's an era where we can control our data, gain insights from it, and use it to our advantage. It's an era where our digital identities are not fragmented across various platforms but are unified and comprehensive. It's an era where we can have digital twins and autonomous agents that can perform tasks on our behalf. It's an era of holistic identity, digital twins, and autonomous agents.
-
@ 32c7ac5a:7bc8d77d
2023-07-31 03:28:30Summary
Hashrate Asset Group is committed to establishing the world’s first sustainable, compliant, and transparent Bitcoin standard arithmetic operating model. HAG token is a security token issued by Hashrate Asset Group in the form of STO. Holders may participate in the ecosystem and realize a real-time return on their investment. HAG mining farm is located in the United States, and the team is composed of industry-leading professionals from Bitmain, Goldman Sachs, and TSMC.
The value of HAG is mainly reflected in two aspects. Firstly, its inherent value is highly correlated with Bitcoin, and the long-term value of Bitcoin will bring returns to investors. Secondly, investors can receive 70% of net profit of mining income every month as WBTC dividends.
For those interested in investing in HAG, we have established a strategic partnership with INX Limited, one of top three STO exchanges in US, the owner of a blockchain-based trading platform for digital securities and cryptocurrencies that has successfully completed a security token offering that was filed with the SEC and went through a full prospectus. Through the INX platform (https://one.inx.co/), investors will have the opportunity to invest in the first SEC-filed security token focused on Bitcoin mining.
Furthermore, HAG’s presale is set to commence soon, and both individual and institutional investors may participate by subscribing in advance at a unit price of $48. We will be releasing detailed tutorials on pre-sale and STO investments to provide investors with clearer and more specific guidance. We encourage interested parties to stay tuned for further updates.
**Introduction ** The cryptocurrency market is flooded with altcoins, which leaves investors with excessive options to invest in digital assets. However, consistent returns in this space require significant due diligence and a high level of systematic risk tolerance. For those looking for investment exposure to the cryptocurrency space without the extreme volatility of the altcoin market, Bitcoin is a crowd favorite. Due to its brand recognition, security, limited supply, and legacy as the original cryptocurrency, Bitcoin has yielded the highest returns compared to traditional asset classes for nine out of the past eleven years.
Investors can purchase Bitcoin directly with fiat through various exchanges or mine it by setting up specific computer hardware to solve blocks of algorithmic equations. Mining and holding can remove the challenge of figuring out the ideal time and price point to purchase Bitcoin, allowing investors to accumulate BTC based on their hashing power. However, setting up and maintaining a profitable mining operation can be challenging and often out of reach for individual investors or investors without related mining operation experience.
It’s important to note that this is an extremely competitive space where having the right hardware setup, security, maintenance, and access to cheap electricity are all prerequisites for a profitable mining operation. This can be difficult, if not outright infeasible for individual investors.
Hashrate Asset Group
HAG (Hashrate Asset Group) offers accessibility to invest in Bitcoin through hashrate ownership with security and compliance that are typically reserved for institutional investors. HAG Tokens are security tokens that are fully compliant with the U.S. Securities Act. Known as Security Token Offerings (STO), it is a digital token in ERC 1404 that has all of the same features and benefits of an ERC-20 token contract with a few enhancements to allow issuers to enforce regulatory restrictions.
Each HAG Token represents 1 terahash of perpetual mining hashrate for the underlying Bitcoin digital asset. Token holders receive monthly dividend payments in the form of WBTC net of expenses based on how many HAG Token they hold. The nature of the underlying asset, regular yield payments, and compliance with extensive regulations result in significantly lower risk for investors compared to the broader cryptocurrency space.
Compared to traditional methods of directly buying Bitcoin or attempting to set up your own hardware with all the necessary prerequisites, HAG provides a sustainable and consistent ROI for investors through a broad range of pricing assumptions of the underlying digital asset (Bitcoin).
STO represents a security token offering. A security token is a digital asset that represents ownership or other rights and transfers value from an asset or bundle of assets to a token.
**HAG STO Highlights ** HAG Token is a security token based on blockchain technology, issued in accordance with US securities laws through an STO. It is the world’s first security token that permanently anchors BTC mining power and distributes WBTC as dividends on a monthly basis. HAG creatively combines digital assets with compliant issuance, especially in the context of increasing global compliance and regulation of digital assets. The issuance of HAG Token will serve as a model for the entire cryptocurrency market and Bitcoin mining industry.
Compared to direct investment in Bitcoin and stocks of Bitcoin mining companies, HAG Token has significant advantages:
Transparency and immediate distribution of profits: The Bitcoins produced each month are directly distributed to investors’ wallets through smart contracts. Compliance: Fully filed with the US SEC and issued in strict accordance with securities laws. Perpetual and constant computing power: A well-designed structure ensures the feasibility of creating long-term value for assets. HAG Tokenomics
Initial fundraising size: $10~20 million
Minimum Ticket Size: $5,000
Token price: Floating but target at $48 /HAG
1 HAG = 1 Terahash/s
Planned initial miner: 1,479 ~ 3,076 units Miner series: s19 XP, 21.5J/TH, 140TH/s
Deployment: 2 months after fully called
Miner depreciation:
30% in 1st year 60% in 2nd year 90% in 3rd year 1 HAG monthly BTC mined (30 days) ~ 0.0000277 BTC (assume miner uptime 95% & pool fee 1%)
Mining cost for 1 BTC ~ USD 15,000 (assume USD 7.5 cent / kWh for electricity)
Static result as of Mar. 14th , 2023. Please refer to Private Placement Memorandum for final detail.
The Company will distribute 70% of the net yield to HAG holders monthly, on a pro rata basis. The remaining 30% will be utilised as a reserve to maintain the targeted hash rate per HAG Token and as general working capital.
The value of HAG is mainly reflected in two aspects. Firstly, its inherent value is highly correlated with Bitcoin, and the long-term value of Bitcoin will bring returns to investors. Secondly, investors can receive 70% of net profit of mining income every month as WBTC dividends.
How to invest?
HAG has partnered with INX Limited (“INX”), the owner of blockchain-based trading platforms for digital securities and cryptocurrencies, the first to complete a security token offering that was filed with the SEC & went through a full prospectus. The HAG and INX strategic partnership will allow investors on the INXS platform(https://one.inx.co/) to be able to invest in the first SEC-filed security token focusing on Bitcoin mining.
Note that the HAG presale is about to begin, and both individual and institutional investors may subscribe in advance at a price of $48 per unit. We will soon release separate tutorials on pre-sales and STO investments, in order to provide investors with additional guidance. Please stay tuned for further updates. Private Placement Memorandum will also be available for final detail. The final price of HAG will be determined based on the market price of Bitcoin and hashrate, and it will be confirmed with the exchange before the STO.
About Hashrate Asset Group
Hashrate Asset Group aspires to build the world’s first sustainable, compliant and transparent Bitcoin standard arithmetic operating model. HAG Token allows investors to join the ecosystem and receive a real-time return on your investment. HAG mining farm is located in the United States, and the team is composed of industry-leading professionals from Bitmain, Goldman Sachs, and TSMC.
Website: https://www.hagsto.com/
Twitter: https://twitter.com/HashrateAsset
Discord: https://discord.gg/9mkYSr23cz
Telegram: https://t.me/HashrateAsset
-
@ 9ecbb0e7:06ab7c09
2023-07-30 04:58:33Las autoridades de la prisión Agüica, en Colón, Matanzas advirtieron al preso político Félix Navarro que será sancionado si continúa sacando a la luz pública los abusos contra los reclusos, según denunció el líder opositor en un audio grabado por teléfono al que Martí Noticias tuvo acceso.
El teniente coronel Emilio Cruz Rodríguez, jefe del penal, hizo trasladar a Navarro a su oficina, donde le comunicó que tomaría medidas si no cesa las denuncias sobre violaciones de los derechos de los privados de libertad y le dijo que no puede hablar con otras personas que no sean de su familia, explicó el activista sentenciado a nueve años de privación de libertad por participar en las protestas del 11 de julio de 2021.
“El miércoles 12 me llevaron para allá y tuvimos ese debate. Me dijo que a él no le importaban las sandeces que yo hablaba", relató Navarro quien también enfrentó una pena por causas políticas durante la Primavera Negra de 2003.
Navarro contestó al oficial que las "sandeces" eran de su parte, lo que el oficial calificó de “falta de respeto".
No obstante, el preso político consideró que la conversación fue “respetuosa” salvo que le están imponiendo arbitrariedades porque las llamadas telefónicas “son para familiares y amigos, no solo para familiares".
Afirmó que no se amedrenta por las represalias prometidas: “esta es nuestra lucha, lo mismo en la calle que aquí” y reiteró la tolerancia y la desidia conque actúa la Fiscalía Militar ante las quejas sobre los maltratos de las autoridades penitenciarias contra los reos.
“La cúpula que manda en Cuba y su Fiscalía Militar no se han pronunciado respecto a las denuncias que publicamos los días 5 y 19 de enero y el 1º. de febrero de 2023. El silencio como respuesta es un método muy utilizado por ellos cuando la razón se impone", declaró en la llamada.
Asimismo, se refirió a “la tolerancia en la región Camagüey ante las ilegalidades y abusos contra los presos cubanos, formuladas en queja en el atestado 76 de 2022, y en la no. 1 del 2021” que no han sido respondidas por la entidad de justicia.
Estos malos tratos “cuentan con toda la impunidad, el respaldo del teniente coronel Yuri Rodríguez y del coronel Mario Viltre, jefes de las fiscalías, de la región de Camagüey y del territorio oriental, respectivamente”, opinó el coordinador del Partido por la Democracia Pedro Luis Boitel, desde la cárcel donde está encerrado.
Con anterioridad, Navarro ha acusado a la Fiscalía Militar cubana de ser “la columna vertebral de la tolerancia y de la impunidad ante las ilegalidades y los abusos de todo tipo que sufren los presos y ha denunciado que en la prisión Agüica aplican tratos crueles e inhumanos a las personas que cumplen penas en ese establecimiento penitenciario.
“Aún continúa afectado el recluso, Juan Carlos Garrote Molina, de Jovellanos, Matanzas”, señaló Navarro que, previamente, el 5 de enero, había hecho pública la situación de Garrote Molina, enfermo “privado de medicamentos y dietas médicas”.
“Si no se puede contar con uno de los órganos más importantes en materia de legalidad y este resulta un fraude, ¿qué otra denuncia se necesitaría para demostrar que dicha jefatura no es otra cosa que un sistema falso, manipulador y dictatorial”, recalcó el líder disidente.
De acuerdo a Navarro, cerca de 14 presos comunes están en huelga de hambre reclamando que los trasladen hacia establecimientos penales cercanos a sus viviendas, ubicadas en Holguín, a casi 600 kilómetros de Agüica, donde están recluidos.
-
@ 9ecbb0e7:06ab7c09
2023-07-30 04:55:00Varios cubanos denunciaron en redes sociales que los cajeros automáticos de la isla comenzaron a entregar billetes impresos en una sola cara, supuestamente por falta de tinta.
Las imágenes de los "nuevos billetes" circulan en redes y aunque el régimen no se ha pronunciado al respecto, un profesor dijo en una publicación en Facebook del influencer Edmundo Dantés que en su escuela también le pagaron con billetes que tienen uno de los lados en blanco.
"Y entonces de forma inesperada los cajeros empiezan a darte dinero invisible como los logros de la Revolución", afirmó Dantés en su post.
Por su parte, Felix Yasser Castillo Pelayo consideró que el Estado "te roba hasta en el cajero" mientras mostraba un billete de 100 pesos con una parte en blanco.
Ante la incredulidad de varios usuarios de la red social un cubano de Santiago de Cuba dijo que en esa ciudad ha pasado y hay personas que han sacado hasta 10 papeles con el sello metálico del medio solamente.
Otros dijeron que estos billetes con error podrían tener valor numismático: Antonio Planas Ampudia dijo que "Tanto en el sector numismático en Cuba como fuera de él en otros países tienden a coleccionar billetes y monedas cubanas desde la colonia, república y actualidad, y tienen un valor según el comprador... Ese billete tiene un buen error, y los hay que coleccionan errores o series, es según gustos...", explicó.
La crisis en Cuba está tan generalizada que alcanza a casi todos los sectores, incluido uno tan sensible como el bancario. Desde hace varios meses los cubanos se quejan de que los cajeros del país amanecen sin efectivo, pero nunca imaginaron que tampoco habría tinta para los billetes.
Se desconoce si los bancos o negocios aceptarán estos papeles mitad impresos mitad en blanco. Algunos afirman enojados que se trata de otra estafa del régimen para con la población del país.
-
@ 9ecbb0e7:06ab7c09
2023-07-30 04:50:26Desde este sábado 29 de julio los ciudadanos cubanos necesitan una visa para el tránsito aeroportuario si desean pasar un aeropuerto de Alemania (zona de tránsito internacional) para continuar viaje a un tercer país (fuera del espacio Schengen), informó la embajada de Alemania en La Habana en una nota publicada en sus canales oficiales.
La sede diplomática, no obstante, precisó que si se cumplen algunas condiciones, entonces no es necesaria una visa de tránsito aeroportuario.
No tendrán que solicitar el documento aquellos cubanos que posean una visa Schengen válida, una visa nacional para la estancia a largo plazo o algún título de estancia emitido por alguno de los países miembros.
Tampoco tendrán que pedir permiso de tránsito quienes tengan una visa válida para Japón, Canadá y Estados Unidos o un permiso de estancia emitido por Andorra, Canadá, Japón, Mónaco, San Marino o Estados Unidos, el cual garantiza la readmisión incondicional del poseedor.
Alemania no exigirá visa de tránsito a quienes posean un pasaporte diplomático cubano, sea familia de ciudadanos de la Unión Europea o sea miembro de tripulaciones de aviones que son ciudadanos de un estado signatario del Convenio de Chicago sobre la aviación civil internacional.
La nota de la embajada de Alemania destaca que "el titular de una visa de tránsito aeroportuario no puede abandonar la zona de tránsito del aeropuerto alemán" y recomienda "con insistencia informarse previamente con toda exactitud acerca del transcurso del viaje, y ante la oficina de viajes o la compañía aérea recabar información acerca de si es necesario cambiar de terminal".
"Si durante el viaje se hace necesario entrar a la jurisdicción de los estados Schengen (por ejemplo, al trasladarse a otro edificio de la terminal o para hacer escala para un vuelo interno Schengen), los ciudadanos cubanos necesitan una visa Schengen", sostiene.
La reservación de un turno para solicitar visa se hace exclusivamente a través de la página web de la Embajada dentro del sistema de reservación de turnos previsto para ello.
La nota añade que debe tenerse en cuenta que la demanda de tales turnos es muy alta. "Por eso hay que ocuparse a tiempo de hacer una reservación. La solicitud se puede hacer ya desde seis meses antes de la fecha de viaje prevista. En cualquier caso, la tramitación solo se hará si previamente se ha sacado un turno".
Relacionado con la nota aparece toda la documentación requerida para la solicitud de la visa de tránsito para los cubanos por Alemania, con un costo de 80 euros que se pagan al contado. La tasa para los niños de entre seis y ocho años de edad es de 40 euros.
La semana pasada trascendió que el número de cubanos que solicitan asilo en Alemania se multiplicó por ocho durante el primer semestre de 2023 respecto al mismo periodo de 2022, de 73 a 607.
Según el diario alemán Bild, que citó a un portavoz del Ministerio del Interior alemán, los cubanos empleaban un mecanismo consistente en comprar un billete de avión a un destino para el que no necesitan visado, por ejemplo Belgrado (Serbia) o Dubai (Emiratos Árabes Unidos), con escala en la ciudad alemana de Fráncfort. Allí, donde los pasajeros en tránsito no necesitan visado, se presentaban ante la Policía y solicitaban asilo.
Según el portavoz, en 2022 se identificaron a 302 cubanos que hicieron uso principalmente de este privilegio de tránsito para solicitar asilo. Agregó que "ni la mitad" de estos cubanos siguió la vía regular, es decir, no se presentaron en el centro correspondiente de la oficina de migración "después de expresar su deseo de asilo ante la Policía federal" en el aeropuerto y del registro de sus datos. El diario señaló que alrededor de 300 cubanos desaparecieron de esta manera.
-
@ bcea2b98:7ccef3c9
2023-07-29 00:48:14I remember the first time I caught the neon glow of the blockchain data-stream pulsating through the old fiber-optic cables in the urban labyrinth of Neo-Tokyo. That was the year Satoshi’s dream, Bitcoin, had become the last refuge of economic freedom, our Prometheus unbound from the chains of tyrannical central banking. That was the year the world ended and started anew, all at once, like the flicker of an old digital display struggling to hold onto a single pixel of hope.
Cyber-Sentries, autonomous quantum-resistant algorithms birthed in the tech womb of Bitcoin Core, roamed the noospheric network, relentless in their hunt for anomalies that could threaten our crypto-republic. Their code was hardened against the threat of quantum decryption, a terror from the old-world that now lurked in the shadows of the darknet. Armed with Q-Diffusers, inventive tech developed to obscure any coherent superposition of qubits, the Sentries were the silent, digital paladins of our new world.
The Cyber-Sentries were the silent guardians of our cryptoverse, digital watchdogs built on an intricate framework of quantum-resistant algorithms. Brought to life in the crucible of Bitcoin Core, these Sentries were the vanguards of Satoshi's legacy, standing watch against potential threats.
Imagine a predator of the old-world, a wolf maybe, that would tirelessly patrol its territory, sniffing out any intruder that dared to violate its space. A Cyber-Sentry was much like that, except its territory was the vast terrain of the Bitcoin network, and its sense of smell was replaced by a near-infallible ability to detect irregularities and potential threats in the data streams.
These Sentries were powered by an advanced AI system, coded to learn and adapt to evolving threat landscapes. Every anomaly detected was analyzed, its signature extracted and stored in a vast neural repository, a cerebral vault of threat patterns. The Sentries learned from each encounter, evolving their defenses with every battle fought.
Each Sentry was armed with Q-Diffusers, state-of-the-art technology designed to disperse any coherent superposition of qubits. These Q-Diffusers would essentially introduce a deluge of quantum noise into the system of any would-be attacker, rendering their quantum computing capabilities useless. This was their primary defense against the quantum decryption threat, their shield against the spectral specter of the old-world quantum terror.
In action, the Cyber-Sentries were like specters, a ghost in the machine. Unseen, unheard, they roamed the neural pathways of the network, ceaselessly vigilant. When a threat was detected, they would swarm, a lightning-fast, coordinated response that left adversaries disoriented and defeated. They were not just defenders; they were also deterrents, their formidable reputation alone enough to ward off many would-be attackers.
Ever-present, though unseen, was the hum of the Lightning Network. Its evolution had exceeded even the wildest predictions of the cypherpunk prophets. Beyond mere instantaneous transactions, it now powered our communications, our commerce, our very lives. Its channels were our veins, the satoshis our lifeblood, coursing through the body of the new age, offering near-infinite scalability, an economic pulse for every living soul.
In this cryptic symphony, a new tech marvel played the sweetest tune. The Merkle Sanctum, an advanced shield technology, made the Bitcoin network impervious to attacks. At its core was the labyrinthine, self-regenerating maze of hash-based proof functions. Each function was an arcane spell in itself, casting away would-be invaders into oblivion. To attempt a breach was to confront infinity itself, a Sisyphean task for even the most formidable foe.
Merkle Sanctum was the magnum opus of cryptographic defense in the Bitcoin network, a technological marvel that made it nigh-impregnable to attacks. Named in honor of Ralph Merkle, one of the godfathers of modern cryptography, the Sanctum was more than just a defensive system. It was a monument to the boundless ingenuity of the human mind, a labyrinthine fortress constructed in the abstract plane of mathematical probabilities.
At its core, the Merkle Sanctum functioned as a self-regenerating maze of hash-based proof functions. These functions were cryptographic constructs, sequences of mathematical operations designed to protect the network. They were like invisible walls in the Sanctum, each one unique and insurmountable.
The true genius of the Sanctum was in its dynamic architecture. Each hash-based proof function was not static; it evolved with every transaction, every second that passed in the Bitcoin network. This made the Sanctum a dynamic, ever-changing labyrinth. Its walls shifted, its passages morphed, its structure regenerated in the blink of an eye. To an attacker, it presented a puzzle that was infinitely complex and perpetually changing.
This dynamic nature was powered by the pulse of the Bitcoin network itself. Each transaction, each data packet, each heartbeat of the network served as a seed of change, a catalyst for the regeneration of the Sanctum. It was a system in symbiosis with the network, drawing strength from the very activity it sought to protect.
Yet, despite its complexity, the Sanctum was not impenetrable. After all, the aim was not to exclude but to protect. Legitimate network participants could navigate through the labyrinth with the aid of cryptographic keys, their unique digital signatures acting as Ariadne's thread in the shifting maze. These keys allowed them to bypass the proof functions, moving through the Sanctum as if through open doors.
To invaders, however, the Sanctum presented a challenge of Herculean proportions. With its dynamic, ever-changing architecture, and near-infinite complexity, attempting to breach the Sanctum was like trying to capture a waterfall in a sieve. It was a task beyond the reach of even the most sophisticated quantum computer, a futile attempt to conquer a fortress that was as mercurial as it was formidable.
Humans huddled in the neon halo of their digital screens, their eyes reflecting the dancing symbols of hope and freedom. Pseudonymous, untraceable, they whispered to each other across the globe. Deals were brokered, wisdom shared, love declared - all in the cryptic tongue of Bitcoin. In this world on the edge of time, humanity had reclaimed its voice, and with it, the power to dream again.
The year 2140 dawned in the neural cortex of the global hivemind. In the alleyways of the silicon city, miners with their rigs, now relics of a different era, gathered for the last hurrah. These were the custodians of the cryptoverse, their ASICs humming lullabies of complex mathematical problems, their veins humming with electricity. The air was thick with hope and a sense of closure as they prepared to decipher the final hash.
The clock struck the hour, the Ledger of Eternity, a monumental quantum blockchain register, started the countdown. The world held its breath as the final block reward, a mere satoshi, the last of the 21 million, was offered up for claim. The blockchain was a spiderweb of transactions and mathematical riddles, intricate, indecipherable.
A hush fell over the data streams as the ASICs began their dance. The miners’ eyes glowed with the reflected light of the complex computations running on their screens. They could feel the pulse of the network in their blood, in the electrified air that buzzed around their rigs.
Then, in an almost anticlimactic moment, the final hash was deciphered, the block reward claimed. The last satoshi nestled itself in a digital wallet. The entire cryptoverse erupted in silent celebration, a solitary firework blossoming in the inky black of the datastream.
For the first time since its inception, Bitcoin existed without the promise of a block reward. Yet, it did not falter. It stood resilient in its quiet dignity, an economic lodestar in the chaotic quantum sea. For what had begun as an incentive mechanism was now a symbol of freedom, a testament to a world unshackled from the chains of traditional economic structures.
As 2140 rolled on, Bitcoin stood at the precipice of a new age. Without the block reward, it had transformed, yet again, to embody its core tenet of resilience. It was no longer a system driven by the promise of reward but one that thrived on the principles of decentralization and autonomy, the beacon of economic freedom in a brave, new digital world.
-
@ 57fe4c4a:c3a0271f
2023-07-30 22:52:21👥 Authors: Salvatore Ingala ( nostr:npub1lpkmxpl2zhk0w30vtdz7s64ml9644k785eggmjsjgs7wman3szzqac74n4 )
📅 Messages Date: 2023-07-30
✉️ Message Count: 1
📚 Total Characters in Messages: 4292
Messages Summaries
✉️ Message by Salvatore Ingala on 30/07/2023: A complete proposal for the core opcodes of MATT has been put together, with improved implementation and including OP_CHECKTEMPLATEVERIFY.
Follow nostr:npub15g7m7mrveqlpfnpa7njke3ccghmpryyqsn87vg8g8eqvqmxd60gqmx08lk for full threads
-
@ bcea2b98:7ccef3c9
2023-07-29 00:46:37The Year 2140
I remember the first time I caught the neon glow of the blockchain data-stream pulsating through the old fiber-optic cables in the urban labyrinth of Neo-Tokyo.
That was the year Satoshi’s dream, Bitcoin, had become the last refuge of economic freedom, our Prometheus unbound from the chains of tyrannical central banking That was the year the world ended and started anew, all at once, like the flicker of an old digital display struggling to hold onto a single pixel of hope.
Cyber-Sentries, autonomous quantum-resistant algorithms birthed in the tech womb of Bitcoin Core, roamed the noospheric network, relentless in their hunt for anomalies that could threaten our crypto-republic.
Their code was hardened against the threat of quantum decryption, a terror from the old-world that now lurked in the shadows of the darknet Armed with Q-Diffusers, inventive tech developed to obscure any coherent superposition of qubits, the Sentries were the silent, digital paladins of our new world.
The Cyber-Sentries were the silent guardians of our cryptoverse, digital watchdogs built on an intricate framework of quantum-resistant algorithms.
Brought to life in the crucible of Bitcoin Core, these Sentries were the vanguards of Satoshi's legacy, standing watch against potential threats.
Imagine a predator of the old-world, a wolf maybe, that would tirelessly patrol its territory, sniffing out any intruder that dared to violate its space.
A Cyber-Sentry was much like that, except its territory was the vast terrain of the Bitcoin network, and its sense of smell was replaced by a near-infallible ability to detect irregularities and potential threats in the data streams.
These Sentries were powered by an advanced AI system, coded to learn and adapt to evolving threat landscapes.
Every anomaly detected was analyzed, its signature extracted and stored in a vast neural repository, a cerebral vault of threat patterns The Sentries learned from each encounter, evolving their defenses with every battle fought.
Each Sentry was armed with Q-Diffusers, state-of-the-art technology designed to disperse any coherent superposition of qubits.
These Q-Diffusers would essentially introduce a deluge of quantum noise into the system of any would-be attacker, rendering their quantum computing capabilities useless This was their primary defense against the quantum decryption threat, their shield against the spectral specter of the old-world quantum terror.
In action, the Cyber-Sentries were like specters, a ghost in the machine.
Unseen, unheard, they roamed the neural pathways of the network, ceaselessly vigilant When a threat was detected, they would swarm, a lightning-fast, coordinated response that left adversaries disoriented and defeated They were not just defenders; they were also deterrents, their formidable reputation alone enough to ward off many would-be attackers.
Ever-present, though unseen, was the hum of the Lightning Network.
Its evolution had exceeded even the wildest predictions of the cypherpunk prophets Beyond mere instantaneous transactions, it now powered our communications, our commerce, our very lives Its channels were our veins, the satoshis our lifeblood, coursing through the body of the new age, offering near-infinite scalability, an economic pulse for every living soul.
In this cryptic symphony, a new tech marvel played the sweetest tune.
The Merkle Sanctum, an advanced shield technology, made the Bitcoin network impervious to attacks At its core was the labyrinthine, self-regenerating maze of hash-based proof functions Each function was an arcane spell in itself, casting away would-be invaders into oblivion To attempt a breach was to confront infinity itself, a Sisyphean task for even the most formidable foe.
Merkle Sanctum was the magnum opus of cryptographic defense in the Bitcoin network, a technological marvel that made it nigh-impregnable to attacks.
Named in honor of Ralph Merkle, one of the godfathers of modern cryptography, the Sanctum was more than just a defensive system It was a monument to the boundless ingenuity of the human mind, a labyrinthine fortress constructed in the abstract plane of mathematical probabilities.
At its core, the Merkle Sanctum functioned as a self-regenerating maze of hash-based proof functions.
These functions were cryptographic constructs, sequences of mathematical operations designed to protect the network They were like invisible walls in the Sanctum, each one unique and insurmountable.
The true genius of the Sanctum was in its dynamic architecture.
Each hash-based proof function was not static; it evolved with every transaction, every second that passed in the Bitcoin network This made the Sanctum a dynamic, ever-changing labyrinth Its walls shifted, its passages morphed, its structure regenerated in the blink of an eye To an attacker, it presented a puzzle that was infinitely complex and perpetually changing.
This dynamic nature was powered by the pulse of the Bitcoin network itself.
Each transaction, each data packet, each heartbeat of the network served as a seed of change, a catalyst for the regeneration of the Sanctum It was a system in symbiosis with the network, drawing strength from the very activity it sought to protect.
Yet, despite its complexity, the Sanctum was not impenetrable.
After all, the aim was not to exclude but to protect Legitimate network participants could navigate through the labyrinth with the aid of cryptographic keys, their unique digital signatures acting as Ariadne's thread in the shifting maze These keys allowed them to bypass the proof functions, moving through the Sanctum as if through open doors.
To invaders, however, the Sanctum presented a challenge of Herculean proportions.
With its dynamic, ever-changing architecture, and near-infinite complexity, attempting to breach the Sanctum was like trying to capture a waterfall in a sieve It was a task beyond the reach of even the most sophisticated quantum computer, a futile attempt to conquer a fortress that was as mercurial as it was formidable.
Humans huddled in the neon halo of their digital screens, their eyes reflecting the dancing symbols of hope and freedom.
Pseudonymous, untraceable, they whispered to each other across the globe Deals were brokered, wisdom shared, love declared - all in the cryptic tongue of Bitcoin In this world on the edge of time, humanity had reclaimed its voice, and with it, the power to dream again.
The year 2140 dawned in the neural cortex of the global hivemind.
In the alleyways of the silicon city, miners with their rigs, now relics of a different era, gathered for the last hurrah These were the custodians of the cryptoverse, their ASICs humming lullabies of complex mathematical problems, their veins humming with electricity The air was thick with hope and a sense of closure as they prepared to decipher the final hash.
The clock struck the hour, the Ledger of Eternity, a monumental quantum blockchain register, started the countdown.
The world held its breath as the final block reward, a mere satoshi, the last of the 21 million, was offered up for claim The blockchain was a spiderweb of transactions and mathematical riddles, intricate, indecipherable.
A hush fell over the data streams as the ASICs began their dance.
The miners’ eyes glowed with the reflected light of the complex computations running on their screens They could feel the pulse of the network in their blood, in the electrified air that buzzed around their rigs.
Then, in an almost anticlimactic moment, the final hash was deciphered, the block reward claimed.
The last satoshi nestled itself in a digital wallet The entire cryptoverse erupted in silent celebration, a solitary firework blossoming in the inky black of the datastream.
For the first time since its inception, Bitcoin existed without the promise of a block reward.
Yet, it did not falter It stood resilient in its quiet dignity, an economic lodestar in the chaotic quantum sea For what had begun as an incentive mechanism was now a symbol of freedom, a testament to a world unshackled from the chains of traditional economic structures.
As 2140 rolled on, Bitcoin stood at the precipice of a new age.
Without the block reward, it had transformed, yet again, to embody its core tenet of resilience It was no longer a system driven by the promise of reward but one that thrived on the principles of decentralization and autonomy, the beacon of economic freedom in a brave, new digital world.
-
@ bcea2b98:7ccef3c9
2023-07-28 13:48:57The book "Softwar: A Novel Theory on Power Projection and the National Strategic Significance of Bitcoin" by Jason P. Lowery offers an extensive exploration of Bitcoin through the lens of strategic and cybersecurity, as opposed to a purely fiscal one. It presents a novel theoretical paradigm for assessing the potential strategic national implications of Bitcoin, viewed as an electro-cyber defense technology as opposed to a decentralized electronic cash system.
The following are some pivotal arguments and possible inaccuracies from the standpoint of a Bitcoin developer:
Proof-of-work
The proposition that Bitcoin, via its proof-of-work protocol, can be perceived as an electro-cyber security framework allowing for the exertion of tangible power in the digital domain. Lowery posits that Bitcoin equips individuals with the capacity to enforce severe physical consequences (represented in wattage) on adversarial entities attempting to exploit them via software. This viewpoint conceptualizes Bitcoin as a mechanism for power projection, enabling individuals to establish and maintain an untrusting, permissionless, egalitarian, and decentralized dominion over bits of information, provided they have the will and capability to wield physical power to safeguard it (Page 245).
-
The critique here, though, points out that this is an intriguing perspective; the primary purpose of Bitcoin is to operate as a decentralized digital currency. The power projection elucidated by the author is an ancillary outcome of Bitcoin's decentralization and inherent security, rather than its primary intent. The function of Bitcoin's proof-of-work algorithm is to fortify the network against double-spending and other forms of attack, not to project power in a geopolitical context.
-
The book also proposes that Bitcoin could upheave human power-based dominance hierarchies from their foundational roots (Page 374). While it's undeniable that Bitcoin has the potential to disrupt conventional financial systems and power structures, it was not designed to be an instrument for power projection in the same way that military or economic resources are. Bitcoin is a tool for financial sovereignty and privacy, not a weapon for geopolitical power struggles.
-
The book seems to equate the physical power utilized in Bitcoin's proof-of-work algorithm (i.e., the energy expended to mine Bitcoin) with the notion of power projection in a geopolitical environment. While it's true that Bitcoin mining demands substantial energy resources, it is distinctly different from the concept of power projection, which usually involves employing military or economic force to influence the actions and behavior of other nations.
Bits and Value
The book asserts that the bits within Bitcoin can symbolize any kind of valuable data that individuals desire it to represent, including but not confined to financial information (Page 375). It further implies that once we have deduced how to keep financial bits of data physically secure against attack, we have in essence figured out how to protect all types of information from physical attack (Page 34).
-
While it is indeed factual that Bitcoin's blockchain can be employed to store and transmit non-financial data, it's crucial to understand that this is not its primary function. Bitcoin was developed as a peer-to-peer electronic cash system, with the primary objective of facilitating financial transactions without necessitating a trusted third party. The data that Bitcoin primarily processes is transaction data, i.e., detailing who transferred how much Bitcoin to whom.
-
The capability to store non-financial data on the Bitcoin blockchain is more of an unintended consequence of its design rather than a fundamental feature. This is accomplished by utilizing specific fields in the transaction data to store arbitrary data, but this practice is generally discouraged because it can result in blockchain bloat. The Bitcoin blockchain is not intended to function as a general-purpose data storage system, and employing it as such can induce inefficiencies.
-
Storing non-financial data on the Bitcoin blockchain does not offer any additional security advantages over storing financial data. The security of the Bitcoin blockchain is derived from its decentralized, proof-of-work consensus mechanism, which ensures that no single entity can control or manipulate the transaction history. This security extends equally to all data housed on the blockchain, whether it's financial or non-financial.
Strangler Pattern
The book suggests that Bitcoin could serve as a "strangler pattern" replacement of our legacy internet infrastructure with a modern architecture that isn’t as clearly susceptible to systemic exploitation and abuse (Page 376). The book further intimates that Bitcoin could operate as the preeminent mechanism employed by digital-age societies to map or anchor elements of the old internet version to the revamped version of the internet (Page 322).
-
While Bitcoin indeed introduces a new, decentralized paradigm for financial transactions and data storage, it isn't designed to supplant the entire internet infrastructure. The internet is an extensive and multifaceted system supporting a wide variety of applications and services, from email and web browsing to video streaming and cloud computing. In contrast, Bitcoin is a specific application that operates on the internet, designed to enable peer-to-peer financial transactions.
-
The claim made by the book appears to conflate the concept of a decentralized financial system (which Bitcoin offers) with the idea of a decentralized internet (which is a much broader and more intricate concept). While there are projects and technologies aiming to foster a more decentralized internet (such as InterPlanetary File System (IPFS) and various blockchain-based "Web 3.0" initiatives), Bitcoin is not one of them.
-
The book's suggestion that Bitcoin could serve as the dominant mechanism for tying the legacy internet to a new version of the internet is speculative and isn't substantiated by current technological realities. While it's true that blockchain technology (the underlying technology of Bitcoin) has potential applications in areas like decentralized identity systems and domain name systems, these are still largely experimental and have not been widely adopted yet.
Kinetic Power
The book propososes that Bitcoin could significantly transform the way humans vie for control over valued resources, effectively resetting the global balance of power in a manner akin to the profound changes brought about by full-scale kinetic world wars (Page 376). The author further proposes that Bitcoin could revolutionize the perception of physical confrontation in the digital-age society (Page 370). Moreover, it's implied that global adoption of an electro-cyber form of physical power competition, facilitated by proof-of-work technologies like Bitcoin, could lead to the genesis of a completely novel form of polity (Page 385).
-
It is undeniable that Bitcoin holds the potential to disrupt conventional financial systems and power structures. However, juxtaposing its impact to the scale and consequences of world wars is hyperbolic and speculative. World wars resulted in massive loss of human lives, extensive destruction, and substantial shifts in geopolitical boundaries and power structures. In contrast, Bitcoin is a digital currency that operates on a decentralized network. While it can indeed influence financial systems and potentially disrupt some power structures, particularly those related to finance and banking, it is incapable of causing physical destruction or loss of life.
-
The book suggests that Bitcoin could spawn a completely new form of polity is speculative and isn't supported by current realities. Although Bitcoin and other cryptocurrencies can indeed influence how people transact and store value, they don't intrinsically alter the fundamental nature of societal organization or governance. They may contribute to shifts in power dynamics, particularly in the financial sector, but they do not supplant the need for traditional forms of governance and societal organization.
-
The book also seems to conflate the concept of power competition in a geopolitical context with the concept of power competition in the context of Bitcoin mining. Bitcoin mining does involve a form of competition, as miners compete to solve complex mathematical problems to add new blocks to the blockchain and receive Bitcoin rewards. However, this is far from the concept of power competition in a geopolitical context, which involves nations competing for resources, influence, and dominance on the global stage.
Projecting Power
The book asserts that Bitcoin, through its proof-of-work technology, could help society secure itself against systemic exploitation of computer networks (Page 378). The author argues that Bitcoin could empower individuals to physically secure their digital information, by projecting physical power in, from, and through cyberspace to impose severe physical costs on those who exploit our computing systems (Page 347). The author further speculates that national adoption of Bitcoin could be conceptualized as the employment of an electro-cyber militia to safeguard and defend a nation’s valuable digital information (Page 360).
-
While it's true that Bitcoin and other cryptocurrencies can provide a level of security and privacy not available in traditional financial systems, it's crucial to underscore that they also introduce new vulnerabilities and potential for misuse. Bitcoin transactions are irreversible, which can make them appealing for fraudulent activities. Bitcoin wallets can be lost or stolen if not adequately secured. The security of the Bitcoin network itself hinges on its decentralized nature and the computational power required to alter the blockchain, but individual users are still susceptible to scams and hacking.
-
The book's suggestion that Bitcoin could serve as a form of "electro-cyber militia" is speculative and not universally accepted within the Bitcoin community. Although Bitcoin does offer a method for individuals to secure their financial transactions against interference from third parties, it is not designed to shield against all forms of cyber exploitation or to serve as a form of national defense.
-
In conclusion, Softwar presents an unconventional viewpoint on Bitcoin's potential influence on national strategic security, proffering that Bitcoin could serve as an effective tool for power projection, system defense, and even societal reorganization. However, these propositions, while intriguing, often veer towards the speculative and seem to be somewhat misaligned with the predominant understanding of Bitcoin's primary purpose and functionality within the field of computer science.
-
While this book undoubtedly offers a unique and thought-provoking perspective on Bitcoin's wider implications, it is crucial to approach these theories with a discerning and critical eye. It is always beneficial to juxtapose such theories with the broader consensus within the Bitcoin development and computer science communities, to gain a more rounded understanding of this remarkable technology. Ultimately, Bitcoin continues to evolve and its full potential, while yet to be fully realized, remains a subject of intense debate and exploration within the computer science community.
-
-
@ 9ecbb0e7:06ab7c09
2023-07-30 04:47:06Nicolás Petro, el hijo mayor del presidente de Colombia, Gustavo Petro, y diputado de la Asamblea del Departamento del Atlántico, fue detenido este sábado por la Fiscalía, que lo investiga por el posible delito de lavado de dinero y enriquecimiento ilícito, informó esa institución, reportó EFE.
Además de Petro hijo, fue arrestada su exesposa Daysuris Vásquez, quien a comienzos de año lo acusó de recibir de un narcotraficante una alta suma de dinero para la campaña del hoy presidente, y de quedarse con ese dinero.
La Fiscalía informó de que las capturas tuvieron lugar "el día de hoy 29 de julio de 2023, sobre las 06:00 horas, en cumplimiento a lo dispuesto por el Juzgado 16 Penal Municipal con Función de Control de Garantías de Bogotá".
La institución añadió en un comunicado que Nicolás Fernando Petro Burgos fue detenido "por los delitos de lavado de activos y enriquecimiento ilícito y Daysuris del Carmen Vásquez Castro por los punibles de lavado de activos y violación de datos personales por hechos ocurridos desde 2022 a la fecha".
"Los capturados serán puestos a disposición de un juez penal municipal con Función de Control de Garantías, a quien se le solicitará impartir legalidad a los procedimientos de allanamiento, captura e incautación de elementos materiales probatorios", añadió.
También "se formulará imputación por los delitos ya citados y se solicitará una medida restrictiva de la libertad", según la Fiscalía.
El pasado 21 de marzo la Fiscalía anunció que investigaba penalmente a Nicolás Petro por sus supuestas reuniones con narcotraficantes en la cárcel y por un "posible lavado de activos".
La exesposa del hijo de Petro aseguró a comienzos de ese mes en una entrevista con la revista Semana que el narcotraficante Samuel Santander Lopesierra, alias "El hombre Marlboro", le entregó a Nicolás Petro "más de 600 millones de pesos (unos 153.000 dólares de hoy) para la campaña del papá".
"Eso nunca llegó legalmente a la campaña porque él se quedó con ese dinero, y así otros", añadió la mujer, que mencionó que Nicolás Petro también recibió 200 millones de pesos (unos 51.000 dólares) del controvertido empresario Alfonso Turco Hilsaca, que tampoco fueron a dar a la campaña.
Por su parte, el presidente Gustavo Petro afirmó este sábado que "no intervendrá ni presionará" las decisiones de la Fiscalía tras la captura de su hijo.
"Como persona y padre me duele mucho tanta autodestrucción y el que uno de mis hijos pase por la cárcel; como presidente de la República aseguro que la Fiscalía tenga todas las garantías de mi parte para proceder de acuerdo a la ley", expresó el mandatario en su cuenta de Twitter.
"A mi hijo le deseo suerte y fuerza. Que estos sucesos forjen su carácter y pueda reflexionar sobre sus propios errores. Como afirmé ante el fiscal general, no intervendré ni presionaré sus decisiones; que el derecho guíe libremente el proceso", agregó.
-
@ c80b5248:6b30d720
2023-07-07 00:17:30A few months ago I made a simple post that deserves a longer explanation: nostr:note1fdu9uu2rjjd0zusgpkpyv8c6qvjpx78z974qkhj7v3mwlac2st3sxywf8h
What I really mean
Python has been wildly successful over the last 13 years. Check out its rise in usage shown by this chart from a Medium article by Vahid Vaezian.
https://miro.medium.com/v2/resize:fit:4800/format:webp/1*XBOqPd5_4tKF2hf8iAV0vQ.png
Obviously Nostr isn’t literally Python. Nostr is a language-agnostic protocol and Python is a programming language. However, Nostr and Python embody a similar spirit and a similar culture that I think is bound to lead to the long-term success of Nostr in the same way we have seen Python explode over the last two decades. What makes me say this?
3 key traits of Python (and Nostr)
Before continuing: If you are familiar with Python, but new to Nostr consider reading the nostr.how - What is Nostr and nostr.how - The Nostr Protocol for a relatively quick introduction to the protocol.
1. Open by default and adaptable
Both Nostr and Python are open by default.
Python is a completely open-source language and many of the most powerful packages built within the ecosystem are open-source, even when development is funded by large companies. Like many programming languages, the download and usage of Python is not permissioned.
Nostr is a protocol. Additions to the protocol are openly debated and decided upon by the community on the Nostr GitHub repository. What constitutes the definition of the protocol is entirely determined by the community and their usage.
The protocol needs to be cohesive enough so that the bulk of clients and relays can reliably interact, but there is room for flexibility and varied implementation within the ecosystem. In a few instances, there have been disagreements on different implementations of Nostr Improvement Proposals (NIPs). Because usage of the protocol is open and does not require consensus, clients and relays are free to implement contentious additions. This type of fluid consensus and openness to change is something that cannot exist in other decentralized systems that require rigid consensus between nodes (e.g. Bitcoin).
2. Built to be flexible via composability
Both Python and Nostr are designed with flexibility in mind.
Python’s design choices allow it to excel in a wide variety of use-cases from data science to web development. The diversity of the community and availability of new packages makes using Python even more desireable as you can do nearly everything (including running code from other languages when needed). Over time the Python ecosystem has become a robust set of classes within the standard library and coutless other packages that can be composed by users into a variety of programming styles and projects.
Nostr is often mistaken as only a social media “platform” which couldn’t be further from the truth. The “os” in Nostr, which stands for ”other stuff” is a key distinction between the Nostr protocol and many of its counterparts. Over its first couple years of existence the “other stuff” has been mostly metadata related to Twitter-like platforms, but more more we are starting to see new kinds of events being transmitted over the Nostr protocol. A few notable examples include zaps, highlights, location data, marketplace data, and ephemeral encrypted data.
Nothing about the Nostr spec dictates how a client will piece together these types of events to create a user experience. The earliest example of Nostr client implementations have primarily mimicked existing social media platforms because of the simplicity of the medium - a twitter-clone only requires short text notes and follow lists to be functional. Similarities to Twitter have driven the mainstream narrative, but the Nostr ecosystem is beginning to explode.
Community-based approaches (like Reddit) are beginning to take off, spurred on by Reddit’s failures to meet developer and user needs. Even more interesting applications that don’t exist in the centralized landscape are coming to bear as well. Projects like stemstr, Highlighter, and Nostrocket show some of the different formulations of tools that can be built on the Nostr spec. These are all new formulations of how we share information socially.
The new project nsecBunker by Pablo uses Nostr in a completely different way to build a new sort of “password management” system for cryptographic signatures in Nostr. This is not a social application, but a completely new security model achieved by composing existing metadata events in a new workflow that clients can adopt.
Just as the flexibility of Python and the diversity of Python packages leads to even more diversity, the flexibility of composition allowed by Nostr events and the dumb relay, smart client model are going to lead to an explosion of Nostr client implementation styles.
3. Accessibility over performance
Nostr and Python both favor developer accessibility over raw performance.
One of the best compliments a Python developer can receive is to have their code referred to as “Pythonic”. Executing
import this
in a python interpreter returns the core of what it means for code to be pythonic:python ''' The Zen of Python, by Tim Peters Beautiful is better than ugly. Explicit is better than implicit. Simple is better than complex. Complex is better than complicated. Flat is better than nested. Sparse is better than dense. Readability counts. Special cases aren't special enough to break the rules. Although practicality beats purity. Errors should never pass silently. Unless explicitly silenced. In the face of ambiguity, refuse the temptation to guess. There should be one-- and preferably only one --obvious way to do it. Although that way may not be obvious at first unless you're Dutch. Now is better than never. Although never is often better than *right* now. If the implementation is hard to explain, it's a bad idea. If the implementation is easy to explain, it may be a good idea. Namespaces are one honking great idea -- let's do more of those! '''
These principles are almost certainly a key part of why Python has remained one of the most accessible languages to aspiring coders (including myself) and has experienced such a rapid growth over the past decade. What Python lacks in raw computational efficiency it has made up for by remaining simple, readable, beautiful, and practical. This brought in bringing in a vast community of people who are passionate about solving problems and are willing to learn new tools and experiment to do so.
I see very similar qualities in the Nostr development community. Though the Nostr specification does get more complicated with the addition new NIPs, it cannot change the core properties of the protocol that make it easy to understand. I think Will (nostr:npub1xtscya34g58tk0z605fvr788k263gsu6cy9x0mhnm87echrgufzsevkk5s) said it best when he tweeted
nostr is just schnorr-signed json blobs of different kinds (profile, contact list, post, like, retweet) that you can store locally
https://twitter.com/jb55/status/1594000865568129025?s=46
Fundamentally, Nostr is just entities (clients and relays) passing around these JSON blobs a websocket, which you can think of as effectively just a bidirectional pipe of data between two entities.
The composition of these JSON blobs and their formulation by clients can range in complexity, but the simplest implementation truly is quite simple and easy to understand. Compare this to the specifications for a product like BlueSky (which doesn’t extend beyond being a twitter clone) and it is not hard to understand why they don't have a trove of would-be developers knocking on their door as Jack (nostr:npub1sg6plzptd64u62a878hep2kev88swjh3tw00gjsfl8f237lmu63q0uf63m) lamented i a now infamous post:
nostr:nevent1qqsgxtjp8yftcqvx69e2dgk0mk4lmafskpn8wn7lqwpdw5jr07ylsdca2qxrl
Why you should care
The success of Python is not a fluke - the three characteristics listed above have been key drivers in the massive adoption rate of Python by new programmers. Other programming languages like Rust and Julia have attempted to solve some of the performance limitations inherent with Python, but will always struggle to gain the same level of attraction due to the vast network effect of the Python package ecosystem. This is not to say languages like Rust won’t be successful - in many cases, Rust may be the better choice for certain projects in 2023. However, Python’s dominance shows the power and impact of new developers consistently entering the ecosystem due to the ecosystem’s broad utility.
Just within the past few months a new project, Mojo has garnered lots of excitement because it promises to massively improve on Python’s performance while also being able to capitalize on the network effects associated with Python since it is a superset of Python and should ultimately be able to run any existing Python code. Will Mojo overtake Python? I don’t know, but I believe the only way it has a chance is if it maintains all of the above characteristics of Python and it delivers in terms of increased performance.
Nostr is positioned similarly to Python in relation to competing protocols. Until another decentralized data protocol comes around that is open, flexible, accessible, and more efficient, Nostr is bound to win.
-
@ d030bd23:96435da9
2023-07-30 03:25:07Test
-
@ 75bf2353:e1bfa895
2023-07-06 19:46:08The first meetup was a success. I met one person who lived near me over coffee. He even runs Node and Nostr Relay. What are the odds? I assume it's really low. According to Clark Moody's dashboard, there are about 17,000 available nodes. I haven't verified this number, but this is the first person I met in the wild who runs a bitcoin node.
This gives me hope. I know starting a meetup on a little-known Internet protocol has a snowball's chance in a pizza oven of working, but I'm rationally optimistic. I saw a lot of plebs post their npubs on Saturday because the Twitter chief pissed them off. I don't know why, but it's indicative of the importance of permission-less computer networking. I used to rub my peach fuzz and think the Internet would give people power back from the corporations. TV was just a way to sell soap, drugs, and cars. The corporations didn't even let you use the word "ass." Then they had experts like Colin Powell tell us Iraq had weapons of mass destruction. I didn't believe him because I watched the videos of trucks and warehouses, but evidence of trucks and warehouses is not evidence of weapons. Who was the conspiracy theorist? Me, or Colin Powell? I thought the Internet would fix this. I was active on MySpace and there were no ads.
Then Facebook happened. Then Instagram, which had no ads at first, until it got bought by facebook. Then Facebook experimented with affecting people's moods. . Things got strange. In 2016, the Donkey team accused Russians of influencing the elections. [Cambridge Analytica did some fuckery]9https://en.wikipedia.org/wiki/Cambridge_Analytica#Elections) I don't fully understand, allegedly influencing elections is the US, Australia, India, Kenya, Matla, Mexico, and the UK. In 2020, the Elephant team accused the elections of being rigged. So for about 8 years, a significant portion of the united states as not trusted the results of an election. What do you think the odds of distrust in this upcoming election? I wouldn't bet many sats on it going smoothly.
"A third point is the confirmation of the central role that former spies played in October 2020 in framing the Hunter Biden story in a way that made it easier for Twitter and Facebook to justify their censorship." (Wall Street Journal)
When you zoom out, you realize this is not just about American politics. Similar things are happening all over the world. One day, you see riots in Shi Lanka. A few months later riots in France. Many people blame free speech, but what is free about the algorithms? Facebook also has the right to write code. Code is speech, cypherpunks write code. In my opinion, Facebook is bad speech. The answer to bad speech is more speech Nostr is more speech, better speech in my opinion.
"Cryptography is a surprisingly political technology. In recent years, it has become more so, with the controversy surrounding the Government's Clipper chip, the FBI wiretap legislation, export controls on cryptographic software, and the balance of power between a government and its people. Historically, cryptography has been used mainly by governments for diplomatic and military traffic. But with the coming of the information age, ubiquitous personal computers, modems, and fax machines, this is changing. With an emerging global economy depending more and more on digital communication, ordinary people and companies need cryptography to protect their everyday communications. Law enforcement and intelligence agencies want access to all of our communications, to catch people who break the law, and detect threats to National Security. Civil libertarians want to keep the Government out of our private communications, to protect our privacy and maintain a healthy democracy."
--Phil Zimmermann
Nostr Fixes This
I don't like talking about politics because so many people are programmed to get triggered when you mention certain hot-button topics, but it's possible to use cryptography in a way to preserve freedom. The Twitter files revealed government censorship on the blue bird app. This is relevant to us because public key cryptography because there are no twitter files. The government needs to censor all the relays, not just a single corporation that is too afraid to pay for advertising on shows that use the word "shit." It's time we start thinking of using technology in terms of the Sovereign Individual. Normies debate weather or not a web developer or graphic designer should be compelled to write code or create art they don't agree with. Meanwhile, I used free and open source software to make my own webpage, but if you don't know HTML, Chat GPT does. I made most of the photos using Stable Diffusion. Apple can ban zaps from their apps, but can they ban SamSamskies from writing code? https://notazap.lol/
Until nostr, we didn't have freedom on corporate-controlled, government-censored social media. This is not a blog about politics. It's about communication. I don't want the red team or the blue team to control what the public may or may not see.. I do not wish to be booted from Meetup.com again because I do not identify the color of my skin, my email address, or the name my mother gave me I'm not joining an apple pod bitcoin meetup. I want free speech and freedom from undue influence. It's not that I'm mad at meetup.com. They are a company and can do anything they want, but so am I. There is not a real person named meetup.com just like my mother did not name me blogging bitcoin. I'm amazed that anybody showed up to this meetup at all. It was an awesome experience. It's nice to have freedom on the Internet once again.
Fix the Internet, fix the world wide web.
Snowden Is Back
Edward Snowden was exiled to Russia by the United States Government. He wrote a book titled Permanent Record*. The US government confiscated the proceeds of that book. Now he gets zapped on a censorship-resistant communications protocol named nostr.
Some hate him. Some think he's a hero. Regardless what you think, Snowden did not make a dime from his book; but he gets zapped for notes.
If Edward Snowden can get zapped, they can't stop us.
I would be lying if I said I didn't like NGU, but that's the least interesting thing about bitcoin.
I'm not flying to Washington. I won't write my congressman or bribe my senator to overturn the laws that allow the government to censor the proceeds of books in the United States , but I have the option to zap a man who was exiled to Russia. I'm not a drug dealer, and you probably aren't one either. Maybe it's not such a good idea to use your real name, address, and credit cards to sign up for ubiquitous websites in case they deem sending 1 penny to someone is illegal. IT might be a worse crime than growing marijuana flowers. Be careful. Don't accidentally zap the wrong citizen of the world if you have a blue check.
Says the Elephant Skeleton, "Ban the abortion pill!"
Says the Donkey Skeleton, "Guns will kill!"
Yes, [we will not find a solution to political problems in cryptography,] but we can win a major battle in the arms race and gain a new territory of freedom for several years. Governments are good at cutting off the heads of a centrally controlled networks like Napster, but pure P2P networks like Gnutella and Tor seem to be holding their own.” ~ (Satoshi Nakamoto)
Bitcoin is freedom money, but freedom isn't free. You have to fight for it. You can't just send your bitcoin to BlockFi or Bitconnect 2.0 and wave your bitcoin goodbye. 👋 You must insist on self custody. Use collaborative custody if you need hand-holding. You can't expect to be an expert chart doodler and digital art curator if you do not know the difference between these two jpegs.
That's why my meetup is bitcoin only.
Nostr makes it easy to teach people proper key management with lower stakes. I think creating a nostr key and joining our meetup chatroom is good practice for holding your own bitcoin keys. If you screw it up, at least you won't lose your life savings.
Eventually, I will probably capitulate and join Meetup.com like Neo went back into the Matrix. If so, I still want to use Nostr to post the location of meetups. Even cottage cheese sushi swappers should learn how to use Nostr. They can get a taste of freedom before making their next trade on rug-pull-my-crypto.exchange. They'll come back when rug-pull-my-etf.com screws them over.
Blogging Bitcoin Block Height: 797,471
**This blog is ad-free and is written using the Value4Value model.
Bookmark my habla.news page to follow my work because email subscriptions suck.
sauces
Author's preface to the book: "PGP Source Code and Internals
[The New Libertarian Manifesto[https://mises.org/library/new-liberty-libertarian-manifesto/html/p/468)
-
@ c4165d34:33efc5bb
2023-07-30 17:42:23YakiHonne is a Nostr-based decentralized content media protocol, which supports free curation, creation, publishing, and reporting by various media.
Introduction Yakihonne enthusiasts were thrilled to learn about the platform's recent updates, designed to take the user experience to new heights. Eager to explore the enhancements, I embarked on a thorough test to evaluate the changes in action. In this article, I share my experiences and insights as I navigated through the revamped Login UI screen, interacted with the NIP-35 support for voting, enjoyed the convenience of the Tags feature, and more.
1. Revamped Login UI Screen:
The first noticeable update was the Login UI screen, where I was excited to find that the "login with the extension" option was cleverly greyed out. This subtle but effective change turned out to be a brilliant move for user experience. Greyed out buttons often indicate that a feature is not available under certain circumstances, preventing confusion and frustration. It was a welcoming sign that Yakihonne's team had put thought into making the platform more user-friendly.
The login screen
2. NIP-25 Support for Upvoting and Downvoting: Next, I explored the new NIP-25 support, which enabled users to upvote and downvote articles. This feature added a significant layer of interactivity to the platform. However, during my test on the web version, I encountered a slight delay in the response when attempting to upvote or downvote an article. A quick refresh resolved the issue, but this observation could be valuable feedback for the development team to fine-tune the system for seamless voting interactions.
3. The Intuitive Tags Feature: As a content explorer, the Tags feature quickly became my favorite. Clicking on a tag displayed all articles associated with it, allowing for effortless content discovery. Whether it was diving into technology, literature, or travel, the Tags feature offered a smooth and efficient way to find articles of interest. This enhancement fostered a sense of community, connecting users with shared interests and expanding their horizons on Yakihonne.
Tags shown in the brand colour
Immediately all articles associated with the tag is shown
4. Verifying Profiles with NIP05 Addresses:
One intriguing update was the option to verify profiles using NIP05 addresses. While I was eager to give it a try, I faced some uncertainty on how to proceed with the verification process. Clearly, this feature required further guidance from the Yakihonne team, as it holds the potential to enhance user trust and authenticity on the platform. Clear instructions from the team would be invaluable in utilizing this exciting verification feature.
NIP-05 address
5. One-Click Sharing to Twitter: The final feature I put to the test was the one-click sharing to Twitter. Yakihonne's team had truly hit the mark with this addition, simplifying the process of sharing articles with a wider audience. This seamless integration with Twitter provided a powerful tool for users to promote content, effectively giving Yakihonne a boost and expanding its reach beyond the platform.
6. Search Feature Yakihonne's search functionality stands out with its exceptional speed and precision, delivering an effortless content discovery experience. The platform's efficient search algorithm quickly produces relevant results, saving valuable time for users. With its well-organized presentation of articles and support for advanced search options, Yakihonne ensures that users can easily find and access the content that aligns with their interests. This fast and intuitive search feature enhances the overall user experience, making Yakihonne a top-notch platform for seamless content exploration.
Conclusion: Yakihonne's recent updates showcased the platform's dedication to enhancing user experience and fostering an interactive community. The greyed-out Login UI, NIP-25 voting support, intuitive Tags feature, and Twitter sharing functionality all contributed to a more enjoyable and engaging content-sharing experience. While the NIP-05 profile verification feature showed great promise, additional guidance from the team would be beneficial to fully utilize its potential. Overall, Yakihonne's updates have positioned the platform for even greater success, delighting users and cultivating an enthusiastic community of content creators.
-
@ b3590d02:0f2b1ae4
2023-07-06 18:44:20Verification: tweet
Badge implications:
- an account that joined Twitter in 2010 or earlier
- an account with fewer than 10,000 followers
Twitter user tweetious profile information:
- Joined in 2008
- Follower count: 699
- Following count: 529