-
@ 0176967e:1e6f471e
2024-07-22 19:57:47Co se nomádská rodina již 3 roky utíkající před kontrolou naučila o kontrole samotné? Co je to vlastně svoboda? Může koexistovat se strachem? S konfliktem? Zkusme na chvíli zapomenout na daně, policii a stát a pohlédnout na svobodu i mimo hranice společenských ideologií. Zkusme namísto hledání dalších odpovědí zjistit, zda se ještě někde neukrývají nové otázky. Možná to bude trochu ezo.
Karel provozuje již přes 3 roky se svou ženou, dvěmi dětmi a jedním psem minimalistický život v obytné dodávce. Na cestách spolu začali tvořit youtubový kanál "Karel od Martiny" o svobodě, nomádství, anarchii, rodičovství, drogách a dalších normálních věcech.
Nájdete ho aj na nostr.
-
@ eed76ece:afa98124
2024-07-22 19:10:22Changed this territory name to Import_Export today, as postings are for actual Importers, Exporters, Traders, Wholesalers, that might someday in a few years, might run into one of these posts on specifics for this or other merchandise.
Beach Sarongs, Full Size, seven prints in stock minimum of 800 pieces per print in stock. Individually packed in clear resealable plastic bags, 12 per master pack. https://globalmerchant.io/collections/sarongs
MOQ: 72 pieces can be assorted with other goods. C.I.F. (Cost, Insurance, Freight) D.A.P. (Delivered at Place) Duty paid by Consignee. Shipped via DHL or FedEx, UPS International Economy Air ================================ Unit Price Each in SATS = 6,575
Unit Price Each in USD = $ 4.44 Unit Price Each in Thai Baht = 161 ================================ Country of Origin: India. Material: 100% Terivoile (Polyester). ================================ T/T, CC, BTC On Chain, Lightning, If your coming in person to Waikiki, we can discuss alternate terms.LINE @globalmerchant
Ask for Rich
Agents wanted especially in Bangkok, Pattaya, Phuket, Da Nang, Hanoi.
Website: globalmerchant.io
originally posted at https://stacker.news/items/618381
-
@ 6f170f27:711e26dd
2024-07-22 19:06:24https://www.journals.uchicago.edu/doi/abs/10.1086/716290
Meta-analytically, women are strongly biased in favor of female candidates for election, whereas men are slightly biased in favor of... female candidates.
originally posted at https://stacker.news/items/618382
-
@ 8e7462a8:c723a97b
2024-07-22 18:17:51```js import React, { createContext, useContext, useRef } from 'react'; import { Toast } from 'primereact/toast';
const ToastContext = createContext();
export const useToast = () => useContext(ToastContext);
export const ToastProvider = ({ children }) => { const toast = useRef(null);
const showToast = (severity, summary, detail) => { toast.current.show({ severity, summary, detail }); }; return ( <ToastContext.Provider value={{ showToast }}> <Toast ref={toast} /> {children} </ToastContext.Provider> );
}; ```
-
@ be39043c:4a573ca3
2024-07-22 18:14:55Growing up in Japan, we always had a rice cooker at home and didn't know any family that didn't have one. When I left my house for college I bought one.
When I moved to the US, I naturally thought, I must have a rice cooker and bought a cheap one.
At one point, our, me and my husband's, rice cooker broke. I looked up how to cook with a pan. To my surprise, rice tasted better when cooked by a pan over the gas stove. We have been without a rice cooker since, for a good 10 years.
I recently read someone in Japan cooked rice for the first time on the stove as he couldn't use electricity because of an earthquake. He said "what the point of rice cooker, if we can cook better without it?"
Well, a rice cooker keeps the rice warm. But, it is not a really necessary device as most of us, Japanese people, believe.
The companies like panasonic and zojirushi, make a lot of money, selling rice cookers to us and many people are employed there. Maybe, this has been the point of rice cooker and the 20th century, growth, growth.
I am not saying it was all bad. Still, I prefer to live in a way that any false belief, a certain product is necessary to do something, is unveiled and make a choice based on that.
-
@ 8e7462a8:c723a97b
2024-07-22 18:14:33 -
@ cc05487a:8eddd6a8
2024-07-22 17:51:32I've shared this story a couple of times and often I have second thoughts about sharing the story again. But if my sharing it here can save ONE life, it's worth it., this issue is too important to stay silent about, so I want to raise awareness.
In November 2018, I was diagnosed with stage T2b intermediate prostate cancer. To make matters worse, it was an aggressive form genetically meaning there was a high likelihood of metastasis if I didn't take action.
Hearing the doctor deliver this news was devastating. I was stunned, caught off guard, and I broke down. Given the context of the preceding 18 months, it felt like a knockout blow:
- My mother passed away in June 2017.
- My father passed away in April 2018.
- My uncle (my mom's older brother) passed away in August 2018.
With all the grief I'd been dealing with, my first thought upon receiving the diagnosis was, "Am I next?"
Reflecting on it, I realized I experienced a microcosm of the stages of grief. I initially tried to deny the diagnosis, but there it was, staring me in the face. I got angry and sought a second opinion, hoping the first doctor was wrong. But the facts were undeniable. Facts don't care about your fears. As anyone close to me can tell you, I became deeply depressed. I was no fun to be around during the holidays. But then I realized that wallowing in fear and sadness was hurting me and those around me. I was still scared, but I knew I had to take action.
My wife was instrumental in helping me through this difficult time. I couldn’t have done it without her.
In the weeks following my diagnosis, I consulted with my urologist to devise a plan of action. He strongly recommended surgery over radiation. After doing some research, I understood why: given the aggressive nature of my cancer, radiation would leave a lingering uncertainty about recurrence. And if the cancer returned, I would have no options left. Despite the fear, surgery was the best option. I had to keep reminding myself of one thing:
I was going to have the final word on this cancer. I was determined to remove it by any means necessary and move on with my life. Was I still scared? Absolutely. But the sooner I got treatment, the better.
My urologist referred me to Swedish Urology for treatment, saying, "I'm good, but they're better, and I think you need to see them for treatment." More blood tests and planning ensued throughout December. In January, I met with my surgeon and got on the schedule.
A few months later, I underwent surgery to remove the cancer. If you're curious about the procedure, look up "da Vinci robotic surgery" online. My wife and I had to drive through the final (and worst) snow of the year on the day of the surgery, but everything went as planned. My wife stayed with me during pre-op, chatting with the anesthesiologist and nurses while they completed the final checklists. I wasn't too nervous at first, but seeing her walk away to the waiting area made everything suddenly real. In hindsight, I probably should have taken the valium they offered.
The surgery went well overall, despite a few hiccups that extended my hospital stay by a night. The pain was intense; Oxycodone helped, but I didn't enjoy the hallucinations it caused (like letters on the dry erase board melting down the board), so I minimized its use.
My wife never left my side, sleeping on an uncomfortable cot next to me. She's incredibly strong and courageous. Maybe I should get her something special for her birthday... perhaps those diamonds she was eyeing the other day.
A week later I had my first follow-up and reviewed the final pathology report. The results were good: no metastasis. I was still sore and recovering, but getting better every day. Was I out of the woods yet? Mostly, yes. There's some concern that some cancer cells might have been left behind. The cancer had advanced between my initial biopsy and final pathology report, from a Gleason score of 4+3=7 to 4+5=9. Opting for surgery was the right call. If follow-up radiation is ever needed, it will be minimal and focused.
So why am I sharing all of this? It's not about seeking attention. I'm sharing this to shake a few people out of complacency. If you're reading this and have concerns, don't ignore them. Looking back, I realize there were warning signs I dismissed as mere aging. Those ads on TV promising miracle supplements? Ignore them. They should be against the law...! They almost cost me dearly. Ask yourself:
- Are you over 40?
- Do you have a family history of reproductive or pancreatic cancers?
- Have you noticed any changes or discomfort when sitting, using the restroom, or riding a bicycle?
- Any unusual discharge in your underwear?
If you answered 'yes' to any of these, see a doctor. Go now. Get a PSA test. If your doctor dismisses the need for it, find another doctor. PSA tests do work. My PSA was only 7 (normal is 1 to 4), and my first urologist didn't think they would find anything. But here I am, post-prostatectomy. PSA tests are easy: a simple blood draw, and the results come back in a few days. If it's below 4, check it again in six months to ensure it hasn't risen. And if you're over 50, definitely get a direct exam.
Did you know the percentage of males with prostate cancer increases linearly with age? It does: 30% of males in their 30s have prostate cancer, 40% in their 40s, and so on. Recent studies have found this, and urologists are taking these findings seriously. Ignoring early warning signs is not worth your life.
Ignore those TV ads for miracle supplements. They’re a dangerous distraction. Many contain Saw Palmetto, which can suppress PSA levels and create a false sense of security. Instead of relying on these supplements, see a doctor.
Go get checked. Talk to a doctor. Do it now.
Finally, I want to thank my wife. I know this has been tough, but I'm glad you're here. I couldn't have done this without you. I love you.
-
@ 1408bad0:4971f2ca
2024-07-22 15:56:41https://image.nostr.build/bde258ff45b8017e3ce76aa733718beb302bea7a4b7fe9aff7ff3af1e788baab.png
Pretty much all of the alternative crypto tokens to Bitcoin seem to have DAOs or insiders have managed to pre-mine themselves some extra tokens to give themselves an advantage.
The bad thing about DAOs are that they are used to enrich insiders at the expense of token holders. Many may claim you can vote how they are spent based on your stake or other such democracy, but in reality, token distribution or other such mechanisms have been put in place to rig a fair distribution.
We can see even with OpenSats that many supported projects stop a year later without any passion and commitment from the Devs. But at least these projects are not funded at the behest of existing Bitcoin holders. They require an honest funding mechanism.
It seems at some point or another, the DAOs will get drained and in the name of the community. Even if you as part of the community disagree with the spending, there is nothing much you can do to stop it.
Some chains may do scheduled unlocks to the insiders to gradually drain away value from token holders. You may ask yourself, why should they get this reward at your expense? The answer is often they have been early investors and taken the risk.
This isn't exactly true because the early adopters of this tech were on Bitcoin and they have already made bank. The new tokens are just fresh money grabs often without providing any utility.
Why store any value in these tokens if they are just money grabs and will not hold value over the long term against Bitcoin. High time preference people are those usually chasing the short term gains instead of thinking longer term.
-
@ 82100056:e198089f
2024-07-22 15:44:39Was the Trump ‘assassination attempt’ a staged event that he was in on? Was it an inside job by the secret service seeking to eliminate the number one enemy of the deep state? Or was Thomas Crooks simply competent (and lucky) enough to pull off a lone-wolf attack on the 45th president of the USA?
Whatever the truth is, which will surely come out over time, the ‘shooting’ has dramatically altered the course of American politics.
The most obvious narrative shift has been Trump’s new-found God-like status among the American right. The image of Trump holding his fist in the air, right ear bloodied, while urging Americans to “fight”, symbolises this. His long-time friend and UFC chairman Dana White said in the aftermath of the ‘shooting’: “He [Trump] is one of the toughest, most resilient human beings that I have ever met in my entire life….This guy is the legitimate ultimate American badass of all-time!” In addition, a section of Trump supporters turned up to the recent Republican National Convention wearing fake bandages on their ears in a humorous yet sincere show of solidarity with their leader. These are just two examples among many of the relentless outpouring of adulation that Trump is receiving. He has become a martyr without having to die.
The Trump fist raise image and resulting strongman narrative is not what I will focus on, but it does provide crucial context for what I am going to say. Instead, I will look at the farcical performance of the secret service, which did not go unnoticed by social media users. Specifically, that of the female agents, and the resulting backlash against the perceived failures of DEI (diversity, equity and inclusion) within the US government. I will also look at why this narrative is being pushed by powerful players of the alternative-media industrial complex.
First, footage emerged of one female agent who struggled to holster her firearm, while looking completely disorientated, as Trump, who could have still been in danger, fled the scene in a blacked out SUV. It was reminiscent of Fredo Corleone’s bumbling efforts to save his father Vito, who faced a failed assassination attempt of his own in The Godfather 2.
Second, there is the ridiculous still image of the female secret service agent standing in front of Trump by the podium in order to provide cover for the former president amid a possible active shooter situation. The image is ludicrous because the female agent is not nearly as tall as Trump, and so his head, including his ear that was just ‘clipped’ by a 'bullet', remains completely exposed.
Third, the US secret service director Kimberly Cheatle, a woman, is ultimately responsible for the ‘near-assassination’ of Donald Trump and the seemingly unlimited ‘mistakes’ made by the agency on the day. For example, Cheatle did not have a secret service gunmen on the roof used by Thomas Crooks because it was “too sloped”. An explanation that was shown to be comically bad in the immediate aftermath of the ‘shooting’ when photos emerged of cleaners standing on the roof to clear Crooks’ ‘blood’ away.
The backlash against the female agents in question has been pretty relentless from the American right. But not just from anonymous MAGA social media users. Some of the key drivers of political narratives on the right have honed in on this issue and more generally against the perceived failings of DEI.
“There should not be any women in the Secret Service. These are supposed to be the very best, and none of the very best at this job are women,” said right-leaning activist Matt Walsh in a direct response to the footage of the female secret service agents. Walsh works for The Daily Wire and rose to global prominence on the back of his documentary “What Is A Women?”. In other words, he is a key figure within the US culture war, who has a keen interest in discussions around gender.
Then there is Andrew Tate. One could have predicted where the former kickboxer was going to stand on this issue. Tate posted a video to social media in which he lambasted Kimberly Cheatle and the other female secret service agents in a visibly heated manner. “There’s not a female alive who’s ever going to jump in front of a bullet for anybody. She’s gonna piss her panties and hide.” Tate carries on the rest of the video in the same vain, including a statement which in my view is a big clue about the narrative he is trying to push.
“Society as a whole will be better off if we return to our [gender] roles.”
Tate wants the secret service and the military to go back to being made up of men who are selected solely on the basis of merit. After years of military aged males being 'disenfranchised' by the institutions they once felt proud to represent - the pendulum may be about to swing back.
At this point, it is worth mentioning that I believe both Matt Walsh and Andrew Tate are intelligence assets who push agendas to the wider public on behalf of the deep state, as Miri AF explains here. It is also worth considering that these women are akin to actors who are fulfilling their roles as incompetent, further driving the narrative of Walsh and Tate. Their incompetence is not necessarily because they are women - they might just be acting like they are incapable because they are women. On the face of it, it sounds overly conspiratorial - but in my view, it is plausible.
Okay, let’s continue down this ‘conspiracy theory’ rabbit hole for a little longer. It is widely understood that intelligence agencies use terror attacks to further their insidious agendas. A clear example of this is how George Bush’s government, in conjunction with the media and the intelligence apparatus, weaponised the fear brought about by 9/11 in order to invade Iraq.
So, what is the goal of Tate/Walsh and the deep state (and possibly Trump too) in pushing back against DEI and making the secret service/military a place that ‘respects’ male competence again? The same military whose soon-to-be commander-in-chief has God-like status, is the "ultimate American badass of all time", and who ‘literally’ just ‘took a bullet’ for his country. The same military that is currently in escalating proxy wars vs Russia and Iran - neither of which look like ending anytime soon. Are American military aged males being influenced to go and fight for their country in WW3 after years of being gaslighted?
I want to state that I do not claim to know this for sure. I am stating a theory. I am asking questions that I believe need to be asked.
But I will leave you with this quote by a World War Two veteran that has been doing the rounds on social media recently.
“If president Trump was commander-in-chief I would go back to re-enlist today.” Sgt. Bill Peril, 99, WW2 veteran.
-
@ dd664d5e:5633d319
2024-07-21 10:46:33My suggestion for scaling up and staying humble
The original protocol design was "good enough for now"
When Nostr was invented and got started with developing implementations, the Original Devs (ODs) were convinced this was going to be big... maybe... someday... hopefully.
But, whatever they did at the moment should definitely scale up a bit and be a bit flexible, to attract innovators and keep up the momentum. So, they designed the protocol to be open and left the specifications a bit vague and very high-level, so that nobody was forced into a particular implementation, in order to adhere to the spec. And they put the specs into a GitHub repository and managed them by a committee of collaborators, who were generally open to changes to the specs, including breaking changes.
That was smart. And it was "good enough for now"... back then. After all, Nostr (and the associated wiki and modular article specs) hadn't been invented, yet, so they couldn't write the protocol in the protocol before the protocol existed. They're good, but not that good.
What they specifically wrote, into the Nostr Protocol was:
To promote interoperability, we standards (sic) that everybody can follow, and we need them to define a single way of doing each thing without ever hurting backwards-compatibility, and for that purpose there is no way around getting everybody to agree on the same thing and keep a centralized index of these standards...
Standards may emerge in two ways: the first way is that someone starts doing something, then others copy it; the second way is that someone has an idea of a new standard that could benefit multiple clients and the protocol in general without breaking backwards-compatibility and the principle of having a single way of doing things, then they write that idea and submit it to this repository, other interested parties read it and give their feedback, then once most people reasonably agree we codify that in a NIP which client and relay developers that are interested in the feature can proceed to implement.
I disagree with this statement.
I don't disagree with what they meant, or what they wanted, I disagree with what they specifically wrote.
Standards (defined as prose specifications) are not the only -- or even best -- way to ensure interoperability or to check for backwards-compatibility. And, as they later note, basing a protocol off of implementations is arguably worse (but faster) than using specifications, as implementations have a life of their own and are sometimes simply shoddy or buggy, their content eventually differs from what ends up in the final standard, or there are soon multiple implementations covering the same spec "in theory", but not in practice, so that their events are incompatible.
And then the inevitable, heated discussion begins: * Which implementation is the Real Standard™? * Who controls the Real Standard™? * How is the Real Standard™ spec supposed to be written? * Does everything have to be in the same file type or markup language? If not, how can we ensure compatibility? * What is realistic content for the data files? * Is the Real Standard™ including all of the information needed to ensure interoperability, but not anything more, without reducing innovation and artificially forcing consensus by encouraging copy-paste or forking of product code?
There is a third way: write the test first
We actually do not need standards to define a single way of doing each thing. A test is another way, and I think it is the best (i.e. the most-efficient and most-effective) way.
Specifically, I think we can borrow the simple behavior-driven design (BDD) language called Gherkin (or something similar), which is used to write dynamic specifications: i.e. implementations that test adherence to a set of rules, rather than an implementation that uses the rules to perform some task for an end user.
Gherkin simply allows you to create standard scenarios and test data and write the tests up in a way that can be performed manually or through automation. For example ( source ):
(For a concrete example of such a TDD Protocol for Nostr, please see the nostr-voliere repo from nostr:npub1axy65mspxl2j5sgweky6uk0h4klmp00vj7rtjxquxure2j6vlf5smh6ukq .)
This really is better
This TDD Protocol design would have some downsides and some upsides, of course, like any change.
Downsides
- You can't write a TDD spec by yourself unless you understand basic software functionality, how to define an acceptance test, and can formulate a use case or user story.
- The specs will be more informative and agnostic, but also longer and more detailed.
- Someone will have to propose concrete test data (i.e. a complete json event) and spec interlinking will be explicit, rather than writing "...", "etc.", or "sorta like in that other section/doc, but not really" all over the place.
- The specs will tend to focus on positive cases, and leave off error-handling or most edge-cases, so developers can't use them to replace unit tests or other verification of their product.
Upsides
- The specs will be concrete and clear, in a type of pseudocode, while leaving the actual implementation of any feature up to the individual developer, who uses the spec.
- The specs will be orderly and uniquely-identifiable, and can have hierarchy and granularity (major and minor tests, optional tests, tests only under certain conditions, etc.)
- Deciding whether changes to the spec are breaking changes to the protocol would be simple to determine: Does the previous test still pass?
- Specs will always be final, they will simply be versioned and become more or less defined over time, as the tests are adjusted.
- Product developers will feel less like they "own" particular specs, since their implementation is actually what they own and the two remain permanently separate.
- Developers can create an implementation list, defining specific tests in specific versions, that they adhere to. This makes it more transparent, what their product actually does, and lowers their own documentation burden.
- Rather than stalking the NIPs for changes, or worrying about what some other implementation someplace has built, developers can just pull the repo and try running the relevant tests.
- Each product developer can test the spec by trying to perform or automate/run it, and request changes to ensure testability, raising the quality of the spec review process.
This is already a lot to think about, so I'm just going to end this here. Thank you for reading.
-
@ 6bae33c8:607272e8
2024-07-22 14:17:47These are my very preliminary WR rankings, my only research being the RotoWire depth charts and player notes.
Tier 1
CeeDee Lamb, Tyreek Hill, Ja’Marr Chase, Amon-Ra St. Brown, AJ Brown
I thought about putting Lamb in his own tier since he’s in his prime and the dominant No. 1 on a pass-heavy team, but Hill is arguably a top-five all-time receiver, Chase projects just as highly over a full season if his QB is healthy, St. Brown is a target and receptions machine and Brown is a physical freak who is finally receiving the targets he deserves.
Tier 2
Justin Jefferson, Puka Nacua, Garrett Wilson, Stefon Diggs
Jefferson might be the best receiver in the game, but the quarterback downgrade lands him in the top of Tier 2. Nacua also could have been in Tier 1, given his rookie year, but the presence of a healthy Cooper Kupp might cut into his role slightly (apparently Kupp was never fully healthy last year.) Wilson managed to produce with abominable QB play last year, but the threat of Aaron Rodgers being toast or Davante Adams getting brought in keep him out of Tier 1. Diggs is getting older, but landed in an ideal situation.
Tier 3
Drake London, Marvin Harrison, DJ Moore, Jaylen Waddle, Chris Olave, DeVonta Smith, Davante Adams, Cooper Kupp
London gets a massive QB upgrade, Harrison is the clear No. 1 out of the gate, Moore gets an upgrade even with a rookie QB, though there’s a lot of competition for targets. Waddle, Olave and Smith are here based on their solid past production and skills, Adams is a wild card, depending on where he winds up and Kupp is an age and injury risk with Tier 1 upside if he somehow emerges as Matthew Stafford’s No. 1 again.
Tier 4
Nico Collins, Michael Pittman Amari Cooper, Brandon Aiyuk, Tee Higgins, Deebo Samuel, Mike Evans, DK Metcalf, Keenan Allen, Christian Kirk
These are all either quality second fiddles or top dogs with limited ceilings. I never draft Allen, and he always produces, but he’s 32, playing with a rookie QB and in a crowded receiving corps. Kirk has a nice floor as Trevor Lawrence’s likely top target again. I had Pittman at the bottom of this tier originally, but moved him up because he’s a clear No. 1, and what if Anthony Richardson is good? I used to like Metcalf more, but he’s just an okay NFL player despite other-wordly athleticism.
Tier 5
**Terry McLaurin, Zay Flowers, Malik Nabers, Ladd McConkey, Christan Watson, George Pickens, Jayden Reed, Marquise Brown, DeAndre Hopkins, Calvin Ridley **
This is an odd tier of has-beens, maybes, rookies and guys getting a shot. McLaurin probably has the highest floor, but is a boring pick. Reed has a good chance to emerge ahead of Watson and Romeo Doubs as Jordan Love’s go-to in Year 2, but Watson has the size and speed. I won’t stump for my Giant Nabers this year — though I traded Marvin Harrison for him and an extra pick in a dynasty league! — but I love that he’s only 20 and produced at such a high level in college. McConkey has a chance to be Justin Herbert’s top target right away, Flowers should be the co-No. 1 with Mark Andrews in Baltimore and Pickens is the No. 1 in Pittsburgh, though I don’t trust him or that passing offense at all. Marquise Brown has massive upside in Kansas City, but some competition with Xavier Worthy (who does the same thing) and Rashee Rice if he doesn’t got to jail. Ridley and Hopkins will see targets at least.
Tier 6
Jaxon Smith-Njigba, Rashid Shaheed, Courtland Sutton, Diontae Johnson, Rashee Rice, Jordan Addison, Chris Godwin, Tank Dell, Jameson Williams, Gabe Davis, Rome Odunze, Jerry Jeudy, Adam Thielen, Romeo Doubs, Brian Thomas
Smith-Njigba could emerge at Seattle’s No. 1 with Lockett aging and Metcalf plateauing. Shaheed should get more targets with Michael Thomas gone, Sutton is the No. 1 in a bad offense, Johnson should catch a lot of passes, Rice could be Tier 2 or 3 once he gets finished with a likely suspension, Addison has his own legal problems from driving drunk, plus a QB downgrade, Godwin is a boring second fiddle now, Dell drops due to Diggs’ arrival, Williams is third fiddle in a good offense, Davis is probably just a guy, but should get targets, Odunze is in a crowded receiving corps, Jeudy is running out of time to break out, Thielen will get targets, Doubs is steady and reliable and Thomas has upside in a receiving corps without a star.
Tier 7
Joshua Palmer, Keon Coleman, Brandin Cooks, Xavier Worthy, Jakobi Meyers, Khalil Shakir, Tyler Lockett, Jahan Dotson, Mike Williams, Curtis Samuel, Rashod Bateman, Michael Wilson, Quentin Johnston
These are mostly just guys who should get targets. Worthy and Coleman are more upside than floor, but the rest are just here because the ball is probably coming their way.
-
@ 3c984938:2ec11289
2024-07-22 11:43:17Bienvenide a Nostr!
Introduccíon
Es tu primera vez aqui en Nostr? Bienvenides! Nostr es un acrónimo raro para "Notes and Other Stuff Transmitted by Relays" on un solo objetivo; resistirse a la censura. Una alternativa a las redes sociales tradicionales, comunicaciónes, blogging, streaming, podcasting, y feventualmente el correo electronico (en fase de desarrollo) con características descentralizadas que te capacita, usario. Jamas seras molestado por un anuncio, capturado por una entidad centralizada o algoritmo que te monetiza.
Permítame ser su anfitrión! Soy Onigiri! Yo estoy explorando el mundo de Nostr, un protocolo de comunicacíon decentralizada. Yo escribo sobre las herramientas y los desarolladores increíbles de Nostr que dan vida a esta reino.
Bienvenides a Nostr Wonderland
Estas a punto de entrar a un otro mundo digtal que te hará explotar tu mente de todas las aplicaciones descentralizadas, clientes, sitios que puedes utilizar. Nunca volverás a ver a las comunicaciones ni a las redes sociales de la mesma manera. Todo gracias al carácter criptográfico de nostr, inpirado por la tecnología "blockchain". Cada usario, cuando crean una cuenta en Nostr, recibe un par de llaves: una privada y una publico. Estos son las llaves de tu propio reino. Lo que escribes, cantes, grabes, lo que creas - todo te pertenece.
Unos llaves de Oro y Plata
Mi amigo y yo llamamos a esto "identidad mediante cifrado" porque tu identidad es cifrado. Tu puedes compartir tu llave de plata "npub" a otros usarios para conectar y seguir. Utiliza tu llave de oro "nsec" para accedar a tu cuenta y exponerte a muchas aplicaciones. Mantenga la llave a buen recaudo en todo momento. Ya no hay razor para estar enjaulado por los terminos de plataformas sociales nunca más.
Onigirl
npub18jvyjwpmm65g8v9azmlvu8knd5m7xlxau08y8vt75n53jtkpz2ys6mqqu3
Todavia No tienes un cliente? Seleccione la mejor opción.
Encuentra la aplicación adecuada para ti! Utilice su clave de oro "nsec" para acceder a estas herramientas maravillosas. También puedes visit a esta pagina a ver a todas las aplicaciones. Antes de pegar tu llave de oro en muchas aplicaciones, considera un "signer" (firmante) para los sitios web 3. Por favor, mire la siguiente imagen para más detalles. Consulte también la leyenda.
Get a Signer extension via chrome webstore
Un firmante (o "signer" en inglés) es una extensión del navegador web. Nos2x and NostrConnect son extensiónes ampliamente aceptado para aceder a Nostr. Esto simplifica el proceso de aceder a sitios "web 3". En lugar de copiar y pegar la clave oro "nsec" cada vez, la mantienes guardado en la extensión y le des permiso para aceder a Nostr.
👉⚡⚡Obtén una billetera Bitcoin lightning para enviar/recibir Zaps⚡⚡ (Esto es opcional)
Aqui en Nostr, utilizamos la red Lightning de Bitcoin (L2). Nesitaras una cartera lightning para enviar y recibir Satoshis, la denominacion mas chiquita de un Bitcoin. (0.000000001 BTC) Los "zaps" son un tipo de micropago en Nostr. Si te gusta el contenido de un usario, es norma dejarle una propina en la forma de un ¨zap". Por ejemplo, si te gusta este contenido, tu me puedes hacer "zap" con Satoshis para recompensar mi trabajo. Pero apenas llegaste, as que todavia no tienes una cartera. No se preocupe, puedo ayudar en eso!
"Stacker.News" es una plataforma donde los usarios pueden ganar SATS por publicar articulos y interactuar con otros.
Stacker.News es el lugar mas facil para recibir una direccion de cartera Bitcoin Lightning.
- Acedese con su extensión firmante "signer" - Nos2x or NostrConnect - hace click en tu perfil, un codigo de letras y numeros en la mano superior derecha. Veás algo como esto
- Haga clic en "edit" y elija un nombre que te guste. Se puede cambiar si deseas en el futuro.
- Haga clic en "save"
- Crea una biografía y la comunidad SN son muy acogedora. Te mandarán satoshi para darte la bienvenida.
- Tu nueva direccion de cartera Bitcoin Lightning aparecerá asi
^^No le mandas "zaps" a esta direccion; es puramente con fines educativos.
- Con tu Nueva dirección de monedero Bitcoin Lightning puedes ponerla en cualquier cliente o app de tu elección. Para ello, ve a tu página de perfil y bajo la dirección de tu monedero en "Dirección Lightning", introduce tu nueva dirección y pulsa "guardar " y ya está. Enhorabuena.
👉✨Con el tiempo, es posible que desee pasar a las opciones de auto-custodia y tal vez incluso considerar la posibilidad de auto-alojar su propio nodo LN para una mejor privacidad. La buena noticia es que stacker.news tambien está dejando de ser una cartera custodio.
⭐NIP-05-identidad DNS⭐ Al igual que en Twitter, una marca de verificación es para mostrar que eres del mismo jardín "como un humano", y no un atípico como una mala hierba o, "bot". Pero no de la forma nefasta en que lo hacen las grandes tecnológicas. En el país de las maravillas de Nostr, esto te permite asignar tu llave de plata, "npub", a un identificador DNS. Una vez verificado, puedes gritar para anunciar tu nueva residencia Nostr para compartir.
✨Hay un montón de opciones, pero si has seguido los pasos, esto se vuelve extremadamente fácil.
👉✅¡Haz clic en tu "Perfil ", luego en "Configuración ", desplázate hasta la parte inferior y pega tu clave Silver, "npub!" y haz clic en "Guardar " y ¡listo! Utiliza tu monedero relámpago de Stacker.news como tu NIP-05. ¡¡¡Enhorabuena!!! ¡Ya estás verificado! Dale unas horas y cuando uses tu cliente "principal " deberías ver una marca de verificación.
Nostr, el infonformista de los servidores.
En lugar de utilizar una única instancia o un servidor centralizado, Nostr está construido para que varias bases de datos intercambien mensajes mediante "relés". Los relés, que son neutrales y no discriminatorios, almacenan y difunden mensajes públicos en la red Nostr. Transmiten mensajes a todos los demás clientes conectados a ellos, asegurando las comunicaciones en la red descentralizada.
¡Mis amigos en Nostr te dan la bienvenida!
Bienvenida a la fiesta. ¿Le apetece un té?🍵
¡Hay mucho mas!
Esto es la punta del iceberg. Síguenme mientras continúo explorando nuevas tierras y a los desarolladores, los caballeres que potencioan este ecosistema. Encuéntrame aquí para mas contenido como este y comparten con otros usarios de nostr. Conozca a los caballeres que luchan por freedomTech (la tecnología de libertad) en Nostr y a los proyectos a los que contribuyen para hacerla realidad.💋
Onigirl @npub18jvyjwpmm65g8v9azmlvu8knd5m7xlxau08y8vt75n53jtkpz2ys6mqqu3
🧡😻Esta guía ha sido cuidadosamente traducida por miggymofongo
Puede seguirla aquí. @npub1ajt9gp0prf4xrp4j07j9rghlcyukahncs0fw5ywr977jccued9nqrcc0cs
sitio web
- Acedese con su extensión firmante "signer" - Nos2x or NostrConnect - hace click en tu perfil, un codigo de letras y numeros en la mano superior derecha. Veás algo como esto
-
@ 8d34bd24:414be32b
2024-07-22 14:15:48My Inspiration
Last night, I was reading in 2 Timothy and read this passage.
For God has not given us a spirit of timidity, but of power and love and discipline. Therefore do not be ashamed of the testimony of our Lord or of me His prisoner, but join with me in suffering for the gospel according to the power of God, who has saved us and called us with a holy calling, not according to our works, but according to His own purpose and grace which was granted us in Christ Jesus from all eternity, (2 Timothy 1:7-9) {emphasis mine}
Paul, in his introduction of his letter to his adored friend, states, “ join with me in suffering for the gospel according to the power of God.” Can you imagine being called to “join with me in suffering?” This is our calling.
So often people are told, “come to Jesus and everything in life will go well.” This is a nice sounding lie, but we need to tell the uncomfortable truth. Jesus never promised His followers wealth and an easy life. In fact He promised the opposite. All of His disciples, except one who died in exile, died horrific deaths at the hands of those who persecuted them.
Persecution
Sometimes this suffering will come from persecution. Jesus promises that we will be persecuted as He was persecuted. Should we expect any less?
Remember the word that I said to you, ‘A slave is not greater than his master.’ If they persecuted Me, they will also persecute you; if they kept My word, they will keep yours also. (John 15:20)
Indeed, all who desire to live godly in Christ Jesus will be persecuted. (2 Timothy 3:12)
If the world hates you, you know that it has hated Me before it hated you. If you were of the world, the world would love its own; but because you are not of the world, but I chose you out of the world, because of this the world hates you. (John 15:18-19)
You could make the argument that, if you have been a Christian for years, but have not ever suffered or been persecuted, you need to evaluate whether you truly are a born again Christian.
Patience with a Purpose
We also suffer because of sin in the world and are strengthened by suffering. Instead of trying to avoid all suffering, we should look to see where God is leading and what we can learn from suffering. For a born again Christian, suffering is never a random happening and rarely is it a judgment from God. Usually it is used for our good.
"I know why all these things are happening to me." He said, "I know whom I have believed, and am persuaded that he is able to keep that which I have committed unto him against that day" (2 Timothy 1:12 KJV).
In the physical world, heat can cause three responses. In objects like wood, heat causes it to burn away. In objects like ice, heat causes it to melt into a liquid or evaporate into a gas. In objects like iron, heat causes it to get harder and stronger. In the same way suffering can do the same thing. To those who have no faith in God, suffering can cause them to give up entirely. In someone with a little faith, it may cause them to melt. They don’t go away; they aren’t destroyed; but they are made ineffective. In someone with proper faith in Jesus, suffering strengthens their faith and causes them to stand firm in all circumstances.
And not only this, but we also exult in our tribulations, knowing that tribulation brings about perseverance; and perseverance, proven character; and proven character, hope; and hope does not disappoint, because the love of God has been poured out within our hearts through the Holy Spirit who was given to us. (Romans 5:3-5)
Beloved, do not be surprised at the fiery ordeal among you, which comes upon you for your testing, as though some strange thing were happening to you; but to the degree that you share the sufferings of Christ, keep on rejoicing, so that also at the revelation of His glory you may rejoice with exultation. If you are reviled for the name of Christ, you are blessed, because the Spirit of glory and of God rests on you. (1 Peter 4:12-14)
Yes, for a true believer, suffering should lead to JOY. It should lead to greater faith. It should lead to praising Jesus. It should lead to being so close to Jesus and working so hard for His will that we can say like Paul did to Timothy, “ join with me in suffering for the gospel according to the power of God.”
God’s Way
Christianity isn’t like any other religion because it isn’t a religion made up by man (but there are counterfeits of Christianity with the same name that are made up by men). Nobody would make up a religion where you are promised hardship. Nobody would make up a religion with such difficult things to understand as a trinitarian God (3 persons in 1 God). Nobody would make up a religion that told you to “… love your enemies and pray for those who persecute you.” (Matthew 5:44) You can tell Christianity is real (by studying the Bible, not by looking at everyone who labels themselves Christians) because of how different it is than what a person would want to make it, and many of its precepts are beyond what a man could make up or even comprehend.
Calling
Are you actively trying to avoid any suffering, hardship, or persecution? Are you running away from God’s plan for you? Are you putting your comfort above God’s commands and will? Are you blaming God for hardship instead of seeking what He is teaching you and/or what He is doing in the lives of people? If any of these are true, you are missing the joy and grace and peace that come from resting in God’s will.
Look around you. What is God doing around you or through you? Seek His will even if it is uncomfortable. Moses obeyed God’s call even though he didn’t feel capable (God had to remind him “who made your mouth”). John the Baptist obeyed God’s call even though it meant living in the wilderness, eating locust and honey, and being beheaded. Hosea married a prostitute, who was unfaithful and bore children out of wedlock, in order to be an object lesson to Israel of her unfaithfulness and His love and faithfulness. All of these cases and more, people sacrificed to obey God’s will and amazing things were done. Although all suffered, none said it wasn’t worth it. Utter agony with God is better than prosperity without Him. There is also an amazing feeling of accomplishment when we live life according to God’s will and when we see His plan unfolding, knowing that He used us, in a little way, to accomplish His plan. There is no greater joy than accomplishing what God created us to accomplish.
Additional Bible Verses
I don’t have more to say on the topic right now, but I’ve included some additional verses on the subject to help you dig deeper into God’s plan as it relates to suffering and persecution.
After you have suffered for a little while, the God of all grace, who called you to His eternal glory in Christ, will Himself perfect, confirm, strengthen and establish you. (1 Peter 5:10)
For I consider that the sufferings of this present time are not worthy to be compared with the glory that is to be revealed to us. (Romans 8:10)
“Blessed are those who have been persecuted for the sake of righteousness, for theirs is the kingdom of heaven.
“Blessed are you when people insult you and persecute you, and falsely say all kinds of evil against you because of Me. 12Rejoice and be glad, for your reward in heaven is great; for in the same way they persecuted the prophets who were before you. (Matthew 5:10-12)Who will separate us from the love of Christ? Will tribulation, or distress, or persecution, or famine, or nakedness, or peril, or sword? Just as it is written,.
.
“For Your sake we are being put to death all day long;.
We were considered as sheep to be slaughtered.”.
.
But in all these things we overwhelmingly conquer through Him who loved us. For I am convinced that neither death, nor life, nor angels, nor principalities, nor things present, nor things to come, nor powers, nor height, nor depth, nor any other created thing, will be able to separate us from the love of God, which is in Christ Jesus our Lord. (Romans 8:35-39)In a similar way these are the ones on whom seed was sown on the rocky places, who, when they hear the word, immediately receive it with joy; and they have no firm root in themselves, but are only temporary; then, when affliction or persecution arises because of the word, immediately they fall away. (Mark 4:16-17)
These things I have spoken to you, so that in Me you may have peace. In the world you have tribulation, but take courage; I have overcome the world.” (John 16:33)
FYI: It may seem like I have posted a lot of persecution and hardship. I believe there is a twofold reason for my feeling called to write on this subject.
There are so many “Christians” selling Christianity as a prosperity gospel. They tell you “follow your heart” and encourage people to feel good about their sins. They focus on you gaining wealth, success, and comfort. They treat the God of the Bible as a magic genie that will give you everything you desire. This message isn’t from God. It is the same promises made by the Devil from Adam to Jesus.
I also see the West turning from their Christian heritage and moving towards persecution of the true Church. I believe those who follow Jesus are likely going to see hard times coming and persecution coming from places they never expected. I have little doubt we will experience some before the Rapture, but those who are saved after the Rapture will experience this persecution to the extreme.
Trust Jesus.
Bible verses are NASB (New American Standard Bible) 1995 edition unless otherwise stated
Check out more at trustjesus.substack.com
-
@ a9434ee1:d5c885be
2024-07-18 14:34:09What makes Nostr and Bitcoin so extremely exciting to me, as a designer, is that they offer an entirely new design space. Something we never had before. Something that we can, for the first time ever, call…
A Free Market.And not just that, but one where everyone speaks the same language. One that acknowledges that Sovereignty doesn’t go that far without Interoperability.
Since this is literally a first, it seems terribly naïve to assume that we can just copy what works for Big Tech’s walled gardens and add some Freedom sauce on top. I’ve been sketching out things like “Twitter but with Zaps”, “Youtube but with many algo’s”, “Patreon but with Bitcoin”, etc... for long enough to realize the harsh limits of such an approach. Instead I’ve found it more fruitful to start by looking at the characteristics of this new digital, interoperable free market and, only then, look for similar existing benchmarks that can serve as inspiration.
Because the convergence of the various Freedom Tech protocols (Nostr, Bitcoin, Cashu, …) are such a huge game changer for online monetization specifically, it seems even more naïve to just start copying Big Tech on that front, without proper examination.
If we did just play copycat, Monthly Subscriptions and Paywalls would be the obvious answer. This article dives into why they’re not.
Free as in Competition
In a free market, it’s going to be hard for you to set a price if you don’t have something scarce to sell. Unlike in a Big Tech walled garden, there’s no one blocking competitors from entering the market. That includes competitors that offer your stuff!
If what you create can easily be copied, it will.Yes, Content creators, FOSS devs, Designers, … that’s you.
Charging for your article, podcast, app or movie doesn’t end well when people can find a free upload right next to yours. Open protocols remove the friction of having to go PirateBay for that sort of stuff. Luckily, they also remove the friction for people to value your content directly (#V4V Zaps) and for you to launch unique and scarce products related to what you do (Community, Merch, …).Even if you have a scarce product to sell though, Nostr sets the bar lower than ever for others to start competing with your product. Every part of your product!
Interoperability breaks products and services down to their constituents and opens up every layer of the stack to competition. Currently most SAAS business models are basically very expensive monthly subscriptions for some hosting and computation. And guess what Nostr is extremely good at → Discovering and switching providers of hosting and computation.
In a competitive context like that, you need a monetization model that lets you adapt fast. If you choose to monetize with monthly subscriptions, you’ll be updating your subscription tiers every month, if not every week, just to remain competitive. If that’s the case, then what’s the point? Why not just have a cash price list?
Free as in Volatile
The reality of a free market is that price-discovery never takes a break. This is especially true for a market that uses a money that is still being discovered as money by 99% of the world. A money that, just like Nostr, only saw the light of day because its creators saw total chaos ahead without it.
Bitcoin and Nostr do #fixthemoney and do #fixtheinternet, but all that fixing isn’t exactly taking us on a smooth ride.
Smooth rides are stable rides, stable prices, stable amounts of locked-in users, stable ties with the state, etc… They are everything Freedom Tech is taking us on exciting journey away from.
Again, adaptability is key and your business model needs to reflect that. Monthly subscriptions don’t handle volatility well. It takes a lot of confidence, extremely loyal customers and liquid reserves to survive bumpy roads. That’s a lot of capital that probably has better uses. Better uses that Big Tech monopolies can ignore, but your competitors won’t.
Free as in “Number Go Up”
Denominating your subscriptions in “stable” fiat-currencies isn’t going to help neither. The mental cost for customers is only low as long you don’t have juggle more than 21 subscriptions. (yes, I completely made up that number but it’s somewhere around that, so bear with me).
Given that Subscription Hell is already a thing in Fiat-land once you go beyond that number, just try to imagine the amount of subscriptions you’d be handling in Nostr-land. Your 1 Netflix subscription suddenly became 6, to different service providers, plus 20 to all your favorite creators. Then add the subscriptions for all other internet use cases you can think of. Then add some more for the ones that you can’t even imagine yet.
This is not an overstatement. It is very very unlikely that your service happens to be the only one that subscriptions would work for. If they appear to work for you, every competitor or similar use case will be copying that model in no time.So if they work, there would be a loooot of subscriptions.
That’s also a looot of: - recurring payments to forget about - different time periods to oversee - extra effort to go and unsubscribe or switch tiers - users thinking ‘I didn’t even really use this!” - users also thinking “What am I paying for exactly?”In short, that’s asking for frustrated, disappointed and confused customers.
These subscriptions would then also need to be handled on top of all the #V4V Zaps and cash payments that are inevitably happening as well. Unlike Big Tech products you don’t get to just pick Subscriptions as the only option. You will have to be optimizing both the Wallet and Social UX/UI for all of these types of payments. Something I naïvely tried and quickly gave up on. It overcrowds interfaces, makes different applications less interoperable and creates a very confusing user experience.
And if you denominate everything in fiat, you add even more confusion since the Zaps from others are denominated in Bitcoin. That suddenly makes them variable from the fiat-denominator’s perspective. Other denominations work when you only have two parties (f.e. seller and buyer), not when you have groups of people zapping away at an article. For #V4V to gain traction zappers need recognition. If it’s completely variable, and thus unclear, who the top zappers of a piece of content are or if the biggest patrons of a creator are only visible if you go to some specific subscriber dashboard, you’re creating confusion and diluting recognition.
Zaps are awesome, nearly universally implemented, very simple to design for and they might just be enough. We’re mostly still early. Creators can’t even reply to Zaps yet. The first prototypes for frictionless Cashu zaps are only just coming out. Let’s explore all that further before we start adding subscriptions to the UI, the users’ mental load and the app developers endless list of “things to implement”.
If the current zap-rate of daily Nostr users (myself included) tells me anything, it’s that sending around small payments all the time isn’t really the issue. The mental cost for the user happens mostly at the level of:
”How the f*ck do I keep track of all this?”
“How do I fit this in with my regular monthly expenses?”
Subscriptions are only one answer to this. If they happen to still somehow solve the above issues for some use cases, we still have to find solutions anyway for the micro-payments side of things.My point is thus mainly: we might as well start there and, who knows, we don’t even need subscriptions. Or at least not as a standard to strive towards. No one is stopping you from using time-based pricing for things that are actually time-based or from creating micro-apps that let you set up recurring zaps and other fun stuff like that. Just don’t promise creators and merchants that they can base their business models on them.
If you’re talking to people that will only consider Freedom Tech if they can have the stability of a group of monthly subscribers, you’re probably not talking to the right people (yet!).
Freedom Tech = Responsibility Tech.
Both seller and buyers need tools that help them take that responsibility. Especially regarding payments, the currently available tools are far from optimal and have barely scratched the surface of what’s possible in the interoperable ocean of possibilities. I say this more as an invitation than as a complaint of course. We are — yes, I’m saying it again — sooooo early.
So how can we help buyers be responsible while “not having to think about every little payment”?
For rich npubs on a network denominated in #NGU technology the answer can be quite simple: They simply don’t have to really think about it.
In fact, poor npubs neither as long they can set a spending budget and have great tools and transparency to help them manage it.They need something a bit like their “lives left” in a video game. Or something that resembles envelope budgeting: like going to the farmers market with only a limited amount of cash.
Talking about farmers markets
Let’s look at what currently comes close to an interoperable free market in meat-space: Restaurant & Farmers markets. - No one has copyrights on pizza Margherita or chicken eggs - Entering the market is relatively cheap (food trucks, farmer stands, …) - Customers can pick and choose where they get “every layer of the stack” (drinks here, main dish there and ice cream over there)
Now look at how these businesses monetize: - Cash price lists - Discounts (for special occasions, loyal customers, ..) - Bundles (lunch menus, vegetable baskets, …)
Both restaurant owners and farmers have been trying out monthly subscriptions since forever, mostly because it would benefit them. But in places with high competition this has had exactly zero success.
What did have success is services that bundle farmers market food items for you and deliver it at your doorstep on a weekly basis. That’s what lower mental cost looks like. Ironically, these services often started out with the monthly subscription model and have had to switch because customers needed more flexibility. Now, in their apps, they let users pick what day they want what, at what price.
And these apps are not even Nostr yet.Talking about Nostr markets
Nostr markets are not exactly farmers markets of course. But that is mostly good news. Nostr, Bitcoin and Cashu, in particular, remove a lot of the friction that cash still has in the physical world. They enable seamless micro-payments. That’s an innovation worth embracing, not something to hide away behind subscriptions.
Embracing micro-payments means that every little thing can have a real-time price. A price that, once payed, you never have to think about again. And some, if not most, micro-payments are so micro that you don’t have to really think about them in the first place. Especially if we’re talking about recurring events (pun intended).
If a chat message in a certain Community costs 2 sats, after your third message your added mental cost will likely be close to zero. At that point, the price is mostly a matter of transparency, fairness and spam prevention. When the Admin suddenly changes this price to 1000 sats however, you will need to think twice about posting such a message. But again, that is a good thing. That’s exactly what the Admin is monetizing.
He is curating for high signal content and conversations around a certain interest. His price list is one of his top tools for doing so. You pay him for being able to publish in his unique community. You “Pay Per Use” of that Community as a broadcasting channel for your messages, articles, videos, … You know everyone else does too.
Using monthly subscriptions for such a community would just invite abuse and poor quality. It would be like an all-you-can-eat restaurant where everyone has an infinite stomach size, you’re all at the same table and only the loudest screamers get heard.
So the Admin would put limits on what you specifically get for your subscription (100 messages per month, 210MB of hosting, etc etc…). The members would then demand flexibility or go elsewhere.The admin would then provide different tiers.
Yet, most members would still need flexibility more than they need flat rate monthly pricing.At the same time, the Admin’s “Pay Per Use” competitors will still be cheaper. They don’t have the overhead of handling the uncertainty that comes with providing stable pricing for several months. Trying their offer out is also way cheaper than immediately having to pay a subscription. The admin, on the other hand, cannot really offer free trials if he doesn’t have the locked-in users to pay for them.
In the end, just like restaurants, the Admin will switch to “Pay Per Use” and will use discounts and bundles to his advantage.
As long as users have great tools to keep an eye their spending, this sort of outcome is a win-win for the whole ecosystem. What users tend to like most about monthly subscriptions for something is the guarantee that they will not exceed XXX amount of money on that thing for the month. Nothing is stopping us from building tools that provide the same guarantee without the complications of handling monthly subscriptions.
Since most Bitcoin wallets are not daily-spending wallets and most Nostr projects aren’t monetizing yet, hardly any attention has been spent on tools like this. They all copied bank apps and display your total amount of money and a chronological feed of your transaction history. There are several problems with this: 1. You don’t want to openly showcase your total balance to everyone around you when you open the app 2. Your total balance shows “everything you have”. That is a terrible benchmark for determining “what you can sustainably spend”. 3. It’s also a terrible benchmark for determining “what you earned this month” 4. Micro-payments make a chronological feed of all your transactions completely unusable 5. Zaps make a lot of your transactions social. Zaps and eCash blur the line between money and message. And messages require interaction that transactions don’t.
I think we can do better.
So, let’s try!Cash Budgets
Just like the previously mentioned “lives” in a game or the cash in your wallet for a night out, the first thing users will want to see is “How much do I have left?”. Since most people organize their budgets per month we can more specifically turn this into “How much do I have left this month?”. This means we need to allow users to set a monthly budget in the first place. Once that budget is set for the month, it facilitates all the rest.
This budget is their subscription now. Their Nostr subscription.
An interoperable subscription they can interact with in any trusted app.And the best part: They pick the price.
They’re taking their responsibility and lowering their mental cost with one action.Now, you can start playing with wallet and home screen Interfaces that show the user at a glance: - What they’ve got left to spend for that month - What they already spent - What they earned (relative to that budget and/or a specific earnings goal)
I’m currently exploring this design space myself and the extent to which Freedom Tech budgeting can be gamified in novel ways will make TikTok look boring.
Some baby UI steps in this direction
But that’s just the budget part of course. We still need non-intrusive ways to display all these little price-tags for things if we’re not hiding them away behind a subscription.
The good news is that, when it comes to movies, music, articles, posts, FOSS apps and all other types of information that can easily be copied, it doesn’t have a price tag. It just has people zapping it. People that can use a lot more context and recognition than they currently get. (showing top zappers everywhere and letting creators reply to zaps being just a humble start for that)
For the stuff that does have price-tag, even the most obvious answer isn’t as bad I though it would be:
Just put it on the button.
The bigger the sum, the bigger the button.eCash is what can make all of this work as a one-button action anyway, removing most of the friction. A budget, on the other hand, is what can remove most of the worrying. Color indications, permission prompts for higher amounts, etc etc… can all work in tandem for this.
With these examples I’m mostly trying to give you an idea of what is still left largely unexplored territory.
A territory that we will have to go and explore in any case.
A territory where Communities are easier to monetize then Content is.
A territory where I can count the current number of active designers on both hands.
A territory that can desperately use a Design Community.You can guess twice what’s next on my todo-list….
-
@ e131258d:e57f86e6
2024-07-22 14:13:46Quand les ordinateurs familiaux sont arrivés sur le marché, j'étais préadolescent. Mon grand-père paternel a été le premier à en parler dans la famille ; il voulait en offrir un à ses petits-enfants : un pour mon frère et moi, et un pour mes cousines. Dans ma mémoire, le choix de l’appareil ressemblait à la préparation d’une expédition en Amazonie. Mon père et mon oncle étudiaient des comparatifs et compilaient toutes les informations dans un grand tableau manuscrit format A3. On aurait dit des généraux devant une carte préparant une bataille ; des scientifiques devant la trajectoire d’une comète allant percuter la terre ; des braqueurs de haut vol devant les plans de la Banque de France ; ou des conspirateurs préparant un coup d’État… Bref, leur affaire semblait compliquée, trop complexe pour le gamin que j’étais. Je me retrouvais donc exclu de la discussion, ce qui paraît paradoxal puisque cet ordinateur était avant tout destiné à moi.
J’entendais les adultes prononcer des noms : processeur, mémoire vive, mémoire morte, disque dur… Parfois, mes oreilles se dressaient et mon sang ne faisait qu’un tour lorsqu’ils prononçaient "Amstrad CPC 6128", le nec plus ultra de l’ordinateur domestique en 1985 avec son lecteur de disquette. C’était celui que possédaient mes amis et il était réputé le meilleur pour les jeux.
Bien sûr, mes espoirs ont été déçus puisque le choix de mes aînés s’est porté sur un TO7/70 avec son stylet. Ce choix a été justifié par le fait que le Thomson était un produit français et le mieux fourni en logiciels éducatifs. On peut imaginer ma déception.
Le TO7/70 J'ai passé des nuits blanches avec ce TO7/70, une petite merveille de technologie avec son stylet et son lecteur de cassette qui mettait une éternité à télécharger ou enregistrer le moindre programme. Je passais des week-ends entiers à recopier des lignes de code pour des programmes qui ne fonctionnaient jamais, faisant les 3/8 avec mon père : un coup c’était lui, un coup c’était moi qui tapait, pendant que l’autre dictait. Personne ne parlait d’addiction à l’époque ; tout au plus ma mère craignait que je finisse épileptique, ce qui n’est bien sûr jamais arrivé.
Les Jeux
L’ordinateur familial était dans le salon, hors de question de l’avoir dans ma chambre. Bien qu’il n’y avait pas d’internet, j’aurais pu, je ne sais pas, jouer avec un programme qui calculait les plus petits et plus grands diviseurs communs, ou avec un jeu où une voiture doit sortir d’un labyrinthe avec un design à faire peur. Je me souviens surtout du décalage entre la présentation du jeu et le jeu lui-même, où seul l’imaginaire pouvait combler les manques. En termes de jeux, il y avait ceux que j’avais recopié et qui ne marchaient jamais, les jeux éducatifs que ma professeure de mathématique m’avait donné, et pour finir le seul jeu que je n’ai jamais acheté pour mon TO7/70.
Après que mon grand-père nous ait offert cet ordinateur, il a fallu acheter un écran et un meuble. Pour l’écran, j’ai eu de la chance : le moniteur faisait aussi télévision. Pour le meuble, ma mère a opté pour quelque chose de rustique, en accord avec le reste du mobilier et la tapisserie à grosses fleurs des années 80. Ensuite, il a fallu acquérir un casque avant que je puisse enfin investir dans deux joysticks et un jeu non éducatif. À la FNAC, perdu dans le rayon, j’ai hésité et finalement fait une mauvaise pioche. Le jeu, dont j’ai oublié le nom, était censé me faire incarner un agent secret, mais il était injouable. J’ai lutté pendant des mois pour dépasser le premier niveau sans jamais y arriver.
Il me restait les jeux offerts par ma professeure de mathématiques, négociés lors de la rencontre parents-profs en échange de quelques points supplémentaires sur ma moyenne. Les jeux n’étaient que des jeux mathématiques, et le seul plaisir était de les essayer tous en espérant trouver une pépite, un trésor que je n’ai bien sûr jamais découvert.
La programmation
Dans les années 80, avec un ordinateur, on pouvait jouer ou apprendre à programmer. Étant donné que jouer n’était pas top, je me suis donc essayé à la programmation. Mes compositions graphiques avec la fonction Random ressemblaient à de l’art électronique primitif, de l’art brut digital, la grotte de Lascaux de l’infographie. C’était fascinant et ridicule en même temps. Pendant une année, j’ai suivi des cours de développement dans une MJC, mais je ne me souviens plus de ce que j’ai appris. Je me rappelle seulement que nous cassions les pieds à l’animateur pour jouer. Les ordinateurs étaient des MO5, et le jeu était inspiré du film Tron. Mon imagination faisait le reste.
Le PC
Frustré par le jeu, limité par le matériel et le langage de programmation, et entravé par un manque de bagage culturel, voilà comment s’est déroulée ma découverte de l’informatique. Mon cousin, plus discipliné que moi, est rapidement passé au TO9 puis à son premier PC, et a fait du développement son métier. Mes compositions graphiques paraissaient bien limitées à côté. Mes parents, contrairement aux siens, n’ont pas jugé bon d’investir davantage, et je me suis désintéressé de ma machine. Même un lecteur de disquettes, ils n’ont pas voulu me l’acheter. Avec mes premiers salaires, aux alentours de l’an 2000, je me suis acheté un PC. Pour le processeur, j’ai opté pour un AMD-K6, et bien sûr une carte d’accélération Vodoo 3DFX. Mais mes parents n’ont pas voulu que je tire un câble jusqu’à ma chambre pour me connecter à Internet. Je me suis donc essayé au HTML pour rien, découvrant Photoshop et autres logiciels, mais ce dont je me souviens le mieux, ce sont les plantages de Windows 98 !
-
@ b7274d28:c99628cb
2024-07-19 02:04:30There is a general perception surrounding Bitcoin that it is prohibitively difficult to use for the average person. This perception can even infect those who are familiar with the various tools to optimally preserve their privacy, sovereignty, and security while using Bitcoin. After all, there’s 12 words you have to keep safe, or someone might steal all of your money. There’s all these tools like hardware wallets, software wallets, running a node, coin-join, and so much more you have to figure out in order to save and transact in Bitcoin securely and privately. Not to mention everything you have to learn if you want to use Lightning in a sovereign way, such as setting up channels, managing liquidity, learning LNURL, and setting up a domain so you can have a Lightning address.
All of these things can be intimidating for those considering whether they want to adopt Bitcoin, especially when comparing it with how little is needed for how most people interact with fiat today. As long as they have a debit card with their four-digit PIN memorized, and their password to their online banking, they have access to their money, and can use it virtually anywhere. Moreover, if they mess something up, they can just go to their financial institution and get a new debit card, or have their password reset. If a fraudster gets access to their account, their bank can often credit the money back to them and change their account information. The very real danger in Bitcoin of losing your funds and having no way of getting them back is scary for a lot of people.
One answer that Bitcoiners often give is, “Don’t worry. We’re still early and these tools will get easier to use as time goes on.” All that really tells someone is, “It might be better for you to wait and see if it actually does get easier.” Ease of use is a major factor for most people, and telling them that it is something they can expect from Bitcoin in the future just makes them more likely to wait until that hypothetical future arrives before they adopt it.
This perception of Bitcoin’s difficulty is really just a matter of perspective, though. Bitcoin looks incredibly difficult to use when you compare the way most people currently interact with fiat to using Bitcoin in a sovereign and private manner. This is comparing apples to oranges, though. If you instead compare apples to apples, looking at what it would take to use fiat digitally in a sovereign and private manner, suddenly Bitcoin is not more difficult than fiat, but less.
Someone may say, “Using fiat in a sovereign and private manner is easy! Just use cash.” Let’s explore that idea a bit, though. Cash can be very useful for transacting privately and without any intermediaries, but that usefulness diminishes when you want to transact with someone who is not within a short drive from you. As soon as you want to purchase something from a merchant in another state, or even the other side of the same state, a large amount of friction is added because you must travel to one another to complete the transaction. The farther you have to travel to transact in cash, the more convenient Bitcoin’s privacy tools begin to look. Indeed, the types of transactions that cash can easily make private and sovereign are exceptionally limited.
The next thing to consider is that cash may be convenient for transacting locally in a private manner, but it is much less convenient for storing your wealth privately and securely. If you want to store large sums of cash privately, a fire rated safe is much more expensive than memorizing 12 words, and is not impenetrable. Its only benefit is in making it take longer for fire to burn your cash or for a thief to break in and steal it, and that added length of time is measured in mere minutes. Meanwhile, the amount of time it would take someone to access the Bitcoin stored in 12 memorized words is measured in billions of years and the cost to secure it is just the transaction fee. Choosing to hide your cash instead may be less expensive than a safe, but then you will need to do some research on how best to protect it from the elements, which adds complexity, expense, and physical labor that is more cumbersome than rolling dice a few times and memorizing 12 words. Regardless of the method, securing cash is more difficult than securing Bitcoin and carries less assurances against theft or destruction. These shortcomings of storing wealth in cash become even more acute if you ever need to move that wealth, especially across any national border. Hiding thousands of dollars of cash in your luggage is much more difficult than carrying 12 words in your head that preserve your access to millions of sats.
Where there is no contest between the ease of using Bitcoin vs. fiat, though, is when you attempt to use fiat digitally in a private and/or sovereign manner. While privacy in Bitcoin is certainly an involved process, digital fiat is nearly impossible to use privately and is fraught with even worse “gotchas”. Forget about trying to use digital fiat in a sovereign manner. That actually is impossible. When people say using fiat is easier than using Bitcoin, they mean that using centralized, custodial, financial institutions that are surveilling your every transaction is easier than using self-custody, sovereign, Bitcoin. That’s a completely lopsided comparison, but it’s the only one that most people can imagine making, because they have never attempted to use digital fiat in a private manner in the first place.
What we should do is compare using Bitcoin in a sovereign and private way with using digital fiat in the same way, and likewise compare using Bitcoin via custodians to using digital fiat via banks and bank alternatives. When we do so, we find that not only is Bitcoin much easier to use privately than digital fiat, but it is impossible to use the latter in a sovereign manner and custodial Bitcoin is even easier to use than custodial fiat.
Anyone can get set up on a custodial wallet, with a Lightning Address, and be zapping people on Nostr, in less than 5 minutes. When was the last time you could do anything similar when setting up a bank account? Buying anything online with a custodial wallet has far less friction than using a debit or credit card. Scan the QR code and hit send. Done. No filling out card numbers, expiration dates, and billing addresses. Do you want to send money to your friend? All you need is their Lightning Address, and even if they are using a completely different custodian than you are, they receive the funds instantly and with virtually no fee. Custodial Bitcoin on Lightning already has better ease of use than any fiat custodian can provide, and it’s only improving.
Now, admittedly, the ideal in Bitcoin is to avoid custodians if you possibly can, and that is where things begin to get complicated, adding difficulty for the average user. However, avoiding custodians is not complicated or difficult with digital fiat, it’s entirely impossible. Most of us just never consider that fact when comparing the two. Instead we put the way we commonly use our fully centralized and surveilled custodial fiat up against the way we are told we ought to use Bitcoin to preserve our sovereignty and privacy, and we conclude Bitcoin is way more difficult, when that could not be farther from the truth. Sovereign, self-custody Bitcoin is far easier than sovereign, self-custody digital fiat, considering that the latter doesn’t even exist, and custodial Bitcoin is far easier than custodial fiat.
Then, when you add back into the equation the fact that fiat is constantly stealing from you, while Bitcoin is hard money that cannot have its supply arbitrarily inflated, the comparison is no contest. Bitcoin is far superior to fiat in every way! The tools to really take personal ownership of it without custodial risk may initially be intimidating, but they are not difficult if you take them on incrementally. You don’t need to dive immediately into hardware wallets, running a node, or managing the liquidity in your Lightning channels. You can start with a basic wallet app on your phone and your first few thousand sats and venture forth from there at your own pace. Bitcoin is hard money that won’t steal the value of your time and energy from you, but it’s not difficult to learn if you approach it like anything else new you have learned before. Take your time, and get comfortable with one new idea or tool every week or two, and before you know it you’ll be the one telling your friends and family just how easy it really is.
-
@ 124b23f2:02455947
2024-07-15 15:36:19Stacker.News if a website that may be familiar to most nostriches. But in case it is not, stacker.news is effectively a forum-style website that functions similar to reddit. But, with the very cool feature that sending sats works as the 'upvote' function. It's a great site, so check it out regardless of what I am about to break down.
One of the great features of Stacker News is that you get a custodial lightning wallet with your account. This wallet comes with a lightning address as well. Now, this alone is not unique or entirely useful. What makes Stacker News stand out is the way you can configure this wallet to interact with your self-hosted lightning stack. Now, I'm not just talking about the power user that is running their own node (although they do have useful configuration for those users). But, the lighter users that may be using a mobile wallet, such as mutiny, zeus, blixt, phoenix can use Stacker News to improve their zap receiving game.
Before I get into how, I want to highlight two great options to receive zaps via ln address if you are using a 'node in the phone solution.' The two options are zaplocker, provided as a solution for users of Zeus' embedded lightning node wallet, and lightning box, provided as a solution for users of the Blixt embedded lightning node wallet. Here is a summary of these two ln address solutions:
-
Zaplocker: Zaplocker is an ln address provided for Zeus embedded node users. The way zaplocker works is that the sats are held temporarily in Zeus' node (for up to 24 hours), and once you log into your Zeus app and access the embedded node, you can redeem the sats. This can be made easier by setting your Zeus app to automatically redeem and sats awaiting redemption upon logging in, and there is also a persistent mode that can help the app stay on in the background. If the app is on in the background and you have turned on the auto redemption feature, it will function more or less to automatically receive sats much like a custodial wallet or a node you are running on a server
-
Lightning Box: Lightning Box is an ln address provided for users of Blixt wallet. The way lightning box works is that, as long as you have a channel opened with the blixt wallet node and it has inbound liquidity, you can receive sats to the lightning box ln address. You have to set blixt wallet to persistent mode as part of using lightning box, which makes it more likely that your app will be running in the background and will successfully facilitate receipt of sats sent to the lightning box ln address. Unlike zaplocker, there is no 24 hour redemption time ; if your blixt wallet is not running in the background when the send is attempted, you will not receive the sats.
So, these both sound like great solutions and a pleb need not do anything else, right ? Wrong. Both are not perfect solutions, with the main issue being that even with persistent app settings, the apps will inevitably close because phones seem to do that for one unknown reason or another. Or a connection will be lost, and your app will not be running to receive the sats. Or if you want t run the apps over tor, your very likely to have a connection inturrupted. Now, with zeus you still have 24 hours to open it up, sync up your node and receive but we want to up our game so that a user does not have a chance to lose zaps.
Enter Stacker News. By Creating a stacker news account (you can even do so with using your blixt or zeus wallet to log in and create your account!), you now have an ln address you can use in place of your zaplocker or lightning box address. Use that stacker news address for receiving zaps on your nostr account. In your stacker news account, go to the wallet link, and click 'Attach Wallets'. Next, click 'configure' under the ln address option. Input your zaplocker or lightning box address. If you want, you can configure stacker news to withdraw automatically by setting the 'desired balance' to 0 sats. There will be a moment of custody, but it is a few seconds in my experience.
So, what have we just done and what's the improvement? Well, zaps you receive are going to be received to your stacker news ln address now. The improvement is that the stacker news node is on all the time, and you do not risk missing payments much like one can expect from say a Wallet of Satoshi ln address. With the configuration we made, the stacker news wallet is going to automatically withdraw sats received to your custody in your zaplocker or lightning box. For example: In the event that you are asleep, and your zeus or blixt wallet is not online for hours while nostriches across the globe are zapping the crap out of your viral post, your sats are sitting comftorably in your stacker news wallet. You will fire up your Zeus or Blixt app, and proceed to watch the sats flow into your custody from your stacker news wallet.
The result of this setup is a pleb who is not running an 'on all the time' node can experience a comparable receiving experience with minimal custodial exposure. I encourage any pleb, especially those using zaplocker due to the potential harm it can cause to the lightning network, to consider this setup. Let me know what your experience is or, for users already leveraging the Stacker News ln address, I'd be curious to hear about your configuration. Thanks for reading, hopefully someone finds this article helpful!
bitcoin #lightning #zeus #blixt
originally posted at https://stacker.news/items/609313
-
-
@ 9fec72d5:f77f85b1
2024-07-18 19:38:47Each relay selects a branch from above and starts serving.
Some big machines in the top layers can handle more. Smaller machines in layers below are needed for decentralization and scalability.
Some top layer machines can act in sync only mode, efficiently distributing notes among layers.
Relay or the admin posts a special kind for advertisement of the relay:
```
{ "pubkey": "...pubkey of admin or the relay itself..", "kind": 30202, "tags": [ ["d","..10"], ["ip4","111.222.33.44:443","primary"], ["ip6","abc:def::443","backup"], ], ... } ```
The above example says this relay will handle the note id's that are ending with bits ..10. In this case it is going to handle about 1/4th of the network.
Primary way of reaching at this relay is through ip 111.222.33.44. There is also a backup server.
Clients can accept this advertisement based on web of trust or historical reliability of the npub. Or other npubs can measure the reliability of this relay and send reactions to this note. Clients then can see these reactions and rank these services.
Solves: - Possible future DNS ban issues: I don't know when or if DNS will be an issue for Nostr. The above design can help with the situation.
-
Scalability: If 1 million users join the network at the same time, the machines that are handling ".." i.e. all of the traffic may fail. But if the clients are using relays on other layers, the load will be efficiently distributed to many machines. The failure of layer 0 and 1 will not stop the network. Every layer can operate independently without the other layers (in theory).
-
Traffic efficiency: A client has to query many relays, depending on what it wants to do. It may choose to stay efficient (talk to top layers) on mobile traffic, or it may choose to help decentralization over wifi. The notes that match the queries will not be repeated as many times as current design, because relays will hold a portion of the network.
-
Storage efficiency: Relay operators can just save a part of the network that they are responsible for, on NVME drives. The rest of the network they can save in hard drives. In case of major failure the hard drives can still have a copy.
-
Speed: Since the notes will come from many different relays at the same time, there may be a slight speed increase.
-
Decentralization: If the top layer relays collude and start banning the other layers can still continue to serve notes.
-
Backup relay: In case a relay instance fails, users can find the backup server on the same note.
-
Zero down time migration: The ability to define a backup server allows zero down time migrations. An operator can set the primary to the new server and backup to the old server and do migration and continue without interruption.
-
Efficient sync among servers: A relay has to sync with 3 servers, 1 above, 2 below. But it can do 6 or 9 depending on how much reliability it wants.
-
Writing to logN relays: Clients has to write to logN relays (i.e. 1 relay in each layer), to effectively distribute their notes to everyone and also to help with decentralization.
-
-
@ 5d4b6c8d:8a1c1ee3
2024-07-22 12:44:10Yesterday's edition https://stacker.news/items/616182/r/Undisciplined
-
- -
July 22, 2023 📅
📝
TOP POST
I’m Henrik Skogstrom, CEO of LN Capital (Torq) AMA!
Excerpt
https://ln.capital/ https://twitter.com/LN_Capital https://twitter.com/ohskogstrom
4016 sats \ 15 comments \ @ohskogstrom \ ~bitcoin
💬
TOP COMMENT
https://stacker.news/items/212283/r/Undisciplined?commentId=212402
Excerpt
I went to a party last year where we would all be betting on the outcomes of various fighting matchups on N64's (classic) Super Smash Bros.
554 sats \ 1 reply \ @gpvansat
From Who Wants To Build Sports Betting on Lightning? by @kr in ~bitcoin
🏆
TOP STACKER
2nd place @k00b (1st hiding, presumed @siggy47)
3776 stacked \ 5791 spent \ 2 posts \ 16 comments \ 0 referrals
🗺️
TOP TERRITORY
~bitcoin
everything bitcoin related
founded by @k00b on Tue May 02 2023
24.8k stacked \ 0 revenue \ 28.3k spent \ 66 posts \ 217 comments
https://imgprxy.stacker.news/fsFoWlgwKYsk5mxx2ijgqU8fg04I_2zA_D28t_grR74/rs:fit:960:540/aHR0cHM6Ly9tLnN0YWNrZXIubmV3cy8yMzc5Ng
July 22, 2022 📅
📝
TOP POST
I'm Simon, creator of Mempool.Space. AMA.
Excerpt
Hi everyone! I am @softsimon_, creator and co-founder of the @Mempool open source project at Mempool.Space, an increasingly used and loved tool for Bitcoiners around the world and with self hosted installations on thousands of Raspberry Pi home node […]
47.3k sats \ 81 comments \ @softsimon \ ~bitcoin
💬
TOP COMMENT
https://stacker.news/items/48270/r/Undisciplined?commentId=48298
Excerpt
- I used bitcoin as an online "anonymous" currency 2013-2015 without understanding the monetary aspects. In 2016 after som austrian economics I finally started to get it and from 2017 it has been almost full time focus.
831 sats \ 0 replies \ @softsimon
From I'm Simon, creator of Mempool.Space. AMA. by @softsimon in ~bitcoin
🏆
TOP STACKER
1st place @softsimon
50k stacked \ 171 spent \ 1 post \ 28 comments \ 0 referrals
🗺️
TOP TERRITORY
~bitcoin
everything bitcoin related
founded by @k00b on Tue May 02 2023
108.1k stacked \ 0 revenue \ 114.6k spent \ 64 posts \ 303 comments
https://imgprxy.stacker.news/fsFoWlgwKYsk5mxx2ijgqU8fg04I_2zA_D28t_grR74/rs:fit:960:540/aHR0cHM6Ly9tLnN0YWNrZXIubmV3cy8yMzc5Ng
July 22, 2021 📅
📝
TOP POST
Yo, I got Strike - Jack Mallers Announcing the Bitcoin Tab
Link to https://jimmymow.medium.com/announcing-the-bitcoin-tab-133fffc5f668
11 sats \ 4 comments \ @Yo_I_got_Strike \ ~bitcoin
💬
TOP COMMENT
https://stacker.news/items/463/r/Undisciplined?commentId=534
Excerpt
love it, just needs to implement ln url withdraw
108 sats \ 1 reply \ @bro
From This is dope. by @falsefaucet in ~bitcoin
🏆
TOP STACKER
1st place @bro
5 stacked \ 9 spent \ 0 posts \ 2 comments \ 0 referrals
🗺️
TOP TERRITORY
~bitcoin
everything bitcoin related
founded by @k00b on Tue May 02 2023
122 stacked \ 0 revenue \ 225 spent \ 16 posts \ 35 comments
originally posted at https://stacker.news/items/617823
-
-
@ 1739d937:3e3136ef
2024-07-12 10:11:42This is the third in a series of weekly updates detailing progress on bringing MLS protocol DMs and group messaging to Nostr.
Previous Updates
Progress this week
Short update post this week but I made solid progress updating several OpenMLS dependencies and building a new library that implements all the OpenMLS crypto requirements for Nostr.
HPKE-RS
I mentioned a PR last week on the
rust-hpke
library, that PR is still waiting on review. However, in the meantime, the OpenMLS library depends on a slightly different hpke library, thehpke-rs
library.This library didn't support the secp256k1 curve either so I've opened a PR to add support for secp256k1. That PR uses the
RustCrypto
library that is the default in the thehpke-rs
library. However, because this library is built to allow for swapping out the underlying crypto library (using traits), I was also able to create a new library that uses thesecp256k1
library instead of theRustCrypto
library. This is the main crypto library that almost all Bitcoin and Nostr apps use so it's important we use that instead ofRustCrypto
.OpenMLS Nostr Crypto
The library that I've created (openmls-nostr-crypto) does a few things via separate sub-libraries (crates). The entire library tries to do as little as possible; only implementing the one required ciphersuite for all MLS implementations and the
secp256k1
schnorr signatures ciphersuite I've created for Nostr. It's also using the rightsecp256k1
library to do it.openmls-nostr-crypto: implementes the crypto traits that the OpenMLS library expects to find for a given provider and removes all the ciphersuites that we don't care about supporting.
hpke-rs-nostr-crypto: This implements the crypto traits that the hpke-rs library expects to find. Again removing all the ciphersuites we don't care about supporting.
I've not yet written any tests for these libraries (which needs to be done) but I've gotten some very simple demos working with OpenMLS using this new set of libraries to handle the crypto.
I've been really impressed with the simplicity of the OpenMLS interface as well. While there is A LOT going on under the hood, the public interface that client developers have to use is clean and simple.
Onward and Upward
Next week I'll continue working on these libraries; adding tests and continuing to build out my small demo app. I've been in touch with the maintainers of the OpenMLS library and I'm hoping to get a review done with them on my PRs and on my new library.
One thing I'll call out here: The review of my library and getting them to review the direction of the project overall will be paid consulting so I'm making sure that I've got as much done as possible before scheduling that time and paying their rates. I'm a strong believer that the right conversation can save you months of wasted time, but I'm also a believer in making sure you're prepared if you're going to drop money on that conversation. 😅
-
@ dd664d5e:5633d319
2024-07-10 15:00:40The Law Giver
In my previous article, I briefly went into the topic of how contract law is holy law.
The Father of the Christians, you see, is a God of Covenants, and He sent His Son to Earth to fulfill the Law and pay a Price for our salvation. The Father also created the Natural Order to confirm His Laws through our observation of that Law in action.
That is why Christians have a deep respect for honest contracts, true prices, fair measures, natural systems, and good laws. Not merely for their own sake, but also because understanding them helps us to understand and emulate the Law Giver.
The tired What would Jesus do? meme is actually an attempt to capture this emulation of the Highest Judge. Jesus knows the Law, since His Father defined it and He is One with the Father, so how would He apply the Law best, in this situation?
The Last Things
Working together with him, then, we entreat you not to accept the grace of God in vain. For he says, “At the acceptable time I have listened to you, and helped you on the day of salvation.”
Behold, now is the acceptable time; behold, now is the day of salvation.
-- 2 Corinthians 6:1-2 New Testament, RSV Bible
One of the things most devout Christians frequently ruminate over, is when Christ will return.
Every day, we ponder. We go for a walk, and ponder. We cook dinner, and ponder.
At the same time, we are called upon to live each day in a productive manner, and to not simply lie around, vegetating, and staring at the heavens. Not only for our own sake, but also because to do so would be to limit our ability to inform others about the Good News, so that they can take to pondering with us. We are called to ponder with as many people as we can produce, adopt, convert, or simply collect through our daily interactions.
This means that we are not of the world (as our eyes are watching God and baptism has made us Immortals), but we are definitely in the world (involved in, and effected by, the everyday dealings around us.) It is a very difficult balancing act to know when we are paying too much attention to the one or the other, or to know where to store up our treasures, if we can't put them all immediately into the Final Treasury.
So, we worked today and earned a bit of fiat cash and we have no immediate usage for it. What to do with it?
Well, some of it should go to charity or be invested in a business that provides important goods, services, and employment. Some of it will be needed to simply cover the day-to-day costs of our own life, or that of those dependent upon us. But it might be prudent to store up some of it, for the mysterious length of time between receipt of the monies and our own ascent into Heaven.
Typically, that store was the local currency, but that's being inflated away at an alarming rate. Then we all began to move to precious metals, and many of us still do, but they are so difficult to spend and can't be broken into small bits or transported over wide distances at low cost.
Enter Bitcoin.
The Great Ledger
In our frustration, more and more Christians are turning to a new technology, to save up some treasure in a liquid asset, for the mid-term. And, once we begin using it, and begin to understand how it works, we quickly grow fond of it.
Bitcoin is a beautiful money because it is a money based upon keeping precise accounts, using a fair measure, and obeying the Laws of Nature.
In essence, Bitcoin is a debt note emitted by a universal debtor (the network) in exchange for some good or service. This frees the individual people using it from ever needing to carry debt, as the debt stays in the network, the value and enforceability of that debt note is protected by the effort used to create and maintain the network, and the eagerness with which other people wish to store their efforts up in that same network. The debt still exists, but it can be so thinly and widely spread that it no longer rests as a burden upon one particular person.
The debt, in other words, has been disassociated from humans and the management has been distributed to machines. This is the precise opposite process of a "fiat" (by decree) currency, which only has value so long as it is associated with some particularly solvent group of humans (who personally vouch for repayment of any debts denominated in the currency), and where management is centralized to some other group of humans.
Have you accepted Bitcoin as your personal money and store of value?
You have invested $10 to buy the electricity to mine Bitcoin? Then you receive $10 of Bitcoin in return. The Bitcoin network now owes you the $10 purchasing power equivalent of that electricity.
If someone then gives you $5 worth of pizza, then you can then give them a $5 portion of your $10 worth of Bitcoin. You have taken a part of your Bitcoin debt note and shared it with them.
They now hold $5 worth of Bitcoin invested in the network and can spend it on some other good or service. Or they can simply hold it and wait for it to rise in value, as more people "mine" more of it (and produce more notes, paradoxically making the existing notes more useful and therefore valuable) and more people try to gain the notes in order to manage their own finances by storing their energy in the network or transporting their energy using the network.
Bitcoin, in other words, is an accounting book that needs no accountant because it stores, tracks, and controls the ledger on its own. It is a Natural Ledger that runs according to the Laws.
It is the only human-made ledger that allows for true and immediate Final Settlement. This Final Settlement is what allows people to trade using the convenience of digital debt notes, with neither person occurring or even risking any personal debt. We Christians know that all debt is a burden, including monetary debts, which is why we are called to forgive each other's debt and to hope that our own debts are forgiven. Better still, is to avoid the accumulation of debts, altogether.
So, Final Settlement? Final Settlement is what Jesus would do.
-
@ 8dc86882:9dc4ba5e
2024-07-22 11:05:44I was wondering what are the must have Nostr and Bitcoin related apps people keep on their Android phones? Anything related to either topic.
originally posted at https://stacker.news/items/617668
-
@ 4523be58:ba1facd0
2024-05-28 11:05:17NIP-116
Event paths
Description
Event kind
30079
denotes an event defined by its event path rather than its event kind.The event directory path is included in the event path, specified in the event's
d
tag. For example, an event path might beuser/profile/name
, whereuser/profile
is the directory path.Relays should parse the event directory from the event path
d
tag and index the event by it. Relays should support "directory listing" of kind30079
events using the#f
filter, such as{"#f": ["user/profile"]}
.For backward compatibility, the event directory should also be saved in the event's
f
tag (for "folder"), which is already indexed by some relay implementations, and can be queried using the#f
filter.Event content should be a JSON-encoded value. An empty object
{}
signifies that the entry at the event path is itself a directory. For example, when savinguser/profile/name
:Bob
, you should also saveuser/profile
:{}
so the subdirectory can be listed underuser
.In directory names, slashes should be escaped with a double slash.
Example
Event
json { "tags": [ ["d", "user/profile/name"], ["f", "user/profile"] ], "content": "\"Bob\"", "kind": 30079, ... }
Query
json { "#f": ["user/profile"], "authors": ["[pubkey]"] }
Motivation
To make Nostr an "everything app," we need a sustainable way to support new kinds of applications. Browsing Nostr data by human-readable nested directories and paths rather than obscure event kind numbers makes the data more manageable.
Numeric event kinds are not sustainable for the infinite number of potential applications. With numeric event kinds, developers need to find an unused number for each new application and announce it somewhere, which is cumbersome and not scalable.
Directories can also replace monolithic list events like follow lists or profile details. You can update a single directory entry such as
user/profile/name
orgroups/follows/[pubkey]
without causing an overwrite of the whole profile or follow list when your client is out-of-sync with the most recent list version, as often happens on Nostr.Using
d
-tagged replaceable events for reactions, such as{tags: [["d", "reactions/[eventId]"]], content: "\"👍\"", kind: 30079, ...}
would make un-reacting trivial: just publish a new event with the samed
tag and an empty content. Toggling a reaction on and off would not cause a flurry of new reaction & delete events that all need to be persisted.Implementations
- Relays that support tag-replaceable events and indexing by arbitrary tags (in this case
f
) already support this feature. - IrisDB client side library: treelike data structure with subscribable nodes.
https://github.com/nostr-protocol/nips/pull/1266
- Relays that support tag-replaceable events and indexing by arbitrary tags (in this case
-
@ b60c3e76:c9d0f46e
2024-05-15 10:08:47KRIS menjamin semua golongan masyarakat mendapatkan perlakuan sama dari rumah sakit, baik pelayanan medis maupun nonmedis.
Demi memberikan peningkatan kualitas layanan kesehatan kepada masyarakat, pemerintah baru saja mengeluarkan Peraturan Presiden (Perpres) nomor 59 tahun 2024 tentang Jaminan Kesehatan. Melalui perpres itu, Presiden Joko Widodo (Jokowi) telah menghapus perbedaan kelas layanan 1, 2, dan 3 dalam Badan Penyelenggara Jaminan Sosial atau BPJS Kesehatan.
Layanan berbasis kelas itu diganti dengan KRIS (Kelas Rawat Inap Standar). Berkaitan dengan lahirnya Perpres 59/2024 tentang Perubahan Ketiga atas Perpres 82/2018 tentang Jaminan Kesehatan, Presiden Joko Widodo telah memerintahkan seluruh rumah sakit yang bekerja sama dengan BPJS Kesehatan melaksanakannya.
Kebijakan baru itu mulai berlaku per 8 Mei 2024 dan paling lambat 30 Juni 2025. Dalam jangka waktu tersebut, rumah sakit dapat menyelenggarakan sebagian atau seluruh pelayanan rawat inap berdasarkan KRIS sesuai dengan kemampuan rumah sakit.
Lantas apa yang menjadi pembeda dari sisi layanan dengan layanan rawat inap sesuai Perpres 59/2024? Dahulu sistem layanan rawat BPJS Kesehatan dibagi berdasarkan kelas yang dibagi masing-masing kelas 1, 2, dan 3. Namun, melalui perpres, layanan kepada masyarakat tidak dibedakan lagi.
Pelayanan rawat inap yang diatur dalam perpres itu--dikenal dengan nama KRIS—menjadi sistem baru yang digunakan dalam pelayanan rawat inap BPJS Kesehatan di rumah sakit-rumah sakit. Dengan KRIS, semua golongan masyarakat akan mendapatkan perlakuan yang sama dari rumah sakit, baik dalam hal pelayanan medis maupun nonmedis.
Dengan lahirnya Perpres 59/2024, tarif iuran BPJS Kesehatan pun juga akan berubah. Hanya saja, dalam Perpres itu belum dicantumkan secara rinci ihwal besar iuran yang baru. Besaran iuran baru BPJS Kesehatan itu sesuai rencana baru ditetapkan pada 1 Juli 2025.
“Penetapan manfaat, tarif, dan iuran sebagaimana dimaksud ditetapkan paling lambat tanggal 1 Juli 2025,” tulis aturan tersebut, dikutip Senin (13/5/2024).
Itu artinya, iuran BPJS Kesehatan saat ini masih sama seperti sebelumnya, yakni sesuai dengan kelas yang dipilih. Namun perpres itu tetap berlaku sembari menanti lahirnya peraturan lanjutan dari perpres tersebut.
Kesiapan Rumah Sakit
Berkaitan dengan lahirnya kebijakan layanan kesehatan tanpa dibedakan kelas lagi, Kementerian Kesehatan (Kemenkes) menegaskan mayoritas rumah sakit di Indonesia siap untuk menjalankan layanan KRIS untuk pasien BPJS Kesehatan.
Kesiapan itu diungkapkan oleh Dirjen Pelayanan Kesehatan Kemenkes Azhar Jaya. “Survei kesiapan RS terkait KRIS sudah dilakukan pada 2.988 rumah sakit dan yang sudah siap menjawab isian 12 kriteria ada sebanyak 2.233 rumah sakit,” ujar Azhar.
Sebagai informasi, KRIS adalah pengganti layanan Kelas 1, 2, dan 3 BPJS Kesehatan yang bertujuan untuk memberikan layanan kesehatan secara merata tanpa melihat besaran iurannya.
Melalui KRIS, rumah sakit perlu menyiapkan sarana dan prasarana sesuai dengan 12 kriteria kelas rawat inap standar secara bertahap. Apa saja ke-12 kriteria KRIS itu?
Sesuai bunyi Pasal 46A Perpres 59/2024, disyaratkan kriteria fasilitas perawatan dan pelayanan rawat inap KRIS meliputi komponen bangunan yang digunakan tidak boleh memiliki tingkat porositas yang tinggi serta terdapat ventilasi udara dan kelengkapan tidur.
Demikian pula soal pencahayaan ruangan. Perpres itu juga mengatur pencahayaan ruangan buatan mengikuti kriteria standar 250 lux untuk penerangan dan 50 lux untuk pencahayaan tidur, temperature ruangan 20--26 derajat celcius.
Tidak hanya itu, layanan rawat inap berdasarkan perpres itu mensyaratkan fasilitas layanan yang membagi ruang rawat berdasarkan jenis kelamin pasien, anak atau dewasa, serta penyakit infeksi atau noninfeksi.
Selain itu, kriteria lainnya adalah keharusan bagi penyedia layanan untuk mempertimbangkan kepadatan ruang rawat dan kualitas tempat tidur, penyediaan tirai atau partisi antartempat tidur, kamar mandi dalam ruangan rawat inap yang memenuhi standar aksesibilitas, dan menyediakan outlet oksigen.
Selain itu, kelengkapan tempat tidur berupa adanya dua kotak kontak dan nurse call pada setiap tempat tidur dan adanya nakas per tempat tidur. Kepadatan ruang rawat inap maksimal empat tempat tidur dengan jarak antara tepi tempat tidur minimal 1,5 meter.
Tirai/partisi dengan rel dibenamkan menempel di plafon atau menggantung. Kamar mandi dalam ruang rawat inap serta kamar mandi sesuai dengan standar aksesibilitas dan outlet oksigen.
Azhar menjamin, Kemenkes akan menjalankan hal tersebut sesuai dengan tupoksi yang ada. “Tentu saja kami akan bekerja sama dengan BPJS Kesehatan dalam implementasi dan pengawasannya di lapangan,” ujar Azhar.
Berkaitan dengan perpres jaminan kesehatan itu, Direktur Utama BPJS Kesehatan Ghufron Mukti menilai, perpres tersebut berorientasi pada penyeragaman kelas rawat inap yang mengacu pada 12 kriteria. "Bahwa perawatan ada kelas rawat inap standar dengan 12 kriteria, untuk peserta BPJS, maka sebagaimana sumpah dokter tidak boleh dibedakan pemberian pelayan medis atas dasar suku, agama, status sosial atau beda iurannya," ujarnya.
Jika ada peserta ingin dirawat pada kelas yang lebih tinggi, kata Ghufron, maka diperbolehkan selama hal itu dipengaruhi situasi nonmedis. Hal itu disebutkan dalam Pasal 51 Perpres Jaminan Kesehatan diatur ketentuan naik kelas perawatan.
Menurut pasal tersebut, naik kelas perawatan dilakukan dengan cara mengikuti asuransi kesehatan tambahan atau membayar selisih antara biaya yang dijamin oleh BPJS Kesehatan dengan biaya yang harus dibayar akibat peningkatan pelayanan.
Selisih antara biaya yang dijamin oleh BPJS Kesehatan dengan biaya pelayanan dapat dibayar oleh peserta bersangkutan, pemberi kerja, atau asuransi kesehatan tambahan.
Ghufron Mukti juga mengimbau pengelola rumah sakit tidak mengurangi jumlah tempat tidur perawatan pasien dalam upaya memenuhi kriteria KRIS. "Pesan saya jangan dikurangi akses dengan mengurangi jumlah tempat tidur. Pertahankan jumlah tempat tidur dan penuhi persyaratannya dengan 12 kriteria tersebut," tegas Ghufron.
Penulis: Firman Hidranto Redaktur: Ratna Nuraini/Elvira Inda Sari Sumber: Indonesia.go.id
-
@ 266815e0:6cd408a5
2024-05-09 17:23:28Lot of people are starting to talk about building a web-of-trust and how nostr can or is already being used as such
We all know about using the kind:3 following lists as a simple WoT that can be used to filter out spam. but as we all know it does not really signal "trust", its mostly just "I find your content interesting"
But what about real "trust"... well its kind of multi-denominational, I could trust that your a good developer or a good journalist but still not trust you enough to invite you over to my house. There are some interesting and clever solutions proposed for quantifying "trust" in a digital sense but I'm not going to get into that here. I want to talk about something that I have not see anyone discuss yet.
How is the web-of-trust maintained? or more precisely how do you expect users to update the digital representation of the "trust" of other users?
Its all well and good to think of how a user would create that "trust" of another user when discovering them for the first time. They would click the "follow" button, or maybe even rate them on a few topics with a 1/5 star system But how will a user remove that trust? how will they update it if things change and they trust them less?
If our goal is to model "trust" in a digital sense then we NEED a way for the data to stay up-to-date and as accurate as possible. otherwise whats the use? If we don't have a friction-less way to update or remove the digital representation of "trust" then we will end up with a WoT that continuously grows and everyone is rated 10/10
In the case of nostr kind:3 following lists. its pretty easy to see how these would get updated. If someone posts something I dislike or I notice I'm getting board of their content. then I just unfollow them. An important part here is that I'm not thinking "I should update my trust score of this user" but instead "I'm no longer interested, I don't want to see this anymore"
But that is probably the easiest "trust" to update. because most of us on social media spend some time curating our feed and we are used to doing it. But what about the more obscure "trust" scores? whats the regular mechanism by which a user would update the "honestly" score of another user?
In the real world its easy, when I stop trusting someone I simply stop associating with them. there isn't any button or switch I need to update. I simply don't talk to them anymore, its friction-less But in the digital realm I would have to remove or update that trust. in other words its an action I need to take instead of an action I'm not doing. and actions take energy.
So how do we reflect something in the digital world that takes no-energy and is almost subconscious in the real world?
TLDR; webs-of-trust are not just about scoring other users once. you must keep the score up-to-date
-
@ 502ab02a:a2860397
2024-07-22 07:31:57เกริ่นนำก่อนนะครับว่า ฉบับนี้เป็นแบบลูกทุ่งลูกลุย ไม่ได้ทำความรู้จักแอพแบบละเอียด วิ่งก่อนเดินทีหลัง 5555
step แบบคร่าวมากๆเลยคือ 1. โหลด 2. รัน 3. ลุย
ไม่ต้องเกริ่นกันแล้วนะครับว่าทุ่งม่วงคืออะไร ไปกดอ่านได้ในโน้ตอื่น เราเริ่มที่จุดนี้เลย เอาแอพมาโพสคุย โน้ตนี้เป็นสำหรับสาขาแอนดรอย์ ถ้าเริ่มจากแอพชื่อ Amethyst จะสะดวกสุด แค่กดๆๆๆๆ จบเลย
เริ่มจากเข้า play store ค้นหาคำว่า amethyst หรือจะ nostr ก็ได้ถ้าขี้เกียจพิมพ์ โหลดเสร็จแล้ว จะมีไอคอนมาให้เรา กดเข้าไปเลยครับ
จากนั้นแอพมันจะแจ้งว่าเรายังไม่ได้เข้าสู่ระบบ แน่หละ เราเพิ่งจะโหลดมานี่หว่า กดไปตรงที่บอกว่า "ลงทะเบียน" ตามในรูปเลยครับ อันนี้ผมเลือกให้ระบบแสดงภาษาไทย เพราะต้องการให้พี่ป้าน้าอา ที่ไม่ใช่สายคอมสายซน กดตามได้ง่ายๆ ส่วนใครทำระบบเป็นภาษาอังกฤษ ก็เทียบเคียงปุ่มได้เลยนะครับ
จากนั้นแอพจะพามาที่หน้าการกรอกรายละเอียดการลงทะเบียน 3 ขั้นตอนเท่านั้น 1. กรอกชื่อที่เราต้องการ จะชื่อจริงๆเลย หรือ ชื่อในวงการ อันนี้ตามสะดวกครับ 2. เสร็จแล้วก็ติ๊ก ยอมรับเงื่อนไข ไปตามระเบียบก่อน 3. เสร็จแล้วก็กดปุ่ม "สร้างบัญชี" ได้เลย
เสร็จแล้ว!!!!!! แค่นี้เสร็จแล้ว คุณมีบัญชี nostr สำหรับใช้แทนความเป็นตัวคุณในทุ่งม่วงแล้ว 1 2 3 เสร็จเลย ง่ายใช่มะ
การใช้งาน แบบอนุบาล ความสำคัญของการอยู่ในทุ่งม่วง ไม่ใช่การโพส cool cool แต่เป็นการพูดคุยการเข้าสังคมครับ ดังนั้นเราเริ่มจากการทักทายทุ่งม่วงกันก่อน ยังจำสิ่งสำคัญที่เรียกว่า แฮชแทค (#) ได้ใช่ไหมครับ ว่าเบสิคเลยเราจะใช้แฮชแทคที่เรียกว่า #siamstr ในการรวมประชากรคนไทยในทุ่งม่วง เรามาเริ่มโพสแรกของเรากันเลย กดที่ปุ่ม + กลมๆข้างล่างตามรูปครับ ใช่ครับ ไอ้กลมๆที่มีรูปขนนกนั่นแหละ เหมือนปากกานักเขียนสมัยก่อนที่เอาขนนกจิ้มหมึก
จากนั้นก็แค่ 2 ขั้นตอนครับ 1. พิมพ์ข้อความที่อยากจะพิมพ์ แล้วตามด้วย #siamstr เพื่อระบุว่าให้ใครก็ตามที่ follow หรือ ติดตามแฮชแทค #siamstr อยู่ได้เห็นข้อความนี้ด้วยง่ายๆ ถ้ายังคิดอะไรไม่ออก พิมพ์ว่า "เฮียโต้งสั่งลุย #siamstr" ก็ได้ครับ แล้วชาวทุ่งม่วงจะออกมาต้อนรับคุณอย่างอบอุ่น 5555555 2. กดปุ่ม "โพสต์" เพื่อส่งข้อความนี้ไปสู่ทุ่งม่วง
เสร็จละ 2 ขั้นตอนคุณก็ส่งข้อความออกไปได้แล้ว
เรียบร้อยแล้วหน้าตาข้อความมันก็จะแสดงบนฟีดของคุณ เหมือนเฟสบุคเลย
สังเกตว่าตรงแฮชแทค มันจะเป็นสีๆ และสามาถคลิกได้ เมื่อกดไปแล้วคุณก็จะเห็นข้อความทั้งหมดในทุ่งม่วง ที่ติดแฮชแทคนนี้ เห็นภาพแล้วเนอะว่า การติดแฮชแทคไว้ในประโยคด้วยมันดียังไง และถ้าคุณกดปุ่ม "ติดตาม" มันก็จะทำการ ติดตามแฮชแทคนี้เอาไว้ แล้วนำมาแสดงบนฟีดของคุณทุกครั้งที่เข้าแอพ
การตอบรับจากชาวทุ่งม่วง ในตอนแรกๆเราอาจยังไม่มีคนมาคุยด้วยมากนัก เป็นธรรมดาของการยังไม่รู้จักกัน ดังนั้นเราอาจจะเริ่มเห็นการส่งอิโมจิให้กัน อย่างในรูปครับ หลังจากที่เราโพสไปพร้อมติดแฮชแทค ก็จะมีเพื่อนๆชาวทุ่งม่วงมาทักทายด้วยอิโมจิ ถ้าเราอยากรู้ว่าใครส่งอิโมจิอะไรมา ก็ให้กดลูกศร ตามในรูปครับ มันจะแสดงให้เราเห็นได้ทันที ว่าเอ๊ะ คนที่มากดอิโมจิไว้ 2 คนคือใครนะ เราก็สามารถกดที่ avatar ของเค้าเพื่อไปตามอ่านโพสที่เขาเขียนไว้ได้ และส่งอิโมจิให้โพสของเค้าได้เช่นกัน
คร่าวๆก็ประมาณนี้ครับ ไอคอนข้างๆอิโมจิ จะมี 2 อันที่ไปลองเล่นได้ก่อนเลยครับ 1. บอลลูนคำพูด อันนั้นมีไว้ reply หรือภาษาเฟสบุคเรียกว่า comment คุณสามารถเข้าไปตอบเมนท์ของตัวเอง หรือ ไปเม้นท์โพสคนอื่นๆได้ครับ 2. เครื่องหมาย วนไปมา อันนี้มีไว้แชร์ ภาษาเฟสบุคก็ share นั่นแหละครับ มันจะให้เลือกว่า แชร์ไปทั้งดุ้นเลย หรือ จะแชร์พร้อมกับโพสแสดงความเห็นไว้ด้วย เหมือนเฟสบุคทุกอย่างเลยครับ
วันนี้เราถือว่าได้เบสิคมากๆ กับการใช้แอพ โพสข้อความในสังคมของ nostr แล้ว ต่อไปก็จะเป็นดีเทลเล็กๆน้อยๆ ที่ไม่ได้ถือว่าสำคัญขนาดว่าถ้าไม่ทำแล้วจะเล่น nostr ไม่ได้เลยยังไม่อธิบาย ณ วันนี้ เพราะอย่างที่บอกครับ วัตถุประสงค์คือ ลงเล่นแม่งก่อนเลย แล้วเรียนรู้ไปพร้อมๆกับสร้างประสบการณ์ ไม่งั้นไม่ได้ลองซักทีเนอะ 555555
* อันนี้หมายเหตุทิ้งไว้เฉยๆว่า ยังจำ "กุญแจคู่ รู้ตัวตน" ได้ใช่ไหมครับ Nostr Public Key ที่เอาไว้แจกคนอื่น กับ Nostr Secret Key ที่ห้ามใครรู้
ถ้าเรากดที่รูป อวตาลของเรา มันจะแสดงเมนู การตั้งค่าต่างๆ จุดที่เป็นเลข 1 ถ้ากดไปแล้วจะเห็นเลข Nostr Public Key (กุญแจสาธารณะ) หรือ npub จุดที่เป็นเลข 2 ถ่ากดไปแล้วจะเข้าเมนู ที่ระบบจะสำเนา Nostr Secret Key (กุญแจลับ) หรือ nsec เอาไว้ในหน่วยความจำของเครื่องคุณ
อ่ะ อันนี้คือหลังจากกด เมนู โปรไฟล์ (จุดที่เป็นเลข 1) มันจะปรากฎค่าโปรไฟล์ของคุณ เห็นไหมครับมีคำว่า npub อยู่ แต่ที่เราเห็นอยู่นี้มันเป็นแค่ส่วนนึงนะครับ ถ้าเราจะนำไปให้เพื่อนๆกด ติดตาม(follow) เรา คุณจะต้องกดปุ่ม สี่เหลี่ยมซ้อนกัน ตามที่ผมวงกลมเอาไว้ให้ เพื่อทำการสำเนาตัวเลขกุญแจสาธารณะ npub ไว้ในหน่วยความจำ ก่อนที่คุณจะนำไป "วาง (paste)" ในกล่องข้อความต่างๆ ที่คุณต้องการจะส่งให้เพื่อนหรือคนทั่วไป เช่น ในข้อความไลน์ โพสในเฟสบุค ส่งอีเมล ส่ง sms หรือ แชทต่างๆ มันจะเป็นชุดตัวอักษร ขึ้นต้นด้วย npub แล้วตามด้วยตัวเลขยาวๆ เช่น npub35465432435 (ตัวเลขเป็นรหัสกุญแจเฉพาะตัวคุณ แต่ละคนจะไม่เหมือนกัน)
อีกอันนึง ถ้าคุณกด จุดที่เป็นเลข 2 แอพจะพาคุณมาที่หน้านี้ ถ้าได้อ่านข้อความของแอพ เขาจะบอกว่าตรงนี้เป็นชุดรหัสลับ ที่คุณจะบอกใครไม่ได้ และคุณต้องสำเนาตัวเลขชุดนี้ไว้ในที่ปลอดภัย เพราะมันคือกุญแจลับ Nostr Secret Key (nsec) ที่เราเคยทำความรู้จักกันไปแล้ว
ตรงนี้ให้คุณกดปุ่มแคปซูลที่เขียนว่า "คัดลอก secret key ของฉัน" แอพจะสำเนาชุดรหัสกุญแจลับเอาไว้ในหน่วยความจำของเครื่องคุณ และเช่นเดียวกันครับ ตรงนี้คุณก็นำไป "วาง (paste)" เพื่อทำการจดเก็บเอาไว้ หรือ จะเก็บไว้ในที่ใดก็ตามที่คุณถนัด และมั่นใจว่าไม่มีใครจะมาเห็นหรือลอกออกไปได้ เพราะนี่คือความเป็นตัวตนของคุณในโลกทุ่งม่วง nostr นี้ครับ
ทีนี้ หลังจากนี้คุณก็สามารถลองพูดคุยกับชาวทุ่งม่วงได้ และเริ่มหัดเรียนรู้เมนูต่างๆ เรียนรู้ประเพณีต่างๆของชาวทุ่งม่วง การใส่รูป การตั้ง bio profile การใส่กระเป๋าบิทคอยน์ การ zap หรือทิป ให้กับข้อความหรืองานที่ถูกใจ และของเล่นอื่นๆอีกมากมาย ที่จะต้องค่อยๆทำความเข้าใจกันไป แต่ก็นะ คุณสามารถคุยกับชาวทุ่งม่วงได้แล้ว ลองถามพวกเค้าดูสิ หลายๆคนยินดีให้คำแนะนำนะ
ไม่ต้องคิดเยอะ เล่นเลย โดดลงน้ำแล้วค่อยหัดว่าย วิ่งก่อนแล้วค่อยเดิน 555555555
-
@ 124b23f2:02455947
2024-07-10 05:24:33I'm reposting this in order to fix the images and get it correctly onto my npub.pro site
What is this ?
My Futurebit Apollo Miner in action
Futurebit is a company that manufactures home miners for retail users (see futurebit.io) - you can check out details on their website. These are targeted at individuals that want to participate in mining with an easy to use, quiet miner that does not use a ton of electricity.
The idea here is to try and stay true to the original bitcoin concept of a computer which runs the bitcoin software and mines bitcoin, all in one. While mining has obviously evolved into a commercial industry and the mining process has long been separated from running a bitcoin node, I think there are still some valid reasons to want to participate in the mining process and attempt to keep this original ethos alive:
- You learn a lot about bitcoin from mining. Reading about the process is one thing, but actually getting a machine running, pointing it towards the pool of your choice, and seeing the bitcoin process from this perspective is something I've found very enjoyable (mining is probably my favorite part of bitcoin, if I had to choose one). It can also be a gateway drug to more advanced mining, so...heavy care.
- It is marginally valuable to the network. Sure, one person running a futurebit miner is nothing compared to the hashrate of MARA. But, if you have tens of thousands of plebs all running their own miners from home, pointing their hashpower to the pool of their choice, it does provide a legitimate decentralizing anchor to large mining actors. Every pleb should be participating in mining, in my opinion, and the Futurebit Apollo miner offers a casual and unobtrusive avenue to do just that.
Now, the Futurebit Apollo Miner you most commonly see is the 'Full Package.' The Full Package is an Orange Pi computer running on a custom linux distribution. You basically boot it up, and it starts running bitcoin core and mining in a few clicks. That is very cool, and I did own one. However, I will say that the Full Package is, unfortunately, not a great product. I won't go into too much detail, but its a very poor quality computer, the security upkeep of the custom linux distribution is lacking, and I found connecting to the node to use with wallets to be very difficult. The mining was also consistently inturrupted due to the pi freezing up. You are better off going with a different option for running a bitcoin node.
This is simply to say, I dislike the Full Package Futurebit Apollo Unit. With that said, I very much like the 'Standard Unit'. The Standard Unit is simply a mining hashboard without the computer OS that comes with the Full Package. I've been running two Standard Units in my house for 18 months now and they just quietly hum along in my house, I barely notice them and they rarely need to be restarted (honestly I can't think of more than 3 times that I needed to troubleshoot and even then it was typically restarting the miner). They are a great way to make your personal contribution to the security of the bitcoin network.
(Please Note: These things are pretty pricey brand new, so don't even think about buying one as being a profitable choice. You can sometimes find decent deals for them used, but buyer beware I can't speak to that experience. This is about hands-on learning about bitcoin mining and participating in the security of the network with a machine that won't drive your family crazy and that I believe is well-built enough to last you a long time. Heck, you could even set one up at your office and I doubt anyone would notice and mine with free electricity.)
So, that's some background. Now, a standard unit is not the most friendly user experience if you are not used to using command line. I wasn't, but I spent a lot of time figuring it out. With this guide, any user should be able to get a Standard Unit hashing :) Here is the step by step process guide on how to run a standard unit, in use with both a Windows OS computer and a Linux OS computer (Ubuntu is what I use)...enjoy.
I. Linux
Your Futurebit Apollo Standard Unit (referred to from here as 'the Unit') miner comes with a power supply and a usb cable. It's pretty straight forward, but you will want to connect the Unit to the computer of your choice (the computer needs to be on and running while you are mining, so a low power consumption computer and/or a computer that you are already running all the time for some other reason would be best). Also, plug in the power supply to your unit and turn it on (there are instructions in the Unit's manual for how to hook up the power supply correctly).
- Go to https://github.com/jstefanop/Apollo-Miner-Binaries
- You'll land on a page that looks like this:
For the purposes of this guide, I am going to be using the 'linux-x86_64' package as it is most appropriate for my computer. But, you would need to select the appropriate binaries for your computer. Odds are, if you are running ubuntu on your laptop or desktop, you are going to use the 'linux-x86_64' package.
- Once you have clicked on the linux-x86_64 folder, you will land here:
Open each of these 4 files and download them using the 'Download Raw File' link in each file's page. Here is an example:
Once you have downloaded, I'd recommend making a folder and putting all 4 of the files into that folder. I am going to make a folder with the name 'Mining Files'. I am also going to move the folder out of my Downloads folder (not necessary, but I'd recommend as it's not a great place to park a folder you want to hold onto). I am going to move it to my Documents Folder:
-
Now, if you do not have a mining pool account, you will need to make one, otherwise you should be able to use your pool credentials in the next step. Alternatively, you can solo mine using solo.ckpool.org with only a btc address. However, for the purposes of this guide I am going to assume that you are using Braiins pool. If necessary, go ahead and make a login at braiins.com/pool. Note the username and password as we will need that later.
-
From the files that you downloaded in step 3, we are going to double-click on 'start_apollo.sh' and open it up in Text Editor. The first 60 lines or so are explanations on how to prepare this script for your use. However, I am going to run you through all the options to update, so you shouldn't need to read the description. Instead, we are going to focus on the text in black font at the bottom of the file:
We need to update the info in the following fields as follows
-host = stratum.braiins.com -port = 3333 -user = username for your Braiins login. I'll use "braiinsexample" for now. -pswd = password for your Braiins login. I'll use "Password123" for now. -comport = Most likely "/dev/ttyACM0" or "/dev/ttyACM1" or "/dev/ttyACM2" -brd_ocp = 48 -osc = 30 -ao_mode = 1
So, when our file is set up correctly, it will look like:
Make sure to save, then close out of text editor.
- Now, start up terminal on your computer. First, we are going to want to get into the correct directory. In our case, our file is in Documents > Mining Files. So, you would input the following command:
cd ~/Documents/Mining Files
Once you are in the correct directory, we will run the following command:
sudo ./start_apollo.sh
You will likely be prompted to enter your computer's password, input the password and, if all goes well, you should see a result like this in your terminal:
You are now mining! The setting we use run the Unit ins 'eco mode', which should yield 2 terahash per second in terms of mining power (you can monitor the performance from your respective mining pool dashboard). As long as you keep this terminal window open and the computer on that is connected to the miner, and you will be mining away :)
II. Windows
Your Futurebit Apollo Standard Unit (referred to from here as 'the Unit') miner comes with a power supply and a usb cable. It's pretty straight forward, but you will want to connect the Unit to the computer of your choice (the computer needs to be on and running while you are mining, so a low power consumption computer and/or a computer that you are already running all the time for some other reason would be best). Also, plug in the power supply to your unit and turn it on (there are instructions in the Unit's manual for how to hook up the power supply correctly).
- Go to https://github.com/jstefanop/Apollo-Miner-Binaries
- You'll land on a page that looks like this:
For Windows, you will click on the folder titled 'msvc19'. You'll see 2 files to download, go ahead and click on each and click 'Download Raw File' for each.
Please Note: Windows may see this file as a virus, so you may need to 'allow on device' in your virus & threat protection of your windows security service.
- Once you have downloaded each file, I'd recommend making a folder and putting all of the files into that folder. I am going to make a folder with the name 'Mining Files'. I am also going to move the folder out of my Downloads folder (not necessary, but I'd recommend as it's not a great place to park a folder you want to hold onto). I am going to move it to my Documents Folder:
-
Now, if you do not have a mining pool account, you will need to make one, otherwise you should be able to use your pool credentials in the next step. Alternatively, you can solo mine using solo.ckpool.org with only a btc address. However, for the purposes of this guide I am going to assume that you are using Braiins pool. If necessary, go ahead and make a login with at braiins.com/pool. Note the username and password as we will need that later.
-
From the files that you downloaded in step 3, we are going to right click on 'start_apollo' and select 'edit'. The file will open up in notepad. The first 60 lines or so are explanations on how to prepare this script for your use. However, I am going to run you through all the options to update, so you shouldn't need to read the description. Instead, we are going to focus on the text in black font at the bottom of the file:
We need to update the info in the following fields as follows
-host = stratum.braiins.com -port = 3333 -user = username for your Braiins login. I'll use "braiinsexample" for now. -pswd = password for your Braiins login. I'll use "Password123" for now. -comport = go to the 'Device Manager' service on your computer, expand the 'Ports' section, and you should find a 'USB serial device' with COM#. Update with your COM# (COM1, COM2, COM3, etc). -brd_ocp = 48 -osc = 30 -ao_mode = 1
So, when our file is set up correctly, it will look like:
- Now that the file is ready, save the file and exit out of notepad. You should be able to simply double-click on the file. The command prompt services will pop-up on your desktop, and should reflect this, indicating your connection is successful and mining has begun:
You are now mining! The setting we use run the Unit ins 'eco mode', which should yield 2 terahash per second in terms of mining power (you can monitor the performance from your respective mining pool dashboard). As long as you keep this terminal window open and the computer on that is connected to the miner, and you will be mining away :)
originally posted at https://stacker.news/items/603076
-
@ 266815e0:6cd408a5
2024-04-22 22:20:47While I was in Mediera with all the other awesome people at the first SEC cohort there where a lot of discussions around data storage on nostr and if it could be made censorship-resistent
I remember lots of discussions about torrents, hypercore, nostr relays, and of course IPFS
There were a few things I learned from all these conversations:
- All the existing solutions have one thing in common. A universal ID of some kind for files
- HTTP is still good. we don't have to throw the baby out with the bath water
- nostr could fix this... somehow
Some of the existing solutions work well for large files, and all of them are decentralization in some way. However none of them seem capable of serving up cat pictures for social media clients. they all have something missing...
An Identity system
An identity system would allow files to be "owned" by users. and once files have owners servers could start grouping files into a single thing instead of a 1000+ loose files
This can also greatly simplify the question of "what is spam" for a server hosting (or seeding) these files. since it could simply have a whitelist of owners (and maybe their friends)
What is blossom?
Blossom is a set of HTTP endpoints that allow nostr users to store and retrieve binary data on public servers using the sha256 hash as a universal id
What are Blobs?
blobs are chunks of binary data. they are similar to files but with one key difference, they don't have names
Instead blobs have a sha256 hash (like
b1674191a88ec5cdd733e4240a81803105dc412d6c6708d53ab94fc248f4f553
) as an IDThese IDs are universal since they can be computed from the file itself using the sha256 hashing algorithm ( you can get a files sha256 hash on linux using:
sha256sum bitcoin.pdf
)How do the servers work?
Blossom servers expose four endpoints to let clients and users upload and manage blobs
GET /<sha256>
(optional file.ext
)PUT /upload
Authentication
: Signed nostr event- Returns a blob descriptor
GET /list/<pubkey>
- Returns an array of blob descriptors
Authentication
(optional): Signed nostr eventDELETE /<sha256>
Authentication
: Signed nostr event
What is Blossom Drive?
Blossom Drive is a nostr app built on top of blossom servers and allows users to create and manage folders of blobs
What are Drives
Drives are just nostr events (kind
30563
) that store a map of blobs and what filename they should have along with some extra metadataAn example drive event would be
json { "pubkey": "266815e0c9210dfa324c6cba3573b14bee49da4209a9456f9484e5106cd408a5", "created_at": 1710773987, "content": "", "kind": 30563, "tags": [ [ "name", "Emojis" ], [ "description", "nostr emojis" ], [ "d", "emojis" ], [ "r", "https://cdn.hzrd149.com/" ], [ "x", "303f018e613f29e3e43264529903b7c8c84debbd475f89368cb293ec23938981", "/noStrudel.png", "15161", "image/png" ], [ "x", "a0e2b39975c8da1702374b3eed6f4c6c7333e6ae0008dadafe93bd34bfb2ca78", "/satellite.png", "6853", "image/png" ], [ "x", "e8f3fae0f4a43a88eae235a8b79794d72e8f14b0e103a0fed1e073d8fb53d51f", "/amethyst.png", "20487", "image/png" ], [ "x", "70bd5836807b916d79e9c4e67e8b07e3e3b53f4acbb95c7521b11039a3c975c6", "/nos.png", "36521", "image/png" ], [ "x", "0fc304630279e0c5ab2da9c2769e3a3178c47b8609b447a30916244e89abbc52", "/primal.png", "29343", "image/png" ], [ "x", "9a03824a73d4af192d893329bbc04cd3798542ee87af15051aaf9376b74b25d4", "/coracle.png", "18300", "image/png" ], [ "x", "accdc0cdc048f4719bb5e1da4ff4c6ffc1a4dbb7cf3afbd19b86940c01111568", "/iris.png", "24070", "image/png" ], [ "x", "2e740f2514d6188e350d95cf4756bbf455d2f95e6a09bc64e94f5031bc4bba8f", "/damus.png", "32758", "image/png" ], [ "x", "2e019f08da0c75fb9c40d81947e511c8f0554763bffb6d23a7b9b8c9e8c84abb", "/old emojis/astral.png", "29365", "image/png" ], [ "x", "d97f842f2511ce0491fe0de208c6135b762f494a48da59926ce15acfdb6ac17e", "/other/rabbit.png", "19803", "image/png" ], [ "x", "72cb99b689b4cfe1a9fb6937f779f3f9c65094bf0e6ac72a8f8261efa96653f5", "/blossom.png", "4393", "image/png" ] ] }
There is a lot going on but the main thing is the list of "x" tags and the path that describes the folder and filename the blob should live at
If your interested, the full event definition is at github.com/hzrd149/blossom-drive
Getting started
Like every good nostr client it takes a small instruction manual in order to use it properly. so here are the steps for getting started
1. Open the app
Open https://blossom.hzrd149.com
2. Login using extension
You can also login using any of the following methods using the input - NIP-46 with your https://nsec.app or https://flare.pub account - a NIP-46 connection string - an
ncryptsec
password protected private key - ansec
unprotected private key (please don't) - bunker:// URI from nsecbunker3. Add a blossom server
Right now
https://cdn.satellite.earth
is the only public server that is compatible with blossom drive. If you want to host your own I've written a basic implementation in TypeScript github.com/hzrd149/blossom-server4. Start uploading your files
NOTE: All files upload to blossom drive are public by default. DO NOT upload private files
5. Manage files
Encrypted drives
There is also the option to encrypt drives using NIP-49 password encryption. although its not tested at all so don't trust it, verify
Whats next?
I don't know, but Im excited to see what everyone else on nostr builds with this. I'm only one developer at the end of the day and I can't think of everything
also all the images in this article are stored in one of my blossom drives here
nostr:naddr1qvzqqqrhvvpzqfngzhsvjggdlgeycm96x4emzjlwf8dyyzdfg4hefp89zpkdgz99qq8xzun5d93kcefdd9kkzem9wvr46jka
-
@ a012dc82:6458a70d
2024-07-22 06:27:34Table Of Content
-
What are Cryptocurrencies?
-
Legal Issues Surrounding Cryptocurrencies
-
Laws and Regulations Governing Cryptocurrencies
-
Conclusion
-
FAQ
Since the creation of Bitcoin in 2009, cryptocurrencies have been growing in popularity and use. However, as with any new technology, the legal implications of cryptocurrencies are still being explored and developed. In this article, we will discuss some of the legal issues surrounding cryptocurrencies and the laws and regulations that govern their use.
What are Cryptocurrencies?
Cryptocurrencies are digital or virtual tokens that use encryption techniques to secure and verify transactions and control the creation of new units. Bitcoin, the first and most well-known cryptocurrency, uses a decentralized system where transactions are recorded on a public ledger called the blockchain. Other popular cryptocurrencies include Ethereum, Litecoin, and Ripple.
Legal Issues Surrounding Cryptocurrencies
The legal issues surrounding cryptocurrencies can be divided into three main categories: regulatory, criminal, and civil.
Regulatory Issues
Regulatory issues involve the laws and regulations that govern the use of cryptocurrencies. One of the biggest regulatory issues is the lack of clarity surrounding how cryptocurrencies should be classified under existing laws. Some countries consider cryptocurrencies as a form of property, while others see them as a form of currency. This can have significant implications for how cryptocurrencies are taxed and regulated.
Criminal Issues
Criminal issues involve the use of cryptocurrencies for illegal activities, such as money laundering, drug trafficking, and terrorism financing. Due to the anonymity and lack of regulation surrounding cryptocurrencies, they have become a popular choice for criminals looking to move money around the world without being detected. Governments and law enforcement agencies are working to develop ways to prevent the use of cryptocurrencies for criminal activities.
Civil Issues
Civil issues involve disputes between individuals or businesses involving cryptocurrencies. For example, if a person loses their cryptocurrency due to a hack or theft, they may seek legal recourse to recover their funds. Additionally, disputes may arise over the ownership and transfer of cryptocurrencies.
Laws and Regulations Governing Cryptocurrencies
The laws and regulations governing cryptocurrencies vary from country to country. In some countries, cryptocurrencies are completely banned, while in others they are heavily regulated. The following are some examples of laws and regulations that apply to cryptocurrencies in different countries:
The United States
The Internal Revenue Service (IRS) considers cryptocurrencies as property for tax purposes. Additionally, the Securities and Exchange Commission (SEC) has established guidelines for when cryptocurrencies may be considered securities and subject to securities laws.
Japan
Japan has recognized cryptocurrencies as legal tender and has implemented regulations to protect consumers and prevent money laundering.
China
China has banned the use of cryptocurrencies and has cracked down on cryptocurrency exchanges and initial coin offerings (ICOs).
European Union
The European Union has established regulations for cryptocurrency exchanges and wallet providers to prevent money laundering and terrorist financing.
Conclusion
Cryptocurrencies are a relatively new technology, and the legal landscape surrounding them is still evolving. Governments and law enforcement agencies are working to develop laws and regulations that protect consumers and prevent criminal activity. As an individual or business using cryptocurrencies, it is important to understand the legal issues surrounding their use and to comply with the laws and regulations that apply to you.
FAQ
What are some examples of laws and regulations governing cryptocurrencies? Examples of laws and regulations governing cryptocurrencies include tax laws, securities laws, and regulations to prevent money laundering and terrorist financing.
Why is it important to understand the legal issues surrounding cryptocurrencies? It is important to understand the legal issues surrounding cryptocurrencies to comply with applicable laws and regulations and to protect yourself from legal and financial risks.
Can I sue someone over a dispute involving cryptocurrencies? Yes, you can sue someone over a dispute involving cryptocurrencies, such as a dispute over ownership or transfer of cryptocurrencies.
What should I do if I am involved in a legal issue related to cryptocurrencies? If you are involved in a legal issue related to cryptocurrencies, it is important to consult with a lawyer who has experience in this area of law. They can help you understand your legal options and protect your rights.
That's all for today
If you want more, be sure to follow us on:
NOSTR: croxroad@getalby.com
Instagram: @croxroadnews.co
Youtube: @croxroadnews
Store: https://croxroad.store
Subscribe to CROX ROAD Bitcoin Only Daily Newsletter
https://www.croxroad.co/subscribe
DISCLAIMER: None of this is financial advice. This newsletter is strictly educational and is not investment advice or a solicitation to buy or sell any assets or to make any financial decisions. Please be careful and do your own research.
-
-
@ 124b23f2:02455947
2024-07-10 04:54:00I'm reposting this article I wrote some time ago again to fix the images for it on my npub.pro site
In my previous post, I explained how to use your getalby ln address to receive zaps directly to your LND node. Similarly, there is an additional option that one can utilize to receive zaps directly to your lightning node: lnaddress.com.
Lnaddress.com is a federated lightning address server that you can use to create a custom ln address. Unlike using getalby, lnaddress.com can be used with any lightning implementation (not just LND). For the purposes of this write-up, I am going to use LNBits to connect an lnaddress.com lightning address with my node. And as will be the case with most of my write-ups, I am going to be using Start9 OS, so users of that OS will likely find this write-up most useful, but I'm sure people using other node interfaces can infer how to complete this set up as well.
With that said, let's dive into the step-by-step on how to create your own custom ln address with lnaddress.com and set it up to receive zaps directly to your lightning node:
*Users should have lnbits set up with their lightning node before proceeding.
- Go to lnaddress.com. Input your desired username, select 'Node Backend Type' = LNBits, and if necessary check the box 'This is a new lightning address'. Keep this page open in one tab as we will be returning to it to input info.
- From your Start9 OS services page, go to your LNBits service. Open the 'Properties' page, and in a new tab, open the (Tor) Superuser Account. Page will look like this:
From this LNbits page, you can choose to 'add a new wallet' and use that wallet instead of your superuser account. That is up to you but the steps will be the same.
- Now, we need to grab the info needed for the 'Host Protocol + IP or Domain + Port' field on the lnaddress.com page. On the lnbits page, expand the 'Api Docs' field, and the 'Get Wallet Details' field found on the right hand side menu. In this 'Get Wallet Details' section, you will want to copy some of the URL found in the 'curl example' section. Copy 'http://xxxxxx.onion' (don't copy any more!), and paste this into the 'Host (Protocol + IP or Domain + Port' field found on the lnaddress.com page.
-
Next, we need to grab the key for your lnbits wallet. From the lnbits page, expand the API docs section found on the right hand side menu. Copy the 'Invoice/read key' (make sure to use the invoice/read key and not your Admin key), and paste it into the key field found on the lnaddress.com page. Upon pasting in that last piece of info, click 'submit' at the bottom of the page.
-
If all info was input correctly, your connection will be successful. If successful, you will be brought to a page that looks like this:
You will want to save this secret PIN in case you need to update info in your ln address. You'll also find a test lightning invoice of 1 sat. Using a wallet not connected to the node we connected to our new ln address, you can test the ln address out by paying the 1 sat test invoice.
Users of Start9 OS might find the following info particularly useful: This ln address via lnaddress.com comes with a couple advantages aside from self-custodial zap receiving:
- One, you can have a custom ln address username to go with your nym or nostr username. Users of Start9 may be familiar with the ln address one can generate in the btcpay server service. This ln address is not customizable.
- Two, if you are running a tor only lightning node, you will be able to receive zaps from both tor and clear net lightning nodes. Users of Start9 may be familiar with the ln address one can generate in the btcpay server service. This ln address can only receive zaps from other tor nodes and can't receive zaps from clear net nodes.
That is it, you should now be all set up with your new ln address hosted on lnaddress.com, and you should be all ready to receive zaps or lightning payments of any kind :)
originally posted at https://stacker.news/items/603061
-
@ 4c20ea71:2ce9517f
2024-07-22 06:04:17لوگوی ورساچه، با تصویر سر مدوسا و مارهایش، یکی از شناختهشدهترین نمادها در دنیای مد است. در پس این تصویر، داستانی جذاب و مملو از رمز و راز نهفته است.
آغاز یک افسانه:
در سال 1978، جیانی ورساچه، طراح ایتالیایی، برند خود را پایهگذاری کرد. او به دنبال لوگویی بود که جسورانه، لوکس و به یاد ماندنی باشد. در همین راستا، به سراغ یکی از افسانههای یونان باستان، مدوسا، رفت.
مدوسا کیست؟
مدوسا زنی زیبا بود که به دلیل تنبیهی از سوی الهه آتنا، موهایش به مار تبدیل شد و هر کس به چشمانش نگاه میکرد، به سنگ تبدیل میشد.
چرا ورساچه مدوسا را انتخاب کرد؟
انتخاب مدوسا توسط ورساچه دلایل مختلفی داشت. برخی معتقدند که او مجذوب قدرت و زیبایی افسانهای مدوسا بود. برخی دیگر نیز بر این باورند که ورساچه میخواست با استفاده از این نماد، قدرت اغواگری و فریبندگی برند خود را به نمایش بگذارد.
معانی نماد مدوسا:
لوگوی ورساچه به مفاهیم مختلفی تعبیر میشود. برخی آن را نمادی از قدرت و جسارت میدانند، در حالی که برخی دیگر آن را نشانهای از فریبندگی و وسوسه میدانند.
تحولات لوگو:
در طول زمان، لوگوی ورساچه تغییرات جزئی داشته است. در ابتدا، تصویر مدوسا به رنگ طلایی بود، اما بعدها به رنگ مشکی با حاشیه طلایی تغییر یافت.
تاثیرگذاری لوگو:
لوگوی ورساچه به یکی از شناختهشدهترین نمادها در دنیای مد تبدیل شده است. این لوگو بر روی لباسها، کیفها، عطرها و سایر محصولات این برند نقش بسته و به هویت و اعتبار ورساچه در سراسر جهان افزوده است.
داستانی فراتر از مد:
لوگوی ورساچه فراتر از یک نماد تجاری ساده است. این لوگو داستانی از اسطورهشناسی یونان، قدرت و زیبایی را روایت میکند و به همین دلیل در میان مردم محبوبیت زیادی یافته است.
نکات جالب:
برخی معتقدند که سر مدوسا در لوگوی ورساچه شبیه به چهره جیانی ورساچه است. لوگوی ورساچه در طول سالها مورد تقلید و جعل بسیاری قرار گرفته است. در سال 2007، شرکت مد ورساچه از شرکت گوچی به دلیل استفاده از تصویری شبیه به مدوسا در یکی از کمپینهای تبلیغاتی خود شکایت کرد.
-
@ 69a0a091:c968228d
2024-07-07 19:10:07Radicale is a self-hosted calendar and contact solution that is "lightweight solution, easy to use, easy to install, easy to configure."
https://radicale.org/v3.html#about
I can finally dump my NextCloud, which was a sprawling mess of PHP scripts. Managing a NextCloud instance over a long period of a time requires the sysadmin to be mindful of the stateful configuration, which can only be upgraded one major version at a time.
I stumbled over Radicale a couple years ago right after I had spent an entire day writing a playbook to build a dockerless NextCloud container. I should have switched immediately, I've wasted numerous hours over that period making sure my container was up to date, or needlessly fiddling with settings just to feel like I had a handle on how the thing was going to operate.
Running radicale on NixOS:
service.radicale = { enable = true; settings = { server = { hosts = [ "127.0.0.1:5232" "[::]:5232" ]; }; auth = { type = "htpasswd"; htpasswd_filename = "/etc/radicale/users"; htpasswd_encryption = "bcrypt"; }; storage = { filesystem_folder = "/var/lib/radicale/collections"; }; }; };
Nginx configuration snippet from:
locations."/radicale/" = { proxyPass = "http://127.0.0.1:5232/"; extraConfig = '' proxy_set_header X-Script-Name /radicale; proxy_pass_header Authorization; ''; };
Creating my account after
nixos-rebuild switch
:nix-shell -p apacheHttpd --run "htpasswd -B -c /etc/radicale/users pleb"
The web interface is dead simple and let's you create, import, or export calendars or addressbooks. I exported the vcf addressbook and ics calendar from my NextCloud instance and imported them into Radicale. The UI gives you URIs for the calendar or contacts, and I pasted that in along with my username in the relevant add dialog in Thunderbird.
Likewise, I added the account in davx5 on Android using the first "Login with URL and user name" option. For the URL this time, I appended my username, so it was
https://example.com/radicale/myusername
and davx was able to sync both the calendar and the contacts from that endpoint. -
@ f977c464:32fcbe00
2024-01-30 20:06:18Güneşin kaybolmasının üçüncü günü, saat öğlen on ikiyi yirmi geçiyordu. Trenin kalkmasına yaklaşık iki saat vardı. Hepimiz perondaydık. Valizlerimiz, kolilerimiz, renk renk ve biçimsiz çantalarımızla yan yana dizilmiş, kısa aralıklarla tepemizdeki devasa saati kontrol ediyorduk.
Ama ne kadar dik bakarsak bakalım zaman bir türlü istediğimiz hızla ilerlemiyordu. Herkes birkaç dakika sürmesi gereken alelade bir doğa olayına sıkışıp kalmış, karanlıktan sürünerek çıkmayı deniyordu.
Bekleme salonuna doğru döndüm. Nefesimden çıkan buharın arkasında, kalın taş duvarları ve camlarıyla morg kadar güvenli ve soğuk duruyordu. Cesetleri o yüzden bunun gibi yerlere taşımaya başlamışlardı. Demek insanların bütün iyiliği başkaları onları gördüğü içindi ki gündüzleri gecelerden daha karanlık olduğunda hemen birbirlerinin gırtlağına çökmüş, böğürlerinde delikler açmış, gözlerini oyup kafataslarını parçalamışlardı.
İstasyonun ışığı titrediğinde karanlığın enseme saplandığını hissettim. Eğer şimdi, böyle kalabalık bir yerde elektrik kesilse başımıza ne gelirdi?
İçerideki askerlerden biri bakışlarımı yakalayınca yeniden saate odaklanmış gibi yaptım. Sadece birkaç dakika geçmişti.
“Tarlalarım gitti. Böyle boyum kadar ayçiçeği doluydu. Ah, hepsi ölüp gidiyor. Afitap’ın çiçekleri de gi-”
“Dayı, Allah’ını seversen sus. Hepimizi yakacaksın şimdi.”
Karanlıkta durduğunda, görünmez olmayı istemeye başlıyordun. Kimse seni görmemeli, nefesini bile duymamalıydı. Kimsenin de ayağının altında dolaşmamalıydın; gelip kazayla sana çarpmamalılar, takılıp sendelememeliydiler. Yoksa aslında hedefi sen olmadığın bir öfke gürlemeye başlar, yaşadığın ilk şoku ve acıyı silerek üstünden geçerdi.
İlk konuşan, yaşlıca bir adam, kafasında kasketi, nasırlı ellerine hohluyordu. Gözleri ve burnu kızarmıştı. Güneşin kaybolması onun için kendi başına bir felaket değildi. Hayatına olan pratik yansımalarından korkuyordu olsa olsa. Bir anının kaybolması, bu yüzden çoktan kaybettiği birinin biraz daha eksilmesi. Hayatta kalmasını gerektiren sebepler azalırken, hayatta kalmasını sağlayacak kaynaklarını da kaybediyordu.
Onu susturan delikanlıysa atkısını bütün kafasına sarmış, sakalı ve yüzünün derinliklerine kaçmış gözleri dışında bedeninin bütün parçalarını gizlemeye çalışıyordu. İşte o, güneşin kaybolmasının tam olarak ne anlama geldiğini anlamamış olsa bile, dehşetini olduğu gibi hissedebilenlerdendi.
Güneşin onlardan alındıktan sonra kime verileceğini sormuyorlardı. En başta onlara verildiğinde de hiçbir soru sormamışlardı zaten.
İki saat ne zaman geçer?
Midemin üstünde, sağ tarafıma doğru keskin bir acı hissettim. Karaciğerim. Gözlerimi yumdum. Yanımda biri metal bir nesneyi yere bıraktı. Bir kafesti. İçerisindeki kartalın ıslak kokusu burnuma ulaşmadan önce bile biliyordum bunu.
“Yeniden mi?” diye sordu bana kartal. Kanatları kanlı. Zamanın her bir parçası tüylerinin üstüne çöreklenmişti. Gagası bir şey, tahminen et parçası geveliyor gibi hareket ediyordu. Eski anılar kolay unutulmazmış. Şu anda kafesinin kalın parmaklıklarının ardında olsa da bunun bir aldatmaca olduğunu bir tek ben biliyordum. Her an kanatlarını iki yana uzatıverebilir, hava bu hareketiyle dalgalanarak kafesi esneterek hepimizi içine alacak kadar genişleyebilir, parmaklıklar önce ayaklarımızın altına serilir gibi gözükebilir ama aslında hepimizin üstünde yükselerek tepemize çökebilirdi.
Aşağıya baktım. Tahtalarla zapt edilmiş, hiçbir yere gidemeyen ama her yere uzanan tren rayları. Atlayıp koşsam… Çantam çok ağırdı. Daha birkaç adım atamadan, kartal, suratını bedenime gömerdi.
“Bu sefer farklı,” diye yanıtladım onu. “Yeniden diyemezsin. Tekrarladığım bir şey değil bu. Hatta bir hata yapıyormuşum gibi tonlayamazsın da. Bu sefer, insanların hak etmediğini biliyorum.”
“O zaman daha vahim. Süzme salaksın demektir.”
“İnsanların hak etmemesi, insanlığın hak etmediği anlamına gelmez ki.”
Az önce göz göze geldiğim genççe ama çökük asker hâlâ bana bakıyordu. Bir kartalla konuştuğumu anlamamıştı şüphesiz. Yanımdakilerden biriyle konuştuğumu sanmış olmalıydı. Ama konuştuğum kişiye bakmıyordum ona göre. Çekingence kafamı eğmiştim. Bir kez daha göz göze geldiğimizde içerideki diğer iki askere bir şeyler söyledi, onlar dönüp beni süzerken dışarı çıktı.
Yanımızdaki, az önce konuşan iki adam da şaşkınlıkla bir bana bir kartala bakıyordu.
“Yalnız bu sefer kalbin de kırılacak, Prometheus,” dedi kartal, bana. “Belki son olur. Biliyorsun, bir sürü soruna neden oluyor bu yaptıkların.”
Beni koruyordu sözde. En çok kanıma dokunan buydu. Kasıklarımın üstüne oturmuş, kanlı suratının ardında gözleri parlarken attığı çığlık kulaklarımda titremeye devam ediyordu. Bu tabloda kimsenin kimseyi düşündüğü yoktu. Kartalın, yanımızdaki adamların, artık arkama kadar gelmiş olması gereken askerin, tren raylarının, geçmeyen saatlerin…
Arkamı döndüğümde, asker sahiden oradaydı. Zaten öyle olması gerekiyordu; görmüştüm bunu, biliyordum. Kehanetler… Bir şeyler söylüyordu ama ağzı oynarken sesi çıkmıyordu. Yavaşlamış, kendisini saatin akışına uydurmuştu. Havada donan tükürüğünden anlaşılıyordu, sinirliydi. Korktuğu için olduğunu biliyordum. Her seferinde korkmuşlardı. Beni unutmuş olmaları işlerini kolaylaştırmıyordu. Sadece yeni bir isim vermelerine neden oluyordu. Bu seferkiyle beni lanetleyecekleri kesinleşmişti.
Olması gerekenle olanların farklı olması ne kadar acınasıydı. Olması gerekenlerin doğasının kötücül olmasıysa bir yerde buna dayanıyordu.
“Salaksın,” dedi kartal bana. Zamanı aşan bir çığlık. Hepimizin önüne geçmişti ama kimseyi durduramıyordu.
Sonsuzluğa kaç tane iki saat sıkıştırabilirsiniz?
Ben bir tane bile sıkıştıramadım.
Çantama uzanıyordum. Asker de sırtındaki tüfeğini indiriyordu. Benim acelem yoktu, onunsa eli ayağı birbirine dolaşıyordu. Oysaki her şey tam olması gerektiği anda olacaktı. Kehanet başkasının parmaklarının ucundaydı.
Güneş, bir tüfeğin patlamasıyla yeryüzüne doğdu.
Rayların üzerine serilmiş göğsümün ortasından, bir çantanın içinden.
Not: Bu öykü ilk olarak 2021 yılında Esrarengiz Hikâyeler'de yayımlanmıştır.
-
@ 23202132:eab3af30
2024-07-22 02:47:30O Mirror.xyz é uma plataforma de publicação que utiliza a tecnologia descentralizada para permitir que criadores de conteúdo publiquem e compartilhem seu trabalho. Ele se diferencia das plataformas tradicionais de publicação por integrar recursos de Web3. Já o App Mixin é uma carteira de criptografia que permite aos usuários efetuar login no Mirror.
Veja como é fácil e rápido criar sua conta de acesso e montar seu blog no Mirror usando o App Mixin.
1 - Instale o Mixin no seu celular, baixe em: messenger.mixin.one
2 - No computador, abra o site Mirror.xyz pelo google chrome e clique em Conectar.
3 - Escolha Other Wallets e aguarde abrir o QrCode.
4 - No celular abra a carteira do app Mixin e clique no icone para abrir o leitor de QrCode
5 - Aponte o leitor de QrCode do Mixin para a tela do PC
6 - Clique no botão que vai aparecer no Mixin e você estará conectado/logado no Mirror.
7 - Agora é só configurar nome, foto de perfil e postar o conteúdo de seu agrado.
Veja um breve passo a passo em vídeo em https://youtu.be/Ix_jv2Z6CpE
Qualquer dúvida adicione o contato ID 39371166 no Mixin para solicitar um suporte técnico.
-
@ 8e7462a8:c723a97b
2024-07-21 23:16:53test
hello world
haha
lol
js const y = (m * x) + b;
-
@ 6c2d79de:0407f1b7
2024-07-13 20:34:36In this inaugural investigative campaign, we at The Xibao Times invite you to embark on a journey down the rabbit hole of Chinese propaganda. Our mission: to expose the intricate network of disinformation and anti-Western sentiment that the Chinese Communist Party (CCP) has meticulously woven through various channels, including state media, non-profits, coalition groups, and more. Our starting point: a seemingly innocuous image (above) reposted by China Daily, which has led us to uncover a web of deception that spans the globe.
At the heart of Operation Codepink lies a comprehensive examination of the CCP's propaganda apparatus. We delve into the inner workings of this sophisticated machine, revealing the strategies and tactics employed to disseminate disinformation and manipulate public opinion. From the recruitment of "useful idiots" to the exploitation of social media, we uncover the CCP's playbook and the players involved.
Our investigation spans multiple platforms and formats, providing readers with a multi-faceted view of the propaganda war being waged. In addition to in-depth articles on Habla News, we are creating wiki-pages on Wikifreedia to serve as a repository of information, and relationship charts to illustrate the complex web of connections between entities and individuals.
As we continue our investigative journey, we will be publishing updates, revelations, and analyses. Stay tuned for an exposé on the CCP's infiltration of Russian state media, US non-profits, and coalition groups, among others. Our objective is not only to educate and inform but also to spark a broader conversation about the dangers of disinformation and the need for transparency and accountability in the media landscape.
Join us as we unravel the threads of this intricate web and shed light on the dark corners of the CCP's propaganda machine. With Operation Codepink, we aim to provide the spiciest content on Nostr, and we invite you to be a part of this groundbreaking campaign.
Inside the Operation
Articles on Habla News:
Our in-depth articles on Habla News will provide a comprehensive overview of the CCP's propaganda machine, its tactics, and the individuals and organizations involved. These articles will delve into specific cases of disinformation, analyze the CCP's propaganda strategies, and expose the connections between various entities and individuals. The articles will be written in an engaging and accessible style, making complex information easily understandable for a broad audience.
Wiki-pages on Wikifreedia:
We will create a series of wiki-pages on Wikifreedia to serve as a central repository of information related to Operation Codepink. These wiki-pages will provide a detailed and organized overview of the CCP's propaganda machine, its various components, and the individuals and organizations involved. They will include information on specific cases of disinformation, analysis of propaganda tactics, and connections between various entities and individuals.
Relationship Charts:
To illustrate the complex web of connections between the entities and individuals involved in the CCP's propaganda machine, we will create relationship charts. These charts will show the relationships between different organizations, individuals, and governments, providing a visual representation of the intricate network of disinformation and propaganda. The relationship charts will help readers understand the scale and scope of the CCP's influence and manipulation efforts.
Investigative Approach:
Our investigative approach for Operation Codepink will involve a combination of open-source research, data analysis, and interviews with experts and insiders. We will use a range of tools and techniques to gather information, including:
Open-Source Research: We will conduct extensive open-source research, using publicly available information to identify patterns, connections, and trends related to the CCP's propaganda machine.
Data Analysis: We will analyze large datasets to identify relationships between individuals, organizations, and governments, as well as patterns in disinformation and propaganda tactics.
Expert Interviews: We will interview experts and insiders who have firsthand knowledge of the CCP's propaganda machine, as well as individuals who have been targeted by the CCP's disinformation campaigns.
Social Media Monitoring: We will monitor social media platforms for signs of disinformation and propaganda, as well as for clues about the identities of individuals and organizations involved in the CCP's propaganda machine.
By using a combination of these techniques, we aim to provide a comprehensive and accurate picture of the CCP's propaganda machine and its activities.
Updates and Revelations:
As we continue our investigation, we will provide regular updates and revelations on our findings. These updates will be published on Habla News and shared on our Nostr account xibaotimes@zaps.lol, keeping readers informed about the latest developments in Operation Codepink. We will also provide analyses and commentary on the implications of our findings for global politics, media, and society.
Stay tuned for more revelations and updates!
-
@ 3cd2ea88:bafdaceb
2024-07-21 21:09:39🚀 Acquista Crypto o Deposita Valuta Fiat e Partecipa!
🗓 Dal 17 luglio al 12 agosto
💸 Attività 1: Cashback sulle Commissioni in ETH!
💸 Effettua una transazione con carta di credito/debito utilizzando CHF/EUR/PLN/CAD/GBP/AUD/USD. 💸 Ricevi un cashback sulle commissioni in ETH sulle tue prime cinque transazioni!
💰 Attività 2: Deposita e Guadagna fino a $200 in ETH!
💰 Accumula più di $1.500 in EUR tramite deposito bancario. 💰 Ricevi un cashback fino a $200 in ETH!
🔗 Iscriviti ora su Bitget e partecipa alla Frenesia Fiat! https://bonus.bitget.com/U1JNEK
Non perdere questa occasione unica di condividere oltre $1 milione in ETH! 🚀✨
-
@ 1739d937:3e3136ef
2024-07-06 09:22:17This is the second in a series of weekly updates detailing progress on bringing MLS protocol DMs and group messaging to Nostr.
Previous Updates
Progress this week
This week was mostly spent on the topic of how to properly publish prekey bundles and what would be needed in the bundle itself to make it workable. I've included an early version of the spec below for prekeys, and would love thoughts on it. Treat this as an alpha version, very subject to change.
The other thing I spent time on was making changes to the OpenMLS library to add support for our custom ciphersuite. One issue that I've run into is that the IETF standard for HPKE doesn't include the secp256k1 curve. Because of this, the library being used in the OpenMLS library doesn't implement the necessary methods using our curve. Thankfully, there is another library with an open PR (shout out to nostr:npub1yevrvtp3xl42sq06usztudhleq8pdfsugw5frgaqg6lvfdewfx9q6zqrkl for that!) that would fix this. Additionally, there's an expired proposal to add secp256k1 to the HPKE spec itself. I've bumped both of these and will continue to follow up. Even without the formal addition to the spec, if we have a working library, I can add that to the OpenMLS library.
Spec Draft for Prekeys
Initial keying material (Prekey Event)
Each user that wishes to be reachable via MLS-based messaging MUST first publish a prekey event. Prekeys are used to authenticate and add members to groups (one-to-one DMs or groups with more than two participants) in an asynchronous way. The prekey event is a simple replaceable event and contains all the information needed to add a user to a group.
Prekeys SHOULD be used only once. Resuse of prekeys can lead to replay attacks.
In most cases, clients that implement this NIP will manage the creation and rotation of the prekey event. It's recommended that clients do so interactively with user consent in order to avoid overwriting prekeys created by other clients.
Derived vs Ephemeral Prekeys
Since prekeys are generated on a single device/client pair, the private key of the prekey must be either stored or generated in a way that can be deterministically recovered.
The recommended approach is to use derived keys, generated in the manner described in NIP-06. In this way, the user can respond to a new group request from any device/client pair, not just from the same device/client pair that created the initial prekey event. If using derived keys;
- Clients MUST use
104
as theaccount
level value and0
at thechange
level value (e.g.m/44'/1237'/104'/0/0
). - Keys are then generated using public derivation by incrementing the
address_index
level value. - Clients MUST include the full derivation path corresponding to the key used in the
content
field on the prekey event. - The
content
field MUST be encrypted using standard NIP-44 encryption (encrypted to themselves).
However, for added security (and consequently a more restrictive user experience), clients can chose to generate an ephemeral key and store the private key locally. This means that users will only be able to respond to new group requests from the same device/client pair and won't be able to respond at all if the prekey's private key is lost. Ephemeral keys can also be used with minimal degredation of UX if you're using a remote signer that can manage these keys.
If using an ephemeral key;
- The
content
field on the prekey event MUST be filled in withEPHEMERAL
and then encrypted using standard NIP-44 encryption (encrypted to themselves). This ensures that anyone looking at prekey events cannot tell whether it's a derived or an ephemeral prekey.
Example Prekey Event
json { "id": <id>, "kind": 10443, "created_at": <unix timestamp in seconds>, "pubkey": <main identity pubkey>, "content": <encrypted derivation path | EPHEMERAL>, "tags": [ ["mls_protocol_version", "1.0"], ["ciphersuite", "MLS_256_DHKEMK256_CHACHA20POLY1305_SHA256_K256"], ["pubkey", <prekey pubkey>], ["prekey_sig", <signature generated from hex encoded pubkey of the prekey>], ["r", "wss://nos.lol"], ["r", "wss://relay.primal.net"] ], "sig": <signed with main identity key> }
Tags
- The
mls_protocol_version
tag identifies the MLS protocol version being used. For now, this MUST be1.0
- The
ciphersuite
tag identifies the ciphersuite supported. For now on Nostr, we're using a custom ciphersuite,MLS_256_DHKEMK256_CHACHA20POLY1305_SHA256_K256
. Read more about ciphersuites in MLS. pubkey
is the derived or ephemeral prekey pubkey.- The
prekey_sig
tag value is a Schnorr signature (over the secp256k1 curve) of the SHA-256 hashed value of the prekey's pubkey, signed with the prekey's private key.
```js const privKey = schnorr.utils.randomPrivateKey(); const pubKey = schnorr.getPublicKey(privKey);
const prekeySig = bytesToHex( schnorr.sign(bytesToHex(sha256(pubKey)), privKey) );
const prekeyTag = ["prekey_sig", prekeySig]; ```
Finally, clients SHOULD include
r
tags to identify each of the relays that they will attempt to publish this prekey event to. This allows for more complete replacement of prekey events at a later date.Replacing Prekey Events
Clients MUST replace the prekey event on all the listed relays any time they successfully process a group welcome event. If the prekey was a derived prekey, clients SHOULD increment the derivation path by 1 for the next key.
Onward and Upward
This next week I'll continue to work on getting the right curves and code added to the OpenMLS library and start work on a simple demo app. The focus is on better understanding what we need from the control and message events and how we can make those as simple as possible for Nostr clients and relays while also preserving as much privacy as possible.
- Clients MUST use
-
@ 3bf0c63f:aefa459d
2024-01-14 13:55:28O Planetinha
Fumaça verde me entrando pelas narinas e um coro desafinado fazia uma base melódica.
nos confins da galáxia havia um planetinha isolado. Era um planeta feliz.
O homem vestido de mago começava a aparecer por detrás da fumaça verde.
O planetinha recebeu três presentes, mas o seu habitante, o homem, estava num estado de confusão tão grande que ameaçava estragá-los. Os homens já havia escravizado o primeiro presente, a vida; lutavam contra o segundo presente, a morte; e havia alguns que achavam que deviam destruir totalmente o terceiro, o amor, e com isto levar a desordem total ao pobre planetinha perdido, que se chamava Terra.
O coro desafinado entrou antes do "Terra" cantando várias vezes, como se imitasse um eco, "terra-terra-terraaa". Depois de uma pausa dramática, o homem vestido de mago voltou a falar.
Terra, nossa nave mãe.
Neste momento eu me afastei. À frente do palco onde o mago e seu coral faziam apelos à multidão havia vários estandes cobertos com a tradicional armação de quatro pernas e lona branca. Em todos os cantos da praça havia gente, gente dos mais variados tipos. Visitantes curiosos que se aproximavam atraídos pela fumaça verde e as barraquinhas, gente que aproveitava o movimento para vender doces sem pagar imposto, casais que se abraçavam de pé para espantar o frio, os tradicionais corredores que faziam seu cooper, gente cheia de barba e vestida para imitar os hippies dos anos 60 e vender colares estendidos no chão, transeuntes novos e velhos, vestidos como baladeiros ou como ativistas do ônibus grátis, grupos de ciclistas entusiastas.
O mago fazia agora apelos para que nós, os homens, habitantes do isolado planetinha, passássemos a ver o planetinha, nossa nave mãe, como um todo, e adquiríssemos a consciência de que ele estava entrando em maus lençóis. A idéia, reforçada pela logomarca do evento, era que parássemos de olhar só para a nossa vida e pensássemos no planeta.
A logomarca do evento, um desenho estilizado do planeta Terra, nada tinha a ver com seu nome: "Festival Andando de Bem com a Vida", mas havia sido ali colocada estrategicamente pelos organizadores, de quem parecia justamente sair a mensagem dita pelo mago.
Aquela multidão de pessoas que, assim como eu, tinham suas próprias preocupações, não podiam ver o quadro caótico que formavam, cada uma com seus atos isolados, ali naquela praça isolada, naquele planeta isolado. Quando o hippie barbudo, quase um Osho, assustava um casal para tentar vender-lhes um colar, a quantidade de caos que isto acrescentava à cena era gigantesca. Por um segundo, pude ver, como se estivesse de longe e acima, com toda a pretensão que este estado imaginativo carrega, a cena completa do caos.
Uma nave-mãe, dessas de ficção científica, habitada por milhões de pessoas, seguia no espaço sem rumo, e sem saber que logo à frente um longo precipício espacial a esperava, para a desgraça completa sua e de seus habitantes.
Acostumados àquela nave tanto quanto outrora estiveram acostumados à sua terra natal, os homens viviam as próprias vidas sem nem se lembrar que estavam vagando pelo espaço. Ninguém sabia quem estava conduzindo a nave, e ninguém se importava.
No final do filme descobre-se que era a soma completa do caos que cada habitante produzia, com seus gestos egoístas e incapazes de levar em conta a totalidade, é que determinava a direção da nave-mãe. O efeito, no entanto, não era imediato, como nunca é. Havia gente de verdade encarregada de conduzir a nave, mas era uma gente bêbada, mau-caráter, que vivia brigando pelo controle da nave e o poder que isto lhes dava. Poder, status, dinheiro!
Essa gente bêbada era atraída até ali pela corrupção das instituições e da moral comum que, no fundo no fundo, era causada pelo egoísmo da população, através de um complexo -- mas que no filme aparece simplificado pela ação individual de um magnata do divertimento público -- processo social.
O homem vestido de mago era mais um agente causador de caos, com sua cena cheia de fumaça e sua roupa estroboscópica, ele achava que estava fazendo o bem ao alertar sua platéia, todos as sextas-feiras, de que havia algo que precisava ser feito, que cada um que estava ali ouvindo era responsável pelo planeta. A sua incapacidade, porém, de explicar o que precisava ser feito só aumentava a angústia geral; a culpa que ele jogava sobre seu público, e que era prontamente aceita e passada em frente, aos familiares e amigos de cada um, atormentava-os diariamente e os impedia de ter uma vida decente no trabalho e em casa. As famílias, estressadas, estavam constantemente brigando e os motivos mais insignificantes eram responsáveis pelas mais horrendas conseqüências.
O mago, que após o show tirava o chapéu entortado e ia tomar cerveja num boteco, era responsável por uma parcela considerável do caos que levava a nave na direção do seu desgraçado fim. No filme, porém, um dos transeuntes que de passagem ouviu um pedaço do discurso do mago despertou em si mesmo uma consiência transformadora e, com poderes sobre-humanos que lhe foram então concedidos por uma ordem iniciática do bem ou não, usando só os seus poderes humanos mesmo, o transeunte -- na primeira versão do filme um homem, na segunda uma mulher -- consegue consertar as instituições e retirar os bêbados da condução da máquina. A questão da moral pública é ignorada para abreviar a trama, já com duas horas e quarenta de duração, mas subentende-se que ela também fora resolvida.
No planeta Terra real, que não está indo em direção alguma, preso pela gravidade ao Sol, e onde as pessoas vivem a própria vida porque lhes é impossível viver a dos outros, não têm uma consciência global de nada porque só é possível mesmo ter a consciência delas mesmas, e onde a maioria, de uma maneira ou de outra, está tentando como pode, fazer as coisas direito, o filme é exibido.
Para a maioria dos espectadores, é um filme que evoca reflexões, um filme forte. Por um segundo elas têm o mesmo vislumbre do caos generalizado que eu tive ali naquela praça. Para uma pequena parcela dos espectadores -- entre eles alguns dos que estavam na platéia do mago, o próprio mago, o seguidor do Osho, o casal de duas mulheres e o vendedor de brigadeiros, mas aos quais se somam também críticos de televisão e jornal e gente que fala pelos cotovelos na internet -- o filme é um horror, o filme é uma vulgarização de um problema real e sério, o filme apela para a figura do herói salvador e passa uma mensagem totalmente errada, de que a maioria da população pode continuar vivendo as suas própria vidinhas miseráveis enquanto espera por um herói que vem do Olimpo e os salva da mixórdia que eles mesmos causaram, é um filme que presta um enorme desserviço à causa.
No dia seguinte ao lançamento, num bar meio caro ali perto da praça, numa mesa com oito pessoas, entre elas seis do primeiro grupo e oito do segundo, discute-se se o filme levará ou não o Oscar. Eu estou em casa dormindo e não escuto nada.
-
@ a9434ee1:d5c885be
2024-07-04 08:03:53Alice is a Capitalist
Alice has been falling down several rabbit holes lately and decides it’s time for a break. She takes a long overdue shower and gets her thoughts straight on some of her experiences.
She writes a post about how “Bees are Capitalists". However, like any real Lady, she’s got some criteria:
- She wants to have some great conversations around this topic
- She wants her post to be publicly readable by anyone
- She doesn't want to be bothered by a bunch of trolls though
- She doesn't want to be the main moderator for these conversations
- She wants zaps and doesn't care where they come from
In short: she wants to get the most out of her post with the least amount of overhead.
Luckily, she has Nostr profile and is part of several communities (publicly readable NIP-29 groups). So, she just selects the communities that overlap with this topic (let's say "₿-keepers", "Austrian Economy Class" and "Permaculture Pirates") and publishes her post ...
In all of them at the same time!
Communities are Capitalists
Now, anyone that is a member of any of these communities can join in on the conversation. The same conversation. One post, one event ID.
Just like for a real life event, Alice didn’t invite the whole universe but she selected certain groups of people. That way, she set the stage for rich discussions with perspectives from Beekeepers and hardcore Austrian economists alike. Discussions that are still readable by anyone, but that do require membership of a selected community for write-access.
The incentives play out into a win-win-win for everyone involved.
Alice’s criteria are met and her post doesn’t have to fit into one neat category. She isn’t left with the current internet’s false choice between “pick 1 community” or “blast bluntly into the universe and pay Big Tech to maybe get it to the RiGhT people”. Here, she gets to publish to selected lists of recipients that she doesn’t have to curate herself, while also adding it to her feed for her followers to pick up on.
Her Followers can be watching her every post, they can only pick up on certain location- or hashtags, or they can let algo’s filter for posts that have proven to be high signal in the relevant communities first.
Community members get to organically discover adjacent communities, following links they’d never have considered themselves. (Beekeepers getting into Mises in this example).
Lurkers can still read, share, embed and get a quick idea of the value prop of joining communities, following certain members, etc…
Bob, the Community Admin for ₿-keepers, makes money by providing Curation, Computation & Hosting. He gets free marketing and exposure to people that actually might be interested. As in: he doesn’t really need Facebook Ads when he’s got plenty of Alices, writing posts with overlapping interests of the one his community is built around. He also gets more people posting to his niche community, since creators are not forced to pick just one.
Relays are Capitalists
Since Bob is in the business of Curating, Computing & Hosting stuff, he prefers doing that in the most efficient way. He prefers handling everything from 1 Relay (and maybe even 1 Blossom server, 1 GPU set-up, 1 mint, … ).
Yes, Bob likes it simple. When the Community is the Relay, every part of his service gets easier: - fast loading of messages and media - content moderation - handling memberships - curating things like custom community, GIF’s, emoji, … - curating other relays he does or doesn’t want to be involved with - self-hosting the entire thing - handing over Admin rights - letting others fork the entire thing - handling back ups in whatever way suits him - handling fast access the collective preferences of his members (favorite movies of this group of people, most valued apps, …) - having a clear overview of his own costs, attack vectors, … - giving members guarantees in terms of pricing, privacy, …
Bob is a Capitalist
Bob got great value out of the conversations that followed Alice’s first post in his community. So much so, that some real-life rendez-vous’ and home-brewed mead later, they now call each other …
"Honey".
The End
-
@ ee11a5df:b76c4e49
2024-07-11 23:57:53What Can We Get by Breaking NOSTR?
"What if we just started over? What if we took everything we have learned while building nostr and did it all again, but did it right this time?"
That is a question I've heard quite a number of times, and it is a question I have pondered quite a lot myself.
My conclusion (so far) is that I believe that we can fix all the important things without starting over. There are different levels of breakage, starting over is the most extreme of them. In this post I will describe these levels of breakage and what each one could buy us.
Cryptography
Your key-pair is the most fundamental part of nostr. That is your portable identity.
If the cryptography changed from secp256k1 to ed25519, all current nostr identities would not be usable.
This would be a complete start over.
Every other break listed in this post could be done as well to no additional detriment (save for reuse of some existing code) because we would be starting over.
Why would anyone suggest making such a break? What does this buy us?
- Curve25519 is a safe curve meaning a bunch of specific cryptography things that us mortals do not understand but we are assured that it is somehow better.
- Ed25519 is more modern, said to be faster, and has more widespread code/library support than secp256k1.
- Nostr keys could be used as TLS server certificates. TLS 1.3 using RFC 7250 Raw Public Keys allows raw public keys as certificates. No DNS or certification authorities required, removing several points of failure. These ed25519 keys could be used in TLS, whereas secp256k1 keys cannot as no TLS algorithm utilizes them AFAIK. Since relays currently don't have assigned nostr identities but are instead referenced by a websocket URL, this doesn't buy us much, but it is interesting. This idea is explored further below (keep reading) under a lesser level of breakage.
Besides breaking everything, another downside is that people would not be able to manage nostr keys with bitcoin hardware.
I am fairly strongly against breaking things this far. I don't think it is worth it.
Signature Scheme and Event Structure
Event structure is the next most fundamental part of nostr. Although events can be represented in many ways (clients and relays usually parse the JSON into data structures and/or database columns), the nature of the content of an event is well defined as seven particular fields. If we changed those, that would be a hard fork.
This break is quite severe. All current nostr events wouldn't work in this hard fork. We would be preserving identities, but all content would be starting over.
It would be difficult to bridge between this fork and current nostr because the bridge couldn't create the different signature required (not having anybody's private key) and current nostr wouldn't be generating the new kind of signature. Therefore any bridge would have to do identity mapping just like bridges to entirely different protocols do (e.g. mostr to mastodon).
What could we gain by breaking things this far?
- We could have a faster event hash and id verification: the current signature scheme of nostr requires lining up 5 JSON fields into a JSON array and using that as hash input. There is a performance cost to copying this data in order to hash it.
- We could introduce a subkey field, and sign events via that subkey, while preserving the pubkey as the author everybody knows and searches by. Note however that we can already get a remarkably similar thing using something like NIP-26 where the actual author is in a tag, and the pubkey field is the signing subkey.
- We could refactor the kind integer into composable bitflags (that could apply to any application) and an application kind (that specifies the application).
- Surely there are other things I haven't thought of.
I am currently against this kind of break. I don't think the benefits even come close to outweighing the cost. But if I learned about other things that we could "fix" by restructuring the events, I could possibly change my mind.
Replacing Relay URLs
Nostr is defined by relays that are addressed by websocket URLs. If that changed, that would be a significant break. Many (maybe even most) current event kinds would need superseding.
The most reasonable change is to define relays with nostr identities, specifying their pubkey instead of their URL.
What could we gain by this?
- We could ditch reliance on DNS. Relays could publish events under their nostr identity that advertise their current IP address(es).
- We could ditch certificates because relays could generate ed25519 keypairs for themselves (or indeed just self-signed certificates which might be much more broadly supported) and publish their public ed25519 key in the same replaceable event where they advertise their current IP address(es).
This is a gigantic break. Almost all event kinds need redefining and pretty much all nostr software will need fairly major upgrades. But it also gives us a kind of Internet liberty that many of us have dreamt of our entire lives.
I am ambivalent about this idea.
Protocol Messaging and Transport
The protocol messages of nostr are the next level of breakage. We could preserve keypair identities, all current events, and current relay URL references, but just break the protocol of how clients and relay communicate this data.
This would not necessarily break relay and client implementations at all, so long as the new protocol were opt-in.
What could we get?
- The new protocol could transmit events in binary form for increased performance (no more JSON parsing with it's typical many small memory allocations and string escaping nightmares). I think event throughput could double (wild guess).
- It could have clear expectations of who talks first, and when and how AUTH happens, avoiding a lot of current miscommunication between clients and relays.
- We could introduce bitflags for feature support so that new features could be added later and clients would not bother trying them (and getting an error or timing out) on relays that didn't signal support. This could replace much of NIP-11.
- We could then introduce something like negentropy or negative filters (but not that... probably something else solving that same problem) without it being a breaking change.
- The new protocol could just be a few websocket-binary messages enhancing the current protocol, continuing to leverage the existing websocket-text messages we currently have, meaning newer relays would still support all the older stuff.
The downsides are just that if you want this new stuff you have to build it. It makes the protocol less simple, having now multiple protocols, multiple ways of doing the same thing.
Nonetheless, this I am in favor of. I think the trade-offs are worth it. I will be pushing a draft PR for this soon.
The path forward
I propose then the following path forward:
- A new nostr protocol over websockets binary (draft PR to be shared soon)
- Subkeys brought into nostr via NIP-26 (but let's use a single letter tag instead, OK?) via a big push to get all the clients to support it (the transition will be painful - most major clients will need to support this before anybody can start using it).
- Some kind of solution to the negative-filter-negentropy need added to the new protocol as its first optional feature.
- We seriously consider replacing Relay URLs with nostr pubkeys assigned to the relay, and then have relays publish their IP address and TLS key or certificate.
We sacrifice these:
- Faster event hash/verification
- Composable event bitflags
- Safer faster more well-supported crypto curve
- Nostr keys themselves as TLS 1.3 RawPublicKey certificates
-
@ dd664d5e:5633d319
2024-06-28 13:04:48An immodest proposal
It is grant season again, and -- as every grant season -- there is some cheering, some complaining, and much bewilderment, as the grants are announced and monies are dispersed.
I'm mostly a bystander to this spectacle, but it's impossible to ignore it, or to not become emotionally entangled in the general circus of it, so I've decided to write about it -- again -- as it is effecting me and interrupting my own efforts -- again.
What I'm about to suggest here is something that has never been done before (at least, not on a grant system of this scale), but it's always easy to criticize, whine, or fall into conspiracy-theorizing, so I've decided to try something more constructive and propose a solution.
The Five Whys
Why do some projects get turned down for grants?
Why do some people receive grants over and over and over?
Why do others refuse to apply for grants?
Why is an enormous amount of money being spent, but nobody knows how much is left over?
Why is it not really clear who is receiving what?
Why, why, why?!
Doctor, heal thyself!
Leaving this many whys lying around, or responding to them defensively or with small information-leaks, is an open communication fail on our open communication protocol.
We don't really have the excuse of not knowing how to communicate transparently and publicly, since that is our professional specialty. If we can't figure out how to run a grant program in the most continuously innovative and traceable way possible, even though we are a collection of some of the most talented perpetual innovators on the planet, then who can?
We are all process engineers, so let's engineer our own processes.
The application process
The main problem with the grant application process is that there is a grant application process. There shouldn't be. Everyone who is working in our space is working transparently, actively marketing their ideas, and everything they do is a matter of public record.
We know who the builders are. We know what they are doing. We can interact with them about what they are doing. We can turn on our computers, pull out some popcorn, and entertain ourselves all day, every day, just watching them labour and think aloud and debate, and fork various repositories or Nostr notes.
There is, at most, a loss of information, as there are so many people working on so many things, that it can become difficult to even track one particular person. That means it is not too much for us to ask, to suggest that anyone interested in a grant at least make their interest known in some small way.
Wave if you want a grant
That way should be as small as possible. Tiny. Ideally, they shouldn't even have to go that way themself; others should be able to nominate them. A 10-minute barrier to entry is already high, if it requires some formal, explicit act of supplication.
Does that sound silly to you? Then you do not understand how profoundly logical, forward-thinking, and diligent software developers can be. If they "just fill out a form" and/or "have an informal discussion", in order to receive money, it smacks disturbingly of "job interview" and "contract". These are people for whom contract law is holy law, so many will agonize over the decision.
-
Some already have a job and they don't know how it will be in the future. What if they have to work overtime?
-
Or they have a family and worry that they can't promise to deliver within some particular time. The wife could get pregnant, the baby could get sick, they might have to move house.
-
Maybe they are students and exam time is approaching. Or they are simply shy or very young, and therefore reluctant to be seen "tooting their own horn". There is probably someone more worthy, and they are taking away the money he would get.
-
Maybe they were hit with such an inner building passion that they hacked the whole implementation out their last vacation and... well... it's now already there. Everyone is already using it. Darn. Why apply for a grant, for something you've already finished? Seems sort of silly. Is that even allowed? What are the grants even for?
-
What if they already have such a well-publicized project, that everyone is already watching them and keeping tabs? Then it's embarrassing to apply, on a lark, and get turned down. But if they don't apply, then everyone encourages them to apply. What to do?
-
Many prefer to keep their head down and keep building, for months or years at a time, and clap politely when others are awarded a grant. In fact, they happily zap the recipients and then go right back to building and releasing. They're often grateful to just bathe in other people's joy, by proxy, while they stack sats and stay humble and keep coding.
That is why a large subset of potential grant recipients never even apply. That is also why those who have received a grant are less reluctant to apply for another. Successful application breeds successful application. The emotional barrier to entry has fallen. To those that have, shall be given.
Let's use Nostr to run Nostr grants
We should turn the tables around completely. We want the developers to keep developing, not jumping through hoops. We don't want them to be distracted and internally torn over the ethics of requesting funding. We don't want them to be afraid to apply for grants or be mystified by the grant-giving process, or be humiliated or frustrated by a declined grant.
-
Let us come up with very clear, understandable criteria for rewarding grants, write them down, and publish them on Nostr Wiki. Accept comments and critiques of the criteria. After every round of grants, we should review the criteria, suggest improvements, and publish the new version in the same place.
-
Let every application be judged according to this criteria and the results should be published after every round. The results should include a rating for each criteria and (if the grant is given) the amount of the grant awarded or (if the grant is declined) the reasons for the decline and what the applicant can improve to have a better chance of receiving a grant in the future.
-
Ideally, no applicant should walk away from a declined grant feeling hopeless or slighted. Every applicant should feel like the grant process gave them valuable feedback on their own efforts and expert guidance on what they should maybe focus more on.
-
Every application should be a standing application that has to be explicitly removed from the list by the person listed as an applicant. Anything not removed automatically enters the next round. If there is no further development on the project, then the application should be paused and removed after 2 rounds of being paused.
-
The application process should consist of adding yourself or someone else to a Wiki Grant Application List and linking to some documentation of the project. Any npub that trolls or spams the list should be prohibited from further contribution. The quality and completeness of that documentation should be a factor in grant acceptance.
-
Grant decisions should include a "handicap" (like in golf), that take into account how Nostr-experienced the applicant is and how easy it was for them to add a new implementation to some already-existing system. The tendency should be to award newer applicants such grants, with more-experienced applicants competing for long-term funding or being offered a paid(!) place on the grant board, but not both.
-
Each grant round should be proceeded by a grant scope declaration (we shall be awarding X number of grants with an average of X Bitcoin per grant) accompanied by a funding overview and update (How much money was collected since the last round? How much did we spend? How much do we have now? etc.).
Okay, this is just a prototype
I'm sure that I'm going to be bombarded with naysayers, critics, and people who think I am "writing above my pay grade", but I wouldn't be me if I let that daunt me.
All I am trying to do is change the discussion into one focused on uncovering the grant problems and offering grant solutions, rather than debates about whether some particular person was grant-worthy, or long rants on some particular person's real motives.
It's a lot of Real Money. It's worth talking about, but it's not worth fighting over. Let's talk.
-
-
@ 3ba77a58:5b0fda98
2024-07-21 19:58:04Dear Bitcoin Enthusiasts,
Today, I want to explore a question that's fundamental to our community: What is Bitcoin?
At its core, Bitcoin is a decentralized digital currency, without a central bank or single administrator, that can be sent from user to user on the peer-to-peer bitcoin network without the need for intermediaries. Transactions are verified by network nodes through cryptography and recorded in a public distributed ledger called a blockchain.
But Bitcoin is so much more than this technical definition. It's a revolution in finance, a beacon of hope for financial freedom, and arguably, the antidote to many of the economic ailments plaguing our world today.
When I say "Bitcoin is the Antidote," I'm making a bold, abstract claim. An antidote is a remedy or solution to a problem or difficult situation. To truly explain why I believe this, I need to share my personal journey.
My story begins long before Bitcoin. In my mid-20s, I was living what many would consider a successful life. I had finished school and became an electrician, owned real estate, and was earning $120,000 to $140,000 a year. I was growing my family and excelling in my career. Yet, something was missing.
I wanted to be legendary, but for all the wrong reasons. My success came with a misguided sense of entitlement and a desire for luxury. This mindset led to depression and hopelessness. By night, I was a thrill-seeking, woman-chasing, money-wasting, alcohol-abusing party animal. This lifestyle nearly killed me, crashing down in early 2018 and resulting in months of recovery at the University of Chicago.
After this wake-up call, I promised myself I'd make better use of my time. I quit alcohol and partying cold turkey, transitioned to a less physically intense career, and started seeking something meaningful to build for my family.
In 2019-2020, I diversified my interests. I took classes on digital marketing, blockchain, social media entrepreneurship, and even MIT classes on the future of work. I dabbled in various investments - classic cars, private equity, cryptocurrencies, stocks, and metaverse properties. I was particularly drawn to altcoins and their ambitious whitepapers.
Then COVID hit, and the world as we knew it changed. Markets crashed, economic downturns accelerated. Amidst this chaos, I found myself reading the Bitcoin whitepaper by Satoshi Nakamoto. It was a pivotal moment. The whitepaper aligned realities for me, unblocking my perception to view the world differently. Concepts like peer-to-peer cash, time, energy, central authorities, money, inflation - it all sparked something inside me. I began converting all my assets into Bitcoin.
As an electrician, I quickly saw Bitcoin as a much-needed reference point, a tool to measure against and better understand value. It wasn't just coincidence that Bitcoin went from a fraction of a penny to $20,000 per coin by 2020. This price action reflected the devaluation of the dollar and highlighted the misuse of money by states against their own citizens.
Reading the Bitcoin whitepaper led me down a rabbit hole of research into politics, economics, social science, and money. It helped me better understand how to preserve energy and value, what property is, and how to store and conserve it.
I discovered voices like Robert Breedlove, Michael Saylor, and Max Keiser, who elaborated on the problems Bitcoin solves. I was introduced to the Bitcoin community through conferences like Oslo Freedom Forum. Bitcoin began to feel like something spiritual, where people gathered with purpose, spreading awareness and fighting for a better way.
For someone like me, with an obsessive and addictive personality and a relentless appetite for knowledge, Bitcoin became the antidote. It was the truth serum I needed to shine a light on my toxic behaviors. But more than that, Bitcoin offered something I had been searching for without realizing it - an unlimited pursuit of knowledge and wisdom.
In many ways, diving into Bitcoin reminds me of a spiritual journey or the pursuit of God. It's a never-ending quest for understanding, always offering new layers to uncover, new concepts to grasp, and new implications to consider. This property makes Bitcoin a perfect transition for people with past addictive behaviors.
Where I once channeled my addictive tendencies into destructive habits, I now find myself addicted to learning. I'm constantly reading about economics, cryptography, game theory, and philosophy. I'm engaging in discussions about the nature of money, the role of the state, and the future of human coordination. This intellectual stimulation provides the same rush I used to seek in less healthy ways, but instead of destroying me, it's building me up.
The Bitcoin rabbit hole is deep and wide. It touches on so many aspects of human knowledge and experience that it can keep even the most voracious minds occupied indefinitely. For those of us who struggle with addictive behaviors, this can be a lifesaver. It provides a positive outlet for our obsessive tendencies, a constructive way to channel our energy and passion.
Moreover, the Bitcoin community, with its conferences, meetups, and online forums, offers a sense of belonging and purpose that many addicts seek. It's a global movement united by shared values of freedom, sovereignty, and honest money. This sense of community and higher purpose can fill the void that many try to fill with substances or destructive behaviors.
It's not simple to explain in a few pages how life-changing Bitcoin can be, but I promise that if you take on the task of better understanding Bitcoin and the problems it solves, it can become your antidote too. Whether you're battling addiction, seeking purpose, or simply looking to understand the world better, Bitcoin offers a path forward.
Bitcoin is the antidote to complacency, to financial illiteracy, to centralized control. It's a beacon of hope in a world that often seems hopeless. It's not just about money - it's about freedom, understanding, and reclaiming our sovereignty. It's about finding a positive addiction that enriches your life rather than depletes it. As we navigate these uncertain times, I invite you to explore Bitcoin not just as a financial tool, but as a paradigm shift, a new way of thinking, and potentially, a new way of life. It might just become your antidote too. I look forward to spreading as much of the Bitcoin gospel as possible. Until next time, check out the below video of Michael Saylor and Robert Breedlove exploring the definition of money. https://youtu.be/kIpExiMRACM
Stay curious, stay sovereign,
The Bitcoin Blok
-
@ f977c464:32fcbe00
2024-01-11 18:47:47Kendisini aynada ilk defa gördüğü o gün, diğerleri gibi olduğunu anlamıştı. Oysaki her insan biricik olmalıydı. Sözgelimi sinirlendiğinde bir kaşı diğerinden birkaç milimetre daha az çatılabilirdi veya sevindiğinde dudağı ona has bir açıyla dalgalanabilirdi. Hatta bunların hiçbiri mümkün değilse, en azından, gözlerinin içinde sadece onun sahip olabileceği bir ışık parlayabilirdi. Çok sıradan, öyle sıradan ki kimsenin fark etmediği o milyonlarca minik şeyden herhangi biri. Ne olursa.
Ama yansımasına bakarken bunların hiçbirini bulamadı ve diğer günlerden hiç de farklı başlamamış o gün, işe gitmek için vagonunun gelmesini beklediği alelade bir metro istasyonunda, içinde kaybolduğu illüzyon dağılmaya başladı.
İlk önce derisi döküldü. Tam olarak dökülmedi aslında, daha çok kıvılcımlara dönüşüp bedeninden fırlamış ve bir an sonra sönerek külleşmiş, havada dağılmıştı. Ardında da, kaybolmadan hemen önce, kısa süre için hayal meyal görülebilen, bir ruhun yok oluşuna ağıt yakan rengârenk peri cesetleri bırakmıştı. Beklenenin aksine, havaya toz kokusu yayıldı.
Dehşete düştü elbette. Dehşete düştüler. Panikle üstlerini yırtan 50 işçi. Her şeyin sebebiyse o vagon.
Saçları da döküldü. Her tel, yere varmadan önce, her santimde ikiye ayrıla ayrıla yok oldu.
Bütün yüzeylerin mat olduğu, hiçbir şeyin yansımadığı, suyun siyah aktığı ve kendine ancak kameralarla bakabildiğin bir dünyada, vagonun içine yerleştirilmiş bir aynadan ilk defa kendini görmek.
Gözlerinin akları buharlaşıp havada dağıldı, mercekleri boşalan yeri doldurmak için eriyip yayıldı. Gerçeği görmemek için yaratılmış, bu yüzden görmeye hazır olmayan ve hiç olmayacak gözler.
Her şeyin o anda sona erdiğini sanabilirdi insan. Derin bir karanlık ve ölüm. Görmenin görmek olduğu o anın bitişi.
Ben geldiğimde ölmüşlerdi.
Yani bozulmuşlardı demek istiyorum.
Belleklerini yeni taşıyıcılara takmam mümkün olmadı. Fiziksel olarak kusursuz durumdaydılar, olmayanları da tamir edebilirdim ama tüm o hengamede kendilerini baştan programlamış ve girdilerini modifiye etmişlerdi.
Belleklerden birini masanın üzerinden ileriye savurdu. Hınca hınç dolu bir barda oturuyorlardı. O ve arkadaşı.
Sırf şu kendisini insan sanan androidler travma geçirip delirmesin diye neler yapıyoruz, insanın aklı almıyor.
Eliyle arkasını işaret etti.
Polislerin söylediğine göre biri vagonun içerisine ayna yerleştirmiş. Bu zavallılar da kapı açılıp bir anda yansımalarını görünce kafayı kırmışlar.
Arkadaşı bunların ona ne hissettirdiğini sordu. Yani o kadar bozuk, insan olduğunu sanan androidi kendilerini parçalamış olarak yerde görmek onu sarsmamış mıydı?
Hayır, sonuçta belirli bir amaç için yaratılmış şeyler onlar. Kaliteli bir bilgisayarım bozulduğunda üzülürüm çünkü parasını ben vermişimdir. Bunlarsa devletin. Bana ne ki?
Arkadaşı anlayışla kafasını sallayıp suyundan bir yudum aldı. Kravatını biraz gevşetti.
Bira istemediğinden emin misin?
İstemediğini söyledi. Sahi, neden deliriyordu bu androidler?
Basit. Onların yapay zekâlarını kodlarken bir şeyler yazıyorlar. Yazılımcılar. Biliyorsun, ben donanımdayım. Bunlar da kendilerini insan sanıyorlar. Tiplerine bak.
Sesini alçalttı.
Arabalarda kaza testi yapılan mankenlere benziyor hepsi. Ağızları burunları bile yok ama şu geldiğimizden beri sakalını düzeltip duruyor mesela. Hayır, hepsi de diğerleri onun sakalı varmış sanıyor, o manyak bir şey.
Arkadaşı bunun delirmeleriyle bağlantısını çözemediğini söyledi. O da normal sesiyle konuşmaya devam etti.
Anlasana, aynayı falan ayırt edemiyor mercekleri. Lönk diye kendilerini görüyorlar. Böyle, olduğu gibi...
Nedenmiş peki? Ne gerek varmış?
Ne bileyim be abicim! Ahiret soruları gibi.
Birasına bakarak dalıp gitti. Sonra masaya abanarak arkadaşına iyice yaklaştı. Bulanık, bir tünelin ucundaki biri gibi, şekli şemalı belirsiz bir adam.
Ben seni nereden tanıyorum ki ulan? Kimsin sen?
Belleği makineden çıkardılar. İki kişiydiler. Soruşturmadan sorumlu memurlar.
─ Baştan mı başlıyoruz, diye sordu belleği elinde tutan ilk memur.
─ Bir kere daha deneyelim ama bu sefer direkt aynayı sorarak başla, diye cevapladı ikinci memur.
─ Bence de. Yeterince düzgün çalışıyor.
Simülasyon yüklenirken, ayakta, biraz arkada duran ve alnını kaşıyan ikinci memur sormaktan kendisini alamadı:
─ Bu androidleri niye böyle bir olay yerine göndermişler ki? Belli tost olacakları. İsraf. Gidip biz baksak aynayı kırıp delilleri mahvetmek zorunda da kalmazlar.
Diğer memur sandalyesinde hafifçe dönecek oldu, o sırada soruyu bilgisayarın hoparlöründen teknisyen cevapladı.
Hangi işimizde bir yamukluk yok ki be abi.
Ama bir son değildi. Üstlerindeki tüm illüzyon dağıldığında ve çıplak, cinsiyetsiz, birbirinin aynı bedenleriyle kaldıklarında sıra dünyaya gelmişti.
Yere düştüler. Elleri -bütün bedeni gibi siyah turmalinden, boğumları çelikten- yere değdiği anda, metronun zemini dağıldı.
Yerdeki karolar öncesinde beyazdı ve çok parlaktı. Tepelerindeki floresan, ışığını olduğu gibi yansıtıyor, tek bir lekenin olmadığı ve tek bir tozun uçmadığı istasyonu aydınlatıyorlardı.
Duvarlara duyurular asılmıştı. Örneğin, yarın akşam kültür merkezinde 20.00’da başlayacak bir tekno blues festivalinin cıvıl cıvıl afişi vardı. Onun yanında daha geniş, sarı puntolu harflerle yazılmış, yatay siyah kesiklerle çerçevesi çizilmiş, bir platformdan düşen çöp adamın bulunduğu “Dikkat! Sarı bandı geçmeyin!” uyarısı. Biraz ilerisinde günlük resmi gazete, onun ilerisinde bir aksiyon filminin ve başka bir romantik komedi filminin afişleri, yapılacakların ve yapılmayacakların söylendiği küçük puntolu çeşitli duyurular... Duvar uzayıp giden bir panoydu. On, on beş metrede bir tekrarlanıyordu.
Tüm istasyonun eni yüz metre kadar. Genişliği on metre civarı.
Önlerinde, açık kapısından o mendebur aynanın gözüktüğü vagon duruyordu. Metro, istasyona sığmayacak kadar uzundu. Bir kılıcın keskinliğiyle uzanıyor ama yer yer vagonların ek yerleriyle bölünüyordu.
Hiçbir vagonda pencere olmadığı için metronun içi, içlerindekiler meçhuldü.
Sonrasında karolar zerrelerine ayrılarak yükseldi. Floresanın ışığında her yeri toza boğdular ve ortalığı gri bir sisin altına gömdüler. Çok kısa bir an. Afişleri dalgalandırmadılar. Dalgalandırmaya vakitleri olmadı. Yerlerinden söküp aldılar en fazla. Işık birkaç kere sönüp yanarak direndi. Son kez söndüğünde bir daha geri gelmedi.
Yine de etraf aydınlıktı. Kırmızı, her yere eşit dağılan soluk bir ışıkla.
Yer tamamen tele dönüşmüştü. Altında çapraz hatlarla desteklenmiş demir bir iskelet. Işık birkaç metreden daha fazla aşağıya uzanamıyordu. Sonsuzluğa giden bir uçurum.
Duvarın yerini aynı teller ve demir iskelet almıştı. Arkasında, birbirine vidalarla tutturulmuş demir plakalardan oluşan, üstünden geçen boruların ek yerlerinden bazen ince buharların çıktığı ve bir süre asılı kaldıktan sonra ağır, yağlı bir havayla sürüklendiği bir koridor.
Diğer tarafta paslanmış, pencerelerindeki camlar kırıldığı için demir plakalarla kapatılmış külüstür bir metro. Kapının karşısındaki aynadan her şey olduğu gibi yansıyordu.
Bir konteynırın içini andıran bir evde, gerçi gayet de birbirine eklenmiş konteynırlardan oluşan bir şehirde “andıran” demek doğru olmayacağı için düpedüz bir konteynırın içinde, masaya mum görüntüsü vermek için koyulmuş, yarı katı yağ atıklarından şekillendirilmiş kütleleri yakmayı deniyordu. Kafasında hayvan kıllarından yapılmış grili siyahlı bir peruk. Aynı kıllardan kendisine gür bir bıyık da yapmıştı.
Üstünde mavi çöp poşetlerinden yapılmış, kravatlı, şık bir takım.
Masanın ayakları yerine oradan buradan çıkmış parçalar konulmuştu: bir arabanın şaft mili, üst üste konulmuş ve üstünde yazı okunamayan tenekeler, boş kitaplar, boş gazete balyaları... Hiçbir şeye yazı yazılmıyordu, gerek yoktu da zaten çünkü merkez veri bankası onları fark ettirmeden, merceklerden giren veriyi sentezleyerek insanlar için dolduruyordu. Yani, androidler için. Farklı şekilde isimlendirmek bir fark yaratacaksa.
Onların mercekleri için değil. Bağlantıları çok önceden kopmuştu.
─ Hayatım, sofra hazır, diye bağırdı yatak odasındaki karısına.
Sofrada tabak yerine düz, bardak yerine bükülmüş, çatal ve bıçak yerine sivriltilmiş plakalar.
Karısı salonun kapısında durakladı ve ancak kulaklarına kadar uzanan, kocasınınkine benzeyen, cansız, ölü hayvanların kıllarından ibaret peruğunu eliyle düzeltti. Dudağını, daha doğrusu dudağının olması gereken yeri koyu kırmızı bir yağ tabakasıyla renklendirmeyi denemişti. Biraz da yanaklarına sürmüştü.
─ Nasıl olmuş, diye sordu.
Sesi tek düzeydi ama hafif bir neşe olduğunu hissettiğinize yemin edebilirdiniz.
Üzerinde, çöp poşetlerinin içini yazısız gazete kağıtlarıyla doldurarak yaptığı iki parça giysi.
─ Çok güzelsin, diyerek kravatını düzeltti kocası.
─ Sen de öylesin, sevgilim.
Yaklaşıp kocasını öptü. Kocası da onu. Sonra nazikçe elinden tutarak, sandalyesini geriye çekerek oturmasına yardım etti.
Sofrada yemek niyetine hiçbir şey yoktu. Gerek de yoktu zaten.
Konteynırın kapısı gürültüyle tekmelenip içeri iki memur girene kadar birbirlerine öyküler anlattılar. O gün neler yaptıklarını. İşten erken çıkıp yemyeşil çimenlerde gezdiklerini, uçurtma uçurduklarını, kadının nasıl o elbiseyi bulmak için saatlerce gezip yorulduğunu, kocasının kısa süreliğine işe dönüp nasıl başarılı bir hamleyle yaşanan krizi çözdüğünü ve kadının yanına döndükten sonra, alışveriş merkezinde oturdukları yeni dondurmacının dondurmalarının ne kadar lezzetli olduğunu, boğazlarının ağrımasından korktuklarını...
Akşam film izleyebilirlerdi, televizyonda -boş ve mat bir plaka- güzel bir film oynayacaktı.
İki memur. Çıplak bedenleriyle birbirinin aynı. Ellerindeki silahları onlara doğrultmuşlardı. Mum ışığında, tertemiz bir örtünün serili olduğu masada, bardaklarında şaraplarla oturan ve henüz sofranın ortasındaki hindiye dokunmamış çifti gördüklerinde bocaladılar.
Hiç de androidlere bilinçli olarak zarar verebilecek gibi gözükmüyorlardı.
─ Sessiz kalma hakkına sahipsiniz, diye bağırdı içeri giren ikinci memur. Söylediğiniz her şey...
Cümlesini bitiremedi. Yatak odasındaki, masanın üzerinden gördüğü o şey, onunla aynı hareketleri yapan android, yoksa, bir aynadaki yansıması mıydı?
Bütün illüzyon o anda dağılmaya başladı.
Not: Bu öykü ilk olarak 2020 yılında Esrarengiz Hikâyeler'de yayımlanmıştır.
-
@ 32e18276:5c68e245
2023-12-06 15:29:43I’m going to be on an ordinals panels as one of the people who is counter arguing the claim that they are good for bitcoin. I decided to brush up on the technicals on how inscriptions work. I am starting to see luke’s perspective on how it is exploiting a loophole in bitcoin’s anti-data-spam mechanisms.
Storing data in Bitcoin, the “standard” way
The standard way you add “data” to bitcoin is by calling the OP_RETURN opcode. Bitcoin devs noticed that people were storing data (like the bitcoin whitepaper) in the utxo set via large multisig transactions. The problem with this is that this set is unprunable and could grow over time. OP_RETURN outputs on the other-hand are provably prunable and don’t add to utxo bloat.
Here’s an excerpt from the march 2014 0.9.0 release notes that talks about this:
On OP_RETURN: There was been some confusion and misunderstanding in the community, regarding the OP_RETURN feature in 0.9 and data in the blockchain. This change is not an endorsement of storing data in the blockchain. The OP_RETURN change creates a provably-prunable output, to avoid data storage schemes – some of which were already deployed – that were storing arbitrary data such as images as forever-unspendable TX outputs, bloating bitcoin’s UTXO database. Storing arbitrary data in the blockchain is still a bad idea; it is less costly and far more efficient to store non-currency data elsewhere.
Much of the work on bitcoin core has been focused on making sure the system continues to function in a decentralized way for its intended purpose in the presence of people trying to abuse it for things like storing data. Bitcoin core has always discouraged this, as it is not designed for storage of images and data, it is meant for moving digital coins around in cyberspace.
To help incentive-align people to not do stupid things, OP_RETURN transactions were not made non-standard, so that they are relayable by peers and miners, but with the caveat:
- They can only push 40 bytes (later increased to 80,83, I’m guessing to support larger root merkle hashes since that is the only sane usecase for op_return)
Bitcoin also added an option called -datacarriersize which limits the total number of bytes from these outputs that you will relay or mine.
Why inscriptions are technically an exploit
Inscriptions get around the datacarriersize limit by disguising data as bitcoin script program data via OP_PUSH inside OP_IF blocks. Ordinals do not use OP_RETURN and are not subjected to datacarriersize limits, so noderunners and miners currently have limited control over the total size of this data that they wish to relay and include in blocks. Luke’s fork of bitcoin-core has some options to fight this spam, so hopefully we will see this in core sometime soon as well.
Inscriptions are also taking advantage of features in segwit v1 (witness discount) and v2/taproot (no arbitrary script size limit). Each of these features have interesting and well-justified reasons why they were introduced.
The purpose of the witness discount was to make it cheaper to spend many outputs which helps the reduction of the utxo set size. Inscriptions took advantage of this discount to store monke jpegs disguised as bitcoin scripts. Remember, bitcoin is not for storing data, so anytime bitcoin-devs accidentally make it cheap and easy to relay data then this should be viewed as an exploit. Expect it to be fixed, or at least provide tools to noderunners for fighting this spam.
Where do we go from here
The interesting part of this story is that people seem to attach value to images stored on the bitcoin blockchain, and they are willing to pay the fee to get it in the block, so non-ideologic miners and people who don’t care about the health and decentralization of bitcoin are happy to pay or collect the fee and move on.
Data should not get a discount, people should pay full price if they want to store data. They should just use op_return and hashes like opentimestamps or any other reasonable protocol storing data in bitcoin.
After going through this analysis I’ve come to the opinion that this is a pretty bad data-spam exploit and bitcoin devs should be working on solutions. Ideological devs like luke who actually care about the health and decentralization of the network are and I’m glad to see it.
-
@ 8fb140b4:f948000c
2023-11-21 21:37:48Embarking on the journey of operating your own Lightning node on the Bitcoin Layer 2 network is more than just a tech-savvy endeavor; it's a step into a realm of financial autonomy and cutting-edge innovation. By running a node, you become a vital part of a revolutionary movement that's reshaping how we think about money and digital transactions. This role not only offers a unique perspective on blockchain technology but also places you at the heart of a community dedicated to decentralization and network resilience. Beyond the technicalities, it's about embracing a new era of digital finance, where you contribute directly to the network's security, efficiency, and growth, all while gaining personal satisfaction and potentially lucrative rewards.
In essence, running your own Lightning node is a powerful way to engage with the forefront of blockchain technology, assert financial independence, and contribute to a more decentralized and efficient Bitcoin network. It's an adventure that offers both personal and communal benefits, from gaining in-depth tech knowledge to earning a place in the evolving landscape of cryptocurrency.
Running your own Lightning node for the Bitcoin Layer 2 network can be an empowering and beneficial endeavor. Here are 10 reasons why you might consider taking on this task:
-
Direct Contribution to Decentralization: Operating a node is a direct action towards decentralizing the Bitcoin network, crucial for its security and resistance to control or censorship by any single entity.
-
Financial Autonomy: Owning a node gives you complete control over your financial transactions on the network, free from reliance on third-party services, which can be subject to fees, restrictions, or outages.
-
Advanced Network Participation: As a node operator, you're not just a passive participant but an active player in shaping the network, influencing its efficiency and scalability through direct involvement.
-
Potential for Higher Revenue: With strategic management and optimal channel funding, your node can become a preferred route for transactions, potentially increasing the routing fees you can earn.
-
Cutting-Edge Technological Engagement: Running a node puts you at the forefront of blockchain and bitcoin technology, offering insights into future developments and innovations.
-
Strengthened Network Security: Each new node adds to the robustness of the Bitcoin network, making it more resilient against attacks and failures, thus contributing to the overall security of the ecosystem.
-
Personalized Fee Structures: You have the flexibility to set your own fee policies, which can balance earning potential with the service you provide to the network.
-
Empowerment Through Knowledge: The process of setting up and managing a node provides deep learning opportunities, empowering you with knowledge that can be applied in various areas of blockchain and fintech.
-
Boosting Transaction Capacity: By running a node, you help to increase the overall capacity of the Lightning Network, enabling more transactions to be processed quickly and at lower costs.
-
Community Leadership and Reputation: As an active node operator, you gain recognition within the Bitcoin community, which can lead to collaborative opportunities and a position of thought leadership in the space.
These reasons demonstrate the impactful and transformative nature of running a Lightning node, appealing to those who are deeply invested in the principles of bitcoin and wish to actively shape its future. Jump aboard, and embrace the journey toward full independence. 🐶🐾🫡🚀🚀🚀
-
-
@ 1739d937:3e3136ef
2024-06-28 08:27:59This is the first in a series of weekly updates I'm publishing on my process around secure messaging. Since it's the first, let's start with a bit of historical context.
How this started
On April 29th I published a PR on the NIPs repo detailing how we could achieve "Double Ratchet" DMs on Nostr. There was also a video and demo app that went along with the PR. The basic premise was to use an adapted version of the Signal protocol. This would result in DMs that were truly private and confidential, even in the case where you leaked your main private key (your nsec).
I got a lot of great feedback from folks on that PR. Interestingly, nearly all of it focused on two points, both of which I'd deliberately scoped out. Group messaging and multi-device/client support. While I knew these were important, and I'd figured out how these might be accomplished, I'd scoped them out to try and keep the footprint of the spec as small as possible But, the more I talked to people, the more it became clear that this was one problem to be solved and given the complexity, it'd be better to do it in one shot.
At the same time, I'd seen that Messaging Layer Security (MLS) had recently made a lot of progress on becoming an internet standard with their RFC proposal and, goaded on by Vitor, decided to take a closer look.
Enter MLS
Messaging Layer Security (MLS) is a new protocal that is basically a modern extension of the Signal protocol that makes group messaging way more efficient (log vs linear) and was built to be used in centralized or federated environments. I'd heard other Nostr devs talk about it in the past but it was always overlooked as being too complicated or too early.
After spending a couple weeks reading hundreds of pages of RFC docs and reading through a few implementations of the MLS spec, I believe it's the best solution for secure direct and group messaging in Nostr. It also has the added benefit that we can upgrade the underlying crypto primitives over time in a sane way.
The MLS protocol specifies "a key establishment protocol that provides efficient asynchronous group key establishment with forward secrecy (FS) and post-compromise security (PCS) for groups in size ranging from two to thousands."
The spec I'm working on will detail the ways that we implement this protocol into the Nostr environment (namely, how do we use our crypto primitives, use events as control mechanisms, and use relays for storage, while obfuscating metadata).
Goals
It's important to be clear about what we're aiming for here. Engineering is all about tradeoffs, always.
- Private and Confidential DMs and Group messages
- Private means that an observer cannot tell that Alice and Bob are talking to one another, or that Alice is part of a specific group. This necessarily requires protecting metadata.
- Confidential means that the contents of conversations can only be viewed by the intended recipients.
- Forward secrecy and Post-compromise security (PCS) in the case of any key material being leaked, whether that's your main Nostr identity key (your nsec) or any of the keys used in the MLS ratchet trees.
- Forward secrecy means that encrypted content in the past remains encrypted even if key material is leaked.
- Post compromise security means that leaking key material doesn't allow an attacker to continue to read messages indefinitely into the future.
- Scales well for large groups. MLS provides this from a computational standpoint, but we need to make sure this works in a scalable way when multiple relays are involved.
- Allows for the use of multiple device/clients in a single conversation/group. Importantly, we're not aiming to enable a device/client to be able to reconstruct the full history of a conversation at any point.
Progress this week
Ok, finally, what what I been up to?
Reading
I've spent most of the last few weeks reading the MLS spec and architectural doc (multiple times), learning some Rust, and beefing up my knowledge of cryptography (which was, if I'm being generous, paltry before starting this project).
Ciphersuites
Nostr is built around the same crypto primitives that Bitcoin is, namely Schnorr signatures over the secp256k1 curve and SHA-256 hashes. This curve isn't currently supported officially in the MLS spec. I've been in touch with the MLS working group to better understand the process of adding a new ciphersuite to the set of ciphersuites in the MLS spec. The outcome here is that we're going to start out using our custom ciphersuite that isn't part of the formal spec. The only drawback being that Nostr's MLS implementation won't be immediately interoperable with other MLS implementations. We can always add it later via the formal channels if we want.
MLS Implementations
Given the complexity of the MLS spec itself (the RFC is 132 pages long), having a well vetted, well tested implementation is going to be key to adoption in the Nostr ecosystem. OpenMLS is an implementation created by several of the RFC authors and written in Rust with bindings for nearly all the major languages we would want to support.
I've been in touch with the maintainers to talk about adding support to their library for our new ciphersuite and to better understand the bindings that are there. Some (WASM) are very barebones and still need a lot of work. They are generally open to the idea of working with me on adding the missing pieces we'd need.
Double Ratchet NIP 2.0
I've also started to write up the new version of the NIP that will detail how all of this will work when plugged into Nostr. It's not yet ready to share but it's getting closer.
Onward & Upward
Hopefully that's a helpful recap. I'll keep doing these weekly and welcome any questions, feedback, or support. In particular, if you're keen on working on this with me, please reach out. More eyes and more brains can only make this better. 🤙
-
@ dd664d5e:5633d319
2024-06-25 09:28:03Should show up here as content
nostr:note1m9jdd9w9qxwa8gfda6n3sku7nf6mjnxylhaaa8wpnvdz85xajrasrrpj2a
-
@ 47750177:8969e41a
2024-07-09 12:00:0026.2 Release Notes
Bitcoin Core version 26.2 is now available from:
This release includes new features, various bug fixes and performance improvements, as well as updated translations.
Please report bugs using the issue tracker at GitHub:
https://github.com/bitcoin/bitcoin/issues
To receive security and update notifications, please subscribe to:
https://bitcoincore.org/en/list/announcements/join/
How to Upgrade
If you are running an older version, shut it down. Wait until it has completely shut down (which might take a few minutes in some cases), then run the installer (on Windows) or just copy over
/Applications/Bitcoin-Qt
(on macOS) orbitcoind
/bitcoin-qt
(on Linux).Upgrading directly from a version of Bitcoin Core that has reached its EOL is possible, but it might take some time if the data directory needs to be migrated. Old wallet versions of Bitcoin Core are generally supported.
Compatibility
Bitcoin Core is supported and extensively tested on operating systems using the Linux kernel, macOS 11.0+, and Windows 7 and newer. Bitcoin Core should also work on most other Unix-like systems but is not as frequently tested on them. It is not recommended to use Bitcoin Core on unsupported systems.
Notable changes
Script
-
29853: sign: don't assume we are parsing a sane TapMiniscript
P2P and network changes
-
29691: Change Luke Dashjr seed to dashjr-list-of-p2p-nodes.us
-
30085: p2p: detect addnode cjdns peers in GetAddedNodeInfo()
RPC
-
29869: rpc, bugfix: Enforce maximum value for setmocktime
-
28554: bugfix: throw an error if an invalid parameter is passed to getnetworkhashps RPC
-
30094: rpc: move UniValue in blockToJSON
-
29870: rpc: Reword SighashFromStr error message
Build
-
29747: depends: fix mingw-w64 Qt DEBUG=1 build
-
29985: depends: Fix build of Qt for 32-bit platforms with recent glibc
-
30151: depends: Fetch miniupnpc sources from an alternative website
-
30283: upnp: fix build with miniupnpc 2.2.8
Misc
-
29776: ThreadSanitizer: Fix #29767
-
29856: ci: Bump s390x to ubuntu:24.04
-
29764: doc: Suggest installing dev packages for debian/ubuntu qt5 build
-
30149: contrib: Renew Windows code signing certificate
Credits
Thanks to everyone who directly contributed to this release:
- Antoine Poinsot
- Ava Chow
- Cory Fields
- dergoegge
- fanquake
- glozow
- Hennadii Stepanov
- Jameson Lopp
- jonatack
- laanwj
- Luke Dashjr
- MarcoFalke
- nanlour
- willcl-ark
As well as to everyone that helped with translations on Transifex.
-
-
@ de496884:72617b81
2023-11-20 13:54:02Hola nostriches. Hoy quiero hacer una publicación a modo de acercamiento al surgimiento de las categorías de género y feminismo. Así que si te interesa el tema haz un esfuerzo, quédate y lee hasta el final.
El «nuevo» feminismo surge en los años 60 en aquellos países desarrollados que acordaron el reconocimiento a los derechos humanos contenidos en la declaración respectiva de la ONU. Entonces, como extensión a dicho reconocimiento es que se gesta el movimiento feminista. A partir de entonces, se vuelve a plantear la relación entre naturaleza y cultura y se llega a la conclusión de que las diferencias sociales van más allá de las diferencias biológicas, lo cual tiene que ver con que hasta la satisfacción de las necesidades elementales como son la alimentación, la vivienda o el vestuario, están condicionadas por construcciones sociales.
Pues resulta que una de las primeras propuestas identificó la subordinación femenina producto de una organización patriarcal, tomando la categoría patriarcado de Max Weber. Es así como la visión de patriarcado se extendió al discurso político y académico. Sin embargo, no existía información acerca del desarrollo histórico de dicha categoría, sistematización, variaciones, etc., debido a que era algo que recién se percibía en sociedad.
Sin embargo, la misma categoría de patriarcado junto a la de relaciones de poder para explicar fenómenos sociales contemporáneos, constituye uno de los principales aportes del feminismo. Es así como han logrado primero visibilizar y luego deconstruir y desmontar una serie de patrones existentes en la cultura que hacían a la mujer subordinarse al hombre, y que estaban presentes en el sistema sexo/género.
Imagen relativa a la lucha contra el patriarcado
Hasta ese momento, en las investigaciones de corte sociológico se podía observar un androcentrismo excesivo al dar cuenta de que el discurso científico giraba en torno al hombre como sujeto y objeto de la investigación. Incluso para muchos positivistas como el mismo Weber, los problemas relativos a las mujeres quedaban en un segundo plano y eran achacados a una supuesta pasividad en el ámbito doméstico-familiar. Dicha pasividad partía de la supuesta inferioridad biológica del género femenino.
El patriarcado entonces era una categoría vacía en sí misma, sin valor explicativo. Desde el punto de vista político pudo ser útil para la movilización pero no resistió la polémica de los críticos del feminismo ni problematizó los conflictos a ser resueltos. Pero por otro lado, un grupo de mujeres -académicas principalmente- se puso manos a la obra en una tarea más pequeña pero con mayores resultados a la larga. Se propusieron generar conocimientos sobre las condiciones de vida de la mujer, buscar en el pasado y el presente los aportes de las mujeres a la sociedad, hacerlas visibles en la historia. Es así como surgen centros académicos, ONG, proyectos, centros de estudios sobre la mujer, etc.
Convendría distinguir entonces dos posturas que acompañaron a la investigación sobre las mujeres: uno que toma como objeto de estudio a la mujer; condiciones de vida, trabajo, cultura, y otra que privilegia a la sociedad como generadora de la subordinación de las mujeres. Mientras la primera perspectiva generaba conocimiento sobre las mujeres en el estudio de las relaciones mujer-hombre y mujer-mujer, la segunda tenía una visión más holística ya que planteaba que hay que estudiar la sociedad ya que la subordinación de las mujeres es producto de la organización de esta, y que no se avanzará solamente estudiando a las mujeres, pues el estudio debe ser más amplio y en diferentes niveles, ámbitos y tiempos.
Es en esta búsqueda donde nace y se desarrolla el concepto de género como categoría social. Es así que los sistemas de género/sexo fueron entendidos como conjuntos de prácticas, símbolos, representaciones, normas y valores sociales que las sociedades construyen a partir de la diferencia anatomofisiológica y que dotan de sentido a los impulsos sexuales, la reproducción y en general a las relaciones de las personas. Por tanto, este sistema pasó a ser el objeto de estudio para comprender la subordinación de la mujer al hombre. Al ser el reconocimiento de una dimensión de la desigualdad social hasta entonces desconocida, se considera la ruptura epistemológica más importante de la segunda mitad del siglo pasado.
Imagen representativa del binarismo de género
Hay que decir que los fenómenos sociales se definen por la relación que guardan entre sí. Es así que la contextualización adquiere gran importancia. Varones y hembras tenemos la capacidad de producir con el cuerpo, pero solo los cuerpos de las hembras pueden gestar y por tanto asegurar la especie humana. Todo grupo humano que pretenda sobrevivir debe asegurarse cierto número de hembras que lo hagan posible. De ahí un poder particular del cuerpo de las hembras que además es económicamente útil. Dicho poder no es en tanto entidad biológica, sino que es la sociedad la que le otorga este poder. En este punto es importante preguntarnos históricamente ¿quiénes han controlado la capacidad reproductiva de las mujeres?, ¿quiénes han tenido la preferencia en el acceso sexual? Para asegurar un control efectivo sobre la reproducción es necesario asegurar el acceso a la sexualidad.
Ello también lleva a dirigir el trabajo que hacen porque de lo contrario, las mujeres pudieran tener espacio para representar una amenaza al dominio de los hombres. Es importante atender a las etapas del desarrollo de las mujeres, que permitirá ver cómo se moldean para aceptar la desigualdad y las jerarquías según el género en las distintas sociedades y culturas.
Otro contexto a delimitar es el ámbito doméstico donde históricamente ha estado ubicada la subordinación femenina. Se ha puesto al descubierto que en mujeres que comparten la misma posición de clase y raza, la condición femenina se redefine a lo largo de la vida y que algunas mujeres gozan de poder sobre otras aún en contextos de alta dominación masculina, donde se les inviste de autoridad, creando el sistema zonas de incertidumbre que las divide como género e impide alianzas y se legitima la dominación. Foucault decía que «el poder se ejerce, no se posee», siempre está en peligro de perderse y no basta con normas, leyes, amenazas o castigos. De ahí que los espacios de subordinación sean también espacios de poder de las mujeres. Es en estos espacios donde aparece la inseguridad, la tensión, la confrontación y por ejemplo, las mujeres pueden reclamar la anticoncepción.
Para terminar, es importante tener en cuenta que el género es una forma de desigualdad social que siempre está articulado con otras formas de desigualdad ya sea clase, raza, grupo etario, etc. Hay que pensar la dominación masculina con un comienzo vago en el tiempo, pero que permite desligar la subordinación de las mujeres de la evolución «natural» de la humanidad y entenderla como un proceso histórico. En ese sentido es necesario destacar que, respecto a la raza, en sociedades plurales hay que atender al contexto étnico-cultural ya que en dependencia de las relaciones entre personas de razas distintas, estarán condicionadas las relaciones entre los géneros.
Imagen de mujeres de distintas razas
Por otro lado, si bien es sabido que históricamente el sistema de género social imperante ha contribuido a invisibilizar a la mujer, asociar necesariamente dicha categoría únicamente a ella, lo que hace es nuevamente recrear el absolutismo histórico que tanto daño ha hecho, ahora como lo antagónico al hombre. En la construcción social participan todos los individuos presentes en el sistema género.
Hasta aquí la publicación de hoy, de la cual aun se puede ahondar mucho más, sobre todo en lo referido al origen de la dominación o de la jerarquización del binomio «hombre/mujer» desde un punto de vista antropológico, pero eso quedará pendiente para otro momento. ¡Así que hasta la próxima!, ¡nos vemos en Nostr!
........................................
English version
Hello nostriches. Today I want to make a post by way of an approach to the emergence of the categories of gender and feminism. So if you are interested in the topic make an effort, stay and read to the end.
The "new" feminism emerged in the 60's in those developed countries that agreed to recognize the human rights contained in the respective UN declaration. Then, as an extension of that recognition, the feminist movement was born. From then on, the relationship between nature and culture was raised again and the conclusion was reached that social differences go beyond biological differences, which has to do with the fact that even the satisfaction of elementary needs such as food, housing or clothing are conditioned by social constructions.
It turns out that one of the first proposals identified female subordination as a product of a patriarchal organization, taking Max Weber's category of patriarchy. This is how the vision of patriarchy was extended to political and academic discourse. However, there was no information about the historical development of this category, systematization, variations, etc., because it was something that was only recently perceived in society.
However, the same category of patriarchy, together with that of power relations to explain contemporary social phenomena, constitutes one of the main contributions of feminism. This is how they have first made visible and then deconstructed and dismantled a series of existing patterns in the culture that made women subordinate to men, and that were present in the sex/gender system.
Image relating to the fight against patriarchy
Up to that time, in sociological research, an excessive androcentrism could be observed as scientific discourse revolved around men as the subject and object of research. Even for many positivists such as Weber himself, the problems related to women remained in the background and were blamed on a supposed passivity in the domestic-family sphere. This passivity was based on the supposed biological inferiority of the female gender.
Patriarchy was then an empty category in itself, with no explanatory value. From the political point of view, it could be useful for mobilization, but it did not resist the polemic of feminism's critics, nor did it problematize the conflicts to be resolved. But on the other hand, a group of women - mainly academics - set to work on a smaller task but with greater results in the long run. They set out to generate knowledge about women's living conditions, to search in the past and present for women's contributions to society, to make them visible in history. This is how academic centers, NGOs, projects, women's studies centers, etc., came into being.
It would be convenient to distinguish then two postures that accompanied the research on women: one that takes women as the object of study; living conditions, work, culture, and the other that privileges society as the generator of women's subordination. While the first perspective generated knowledge about women in the study of woman-man and woman-woman relations, the second had a more holistic vision, since it proposed that society must be studied, since the subordination of women is a product of its organization, and that progress will not be made only by studying women, since the study must be broader and at different levels, spheres and times.
It is in this search that the concept of gender as a social category was born and developed. Thus, gender/sex systems were understood as a set of practices, symbols, representations, norms and social values that societies construct on the basis of anatomophysiological differences and that give meaning to sexual impulses, reproduction and, in general, to people's relationships. Therefore, this system became the object of study to understand the subordination of women to men. As the recognition of a hitherto unknown dimension of social inequality, it is considered the most important epistemological breakthrough of the second half of the last century.
Representative image of gender binarism
It must be said that social phenomena are defined by their relationship with each other. Thus, contextualization acquires great importance. Males and females have the capacity to produce with their bodies, but only the bodies of females can gestate and therefore ensure the human species. Any human group that intends to survive must ensure a certain number of females to make it possible. Hence a particular power of the female body, which is also economically useful. This power is not as a biological entity, but it is society that gives it this power. At this point it is important to ask ourselves historically, who has controlled the reproductive capacity of women, who has had the preference in sexual access? To ensure effective control over reproduction, it is necessary to ensure access to sexuality.
Allegorical image of pregnancy
This also leads to directing the work they do because otherwise, women may have room to pose a threat to men's dominance. It is important to pay attention to the stages of women's development, which will allow us to see how they are shaped to accept inequality and gender hierarchies in different societies and cultures.
Another context to delimit is the domestic sphere where female subordination has historically been located. It has been discovered that in women who share the same class and race position, the feminine condition is redefined throughout life and that some women enjoy power over others even in contexts of high male domination, where they are invested with authority, creating the system of uncertainty zones that divide them as a gender and prevent alliances and legitimize domination. Foucault said that "power is exercised, not possessed", it is always in danger of being lost and norms, laws, threats or punishments are not enough. Hence, the spaces of subordination are also spaces of women's power. It is in these spaces where insecurity, tension, confrontation appear and where, for example, women can demand contraception.
To conclude, it is important to keep in mind that gender is a form of social inequality that is always articulated with other forms of inequality, be it class, race, age group, etc. It is necessary to think of male domination with a vague beginning in time, but which allows us to separate the subordination of women from the "natural" evolution of humanity and to understand it as a historical process. In this sense, it is necessary to emphasize that, with respect to race, in plural societies it is necessary to pay attention to the ethno-cultural context since, depending on the relations between people of different races, the relations between genders will be conditioned.
Image of women of different races
On the other hand, although it is known that historically the prevailing social gender system has contributed to make women invisible, to necessarily associate this category only to women is to recreate the historical absolutism that has done so much damage, this time as antagonistic to men. All individuals present in the gender system participate in the social construction.
So much for today's publication, of which we can still go much deeper, especially with regard to the origin of domination or the hierarchization of the binomial "man/woman" from an anthropological point of view, but that will be left for another time. So until next time, see you in Nostr!
Créditos | Credits
Bibliografía consultada | Consulted bibliography:
-
Barbieri, T. (1993). Sobre la categoría género. Una introducción teórico-metodológica. Debates en Sociología. (18). 1-19. https://doi.org/10.18800/debatesensociologia.199301.006
-
Echevarría, D. (2020). Desigualdades de género e interseccionalidad. Análisis del contexto cubano 2008-2018. Facultad Latinoamericana de Ciencias Sociales (Programa-Cuba). https://www.clacso.org.ar/libreria-latinoamericana-cm/libro_detalle_resultado.php?id_libro=2346&campo=cm&texto=248
Traducción | Translation:
DeepL
-
-
@ 0e8c41eb:95ff0faa
2024-07-08 14:36:14Chef's notes
Sprinkle a pinch of salt in a bowl. Sift 275 grams of flour above it. Make a well in the center of the flour and sprinkle the yeast into it. Add 50 grams of butter, 25 grams of sugar, and the egg. Pour in the milk and knead the dough with the mixer or by hand into a smooth dough. If necessary, add a bit more flour if the dough is too sticky. Form the dough into a ball. Let the dough rise covered with cling film at room temperature for 1 hour.
Peel the apples. Cut the apples and apricots into pieces. Cook the fruit with 150 grams of sugar and 50 ml of water in 15 minutes into a thick compote. Let this fruit filling cool.
Knead the dough again briefly and roll it out to a round sheet of about 28-30 cm in diameter, so that it fits well over the baking tin. Press the dough into the tin and press it well into the corners. Let the dough rise again for 10 minutes and cut off the excess edges.
Spread the fruit filling over the dough.
Mix the rest of the flour with the remaining 100 grams of butter and 100 grams of sugar and knead it with your fingers into a crumbly mixture. Sprinkle this crumble mixture over the fruit filling. Bake the tart in the preheated oven at 200°C for about 30 minutes until golden brown.
Variations: Only apple? Make this tart with 1 kilo of apples. High or low? You can also bake the crumble tart in a low tart tin of 30 cm diameter. If the fruit filling is a bit too voluminous, you can make double the crumble mixture and sprinkle it over the larger surface area.
Additional needs: Mixer with dough hooks or food processor Cling film Round baking tin with a diameter of 28-30 cm, greased
Details
- ⏲️ Prep time: 45 min + 1 hour Rising
- 🍳 Cook time: 30 minutes
- 🍽️ Servings: 8
Ingredients
- 375 grams flour
- 1 packet dried yeast (7 grams)
- 125 grams unsalted butter, at room temperature
- 275 grams sugar
- 1 egg
- 100 ml milk, at room temperature
- 500 grams apples
- 500 grams apricots (fresh or from a can)
Directions
- Sprinkle a pinch of salt in a bowl. Sift 275 grams of flour above it. Make a well in the center of the flour and sprinkle the yeast into it. Add 50 grams of butter, 25 grams of sugar, and the egg. Pour in the milk and knead the dough with the mixer or by hand into a smooth dough. If necessary, add a bit more flour if the dough is too sticky. Form the dough into a ball. Let the dough rise covered with cling film at room temperature for 1 hour.
- Peel the apples. Cut the apples and apricots into pieces. Cook the fruit with 150 grams of sugar and 50 ml of water in 15 minutes into a thick compote. Let this fruit filling cool.
- Knead the dough again briefly and roll it out to a round sheet of about 28-30 cm in diameter, so that it fits well over the baking tin. Press the dough into the tin and press it well into the corners. Let the dough rise again for 10 minutes and cut off the excess edges.
- Spread the fruit filling over the dough.
-
@ cec2981c:814d4f26
2024-07-08 12:56:36Chef's notes
More information about its origins: https://en.wikipedia.org/wiki/Soda_bread
Details
- ⏲️ Prep time: 10 min
- 🍳 Cook time: 45 min
- 🍽️ Servings: 4-8
Ingredients
- 500 gr whole grain flour
- 450 ml butter milk
- 10 gr (sea)salt
- 15 gr baking soda (not US's "baking powder") (also known as sodium bicarbonate)
- a bit of butter or olive oil
Directions
- Pre-heat the oven to 220°C
- Put all flour in a bowl with baking soda and salt. Mix it well.
- Create a hole in the middle and pour all butter milk in the center.
- Stir it from the centre and slowly mixing all together until you get a dough.
- When mixed, the dough is sticky. Use all purpose flour on counter and bulge to the same size as your skillet.
- Use a bit of butter or olive oil to wet the skillet to prevent sticky bread. And put the dough in your skillet.
- Add a but of flour on top of the dough and then cut lightly a X.
- Bake for ±45 min on 220°C until done (inside temperature must be at least 95°C). Turn up the heat for the last 5 min to get a crust.
- Let the bread cool down a bit before taking it out the skillet, and let it cool down even more before eating.
- Donate some sats to me if you like this recipe.
-
@ dd664d5e:5633d319
2024-06-24 06:05:05The new Great Library
We have all heard tales of Amazon or other booksellers banning customers from their bookstores or censoring/editing purchased books. The famous Project Gutenberg, and similar organizations, are performing a good work, to help protect many of our precious books from this fate, but it is merely a centralized website and therefore not censorship resistant. Also, it mostly posts books in English or German.
So, we at nostr:npub1s3ht77dq4zqnya8vjun5jp3p44pr794ru36d0ltxu65chljw8xjqd975wz have decided to move Project Gutenberg to Nostr and house it in the most distributed way possible: on relays. Specifically, our new, public Citadel relay for out-of-print books (and other documents), but also on any relay, anywhere.
And, because we are a very humble group, we're naming the effort "Alexandria". And the first book to be printed on Nostr is the Bible because obviously.
Why on relays?
Well, why not on relays? Relays are one of the few widely-distributed databases for documentation in existence. The relay database spans the entire globe and anyone can maintain their own relay on their personal computer or mobile phone.
That means that anyone can house their own books. Which books are their own? Any books they have in their own possession. Any books someone would have to physically pry out of their cold, dead, computer.
Notes are perfect for publishing
Once we begin generating eBooks from notes with an associated header (which will be quite easy to do, so long as they are written in markdown or similar), they will also be readable, downloadable, and storable in ePub format (see Pandoc). And it is, after all, only a matter of time until someone enterprising makes an ePaper Nostr client for calmer reading of notes, and then you can download and read them, without having to bother converting beforehand, which maintains their Nostr-navigation.
The new event kind 30040 allows us to take any sort of note containing any sort of characters and create a type of "note collection" or "book of notes", "journal of notes", "magazine of notes". And it can be nested or embedded in other notes, creating any sort of note-combination and note-hierarchy you can think of, only limited in size by the ability of your computer to processes the relationships.
Save the Zettels
The associated kind 30041 adds the prospect of breaking longer texts or articles up into sections or snippets (called "Zettel" in German). We can then collect or refer to particular bits of a longer text (like a chart, elegant paragraph or definition, data table), directly. Anyone can create such snippets, even of texts they didn't write, as they can add a reference to the original publication in the tags of the new event.
This means we no longer have to "copy-paste" or quote other people's works, we can simply tie them in. If we worry about them being deleted, we can fork them to create our own, digitally-linked version, and then tie that in. This means that source material can be tied to the new material, and you can trace back to the source easily (using Nostr IDs and tags, which are signed identifiers) and see who else is branching out from that source or discussing that source.
It's gonna be HUGE!
We are making a big, beautiful library... and you are going to build it for us. Anyone, anywhere can therefore publish or republish any document they wish, with Nostr, and store it wherever they have a relay, and view it on any client willing to display it.
You will own something and be happy.
-
@ 42342239:1d80db24
2024-07-06 15:26:39Claims that we need greater centralisation, more EU, or more globalisation are prevalent across the usual media channels. The climate crisis, environmental destruction, pandemics, the AI-threat, yes, everything will apparently be solved if a little more global coordination, governance and leadership can be brought about.
But, is this actually true? One of the best arguments for this conclusion stems implicitly from the futurist Eliezer Yudkowsky, who once proposed a new Moore's Law, though this time not for computer processors but instead for mad science: "every 18 months, the minimum IQ necessary to destroy the world drops by one point".
Perhaps we simply have to tolerate more centralisation, globalisation, control, surveillance, and so on, to prevent all kinds of fools from destroying the world?
Note: a Swedish version of this text is avalable at Affärsvärlden.
At the same time, more centralisation, globalisation, etc. is also what we have experienced. Power has been shifting from the local, and from the majorities, to central-planning bureaucrats working in remote places. This has been going on for several decades. The EU's subsidiarity principle, i.e. the idea that decisions should be made at the lowest expedient level, and which came to everyone's attention ahead of Sweden's EU vote in 1994, is today swept under the rug as untimely and outdated, perhaps even retarded.
At the same time, there are many crises, more than usual it would seem. If it is not a crisis of criminality, a logistics/supply chain crisis or a water crisis, then it is an energy crisis, a financial crisis, a refugee crisis or a climate crisis. It is almost as if one starts to suspect that all this centralisation may be leading us down the wrong path. Perhaps centralisation is part of the problem, rather than the capital S solution?
Why centralisation may cause rather than prevent problems
There are several reasons why centralisation, etc, may actually be a problem. And though few seem to be interested in such questions today (or perhaps they are too timid to mention their concerns?), it has not always been this way. In this short essay we'll note four reasons (though there are several others):
- Political failures (Buchanan et al)
- Local communities & skin in the game (Ostrom and Taleb)
- The local knowledge problem (von Hayek)
- Governance by sociopaths (Hare)
James Buchanan who was given the so-called Nobel price in economics in the eighties once said that: "politicians and bureaucrats are no different from the rest of us. They will maximise their incentives just like everybody else.".
Buchanan was prominent in research on rent-seeking and political failures, i.e. when political "solutions" to so-called market failures make everything worse. Rent-seeking is when a company spends resources (e.g. lobbying) to get legislators or other decision makers to pass laws or create regulations that benefit the company instead of it having to engage in productive activities. The result is regulatory capture. The more centralised decision-making is, the greater the negative consequences from such rent-seeking will be for society at large. This is known.
Another economist, Elinor Ostrom, was given the same prize in the great financial crisis year of 2009. In her research, she had found that local communities where people had influence over rules and regulations, as well as how violations there-of were handled, were much better suited to look after common resources than centralised bodies. To borrow a term from the combative Nassim Nicholas Taleb: everything was better handled when decision makers had "skin in the game".
A third economist, Friedrich von Hayek, was given this prize as early as 1974, partly because he showed that central planning could not possibly take into account all relevant information. The information needed in economic planning is by its very nature distributed, and will never be available to a central planning committee, or even to an AI.
Moreover, human systems are complex and not just complicated. When you realise this, you also understand why the forecasts made by central planners often end up wildly off the mark - and at times in a catastrophic way. (This in itself is an argument for relying more on factors outside of the models in the decision-making process.)
From Buchanan's, Ostrom's, Taleb's or von Hayek's perspectives, it also becomes difficult to believe that today's bureaucrats are the most suited to manage and price e.g. climate risks. One can compare with the insurance industry, which has both a long habit of pricing risks as well as "skin in the game" - two things sorely missing in today's planning bodies.
Instead of preventing fools, we may be enabling madmen
An even more troubling conclusion is that centralisation tends to transfer power to people who perhaps shouldn't have more of that good. "Not all psychopaths are in prison - some are in the boardroom," psychologist Robert Hare once said during a lecture. Most people have probably known for a long time that those with sharp elbows and who don't hesitate to stab a colleague in the back can climb quickly in organisations. In recent years, this fact seems to have become increasingly well known even in academia.
You will thus tend to encounter an increased prevalance of individuals with narcissistic and sociopathic traits the higher up you get in the the status hierarchy. And if working in large organisations (such as the European Union or Congress) or in large corporations, is perceived as higher status - which is generally the case, then it follows that the more we centralise, the more we will be governed by people with less flattering Dark Triad traits.
By their fruits ye shall know them
Perhaps it is thus not a coincidence that we have so many crises. Perhaps centralisation, globalisation, etc. cause crises. Perhaps the "elites" and their planning bureaucrats are, in fact, not the salt of the earth and the light of the world. Perhaps President Trump even had a point when he said "they are not sending their best".
https://www.youtube.com/watch?v=w4b8xgaiuj0
The opposite of centralisation is decentralisation. And while most people may still be aware that decentralisation can be a superpower within the business world, it's time we remind ourselves that this also applies to the economy - and society - at large, and preferably before the next Great Leap Forward is fully thrust upon us.
-
@ dd664d5e:5633d319
2024-06-23 07:03:28I've been dealing with a lot of aspiring IT people, lately, and it has me sort of jaded.
I'm trying to think of the level of competence I would expect of someone like myself because it has changed, dramatically, since I started out, 26 years ago. I've only now managed to catch up enough to feel "requalified" for my own profession, so I thought I'd write it down for all of you, so that you know what to look for.
As in, what is the new "industry standard" knowledge level for a software business analyst?
If I were hiring a business analyst today, I'd ask...
- What do e-mail, FTP, and UML have in common?
- What is the difference between an intranet and the Internet? Name the four Internet layers and one protocol for each.
- What are unit tests and integration tests and what are they useful for? Who should write them? Who should be looking at the results?
- What is a build server and why use one? What is the difference between continuous integration and continuous deployment?
- Please explain the three source control commands: commit, push, and pull.
- Please explain the three Linux commands: pwd, ls, cat.
- How do software branches work? What is a merge conflict? What are pull requests?
- What is the difference between an activity diagram, a state-machine diagram, and a class diagram and when is it best to use which?
- What are DDD and TDD? What purpose do they serve?
- What are use cases and user stories? When is it best to use which? What is gherkin?
- What is the difference between stateful and stateless?
- What is agile programming?
- Explain the difference between Kanban, Xtreme Programming, and Scrum, and when is it best to use which?
- What is the difference between functional and object-oriented programming? Why use one, rather than the other?
- What are wireframes, mockups, and prototypes?
- Name two network topologies and give an example for an implementation of each.
- What are XML, Json, and Yaml and why use one, rather than another?
- What is an ORM and why use one? What are validators?
- Name three types of databases and a use case they are ideal for.
- Explain the importance of data sets to machine learning.
-
@ 8fb140b4:f948000c
2023-11-18 23:28:31Chef's notes
Serving these two dishes together will create a delightful centerpiece for your Thanksgiving meal, offering a perfect blend of traditional flavors with a homemade touch.
Details
- ⏲️ Prep time: 30 min
- 🍳 Cook time: 1 - 2 hours
- 🍽️ Servings: 4-6
Ingredients
- 1 whole turkey (about 12-14 lbs), thawed and ready to cook
- 1 cup unsalted butter, softened
- 2 tablespoons fresh thyme, chopped
- 2 tablespoons fresh rosemary, chopped
- 2 tablespoons fresh sage, chopped
- Salt and freshly ground black pepper
- 1 onion, quartered
- 1 lemon, halved
- 2-3 cloves of garlic
- Apple and Sage Stuffing
- 1 loaf of crusty bread, cut into cubes
- 2 apples, cored and chopped
- 1 onion, diced
- 2 stalks celery, diced
- 3 cloves garlic, minced
- 1/4 cup fresh sage, chopped
- 1/2 cup unsalted butter
- 2 cups chicken broth
- Salt and pepper, to taste
Directions
- Preheat the Oven: Set your oven to 325°F (165°C).
- Prepare the Herb Butter: Mix the softened butter with the chopped thyme, rosemary, and sage. Season with salt and pepper.
- Prepare the Turkey: Remove any giblets from the turkey and pat it dry. Loosen the skin and spread a generous amount of herb butter under and over the skin.
- Add Aromatics: Inside the turkey cavity, place the quartered onion, lemon halves, and garlic cloves.
- Roast: Place the turkey in a roasting pan. Tent with aluminum foil and roast. A general guideline is about 15 minutes per pound, or until the internal temperature reaches 165°F (74°C) at the thickest part of the thigh.
- Rest and Serve: Let the turkey rest for at least 20 minutes before carving.
- Next: Apple and Sage Stuffing
- Dry the Bread: Spread the bread cubes on a baking sheet and let them dry overnight, or toast them in the oven.
- Cook the Vegetables: In a large skillet, melt the butter and cook the onion, celery, and garlic until soft.
- Combine Ingredients: Add the apples, sage, and bread cubes to the skillet. Stir in the chicken broth until the mixture is moist. Season with salt and pepper.
- Bake: Transfer the stuffing to a baking dish and bake at 350°F (175°C) for about 30-40 minutes, until golden brown on top.
-
@ 8fb140b4:f948000c
2023-11-02 01:13:01Testing a brand new YakiHonne native client for iOS. Smooth as butter (not penis butter 🤣🍆🧈) with great visual experience and intuitive navigation. Amazing work by the team behind it! * lists * work
Bold text work!
Images could have used nostr.build instead of raw S3 from us-east-1 region.
Very impressive! You can even save the draft and continue later, before posting the long-form note!
🐶🐾🤯🤯🤯🫂💜
-
@ dd664d5e:5633d319
2024-06-21 19:11:51Finding Catholics and Catholic-friendly content on Nostr
Obvious Catholics being obvious
nostr:npub1m4ny6hjqzepn4rxknuq94c2gpqzr29ufkkw7ttcxyak7v43n6vvsajc2jl
nostr:npub1k92qsr95jcumkpu6dffurkvwwycwa2euvx4fthv78ru7gqqz0nrs2ngfwd
nostr:npub1wqfzz2p880wq0tumuae9lfwyhs8uz35xd0kr34zrvrwyh3kvrzuskcqsyn
nostr:npub1ecdlntvjzexlyfale2egzvvncc8tgqsaxkl5hw7xlgjv2cxs705s9qs735
nostr:npub1rcr8h76csgzhdhea4a7tq5w5gydcpg9clgf0cffu6z45rnc6yp5sj7cfuz
nostr:npub1fyd0awkakq4aap70ual7mtlszjle9krffgwnsrkyua2frzmysd8qjj8gvg
nostr:npub1q0fe26apcqeeyqnlre29fqu7ysx0ucm5ly637md3zlvy2xcfsm3s0lsv4r
nostr:npub1dvdcmtp5llrp63jdlmhspe9gffsyu9ew7cu3ld3f9y7k79nxzjxqf4d4rm
nostr:npub1paxyej8f8fh57ny0fr5w2mzp9can9nkcmeu5jaerv68mhrah7t8s795ky6
nostr:npub1tvw3h5xqnuc2aq5zelxp3dy58sz7x9u8e6enkxywmrz70cg2j2zqjes44n
nostr:npub13tahtl9pjw9u5ycruqk84k6sfmkyljsr7y2pc4s840ctlm73fxeq3j6e08
nostr:npub1w4jkwspqn9svwnlrw0nfg0u2yx4cj6yfmp53ya4xp7r24k7gly4qaq30zp
Other good Christian follows
nostr:npub1hqy4zwnvsdmlml4tpgp0kgrruxamfcwpgm4g3q2tr3d2ut3kuxusx73psm
nostr:npub1cpstx8lzhwctunfe80rugz5qsj9ztw8surec9j6mf8phha68dj6qhm8j5e
nostr:npub1ak5kewf6anwkrt0qc8ua907ljkn7wm83e2ycyrpcumjvaf2upszs8r0gwg
nostr:npub1mt8x8vqvgtnwq97sphgep2fjswrqqtl4j7uyr667lyw7fuwwsjgs5mm7cz
nostr:npub1q6ya7kz84rfnw6yjmg5kyttuplwpauv43a9ug3cajztx4g0v48eqhtt3sh
nostr:npub1356t6fpjysx9vdchfg7mryv83w4pcye6a3eeke9zvsje7s2tuv4s4k805u
nostr:npub1kun5628raxpm7usdkj62z2337hr77f3ryrg9cf0vjpyf4jvk9r9smv3lhe
nostr:npub1qf6gsfapq94rj0rcptkpm9sergacmuwrjlgfx5gznjajtvkcx3psfhx6k5
nostr:npub1ll99fcrclkvgff696u8tq9vupw9fulfc8fysdf6gfwp7hassrh2sktxszt
nostr:npub1zy37ecnhpvx4lmxh4spd0898sxdj0ag8m64s9yq499zaxlg7qrqq8c53q6
nostr:npub1rtlqca8r6auyaw5n5h3l5422dm4sry5dzfee4696fqe8s6qgudks7djtfs
nostr:npub1jlrs53pkdfjnts29kveljul2sm0actt6n8dxrrzqcersttvcuv3qdjynqn
nostr:npub18zqmath26txwfhc70af8axk7pftjre9x7cf0lxkg92nvj2cpfgts8va790
Christian follow list
An exhaustive list of Christians is maintained by nostr:npub1mt8x8vqvgtnwq97sphgep2fjswrqqtl4j7uyr667lyw7fuwwsjgs5mm7cz. Just look at his list on https://listr.lol/
Catholic community
You can also join the community, to reach other catholics (usable on #Nostrudel #Coracle #Amethyst and #Satellite): nostr:naddr1qvzqqqyx7cpzqqnd3dl8hnptg9agfugwmdcmgfl7wcrfjpgfpv28ksq6dnmqc0e8qqyyxct5dphkc6trmu6k9l
Christian topic relay
And always make sure to use the #catholic hashtag, to get onto the top-specific christpill relay (add it to your relay list: wss://christpill.nostr1.com/).
Hope that helps! 😊
-
@ fa0165a0:03397073
2023-10-06 19:25:08I just tested building a browser plugin, it was easier than I thought. Here I'll walk you through the steps of creating a minimal working example of a browser plugin, a.k.a. the "Hello World" of browser plugins.
First of all there are two main browser platforms out there, Chromium and Mozilla. They do some things a little differently, but similar enough that we can build a plugin that works on both. This plugin will work in both, I'll describe the firefox version, but the chromium version is very similar.
What is a browser plugin?
Simply put, a browser plugin is a program that runs in the browser. It can do things like modify the content of a webpage, or add new functionality to the browser. It's a way to extend the browser with custom functionality. Common examples are ad blockers, password managers, and video downloaders.
In technical terms, they are plugins that can insert html-css-js into your browser experience.
How to build a browser plugin
Step 0: Basics
You'll need a computer, a text editor and a browser. For testing and development I personally think that the firefox developer edition is the easiest to work with. But any Chrome based browser will also do.
Create a working directory on your computer, name it anything you like. I'll call mine
hello-world-browser-plugin
. Open the directory and create a file calledmanifest.json
. This is the most important file of your plugin, and it must be named exactly right.Step 1: manifest.json
After creation open your file
manifest.json
in your text editor and paste the following code:json { "manifest_version": 3, "name": "Hello World", "version": "1.0", "description": "A simple 'Hello World' browser extension", "content_scripts": [ { "matches": ["<all_urls>"], "js": ["hello.js"] //The name of your script file. // "css": ["hello.css"] //The name of your css file. } ] }
If you wonder what the
json
file format is, it's a normal text file with a special syntax such that a computer can easily read it. It's thejson
syntax you see in the code above. Let's go through what's being said here. (If you are not interested, just skip to the next step after pasting this we are done here.)manifest_version
: This is the version of the manifest file format. It's currently at version 3, and it's the latest version. It's important that you set this to 3, otherwise your plugin won't work.name
: This is the name of your plugin. It can be anything you like.version
: This is the version of your plugin. It can be anything you like.description
: This is the description of your plugin. It can be anything you like.content_scripts
: This is where you define what your plugin does. It's a list of scripts that will be executed when the browser loads a webpage. In this case we have one script, calledhello.js
. It's the script that we'll create in the next step.matches
: This is a list of urls that the script will be executed on. In this case we have<all_urls>
, which means that the script will be executed on all urls. You can also specify a specific url, likehttps://brave.com/*
, which means that the script will only be executed on urls that start withhttps://brave.com/
.js
: This is a list of javascript files that will be executed. In this case we have one file, calledhello.js
. It's the script that we'll create in the next step.css
: This is where you can add a list of css files that will be executed. In this case we have none, but you can add css files here if you want to.//
: Text following these two characters are comments. They are ignored by the computer, You can add comments anywhere you like, and they are a good way to document your code.
Step 2: hello.js
Now it's time to create another file in your project folder. This time we'll call it
hello.js
. When created, open it in your text editor and paste the following code:js console.log("Hello World!");
That's javascript code, and it's what will be executed when you run your plugin. It's a simpleconsole.log
statement, which will print the text "Hello World!" to the console. The console is a place where the browser prints out messages, and it's a good place to start when debugging your plugin.Step 3: Load and launch your plugin
Firefox
Now it's time to load your plugin into your browser. Open your browser and go to the url
about:debugging#/runtime/this-firefox
. You should see a page that looks something like this:Click the button that says "Load Temporary Add-on...". A file dialog will open, navigate to your project folder and select the file
manifest.json
. Your plugin should now be loaded and running.Go to a website, any website, and open the inspector then navigate to the console. You'll find the inspector by right-clicking anywhere within the webpage, and click "Inspector" in the drop-down menu. When opening the console you might see some log messages from the site you visited and... you should see the text "Hello World!" printed there, from our little plugin! Congratulations!
Chrome
Open your browser and go to the url
chrome://extensions/
. Click the button that says "Load unpacked". A file dialog will open, navigate to your project folder and select the folderhello-world-browser-plugin
. Your plugin should now be loaded and running.Note the difference, of selecting the file
manifest.json
in firefox, and selecting the folderhello-world-browser-plugin
in chrome. Otherwise, the process is the same. So I'll repeat the same text as above: (for those who skipped ahead..)Go to a website, any website, and open the inspector then navigate to the console. You'll find the inspector by right-clicking anywhere within the webpage, and click "Inspector" in the drop-down menu. When opening the console you might see some log messages from the site you visited and... you should see the text "Hello World!" printed there, from our little plugin! Congratulations!
As you can see this isn't as complicated as one might think. Having preformed a "Hello-World!"-project is a very useful and valuable first step. These setup steps are the basics for any browser plugin, and you can build on this to create more advanced plugins.
-
@ 8fb140b4:f948000c
2023-08-22 12:14:34As the title states, scratch behind my ear and you get it. 🐶🐾🫡
-
@ 46fcbe30:6bd8ce4d
2024-06-18 18:50:07At WalletScrutiny we open a lot of issues. And just recently, Danny, whom I cannot mention on Habla.news due to this error wrote a script to facilitate re-visiting issues we haven't re-visited in over half a year. Should we poke them all? 🤔
Issue | Comment | Issue
-----------|------------|---------------------- 2019-12-14 | 2019-12-16 | App cannot be verified 2017-10-20 | 2020-04-11 | Builds are not reproducible 2019-12-14 | 2020-06-19 | App from GooglePlay is not verifiable 2020-08-27 | 2020-09-02 | Please provide full build instructions to reproduce the app on Google Play 2020-12-05 | 2020-12-14 | Provide build instructions to rebuild the app from Google Play 2020-12-16 | 2020-12-16 | Where is the code of DApp Browser? 2020-12-18 | 2021-01-02 | Reproducible builds of com.crypto.multiwallet on Google Play? 2021-03-06 | 2021-03-07 | Change minSdkVersion from 16 to 19 2017-12-20 | 2021-03-12 | Build APK deterministically 2021-03-05 | 2021-04-03 | Please update your build instruction and make sure the app on Google Play can be reproduced 2019-12-28 | 2021-04-03 | Please update your build instructions such that reproducible builds are possible 2021-04-09 | 2021-04-09 | Please provide detailled instructions on how to reproduce the app on Google Play 2021-03-07 | 2021-04-20 | Missing build instructions & build fail 2021-03-08 | 2021-04-22 | Build fail with errors "Error: No named parameter with the name" 2021-04-23 | 2021-04-23 | Where is the latest source-code for im.token.app? 2021-03-12 | 2021-05-02 | Missing code for latest 2.1 version in google-play 2021-04-01 | 2021-05-02 | Publish source code for the app 2021-05-03 | 2021-05-04 | Please provide build instructions for reproducing the apk from Google Play 2020-04-07 | 2021-05-16 | Please provide verifiability of released builds 2021-04-18 | 2021-05-19 | Where is the latest code and build instructions 2021-05-27 | 2021-05-27 | Missing code for latest release 3.2.7 on google-play 2020-01-29 | 2021-06-02 | Please provide verifiable builds 2019-12-22 | 2021-06-03 | Failed to verify the build on GooglePlay 2021-05-27 | 2021-06-08 | Android app is not reproducible, diffs in AndroidManifest.xml & 0.dist.js & dist.js & resources.arsc files 2020-06-01 | 2021-07-23 | Please make the app reproducible 2021-10-26 | 2021-12-02 | Publish sources for android mobile appid=io.divipay.divi
2021-12-11 | 2021-12-11 | Please provide latest source code and reproducible builds 2021-12-20 | 2021-12-20 | Please share the source and build instructions for the QUANTUM HW firmware 2022-02-08 | 2022-02-08 | Please add interpretation of transactions due to be signed 2021-10-17 | 2022-03-12 | Please provide build instructions to reproduce the version on Google Play Store 2021-12-02 | 2022-03-14 | Please provide instructions to reproduce your app found on Google Play 2021-04-21 | 2022-03-27 | Minimal Containers for Reproducible Builds testing for android app - WIP 2019-11-23 | 2022-03-28 | Build is not verifiable 2022-03-15 | 2022-04-10 | Show hash of firmware prior to installation 2021-04-12 | 2022-04-12 | Please provide build instructions to reproduce the app in Google Play Store 2020-04-27 | 2022-04-14 | Please provide reproducible builds 2022-04-08 | 2022-04-16 | Why the wallet readme is not detailed and no updated description? 2021-07-17 | 2022-07-22 | Publish source-code of releases before uploading them to Google Play 2021-12-01 | 2022-08-07 | Reproducible build fails due to a missing dependency 2021-02-14 | 2022-08-29 | Build failed if --production or npm ci are used instead of development 2022-03-01 | 2023-01-06 | Publish source code for mobile apps 2021-10-16 | 2023-04-13 | Missing Code for latest versions on google-play 2023-04-15 | 2023-04-15 | Missing Git Tag & Github release for latest version 5.5.1 on Google Play 2022-08-04 | 2023-04-20 | Add build instructions for all platforms & Containerfiles to compile app 2021-11-23 | 2023-04-22 | Missing source code for latest version 4.18 2021-03-12 | 2023-04-23 | Build apk fail with "because bitorzo_wallet_flutter depends on intl ^0.16.0, version solving failed." 2019-12-14 | 2023-04-24 | App on Google Play is not verifiable 2023-04-24 | 2023-04-24 | Publish code for newer version 23 on google-play 2023-04-27 | 2023-04-28 | Reproducibility problem on v0.32.0 2022-07-12 | 2023-05-25 | Missing code for version 1.3.11 on google-play 2023-05-27 | 2023-05-27 | reproducible builds 2023-06-20 | 2023-06-21 | Use--output-hashing none
to prevent random names for built main.js 2023-06-25 | 2023-06-25 | The app is not reproducible 2023-02-21 | 2023-07-01 | Could not find com.google.android.gms:play-services-tapandpay:18.2.0 2021-10-09 | 2023-07-08 | Please fix build reproducibility 2023-05-22 | 2023-07-13 | v7.8.0 is not reproducible 2021-10-17 | 2023-07-31 | Missing x86_64 folder in ./app/src/main/cpp//lib with lib.a files & Instructions to reproduce binaries in lib folders 2021-01-21 | 2023-08-16 | How can I verify the app on Google Play was built from this source code? 2019-12-28 | 2023-09-30 | Please provide verifiable builds 2019-12-14 | 2023-11-02 | App on Google Play can't be verified 2019-12-30 | 2023-11-03 | Support verifiable builds 2021-02-21 | 2023-12-19 | Has the BitLox project died? -
@ 32e18276:5c68e245
2024-07-05 18:28:20Hey guys!
Here's our Q3-Q4 report. There was some confusion with the timing of the previous quarters so we've accumulated everything since the Q2 report (March 2nd, 2024) into this one. Enjoy!
Damus OpenSats Q3-Q4 Progress Report
The third and fourth quarter of the initial Damus OpenSats grant encompasses work done across many of the active projects being worked on at Damus HQ. Here's a quick overview of the progress made across the organization:
nostrdb
Added
- Author query support (William Casarin)
- REQ support: ndb_subscribe (William Casarin)
- CLOSE support: ndb_unsubscibe (William Casarin)
- Profile flatbuffer dump utility (William Casarin)
- Ability to register a subscription callback instead of just polling (William Casarin)
Changed
- Dump json in filter and fulltext queries in cli tool (William Casarin)
Stats
name added removed commits William Casarin <jb55@jb55.com> +665 -739 16
nostrdb-rust
Added
- Note builder interface (William Casarin)
- Nip10 implementation (William Casarin)
- Getting profile by key (William Casarin)
- Note tag support (William Casarin)
Tag::into_iter
method (Yuki Kishimoto)FilterBuilder::until
method (Yuki Kishimoto)- Export structs from
tags
module (Yuki Kishimoto)
Fixed
FilterBuilder::limit
method (Yuki Kishimoto)Note::sig
method (Yuki Kishimoto)
Stats
Yuki Kishimoto <yukikishimoto@protonmail.com> +20 -4 5 William Casarin <jb55@jb55.com> +1551 -867 32
Damus NoteDeck/Android
This project is in heavy development and doesn't have clean changelogs yet, so here are all of the git logs from this period
Willam's updates
- account_manager: don't add the same pubkey more than once
- actionbar placeholder
- actionbar: remove border on reply button, add expand animation
- actionbar: support both dark and light buttons
- Add forward navigation animation
- add hover to profile map demo
- add --light lightmode flag to previews and notedeck
- add my pubkey to the timeline query for testing
- add test queries for elsat
- add vertical separators between timelines
- allow missing profiles in account switcher
- allow multiple invocations of --sec
- always have some default timeline
- app: make ndb public
- a quick note about the View trait
- args: allow filter files and raw filters on cli
- bump timeline filter limit
- cap size of remote filter limits
- change selection color to purple
- check: remove duplicate and disable wasm checks
- ci: disable wasm builds for now
- cli: add --sec argument to quickly add an account
- clippy postbox cleanups
- crates: add bitflags
- damus: make relay pool public
- darker quote reposts for mobile
- debug: add some debug logs for keyboard note selection
- deps: using tracing instead of log
- direnv: disable nix direnv on darwin
- direnv: make args explicit so its easy to change
- don't bump timeline when adding notes
- dont since-optimize when we don't have enough notes
- don't use an actual keystorage in Damus test apps
- drafts: clear post after posting
- enable nip10 replies
- enostr: rename Event to Note
- env: allow private things like secret keys
- filter: add default limit
- filters: fix hashtag filters
- filter: unify nostrdb and remote limits
- Fix all warnings
- fixed_window: fix clippy warning
- fix global query
- fix inline note preview colors
- fix light borders for quote reposts
- fix postbox design
- fix scroll area shrinking
- fix some clippy warnings
- fix until filters
- fix warnings
- fonts: always use base fonts for all families
- fun large profile grid preview
- get all missing ids from all timelines
- github: remove pages workflow
- gitignore: add perf files
- hide navigation title on global nav
- ids: find more unknown ids from inline notes
- images: add aspect_fill
- initial inline note previews
- initial navigation
- initial note posting
- initial postbox for testing
- initial post box view
- initial post reply view
- initial refactor in preparation for routing
- input: fix deadlock on resize
- input: handle raw events
- input: process input once
- input: switch to scanning over raw input events
- integrate sidebar into main view
- introduce View and Previews traits
- irc mode
- ircmode: tighten username
- keys: actually use the keystore
- lighter gray for inline previews in white mode
- list: switch from arc mutext to rc refcell
- load timelines from commandline
- log: reduce spammy new note log
- log relay subscriptions
- make notedeck front and center
- mobile: add lots of top padding on mobile
- mobile: a tad more padding at the top
- mobile: black panel bg color
- mobile: don't add hover on mobile
- mobile: make mobile flag runtime-configurable
- mobile: simplify is_mobile
- mobile: use my timeline for testing again
- move account_login_view to ui submodule
- move add_space to its correct location
- nip10: fetch unknown replied-to notes
- nip10: show initial reply information on notes
- nix: add openssl dep for reqwest
- nix: make android emulator optional
- nostrdb: bump to support timeline queries
- note: don't allow nested note previews
- note: support nprofile mentions
- perf: don't clone bytes via a more specialized function
- perf: fix continuous repaint
- perf: profiling note content rendering
- popup: increase fixed window margin
- popup: simplify borrow
- postbox: add focused state
- postbox: add focus shadow from figma
- post: full available width
- postview: add vertical scroll
- previews: remove central panel frame
- profile: add about and username to profile previews
- profiles: introduce DisplayNames
- profiling: update puffin version
- queries: add a few more test queries
- queries: set reasonable initial query sizes
- queries: update hashtags query
- readme: add usage
- readme: document previews
- readme: make it clear that its for desktop
- readme: make it clear that nix is optional for non-android dev
- readme: show notedeck preview
- refactor: move fixed_window to ui
- refactor: move note into its own widget
- refactor: rename widgets to ui
- refactor: use map instead of explicit
if let
- remote: use default_remote_limit instead of hardcoding
- remove context from DesktopSidePanel
- remove minimum window size
- remove useless struct
- render_pfp: only depend on imgcache
- reset virtual list if notes are spliced into timeline
- set tooltip delay to 100ms
- side_panel: return more detailed side panel responses
- simplify is_mobile check
- simplify tab underline calculation
- since filter optimization
- slightly larger pfpfs
- small inline preview pfps
- style: initial logic for mobile style
- style: less bubbly
- style: realtime hover interactions
- style: set 0 duration tooltip delay
- Switch to egui master to fix wgpu issues
- switch to glow for now
- switch to rustls-tls-native-roots
- tabs: select notes & replies by default
- test: add test timeline query
- test: update test profile with a banner
- test: use my timeline query by default for testing
- textmode: adjust name sizing so it doesn't bump the column
- textmode: fix coloring of abbreviated names
- textmode: no previews in textmore for now
- theme: update to correct colors based on figma design
- timeline: don't hide on resize
- timeline: refactor tabs into TimelineView
- timeline: use strips instead of panels
- ui: add banners to profile previews
- ui: add feathering-resistant hline separator fn
- ui: add frame around note previews
- ui: add initial Profile hover previews
- ui: add profile picture hover animation
- ui: integrate egui-tabs for notes & replies selector
- ui: introduce profile picture widget
- ui: make pixel sizes correct, use more of the figma
- ui: move RelayView into ui module
- ui: only inline preview kind1 notes for now
- ui: reduce vertical padding for more compact timelines
- ui: remove some old note ui comments
- ui: simplify note widget
- ui: switch to flags for note options
- update slow timeline query
- update to latest nostrdb, adding tag support
- update version in comment because I still use this on my laptop
- use egui_virtual_list for rendering
- use unique id for post/reply view
- weird actionbar experiment
- we should just called this damus
- widgets: begin organizing ui components into widgets
- working notes + notes&replies
kernelkind's updates
- account_management: refactor
- AccountManagementView
- AccountManager: add ability to make a selection
- AccountSelectionWidget
- Account switcher
- Add AccountManager to app
- Add custom visual themes
- Add flag to emulate mobile during preview
- Add keypair & update pubkey
- Add login key parsing
- Add login manager
- Add MacOS key storage
- Add MobileAccountLoginView
- Add modular custom text styles
- Add monospace text style
- Add nix-shell for macos android development
- Add nostr-sdk dependency
- Add precommit hook to suggest proper formatting
- Add preview for RelayView
- Add relay view
- add reqwest as dependency
- Add SimpleProfilePreview
- Add ui_test_harness binary implementation
- Add warn color and highlight color
- app: add accounts to Damus & mock
- Apply app style to AccountLoginView
- Apply cargo fmt
- Conditionally compile MacOS key storage code
- Constrain app minimum size
- Create account login panel
- Create side panel & global popup
- enostr: expose RelayStatus
- Extract the sample relay to test_data for reuse
- impl linux credential storage
- Integrate account switcher to side panel
- Integrate global popup into app
- login: Add key loading & error to mobile
- macos_key_storage: runner ignore tests
- migrate AccountManagementView to enostr Keypair
- Migrate to new AccountManagementView conception
- Move login business logic to LoginManager
- move login logic from promise to async fns
- Move preview controller out of account_manager.rs
- move test account creation & apply to global popup
- Pfp integration to side panel
- refactor account switcher & management previews
- Refactor 'ui tests' conception to previews
- reintroduce account management
- reintroduce account switcher
- remove nostr-sdk
- Rename PersistState -> StateInMemory
- select first account
- Seperate mobile dark color theme into its own func
- update cargo.lock to reflect toml change
- Update egui 0.26.1 -> 0.27.1
- Use app_creation for common app setup functions
- Use custom style in app
- use nostr in enostr
Stats
name added removed commits kernelkind <kernelkind@gmail.com> +6844 -1979 58 William Casarin <jb55@jb55.com> +9733 -5583 194
Damus iOS
Added
- Ability to create highlights (ericholguin)
- Add ability to mute hashtag from SearchView (Charlie Fish)
- Add ability to mute words, add new mutelist interface (Charlie) (William Casarin)
- Added callbackuri for a better ux when connecting mutiny wallet nwc (ericholguin)
- Added nip10 marker replies (William Casarin)
- Added title image and tags to longform events (ericholguin)
- Add event content preview to the full screen carousel (Daniel D’Aquino)
- Add First Aid solution for users who do not have a contact list created for their account (Daniel D’Aquino)
- Add marker nip10 support when reading notes (William Casarin)
- Add video cache to save network bandwidth (Daniel D’Aquino)
- Completely new threads experience that is easier and more pleasant to use (Daniel D’Aquino)
- Connect to Mutiny Wallet Button (ericholguin)
- Highlights (NIP-84) (ericholguin)
- Notification reminders for Damus Purple impending expiration (Daniel D’Aquino)
- Proxy Tags are now viewable on Selected Events (ericholguin)
- Relay fees metadata (ericholguin)
- Revamp emoji picker to be less error-prone and add search, frequently used, and multiple skin tone support capabilities (Terry Yiu)
- Show list of quoted reposts in threads (William Casarin)
Changed
- Added first aid contact damus support email (alltheseas)
- Add support for Apple In-App purchases (Daniel D’Aquino)
- Always strip GPS data from images (kernelkind)
- Change reactions to use a native looking emoji picker (Terry Yiu)
- Errors with an NWC will show as an alert (ericholguin)
- Improve UX around video playback (Daniel D’Aquino)
- Moved paste nwc button to main wallet view (ericholguin)
- Relay config view user interface (ericholguin)
- Relay detail design (ericholguin)
- Updated Zeus logo (ericholguin)
Fixed
- Fix broken GIF uploads (Daniel D’Aquino)
- Fix bug that would cause connection issues with relays defined with a trailing slash URL, and an inability to delete them. (Daniel D’Aquino)
- Fixed issue where some replies were including the q tag (William Casarin)
- Fixed issue where timeline was scrolling when it isn't supposed to (William Casarin)
- Fixed threads not loading sometimes (William Casarin)
- Fixed wallet not disconnecting when a user logs out (ericholguin)
- Fix emoji reactions being cut off (ericholguin)
- Fix ghost notifications caused by Purple impending expiration notifications (Daniel D’Aquino)
- Fix image indicators to limit number of dots to not spill screen beyond visible margins (ericholguin)
- Fix in-app purchase issue that would trigger an error on purchase before confirming the account information. (Daniel D’Aquino)
- Fix issue where bootstrap relays would inadvertently be added to the user's list on connectivity issues (Daniel D’Aquino)
- Fix missing Mute button in profile view menu (Terry Yiu)
- Fix nostrscripts not loading (William Casarin)
- Fix profile not updating bug (William Casarin)
- Fix stale feed issue when follow list is too big (Daniel D’Aquino)
- Fix thread bug where a quote isn't picked up as a reply (William Casarin)
- Fix welcome screen not showing if the user enters the app directly after a successful checkout without going through the link (Daniel D’Aquino)
- Improve reliability of contact list creation during onboarding (Daniel D’Aquino)
- Issue where NWC Scanner view would not dismiss after a failed scan/paste (ericholguin)
Stats
name added removed commits Fonta1n3 <fontainedenton@googlemail.com> +1 -0 1 Sean Kibler <skibler@protonmail.com> +13 -2 1 alltheseas <64376233+alltheseas> +65 -1 4 Transifex <transifex@transifex.com> +581 -39 36 Terry Yiu <git@tyiu.xyz> +763 -1383 8 William Casarin <jb55@jb55.com> +1017 -1564 49 ericholguin <ericholguin@apache.org> +1962 -643 13 Daniel D’Aquino <daniel@daquino.me> +3885 -761 48
-
@ 8fb140b4:f948000c
2023-07-30 00:35:01Test Bounty Note
-
@ 8e7462a8:c723a97b
2024-07-21 16:00:55# Bitcoin: A Peer-to-Peer Electronic Cash System Satoshi Nakamoto [satoshin@gmx.com](mailto:satoshin@gmx.com) www.bitcoin.org **Abstract.** A purely peer-to-peer version of electronic cash would allow online payments to be sent directly from one party to another without going through a financial institution. Digital signatures provide part of the solution, but the main benefits are lost if a trusted third party is still required to prevent double-spending. We propose a solution to the double-spending problem using a peer-to-peer network. The network timestamps transactions by hashing them into an ongoing chain of hash-based proof-of-work, forming a record that cannot be changed without redoing the proof-of-work. The longest chain not only serves as proof of the sequence of events witnessed, but proof that it came from the largest pool of CPU power. As long as a majority of CPU power is controlled by nodes that are not cooperating to attack the network, they'll generate the longest chain and outpace attackers. The network itself requires minimal structure. Messages are broadcast on a best effort basis, and nodes can leave and rejoin the network at will, accepting the longest proof-of-work chain as proof of what happened while they were gone. ## 1. Introduction Commerce on the Internet has come to rely almost exclusively on financial institutions serving as trusted third parties to process electronic payments. While the system works well enough for most transactions, it still suffers from the inherent weaknesses of the trust based model. Completely non-reversible transactions are not really possible, since financial institutions cannot avoid mediating disputes. The cost of mediation increases transaction costs, limiting the minimum practical transaction size and cutting off the possibility for small casual transactions, and there is a broader cost in the loss of ability to make non-reversible payments for non-reversible services. With the possibility of reversal, the need for trust spreads. Merchants must be wary of their customers, hassling them for more information than they would otherwise need. A certain percentage of fraud is accepted as unavoidable. These costs and payment uncertainties can be avoided in person by using physical currency, but no mechanism exists to make payments over a communications channel without a trusted party. What is needed is an electronic payment system based on cryptographic proof instead of trust, allowing any two willing parties to transact directly with each other without the need for a trusted third party. Transactions that are computationally impractical to reverse would protect sellers from fraud, and routine escrow mechanisms could easily be implemented to protect buyers. In this paper, we propose a solution to the double-spending problem using a peer-to-peer distributed timestamp server to generate computational proof of the chronological order of transactions. The system is secure as long as honest nodes collectively control more CPU power than any cooperating group of attacker nodes. ## 2. Transactions We define an electronic coin as a chain of digital signatures. Each owner transfers the coin to the next by digitally signing a hash of the previous transaction and the public key of the next owner and adding these to the end of the coin. A payee can verify the signatures to verify the chain of ownership. ``` ┌─────────────────────┐ ┌─────────────────────┐ ┌─────────────────────┐ │ │ │ │ │ │ │ Transaction │ │ Transaction │ │ Transaction │ │ │ │ │ │ │ │ ┌─────────────┐ │ │ ┌─────────────┐ │ │ ┌─────────────┐ │ │ │ Owner 1's │ │ │ │ Owner 2's │ │ │ │ Owner 3's │ │ │ │ Public Key │ │ │ │ Public Key │ │ │ │ Public Key │ │ │ └───────┬─────┘ │ │ └───────┬─────┘ │ │ └───────┬─────┘ │ │ │ . │ │ │ . │ │ │ │ ──────┼─────────┐ │ . ├───────────────┼─────────┐ │ . ├──────────────┼─────────┐ │ │ │ │ │ . │ │ │ │ . │ │ │ │ │ │ ┌──▼─▼──┐ . │ │ ┌──▼─▼──┐ . │ │ ┌──▼─▼──┐ │ │ │ Hash │ . │ │ │ Hash │ . │ │ │ Hash │ │ │ └───┬───┘ . │ Verify │ └───┬───┘ . │ Verify │ └───┬───┘ │ │ │ ............................ │ ........................... │ │ │ │ │ │ │ │ │ │ │ │ │ │ ┌──────▼──────┐ │ │ ┌─▼────▼──────┐ │ │ ┌─▼────▼──────┐ │ │ │ Owner 0's │ │ Sign │ │ Owner 1's │ │ Sign │ │ Owner 2's │ │ │ │ Signature │ │ ...........─►│ Signature │ │ ...........─►│ Signature │ │ │ └─────────────┘ │ . │ └─────────────┘ │ . │ └─────────────┘ │ │ │ . │ │ . │ │ └─────────────────────┘ . └─────────────────────┘ . └─────────────────────┘ . . ┌─────────────┐ . ┌─────────────┐ . ┌─────────────┐ │ Owner 1's │........... │ Owner 2's │.......... │ Owner 3's │ │ Private Key │ │ Private Key │ │ Private Key │ └─────────────┘ └─────────────┘ └─────────────┘ ``` The problem of course is the payee can't verify that one of the owners did not double-spend the coin. A common solution is to introduce a trusted central authority, or mint, that checks every transaction for double spending. After each transaction, the coin must be returned to the mint to issue a new coin, and only coins issued directly from the mint are trusted not to be double-spent. The problem with this solution is that the fate of the entire money system depends on the company running the mint, with every transaction having to go through them, just like a bank. We need a way for the payee to know that the previous owners did not sign any earlier transactions. For our purposes, the earliest transaction is the one that counts, so we don't care about later attempts to double-spend. The only way to confirm the absence of a transaction is to be aware of all transactions. In the mint based model, the mint was aware of all transactions and decided which arrived first. To accomplish this without a trusted party, transactions must be publicly announced [^1], and we need a system for participants to agree on a single history of the order in which they were received. The payee needs proof that at the time of each transaction, the majority of nodes agreed it was the first received. ## 3. Timestamp Server The solution we propose begins with a timestamp server. A timestamp server works by taking a hash of a block of items to be timestamped and widely publishing the hash, such as in a newspaper or Usenet post [^2] [^3] [^4] [^5]. The timestamp proves that the data must have existed at the time, obviously, in order to get into the hash. Each timestamp includes the previous timestamp in its hash, forming a chain, with each additional timestamp reinforcing the ones before it. ``` ┌──────┐ ┌──────┐ ────────────►│ ├───────────────────────►│ ├───────────────────► │ Hash │ │ Hash │ ┌───►│ │ ┌───►│ │ │ └──────┘ │ └──────┘ │ │ ┌┴──────────────────────────┐ ┌┴──────────────────────────┐ │ Block │ │ Block │ │ ┌─────┐ ┌─────┐ ┌─────┐ │ │ ┌─────┐ ┌─────┐ ┌─────┐ │ │ │Item │ │Item │ │... │ │ │ │Item │ │Item │ │... │ │ │ └─────┘ └─────┘ └─────┘ │ │ └─────┘ └─────┘ └─────┘ │ │ │ │ │ └───────────────────────────┘ └───────────────────────────┘ ``` ## 4. Proof-of-Work To implement a distributed timestamp server on a peer-to-peer basis, we will need to use a proof-of-work system similar to Adam Back's Hashcash [^6], rather than newspaper or Usenet posts. The proof-of-work involves scanning for a value that when hashed, such as with SHA-256, the hash begins with a number of zero bits. The average work required is exponential in the number of zero bits required and can be verified by executing a single hash. For our timestamp network, we implement the proof-of-work by incrementing a nonce in the block until a value is found that gives the block's hash the required zero bits. Once the CPU effort has been expended to make it satisfy the proof-of-work, the block cannot be changed without redoing the work. As later blocks are chained after it, the work to change the block would include redoing all the blocks after it. ``` ┌────────────────────────────────────────┐ ┌────────────────────────────────────────┐ │ Block │ │ Block │ │ ┌──────────────────┐ ┌──────────────┐ │ │ ┌──────────────────┐ ┌──────────────┐ │ ───────┼─►│ Prev Hash │ │ Nonce │ ├──────┼─►│ Prev Hash │ │ Nonce │ │ │ └──────────────────┘ └──────────────┘ │ │ └──────────────────┘ └──────────────┘ │ │ │ │ │ │ ┌──────────┐ ┌──────────┐ ┌──────────┐ │ │ ┌──────────┐ ┌──────────┐ ┌──────────┐ │ │ │ Tx │ │ Tx │ │ ... │ │ │ │ Tx │ │ Tx │ │ ... │ │ │ └──────────┘ └──────────┘ └──────────┘ │ │ └──────────┘ └──────────┘ └──────────┘ │ │ │ │ │ └────────────────────────────────────────┘ └────────────────────────────────────────┘ ``` The proof-of-work also solves the problem of determining representation in majority decision making. If the majority were based on one-IP-address-one-vote, it could be subverted by anyone able to allocate many IPs. Proof-of-work is essentially one-CPU-one-vote. The majority decision is represented by the longest chain, which has the greatest proof-of-work effort invested in it. If a majority of CPU power is controlled by honest nodes, the honest chain will grow the fastest and outpace any competing chains. To modify a past block, an attacker would have to redo the proof-of-work of the block and all blocks after it and then catch up with and surpass the work of the honest nodes. We will show later that the probability of a slower attacker catching up diminishes exponentially as subsequent blocks are added. To compensate for increasing hardware speed and varying interest in running nodes over time, the proof-of-work difficulty is determined by a moving average targeting an average number of blocks per hour. If they're generated too fast, the difficulty increases. ## 5. Network The steps to run the network are as follows: 1. New transactions are broadcast to all nodes. 2. Each node collects new transactions into a block. 3. Each node works on finding a difficult proof-of-work for its block. 4. When a node finds a proof-of-work, it broadcasts the block to all nodes. 5. Nodes accept the block only if all transactions in it are valid and not already spent. 6. Nodes express their acceptance of the block by working on creating the next block in the chain, using the hash of the accepted block as the previous hash. Nodes always consider the longest chain to be the correct one and will keep working on extending it. If two nodes broadcast different versions of the next block simultaneously, some nodes may receive one or the other first. In that case, they work on the first one they received, but save the other branch in case it becomes longer. The tie will be broken when the next proof-of-work is found and one branch becomes longer; the nodes that were working on the other branch will then switch to the longer one. New transaction broadcasts do not necessarily need to reach all nodes. As long as they reach many nodes, they will get into a block before long. Block broadcasts are also tolerant of dropped messages. If a node does not receive a block, it will request it when it receives the next block and realizes it missed one. ## 6. Incentive By convention, the first transaction in a block is a special transaction that starts a new coin owned by the creator of the block. This adds an incentive for nodes to support the network, and provides a way to initially distribute coins into circulation, since there is no central authority to issue them. The steady addition of a constant of amount of new coins is analogous to gold miners expending resources to add gold to circulation. In our case, it is CPU time and electricity that is expended. The incentive can also be funded with transaction fees. If the output value of a transaction is less than its input value, the difference is a transaction fee that is added to the incentive value of the block containing the transaction. Once a predetermined number of coins have entered circulation, the incentive can transition entirely to transaction fees and be completely inflation free. The incentive may help encourage nodes to stay honest. If a greedy attacker is able to assemble more CPU power than all the honest nodes, he would have to choose between using it to defraud people by stealing back his payments, or using it to generate new coins. He ought to find it more profitable to play by the rules, such rules that favour him with more new coins than everyone else combined, than to undermine the system and the validity of his own wealth. ## 7. Reclaiming Disk Space Once the latest transaction in a coin is buried under enough blocks, the spent transactions before it can be discarded to save disk space. To facilitate this without breaking the block's hash, transactions are hashed in a Merkle Tree [^7] [^2] [^5], with only the root included in the block's hash. Old blocks can then be compacted by stubbing off branches of the tree. The interior hashes do not need to be stored. ``` ┌──────────────────────────────────────────┐ ┌──────────────────────────────────────────┐ │ │ │ │ │ Block ┌─────────────────────────────┐ │ │ Block ┌─────────────────────────────┐ │ │ │ Block Header (Block Hash) │ │ │ │ Block Header (Block Hash) │ │ │ │ ┌────────────┐ ┌─────────┐ │ │ │ │ ┌────────────┐ ┌─────────┐ │ │ │ │ │ Prev Hash │ │ Nonce │ │ │ │ │ │ Prev Hash │ │ Nonce │ │ │ │ │ └────────────┘ └─────────┘ │ │ │ │ └────────────┘ └─────────┘ │ │ │ │ │ │ │ │ │ │ │ │ ┌─────────────┐ │ │ │ │ ┌─────────────┐ │ │ │ │ │ Root Hash │ │ │ │ │ │ Root Hash │ │ │ │ │ └─────▲─▲─────┘ │ │ │ │ └─────▲─▲─────┘ │ │ │ │ │ │ │ │ │ │ │ │ │ │ │ │ │ │ │ │ │ │ │ │ │ │ │ └───────────┼─┼───────────────┘ │ │ └───────────┼─┼───────────────┘ │ │ │ │ │ │ │ │ │ │ .......... │ │ .......... │ │ ┌────────┐ │ │ .......... │ │ . ─────┘ └─────. . │ │ │ ├────┘ └─────. . │ │ . Hash01 . . Hash23 . │ │ │ Hash01 │ . Hash23 . │ │ .▲.....▲.. .▲.....▲.. │ │ │ │ .▲.....▲.. │ │ │ │ │ │ │ │ └────────┘ │ │ │ │ │ │ │ │ │ │ │ │ │ │ │ │ │ │ │ │ │ │ │ │ .....│.. ..│..... .....│.. ..│..... │ │ ┌────┴─┐ ..│..... │ │ . . . . . . . . │ │ │ │ . . │ │ .Hash0 . .Hash1 . .Hash2 . .Hash3 . │ │ │Hash2 │ .Hash3 . │ │ ...▲.... ...▲.... ...▲.... ...▲.... │ │ │ │ . . │ │ │ │ │ │ │ │ └──────┘ ...▲.... │ │ │ │ │ │ │ │ │ │ │ │ │ │ │ │ │ │ │ │ ┌──┴───┐ ┌──┴───┐ ┌──┴───┐ ┌──┴───┐ │ │ ┌──┴───┐ │ │ │ Tx0 │ │ Tx1 │ │ Tx2 │ │ Tx3 │ │ │ │ Tx3 │ │ │ └──────┘ └──────┘ └──────┘ └──────┘ │ │ └──────┘ │ │ │ │ │ └──────────────────────────────────────────┘ └──────────────────────────────────────────┘ Transactions Hashed in a Merkle Tree After Pruning Tx0-2 from the Block ``` A block header with no transactions would be about 80 bytes. If we suppose blocks are generated every 10 minutes, 80 bytes * 6 * 24 * 365 = 4.2MB per year. With computer systems typically selling with 2GB of RAM as of 2008, and Moore's Law predicting current growth of 1.2GB per year, storage should not be a problem even if the block headers must be kept in memory. ## 8. Simplified Payment Verification It is possible to verify payments without running a full network node. A user only needs to keep a copy of the block headers of the longest proof-of-work chain, which he can get by querying network nodes until he's convinced he has the longest chain, and obtain the Merkle branch linking the transaction to the block it's timestamped in. He can't check the transaction for himself, but by linking it to a place in the chain, he can see that a network node has accepted it, and blocks added after it further confirm the network has accepted it. ``` Longest Proof-of-Work Chain ┌────────────────────────────────────────┐ ┌────────────────────────────────────────┐ ┌────────────────────────────────────────┐ │ Block Header │ │ Block Header │ │ Block Header │ │ ┌──────────────────┐ ┌──────────────┐ │ │ ┌──────────────────┐ ┌──────────────┐ │ │ ┌──────────────────┐ ┌──────────────┐ │ ───────┼─►│ Prev Hash │ │ Nonce │ ├──────┼─►│ Prev Hash │ │ Nonce │ ├───────┼─►│ Prev Hash │ │ Nonce │ ├────────► │ └──────────────────┘ └──────────────┘ │ │ └──────────────────┘ └──────────────┘ │ │ └──────────────────┘ └──────────────┘ │ │ │ │ │ │ │ │ ┌───────────────────┐ │ │ ┌────────────────────┐ │ │ ┌───────────────────┐ │ │ │ Merkle Root │ │ │ │ Merkle Root │ │ │ │ Merkle Root │ │ │ └───────────────────┘ │ │ └────────▲─▲─────────┘ │ │ └───────────────────┘ │ │ │ │ │ │ │ │ │ └────────────────────────────────────────┘ └─────────────┼─┼────────────────────────┘ └────────────────────────────────────────┘ │ │ │ │ ┌────────┐ │ │ .......... │ ├────┘ └─────. . │ Hash01 │ . Hash23 . │ │ .▲.....▲.. └────────┘ │ │ │ │ │ │ Merkle Branch for Tx3 │ │ ┌─────┴─┐ ..│..... │ │ . . │ Hash2 │ .Hash3 . │ │ . . └───────┘ ...▲.... │ │ ┌───┴───┐ │ Tx3 │ └───────┘ ``` As such, the verification is reliable as long as honest nodes control the network, but is more vulnerable if the network is overpowered by an attacker. While network nodes can verify transactions for themselves, the simplified method can be fooled by an attacker's fabricated transactions for as long as the attacker can continue to overpower the network. One strategy to protect against this would be to accept alerts from network nodes when they detect an invalid block, prompting the user's software to download the full block and alerted transactions to confirm the inconsistency. Businesses that receive frequent payments will probably still want to run their own nodes for more independent security and quicker verification. ## 9. Combining and Splitting Value Although it would be possible to handle coins individually, it would be unwieldy to make a separate transaction for every cent in a transfer. To allow value to be split and combined, transactions contain multiple inputs and outputs. Normally there will be either a single input from a larger previous transaction or multiple inputs combining smaller amounts, and at most two outputs: one for the payment, and one returning the change, if any, back to the sender. ``` ┌──────────────────────┐ │ Transaction │ │ │ │ ┌─────┐ ┌─────┐ │ ─────┼──►│ in │ │ out │ ──┼─────► │ └─────┘ └─────┘ │ │ │ │ │ │ ┌─────┐ ┌─────┐ │ ─────┼──►│ in │ │ ... │ ──┼─────► │ └─────┘ └─────┘ │ │ │ │ │ │ ┌─────┐ │ ─────┼──►│... │ │ │ └─────┘ │ │ │ └──────────────────────┘ ``` It should be noted that fan-out, where a transaction depends on several transactions, and those transactions depend on many more, is not a problem here. There is never the need to extract a complete standalone copy of a transaction's history. ## 10. Privacy The traditional banking model achieves a level of privacy by limiting access to information to the parties involved and the trusted third party. The necessity to announce all transactions publicly precludes this method, but privacy can still be maintained by breaking the flow of information in another place: by keeping public keys anonymous. The public can see that someone is sending an amount to someone else, but without information linking the transaction to anyone. This is similar to the level of information released by stock exchanges, where the time and size of individual trades, the "tape", is made public, but without telling who the parties were. ``` Traditional Privacy Models │ ┌─────────────┐ ┌──────────────┐ │ ┌────────┐ ┌──────────────┐ ┌──────────────┐ │ Trusted │ │ │ │ │ │ │ Identities ├──┤ Transactions ├───►│ Third Party ├──►│ Counterparty │ │ │ Public │ └──────────────┘ └──────────────┘ │ │ │ │ │ │ │ └─────────────┘ └──────────────┘ │ └────────┘ │ New Privacy Model ┌────────┐ ┌──────────────┐ │ ┌──────────────┐ │ │ │ Identities │ │ │ Transactions ├───►│ Public │ └──────────────┘ │ └──────────────┘ │ │ └────────┘ ``` As an additional firewall, a new key pair should be used for each transaction to keep them from being linked to a common owner. Some linking is still unavoidable with multi-input transactions, which necessarily reveal that their inputs were owned by the same owner. The risk is that if the owner of a key is revealed, linking could reveal other transactions that belonged to the same owner. ## 11. Calculations We consider the scenario of an attacker trying to generate an alternate chain faster than the honest chain. Even if this is accomplished, it does not throw the system open to arbitrary changes, such as creating value out of thin air or taking money that never belonged to the attacker. Nodes are not going to accept an invalid transaction as payment, and honest nodes will never accept a block containing them. An attacker can only try to change one of his own transactions to take back money he recently spent. The race between the honest chain and an attacker chain can be characterized as a Binomial Random Walk. The success event is the honest chain being extended by one block, increasing its lead by +1, and the failure event is the attacker's chain being extended by one block, reducing the gap by -1. The probability of an attacker catching up from a given deficit is analogous to a Gambler's Ruin problem. Suppose a gambler with unlimited credit starts at a deficit and plays potentially an infinite number of trials to try to reach breakeven. We can calculate the probability he ever reaches breakeven, or that an attacker ever catches up with the honest chain, as follows [^8]: ```plaintext p = probability an honest node finds the next block< q = probability the attacker finds the next block q = probability the attacker will ever catch up from z blocks behind `````` $$ qz = \begin{cases} 1 & \text{if } p \leq q \\ \left(\frac{q}{p}\right) z & \text{if } p > q \end{cases} $$ Given our assumption that p > q, the probability drops exponentially as the number of blocks the attacker has to catch up with increases. With the odds against him, if he doesn't make a lucky lunge forward early on, his chances become vanishingly small as he falls further behind. We now consider how long the recipient of a new transaction needs to wait before being sufficiently certain the sender can't change the transaction. We assume the sender is an attacker who wants to make the recipient believe he paid him for a while, then switch it to pay back to himself after some time has passed. The receiver will be alerted when that happens, but the sender hopes it will be too late. The receiver generates a new key pair and gives the public key to the sender shortly before signing. This prevents the sender from preparing a chain of blocks ahead of time by working on it continuously until he is lucky enough to get far enough ahead, then executing the transaction at that moment. Once the transaction is sent, the dishonest sender starts working in secret on a parallel chain containing an alternate version of his transaction. The recipient waits until the transaction has been added to a block and z blocks have been linked after it. He doesn't know the exact amount of progress the attacker has made, but assuming the honest blocks took the average expected time per block, the attacker's potential progress will be a Poisson distribution with expected value: $$ \lambda = z\frac{q}{p} $$ To get the probability the attacker could still catch up now, we multiply the Poisson density for each amount of progress he could have made by the probability he could catch up from that point: $$ \sum_{k=0}^{\infty} \frac{\lambda^k e^{-\lambda}}{k!} \cdot \left\{ \begin{array}{cl} \left(\frac{q}{p}\right)^{(z-k)} & \text{if } k \leq z \\ 1 & \text{if } k > z \end{array} \right. $$ Rearranging to avoid summing the infinite tail of the distribution... $$ 1 - \sum_{k=0}^{z} \frac{\lambda^k e^{-\lambda}}{k!} \left(1-\left(\frac{q}{p}\right)^{(z-k)}\right) $$ Converting to C code... ```c #include <math.h> double AttackerSuccessProbability(double q, int z) { double p = 1.0 - q; double lambda = z * (q / p); double sum = 1.0; int i, k; for (k = 0; k <= z; k++) { double poisson = exp(-lambda); for (i = 1; i <= k; i++) poisson *= lambda / i; sum -= poisson * (1 - pow(q / p, z - k)); } return sum; } ``` Running some results, we can see the probability drop off exponentially with z. ```plaintext q=0.1 z=0 P=1.0000000 z=1 P=0.2045873 z=2 P=0.0509779 z=3 P=0.0131722 z=4 P=0.0034552 z=5 P=0.0009137 z=6 P=0.0002428 z=7 P=0.0000647 z=8 P=0.0000173 z=9 P=0.0000046 z=10 P=0.0000012 q=0.3 z=0 P=1.0000000 z=5 P=0.1773523 z=10 P=0.0416605 z=15 P=0.0101008 z=20 P=0.0024804 z=25 P=0.0006132 z=30 P=0.0001522 z=35 P=0.0000379 z=40 P=0.0000095 z=45 P=0.0000024 z=50 P=0.0000006 ``` Solving for P less than 0.1%... ```plaintext P < 0.001 q=0.10 z=5 q=0.15 z=8 q=0.20 z=11 q=0.25 z=15 q=0.30 z=24 q=0.35 z=41 q=0.40 z=89 q=0.45 z=340 ``` ## 12. Conclusion We have proposed a system for electronic transactions without relying on trust. We started with the usual framework of coins made from digital signatures, which provides strong control of ownership, but is incomplete without a way to prevent double-spending. To solve this, we proposed a peer-to-peer network using proof-of-work to record a public history of transactions that quickly becomes computationally impractical for an attacker to change if honest nodes control a majority of CPU power. The network is robust in its unstructured simplicity. Nodes work all at once with little coordination. They do not need to be identified, since messages are not routed to any particular place and only need to be delivered on a best effort basis. Nodes can leave and rejoin the network at will, accepting the proof-of-work chain as proof of what happened while they were gone. They vote with their CPU power, expressing their acceptance of valid blocks by working on extending them and rejecting invalid blocks by refusing to work on them. Any needed rules and incentives can be enforced with this consensus mechanism. <br> ### References --- [^1]: W. Dai, "b-money," http://www.weidai.com/bmoney.txt, 1998. [^2]: H. Massias, X.S. Avila, and J.-J. Quisquater, "Design of a secure timestamping service with minimal trust requirements," In 20th Symposium on Information Theory in the Benelux, May 1999. [^3]: S. Haber, W.S. Stornetta, "How to time-stamp a digital document," In Journal of Cryptology, vol 3, no 2, pages 99-111, 1991. [^4]: D. Bayer, S. Haber, W.S. Stornetta, "Improving the efficiency and reliability of digital time-stamping," In Sequences II: Methods in Communication, Security and Computer Science, pages 329-334, 1993. [^5]: S. Haber, W.S. Stornetta, "Secure names for bit-strings," In Proceedings of the 4th ACM Conference on Computer and Communications Security, pages 28-35, April 1997. [^6]: A. Back, "Hashcash - a denial of service counter-measure," http://www.hashcash.org/papers/hashcash.pdf, 2002. [^7]: R.C. Merkle, "Protocols for public key cryptosystems," In Proc. 1980 Symposium on Security and Privacy, IEEE Computer Society, pages 122-133, April 1980. [^8]: W. Feller, "An introduction to probability theory and its applications," 1957.
-
@ dd664d5e:5633d319
2024-06-16 08:01:38Bitcoin Capitalism
But who would build the roads?
Discussing who would build the roads is the classic intellectual excersize of anarchists, everywhere. Would everyone build the stretch in front of, and within, their property? Would private entities build the roads and charge a toll? Would roads fall into disarray and we'd use more rugged vehicles, such as mule-drawn carts and offroad vehicles? Would we eventually abandon the wheel and switch to camels and hovercraft?
Most discussion participants assume it will devolve down into some combination, but Bitcoin might put a break on the toll road option. Bitcoin rises in purchasing power so quickly and inexorably, that any complex, long-term, expensive project becomes an investment with guaranteed negative nominal returns and a high probability of negative real returns.
You would struggle to demand tolls at a price high enough to cover the nominal costs of the initial investment (although they might cover maintenance and running costs), and the tolls you could demand would steadily shrink in nominal terms.
Bitcoin reduces the investment profit motive
The Cambridge Dictionary describes investing as:
to put money, effort, time, etc. into something to make a profit or get an advantage
Bitcoin means that you make a monetary profit by saving capital, rather than by deploying it. To deploy Bitcoin is always to simply spend it. The capital is consumed. Gone. Poof!
Whereas if you simply hodl it, you will rapidly grow wealthier and wealthier without commiserate effort. There will be no such thing as a profitable return on investment because the best returns will always be in savings. Your best financial bet is always to save all of your Bitcoin and use the infrastructure other people have built, at great loss to themselves.
This is correct
This is actually how capitalism should work. It is not an error.
Someone or a group of someones deploys capital, first, to build something, and they alone hold the risk of failure. They do it without expecting their (entire, nominal) capital returned because they value the building planned more than the capital trapped in it. The capital invested is primarily transformed, rather than returned, and the invested capital raises the living standard for everyone who then uses the building.
This is actually high-time preference behavior called "patronage". This is not the opposite of saving, but something often done in addition to it. It is investment devoid of a financial profit motive.
We have come full-circle
What Bitcoin does, in perpetuity, is raise the risk of capital so high that a financial loss is expected, so capital deployments beyond those required for personal consumption will only come from the generous and the particularly far-sighted.
Bitcoin moves investments out of the purely financial realm back into a preference for the public good and a willingness to sacrifice, with returns often being tokens of gratitude, special privileges, increased social standing, hopes of indirect business advantage, or promises of treasure in the afterlife, rather than direct returns of capital, or even profits.
Who will build the roads? The nobility, the pious, the ambitious middle class, and the volunteering tradesmen will build the roads, just as they once built the cathedrals.... and the roads.
-
@ 8fb140b4:f948000c
2023-07-22 09:39:48Intro
This short tutorial will help you set up your own Nostr Wallet Connect (NWC) on your own LND Node that is not using Umbrel. If you are a user of Umbrel, you should use their version of NWC.
Requirements
You need to have a working installation of LND with established channels and connectivity to the internet. NWC in itself is fairly light and will not consume a lot of resources. You will also want to ensure that you have a working installation of Docker, since we will use a docker image to run NWC.
- Working installation of LND (and all of its required components)
- Docker (with Docker compose)
Installation
For the purpose of this tutorial, we will assume that you have your lnd/bitcoind running under user bitcoin with home directory /home/bitcoin. We will also assume that you already have a running installation of Docker (or docker.io).
Prepare and verify
git version - we will need git to get the latest version of NWC. docker version - should execute successfully and show the currently installed version of Docker. docker compose version - same as before, but the version will be different. ss -tupln | grep 10009- should produce the following output: tcp LISTEN 0 4096 0.0.0.0:10009 0.0.0.0: tcp LISTEN 0 4096 [::]:10009 [::]:**
For things to work correctly, your Docker should be version 20.10.0 or later. If you have an older version, consider installing a new one using instructions here: https://docs.docker.com/engine/install/
Create folders & download NWC
In the home directory of your LND/bitcoind user, create a new folder, e.g., "nwc" mkdir /home/bitcoin/nwc. Change to that directory cd /home/bitcoin/nwc and clone the NWC repository: git clone https://github.com/getAlby/nostr-wallet-connect.git
Creating the Docker image
In this step, we will create a Docker image that you will use to run NWC.
- Change directory to
nostr-wallet-connect
:cd nostr-wallet-connect
- Run command to build Docker image:
docker build -t nwc:$(date +'%Y%m%d%H%M') -t nwc:latest .
(there is a dot at the end) - The last line of the output (after a few minutes) should look like
=> => naming to docker.io/library/nwc:latest
nwc:latest
is the name of the Docker image with a tag which you should note for use later.
Creating docker-compose.yml and necessary data directories
- Let's create a directory that will hold your non-volatile data (DB):
mkdir data
- In
docker-compose.yml
file, there are fields that you want to replace (<> comments) and port “4321” that you want to make sure is open (check withss -tupln | grep 4321
which should return nothing). - Create
docker-compose.yml
file with the following content, and make sure to update fields that have <> comment:
version: "3.8" services: nwc: image: nwc:latest volumes: - ./data:/data - ~/.lnd:/lnd:ro ports: - "4321:8080" extra_hosts: - "localhost:host-gateway" environment: NOSTR_PRIVKEY: <use "openssl rand -hex 32" to generate a fresh key and place it inside ""> LN_BACKEND_TYPE: "LND" LND_ADDRESS: localhost:10009 LND_CERT_FILE: "/lnd/tls.cert" LND_MACAROON_FILE: "/lnd/data/chain/bitcoin/mainnet/admin.macaroon" DATABASE_URI: "/data/nostr-wallet-connect.db" COOKIE_SECRET: <use "openssl rand -hex 32" to generate fresh secret and place it inside ""> PORT: 8080 restart: always stop_grace_period: 1m
Starting and testing
Now that you have everything ready, it is time to start the container and test.
- While you are in the
nwc
directory (important), execute the following command and check the log output,docker compose up
- You should see container logs while it is starting, and it should not exit if everything went well.
- At this point, you should be able to go to
http://<ip of the host where nwc is running>:4321
and get to the interface of NWC - To stop the test run of NWC, simply press
Ctrl-C
, and it will shut the container down. - To start NWC permanently, you should execute
docker compose up -d
, “-d” tells Docker to detach from the session. - To check currently running NWC logs, execute
docker compose logs
to run it in tail mode add-f
to the end. - To stop the container, execute
docker compose down
That's all, just follow the instructions in the web interface to get started.
Updating
As with any software, you should expect fixes and updates that you would need to perform periodically. You could automate this, but it falls outside of the scope of this tutorial. Since we already have all of the necessary configuration in place, the update execution is fairly simple.
- Change directory to the clone of the git repository,
cd /home/bitcoin/nwc/nostr-wallet-connect
- Run command to build Docker image:
docker build -t nwc:$(date +'%Y%m%d%H%M') -t nwc:latest .
(there is a dot at the end) - Change directory back one level
cd ..
- Restart (stop and start) the docker compose config
docker compose down && docker compose up -d
- Done! Optionally you may want to check the logs:
docker compose logs
-
@ 0176967e:1e6f471e
2024-07-21 15:48:56Lístky na festival Lunarpunku sú už v predaji na našom crowdfunding portáli. V predaji sú dva typy lístkov - štandardný vstup a špeciálny vstup spolu s workshopom oranžového leta.
Neváhajte a zabezpečte si lístok, čím skôr to urobíte, tým bude festival lepší.
Platiť môžete Bitcoinom - Lightningom aj on-chain. Vaša vstupenka je e-mail adresa (neposielame potvrdzujúce e-maily, ak platba prešla, ste in).
-
@ 3f289d3c:cf35b9fc
2024-06-14 16:14:13In an increasingly regulated world, the industries that are not heavily regulated need to start switching to Bitcoin and Lightning payments now
I'll start this article off by giving you a little bit of back story. I'm from Canada and in Canada the government is increasingly crushing people under high taxes, massive inflation and unreasonable regulation. This seems to be the story in a lot of places but Canada is particularly bad on the regulation and taxes front, at least from my perspective. I live in a rural area where gasoline prices are much higher than in the higher population density areas of our province. Food prices are higher, product prices are higher including on the used market and property taxes here are way higher than what one might expect in such a remote area. You get the idea.
Now when I moved here the only method of income I had was crypto mining, specifically Ethereum. That seemed to be doing well for me for a short time. As time went on I noticed my savings dipping further and further and even though I was holding a lot of coins that were doing well overall, I was spending much more than I was able to earn with any of the crypto mining methods. I didn't lose a ton of money on any one of these but the Canadian market got so bad that I was forced to liquidate most of the tokens that I had set aside in order to make myself more on the cryptocurrency market. Not an ideal situation to be in.
As time went on I was forced to shut off my Ethereum mining rigs and get a "real job." Since then I've been more or less happily back in the the fiat mines (literally working at a gold mine) working for a meager (but not terrible) living and doing better than I was without a job. The biggest change over the course of the Last few months has been my switch to Bitcoin maximalism. I've learned more about the technology/currency and I've switched any remaining extra funds over to Bitcoin.
This all being said, The Canadian economy is now worse than it's ever been and I'm living paycheck to paycheck even though I make something like $78,000 a year in the mining sector. Obviously, this is absolutely Ridiculous. This is what brings me to my side income.
I've been a professional photographer for several years now and up until this point I hadn't considered trying to generate an income in anything other than Canadian dollars. With Canada being in the state it is currently I have reconsidered that and I am now accepting preferably Bitcoin or cash but I really want to emphasize Bitcoin. My issue at the moment is that I live in an area that is very rural and very remote so there are not a lot of people who even know what Bitcoin is let alone people who will actually use it. This is my biggest hurdle at the moment but frankly I don't think it's not much of a hurdle.
My primary plan for some time has been to build an online business focusing on education, community And maybe a bit of consulting versus going out and doing primarily photo shoots. Obviously that has to exist as part of the business ultimately, but my real goal is to focus on educating others and helping them to get to where I am and beyond. Basically I want to pass on my skill set and I want to avoid the awful economy as much as I can while doing it.
This is where accepting Bitcoin comes in. By accepting Bitcoin as payment for my photography services and only using marketplaces that Are built on that technology, I can largely skip past the modern woes of our economy. I can skip out on taxes and I can skip out on any kind of reporting that I would otherwise need to do. I can basically run a business completely free of the government and not need to do anything other than simply do what I would do if it were a hobby. This really excites me.
Imagine a world where you can just transact without having to report anything on your taxes. Imagine not having taxes at all! In an industry such as photography where it's mostly digital anyways, Bitcoin is the perfect method of payment for your services. With the lightning network And NFC chips, You can do all of your transacting right there on your phone to a lightning enabled Bitcoin wallet.
I think these industries where there isn't any regulation at all or at least very little regulation, are the perfect places to start doing this. They're the perfect breeding ground for all Bitcoin transactions and businesses. Why get the government involved at all when you can simply run your business the way it should be run? As they say on nostr, "Bitcoin fixes this."
I'm really looking forward to building my business in the nostr environment and on the back of a Bitcoin standard. I've already started the transition by joining nostr and I'm beginning to speak about photography there. I don't think I have felt as positive or as enthusiastic about this endeavor since joining nostr as a social platform and I'm really happy that I'm here. I believe that this is the future of digital transactions and social media and it may just be the best way to run an online business going forward.
Thank you for taking the time to read this and if anybody wants to see anything specific from me feel free to hit me up on nostr! I hope my thoughts brought some value to you and maybe got you thinking about how you could start transitioning your business to a Bitcoin standard.
-
@ d2e97f73:ea9a4d1b
2023-04-11 19:36:53There’s a lot of conversation around the #TwitterFiles. Here’s my take, and thoughts on how to fix the issues identified.
I’ll start with the principles I’ve come to believe…based on everything I’ve learned and experienced through my past actions as a Twitter co-founder and lead:
- Social media must be resilient to corporate and government control.
- Only the original author may remove content they produce.
- Moderation is best implemented by algorithmic choice.
The Twitter when I led it and the Twitter of today do not meet any of these principles. This is my fault alone, as I completely gave up pushing for them when an activist entered our stock in 2020. I no longer had hope of achieving any of it as a public company with no defense mechanisms (lack of dual-class shares being a key one). I planned my exit at that moment knowing I was no longer right for the company.
The biggest mistake I made was continuing to invest in building tools for us to manage the public conversation, versus building tools for the people using Twitter to easily manage it for themselves. This burdened the company with too much power, and opened us to significant outside pressure (such as advertising budgets). I generally think companies have become far too powerful, and that became completely clear to me with our suspension of Trump’s account. As I’ve said before, we did the right thing for the public company business at the time, but the wrong thing for the internet and society. Much more about this here: https://twitter.com/jack/status/1349510769268850690
I continue to believe there was no ill intent or hidden agendas, and everyone acted according to the best information we had at the time. Of course mistakes were made. But if we had focused more on tools for the people using the service rather than tools for us, and moved much faster towards absolute transparency, we probably wouldn’t be in this situation of needing a fresh reset (which I am supportive of). Again, I own all of this and our actions, and all I can do is work to make it right.
Back to the principles. Of course governments want to shape and control the public conversation, and will use every method at their disposal to do so, including the media. And the power a corporation wields to do the same is only growing. It’s critical that the people have tools to resist this, and that those tools are ultimately owned by the people. Allowing a government or a few corporations to own the public conversation is a path towards centralized control.
I’m a strong believer that any content produced by someone for the internet should be permanent until the original author chooses to delete it. It should be always available and addressable. Content takedowns and suspensions should not be possible. Doing so complicates important context, learning, and enforcement of illegal activity. There are significant issues with this stance of course, but starting with this principle will allow for far better solutions than we have today. The internet is trending towards a world were storage is “free” and infinite, which places all the actual value on how to discover and see content.
Which brings me to the last principle: moderation. I don’t believe a centralized system can do content moderation globally. It can only be done through ranking and relevance algorithms, the more localized the better. But instead of a company or government building and controlling these solely, people should be able to build and choose from algorithms that best match their criteria, or not have to use any at all. A “follow” action should always deliver every bit of content from the corresponding account, and the algorithms should be able to comb through everything else through a relevance lens that an individual determines. There’s a default “G-rated” algorithm, and then there’s everything else one can imagine.
The only way I know of to truly live up to these 3 principles is a free and open protocol for social media, that is not owned by a single company or group of companies, and is resilient to corporate and government influence. The problem today is that we have companies who own both the protocol and discovery of content. Which ultimately puts one person in charge of what’s available and seen, or not. This is by definition a single point of failure, no matter how great the person, and over time will fracture the public conversation, and may lead to more control by governments and corporations around the world.
I believe many companies can build a phenomenal business off an open protocol. For proof, look at both the web and email. The biggest problem with these models however is that the discovery mechanisms are far too proprietary and fixed instead of open or extendable. Companies can build many profitable services that complement rather than lock down how we access this massive collection of conversation. There is no need to own or host it themselves.
Many of you won’t trust this solution just because it’s me stating it. I get it, but that’s exactly the point. Trusting any one individual with this comes with compromises, not to mention being way too heavy a burden for the individual. It has to be something akin to what bitcoin has shown to be possible. If you want proof of this, get out of the US and European bubble of the bitcoin price fluctuations and learn how real people are using it for censorship resistance in Africa and Central/South America.
I do still wish for Twitter, and every company, to become uncomfortably transparent in all their actions, and I wish I forced more of that years ago. I do believe absolute transparency builds trust. As for the files, I wish they were released Wikileaks-style, with many more eyes and interpretations to consider. And along with that, commitments of transparency for present and future actions. I’m hopeful all of this will happen. There’s nothing to hide…only a lot to learn from. The current attacks on my former colleagues could be dangerous and doesn’t solve anything. If you want to blame, direct it at me and my actions, or lack thereof.
As far as the free and open social media protocol goes, there are many competing projects: @bluesky is one with the AT Protocol, nostr another, Mastodon yet another, Matrix yet another…and there will be many more. One will have a chance at becoming a standard like HTTP or SMTP. This isn’t about a “decentralized Twitter.” This is a focused and urgent push for a foundational core technology standard to make social media a native part of the internet. I believe this is critical both to Twitter’s future, and the public conversation’s ability to truly serve the people, which helps hold governments and corporations accountable. And hopefully makes it all a lot more fun and informative again.
💸🛠️🌐 To accelerate open internet and protocol work, I’m going to open a new category of #startsmall grants: “open internet development.” It will start with a focus of giving cash and equity grants to engineering teams working on social media and private communication protocols, bitcoin, and a web-only mobile OS. I’ll make some grants next week, starting with $1mm/yr to Signal. Please let me know other great candidates for this money.
-
@ 8e7462a8:c723a97b
2024-07-21 15:46:37 -
@ 20986fb8:cdac21b3
2024-07-02 21:08:21The motivation of building decentralized media protocols is to balance the power of centralized media platforms, and provide an alternative for the public to take part in the curation of information, not just access information. It has to perform checks-and-balances to the centralized media powers, while providing a credible and neutral mechanism for different communities to improve a platform over time. The operation and maintenance of such a media platform should be autonomous, unstoppable, and self-improving.
While a traditional media platform has to align itself with the interests of its owners, shareholders, and donors, the new type of media platform should not be subject to the control of any entity at all times.
The ownership and stewardship of the new media platform should be open and transparent to the public. This is in contrast to state/corporate-controlled media platforms, in which case the public is often not informed of a media platform’s real ownership and stewardship, hence the public cannot be aware of its real purpose, bias, and hidden agendas.
If each individual is more informed, it will be harder to manipulate public opinions, and it will be easier to foster independent judgment within human societies. The positive externality will be enormous. The transition from centralized media platforms to decentralized media platforms speeds up this process.
This article discusses the two pillars to implement a decentralized media technology stack: decentralized publication and decentralized content moderation.
Decentralization of publication
In a previous article, we extensively discussed the importance of content decentralization. It is the last defense of censorship - content removed or hidden by popular client websites could be stored permanently and be displayed on other (perhaps non-popular) client websites. We also discussed the importance of token economics neutrality, because token economics itself could lead to censorship.
As a non-microblogging platform, Yakihonne is currently one of the most popular clients that implemented NIP-23, attracting bloggers and readers from all over the world to post articles and read contents about Bitcoin and beyond.
As of the time of writing of this article, the platform serves users in 116 countries, supporting 18 languages, and has become the most widely used and feature-rich client within the Nostr ecosystem. Yakihonne is also the first Nostr client to support mobile applications and article editing on mobile apps.
When publishing articles on Yakihonne.com, you can choose multiple relays to publish the content. Relays can also implement logic to sync articles of their own interests and offer them to clients.
Joining the relay network is easy. If someone wants to join the relay network, he is able to do so by deploying one of the open source versions of Nostr relays, and starting to store content for the network.
The number of relays serving content within the Nostr protocol is currently quite small. Over the next period of time, it is important to grow the number of reliable relays.
It should be economically viable and encouraging to set up a relay, so that the number of relays can grow over time. Very likely, the number of relays will correspond to the net income of the network, which comes from revenue streams of each of the clients. Therefore, it is foreseeable that the number of reliable relays will be correlated to the traffic of the Nostr protocol in general and therefore, to the traffic of popular clients.
The zap feature (Lightning Zaps) can be used to incentivize relays. But we cannot set unrealistic expectations over zapping. Zapping an article is making a donation. Donation is unpredictable, and it is definitely not a sustainable way to maintain a large, reliable network of relays. One possible solution to this problem is to have profitable clients make continuous contributions to well-performing relays via enforceable protocol setups.
An ideal incentive mechanism for a relay network is to establish a profit-sharing protocol between clients and the relays.
Decentralization of content moderation and content curation
Striking a balance between moderation and anti-moderation has been an almost impossible task for centralized media platforms. Whoever performs the moderation dominates the narratives.
The most recent community notes feature on X took a major step towards community-driven moderation. In community notes, users who are eligible to provide notes can write community notes to a post, and a community can vote for the notes so that the best recognized notes can emerge on the top. It has been proven to be effective in content moderation in many instances, and the ability to add community-backed notes reduced the need for shadow bans.
Community notes are obviously a more effective system than shadow bans in Jack Dorsey’s Twitter era. At times even Elon Musk himself can be “community noted” is definitely making people trust this system more.
But it cannot be ignored that users still have to sign up to apply for eligibility for community notes. It is in X's own direction to decide who is eligible and who is not. The voting process of community notes itself could also be subject to censorship, because there is no way to verify the votes.
This is to say, X has the power to do evil, it doesn’t matter if it does evil or not.
The countermeasure to balance the power of a centralized media platform is a platform that is not manipulatable by its operating team or share/token holders.
So content moderation has to be permissionless, while verifiability to content moderation mechanisms like community notes.
The first step to developing a decentralized content moderation system is to implement a permissionless community notes system.
X’s community notes can be censored in two steps: the eligibility review process can filter people based on X’s own preference; the ranking and displaying rules are also controlled by X. In a permissionless community notes system, any relay or even client should be able to implement its own version of community notes, and users can choose their preferred system(s) to work with.
Notes have to be stored on an open, token economics neutral relay network to ensure their immutability and resistance to censorship. Community members can add notes to any media content, and the notes are distributed to non-related parties who operate the nodes.
Incentives
The content moderation mechanism needs to be incentivized through the decentralized media protocol.
In a non-sponsored model, income comes from ads or subscription fees, and is distributed to content creators, content relays, and additional network security providers.
In a sponsored model, income is from content sponsors, and further distributed to community note contributors, content relays, and additional network security providers. A sponsored post, in old media platforms, can be quite expensive. The cost to post sponsored content can be dramatically reduced on a decentralized media platform because of the great reduction of content moderation cost.
Ranking
Community notes can be ranked by either relays or clients. A content relay can rank community notes based on its own preference, and deliver to clients. Clients can also rank community notes based on its rules. Basically, everyone can rank any post’s community notes.
In general, community notes ranking algorithms can be stake-based or non-stake based. Stake-based ranking algorithms in general show how people who have a stake in something react to a post. Non-stake based ranking algorithms can be any mechanism, such as reputation-weighted ranks, 1p1v ranks, quadratic voting ranks, random selection, or even asking an AI to rank community notes.
What’s perhaps important is not which ranking algorithm is the best, it’s the flexibility for both the content storage layer and the interfaces (clients) to choose their ideal ranking algorithms. Probably for a client, the best strategy is to provide all ranks from multiple sources, so that people can view community notes from very different angles.
The Current Status of Bitcoin and the Nostr Protocol
As Nostr is the ideal relay network for decentralized media at this moment, it is necessary to write some notes on the current status of the Bitcoin ecosystem, as Nostr is largely inspired by the Bitcoin network to be decentralized, and is greatly supported by the Bitcoin community.
The Bitcoin community is so far the most legitimate crypto community. It has a very simple mission: to create a decentralized monetary system that is not controlled by centralized powers. The progressive element of the Bitcoin community is relative to the old banking world, where Bitcoin is a completely new and alternative technology stack to the centralized money and finance systems. The conservative element of the Bitcoin community is relative to the other crypto community, where the Bitcoin community is largely skeptical of the motivation and decentralization of them.
In my opinion, the scrutiny Bitcoin community puts upon the rest of the crypto world is totally reasonable. However, as much as Nostr is inspired by the Bitcoin network, it will also go beyond the Bitcoin community, in terms of content.
This is not a bad thing!
If this happens on a big monolithic blockchain, probably the Nostr major contributors will be required to create a token pretty soon and then pay gas fees to the blockchain so that the blockchain’s revenue can increase. The Bitcoin community will never ask that. This gives Nostr enough freedom and flexibility to evolve, while continuously getting support from the Bitcoin community.
However, if Nostr users are forever a subset of Bitcoin users, Nostr will be limited. Nostr is not a subset of Bitcoin, it is not an “ecosystem project” of Bitcoin either (a.k.a., it should not become a “layer-2” of Bitcoin, as some people reason about the future of Bitcoin totally from the analogy of Ethereum).
In the best world, Nostr and Bitcoin can expand each other’s user base, and create healthy synergy.
Conclusion
Yakihonne is, so far, not a protocol. With the products and features it offers, Yakihonne is ready to build up a decentralized media stack. The stack has two pillars : decentralized publication and decentralized content moderation. The former guarantees that content will never be lost; the latter creates a new cost-incentive model to make sure that when content and moderation go permissionless, the platform can still maintain truthfulness and cost-effectiveness. In fact, a decentralized media platform can be more cost-effective than centralized media platforms. So the next step for us is to build this up and experiment with it.
The Flash News and the uncensored notes are introduced for such purpose, and everyone is welcomed to give it a try. Note that it is still in the early stages and various improvements will come eventually, for this your feedback is valuable to us to reach any expectations.
-
@ 8e7462a8:c723a97b
2024-07-21 15:44:04 -
@ 8e7462a8:c723a97b
2024-07-21 15:42:55 -
@ 82341f88:fbfbe6a2
2023-04-11 19:36:53There’s a lot of conversation around the #TwitterFiles. Here’s my take, and thoughts on how to fix the issues identified.
I’ll start with the principles I’ve come to believe…based on everything I’ve learned and experienced through my past actions as a Twitter co-founder and lead:
- Social media must be resilient to corporate and government control.
- Only the original author may remove content they produce.
- Moderation is best implemented by algorithmic choice.
The Twitter when I led it and the Twitter of today do not meet any of these principles. This is my fault alone, as I completely gave up pushing for them when an activist entered our stock in 2020. I no longer had hope of achieving any of it as a public company with no defense mechanisms (lack of dual-class shares being a key one). I planned my exit at that moment knowing I was no longer right for the company.
The biggest mistake I made was continuing to invest in building tools for us to manage the public conversation, versus building tools for the people using Twitter to easily manage it for themselves. This burdened the company with too much power, and opened us to significant outside pressure (such as advertising budgets). I generally think companies have become far too powerful, and that became completely clear to me with our suspension of Trump’s account. As I’ve said before, we did the right thing for the public company business at the time, but the wrong thing for the internet and society. Much more about this here: https://twitter.com/jack/status/1349510769268850690
I continue to believe there was no ill intent or hidden agendas, and everyone acted according to the best information we had at the time. Of course mistakes were made. But if we had focused more on tools for the people using the service rather than tools for us, and moved much faster towards absolute transparency, we probably wouldn’t be in this situation of needing a fresh reset (which I am supportive of). Again, I own all of this and our actions, and all I can do is work to make it right.
Back to the principles. Of course governments want to shape and control the public conversation, and will use every method at their disposal to do so, including the media. And the power a corporation wields to do the same is only growing. It’s critical that the people have tools to resist this, and that those tools are ultimately owned by the people. Allowing a government or a few corporations to own the public conversation is a path towards centralized control.
I’m a strong believer that any content produced by someone for the internet should be permanent until the original author chooses to delete it. It should be always available and addressable. Content takedowns and suspensions should not be possible. Doing so complicates important context, learning, and enforcement of illegal activity. There are significant issues with this stance of course, but starting with this principle will allow for far better solutions than we have today. The internet is trending towards a world were storage is “free” and infinite, which places all the actual value on how to discover and see content.
Which brings me to the last principle: moderation. I don’t believe a centralized system can do content moderation globally. It can only be done through ranking and relevance algorithms, the more localized the better. But instead of a company or government building and controlling these solely, people should be able to build and choose from algorithms that best match their criteria, or not have to use any at all. A “follow” action should always deliver every bit of content from the corresponding account, and the algorithms should be able to comb through everything else through a relevance lens that an individual determines. There’s a default “G-rated” algorithm, and then there’s everything else one can imagine.
The only way I know of to truly live up to these 3 principles is a free and open protocol for social media, that is not owned by a single company or group of companies, and is resilient to corporate and government influence. The problem today is that we have companies who own both the protocol and discovery of content. Which ultimately puts one person in charge of what’s available and seen, or not. This is by definition a single point of failure, no matter how great the person, and over time will fracture the public conversation, and may lead to more control by governments and corporations around the world.
I believe many companies can build a phenomenal business off an open protocol. For proof, look at both the web and email. The biggest problem with these models however is that the discovery mechanisms are far too proprietary and fixed instead of open or extendable. Companies can build many profitable services that complement rather than lock down how we access this massive collection of conversation. There is no need to own or host it themselves.
Many of you won’t trust this solution just because it’s me stating it. I get it, but that’s exactly the point. Trusting any one individual with this comes with compromises, not to mention being way too heavy a burden for the individual. It has to be something akin to what bitcoin has shown to be possible. If you want proof of this, get out of the US and European bubble of the bitcoin price fluctuations and learn how real people are using it for censorship resistance in Africa and Central/South America.
I do still wish for Twitter, and every company, to become uncomfortably transparent in all their actions, and I wish I forced more of that years ago. I do believe absolute transparency builds trust. As for the files, I wish they were released Wikileaks-style, with many more eyes and interpretations to consider. And along with that, commitments of transparency for present and future actions. I’m hopeful all of this will happen. There’s nothing to hide…only a lot to learn from. The current attacks on my former colleagues could be dangerous and doesn’t solve anything. If you want to blame, direct it at me and my actions, or lack thereof.
As far as the free and open social media protocol goes, there are many competing projects: @bluesky is one with the AT Protocol, nostr another, Mastodon yet another, Matrix yet another…and there will be many more. One will have a chance at becoming a standard like HTTP or SMTP. This isn’t about a “decentralized Twitter.” This is a focused and urgent push for a foundational core technology standard to make social media a native part of the internet. I believe this is critical both to Twitter’s future, and the public conversation’s ability to truly serve the people, which helps hold governments and corporations accountable. And hopefully makes it all a lot more fun and informative again.
💸🛠️🌐 To accelerate open internet and protocol work, I’m going to open a new category of #startsmall grants: “open internet development.” It will start with a focus of giving cash and equity grants to engineering teams working on social media and private communication protocols, bitcoin, and a web-only mobile OS. I’ll make some grants next week, starting with $1mm/yr to Signal. Please let me know other great candidates for this money.
-
@ ec965405:63996966
2024-07-01 21:11:49My sister, cousin, and I joined 75,000 protesters earlier this month in holding up a red banner marked with the names of Palestinian martyrs around the treasury building and white house in washington dc to signify the "People's Red Line" crossed by israel's invasion of Rafah.
After holding up the red line, we made our way to a nearby park where the organizers held a "People's Court" to try israel and the u.s. for war crimes. We shared a collective cry as we listened to heart wrenching testimonies sent in by Gazan families of Palestinian organizers about the nightmare they are enduring. As this was happening, israel killed another 274 Palestinians in a raid on the Nuseirat refugee camp. The notification across my screen felt like a punishment for our activism and reminded me that the bureaucrats are not listening.
I was initially discouraged to see a lower turnout than the march on washington I attended last November, where 300,000 folks arrived to denounce the united states' unconditional support of the Palestinian genocide. I can't say I didn't anticipate it, though. Wage slaves have to eat, and paid time off to protest about things they care about is a luxury few CEO's in this world provide for them.
Adding to this is the brutal suppression of dissenting voices worldwide. Groups like Within Our Lifetime are kneecapped by outright censorship and bans from corporate social media platforms.Not to mention, the u.s. government is forcing a sale of Tik Tok to a former u.s. treasury secretary with ties to mossad.
Palestine is an anti-colonial struggle that inspired so many of us to organize for change, yet we've managed to spend the last 8 months funding their genocide with our consumption tax dollars and scrolling through pictures of dead babies and leveled cities. Why have activists failed to push western governments to materialize any sort of timely, meaningful shifts in foreign policy? I think part of it has to do with the tools we are using. Like Malcolm X said, we're not outnumbered; we're outorganized.
For one, we are letting these tech CEO's off way too easy. The spotlight is increasingly being pointed at corporate social media giants for conducting a mass behavioral health experiment on the masses. These centralized platforms bombard users with morally outrageous content and advertisements while working with the state to suppress dissenting voices. Antagonizing algorithms polarize public discourse and keep us at each others throats, preventing any sort of mass peace movement from building power. Places like Europe have a whole framework of data rights protections for users that is nonexistent in the Americas. The corporate communication apparatus we organize with freezes the political process and manufactures consent for genocide. Was it Noam Chomskey who said, "the media defines what's possible"...? By the way, he's not dead yet.
How else do you think Trump was able to organize his base to attempt a coup on Jan 6th? He paid someone to build his own communications apparatus in the form of a social media platform, Truth Social, on an open protocol called ActivityPub, a predecessor to Nostr. Meridith Whittaker, the president of the Signal Foundation, explains in a recent article that you should read if you are an activist,
The Right has a point. In that they understand social media platforms as critical infrastructure capable of shaping and distorting our shared information ecosystem, and they recognize that controlling how this ecosystem is “distorted” is a better use of their time — in pursuit of power and influence — than trying to create a magic formula that can “democratize” or “balance” the influence these platforms exert.
The few I hear with similar critiques include OGs like Chairman Yeshitela, who, alongside 3 other members of the People's Afrikan Socialist Party, is currently fighting bogus charges by the u.s. federal government for conspiring with the Russians in sowing dissent in u.s. elections. The FBI violently raided their properties in 2022 without any sort of warrant and before any formal charges had been filed. Contrast this with the recent raid on Mar-a-Lago where not a single flash grenade was used. He explains in a 2023 interview on Democracy Now! that
We have to move beyond protest and move toward the question of capturing and yielding political power.
As activists, we have to be brave enough to realize that what we are doing is not producing our desired results and pivot to new strategies. Open internet protocols are a must have in any activist's toolbox if they are serious about social change. Start with a simple internet search, and start building. I currently run a small relay that I use to introduce my community to nostr. I've spun up a diaspora instance for my close friends and family in the past as well. While this tech is accessible, it requires you to take some time to learn about the landscape of alternative social media protocols and how the Internet works. You could always reach out via the contact form on my website at https://miguelalmodo.com for more information or questions about how to get started!
My choice of social media is my way of preventing the monetization of my attention and digital footprint by the corpo-state consent factory that defines our current political realities. It's an step in reclaiming your digital sovereignty and empowering your base with an un-suppressible voice on the internet. A liberated fifth estate built on open protocols could channel people power into material improvements within our lifetimes. It's why I boycott the corporate tech platforms and why you should too.
edit: I realized after I published this that truth social was launched in Feb of 2022, a year AFTER Jan 6 2021. I insinuated that Jan 6 was organized on truth social which is not the case. By making the connection here, I wish to bring to light the power open internet protocols have in sustaining a cult of personality behind someone like trump. If activists and organizations leveraged the same tech to organize their base, it would bring people together in a way that could manifest the amount of people power needed to overthrow the influence of capital in political processes and allow for real democracy to shape our outcomes.
-
@ 6871d8df:4a9396c1
2024-06-12 22:10:51Embracing AI: A Case for AI Accelerationism
In an era where artificial intelligence (AI) development is at the forefront of technological innovation, a counter-narrative championed by a group I refer to as the 'AI Decels'—those advocating for the deceleration of AI advancements— seems to be gaining significant traction. After tuning into a recent episode of the Joe Rogan Podcast, I realized that the prevailing narrative around AI was heading in a dangerous direction. Rogan had Aza Raskin and Tristan Harris, technology safety advocates, who released a talk called 'The AI Dilemma,' on for a discussion. You may know them from the popular documentary 'The Social Dilemma' on the dangers of social media. It became increasingly clear that the cautionary stance dominating this discourse might be tipping the scales too far, veering towards an over-regulated future that stifles innovation rather than fostering it.
Are we moving too fast?
While acknowledging AI's benefits, Aza and Tristan fear it could be dangerous if not guided by ethical standards and safeguards. They believe AI development is moving too quickly and that the right incentives for its growth are not in place. They are concerned about the possibility of "civilizational overwhelm," where advanced AI technology far outpaces 21st-century governance. They fear a scenario where society and its institutions cannot manage or adapt to the rapid changes and challenges introduced by AI.
They argue for regulating and slowing down AI development due to rapid, uncontrolled advancement driven by competition among companies like Google, OpenAI, and Microsoft. They claim this race can lead to unsafe releases of new technologies, with AI systems exhibiting unpredictable, emergent behaviors, posing significant societal risks. For instance, AI can inadvertently learn tasks like sentiment analysis or human emotion understanding, creating potential for misuse in areas like biological weapons or cybersecurity vulnerabilities.
Moreover, AI companies' profit-driven incentives often conflict with the public good, prioritizing market dominance over safety and ethics. This misalignment can lead to technologies that maximize engagement or profits at societal expense, similar to the negative impacts seen with social media. To address these issues, they suggest government regulation to realign AI companies' incentives with safety, ethical considerations, and public welfare. Implementing responsible development frameworks focused on long-term societal impacts is essential for mitigating potential harm.
This isn't new
Though the premise of their concerns seems reasonable, it's dangerous and an all too common occurrence with the emergence of new technologies. For example, in their example in the podcast, they refer to the technological breakthrough of oil. Oil as energy was a technological marvel and changed the course of human civilization. The embrace of oil — now the cornerstone of industry in our age — revolutionized how societies operated, fueled economies, and connected the world in unprecedented ways. Yet recently, as ideas of its environmental and geopolitical ramifications propagated, the narrative around oil has shifted.
Tristan and Aza detail this shift and claim that though the period was great for humanity, we didn't have another technology to go to once the technological consequences became apparent. The problem with that argument is that we did innovate to a better alternative: nuclear. However, at its technological breakthrough, it was met with severe suspicions, from safety concerns to ethical debates over its use. This overregulation due to these concerns caused a decades-long stagnation in nuclear innovation, where even today, we are still stuck with heavy reliance on coal and oil. The scare tactics and fear-mongering had consequences, and, interestingly, they don't see the parallels with their current deceleration stance on AI.
These examples underscore a critical insight: the initial anxiety surrounding new technologies is a natural response to the unknowns they introduce. Yet, history shows that too much anxiety can stifle the innovation needed to address the problems posed by current technologies. The cycle of discovery, fear, adaptation, and eventual acceptance reveals an essential truth—progress requires not just the courage to innovate but also the resilience to navigate the uncertainties these innovations bring.
Moreover, believing we can predict and plan for all AI-related unknowns reflects overconfidence in our understanding and foresight. History shows that technological progress, marked by unexpected outcomes and discoveries, defies such predictions. The evolution from the printing press to the internet underscores progress's unpredictability. Hence, facing AI's future requires caution, curiosity, and humility. Acknowledging our limitations and embracing continuous learning and adaptation will allow us to harness AI's potential responsibly, illustrating that embracing our uncertainties, rather than pretending to foresee them, is vital to innovation.
The journey of technological advancement is fraught with both promise and trepidation. Historically, each significant leap forward, from the dawn of the industrial age to the digital revolution, has been met with a mix of enthusiasm and apprehension. Aza Raskin and Tristan Harris's thesis in the 'AI Dilemma' embodies the latter.
Who defines "safe?"
When slowing down technologies for safety or ethical reasons, the issue arises of who gets to define what "safe" or “ethical” mean? This inquiry is not merely technical but deeply ideological, touching the very core of societal values and power dynamics. For example, the push for Diversity, Equity, and Inclusion (DEI) initiatives shows how specific ideological underpinnings can shape definitions of safety and decency.
Take the case of the initial release of Google's AI chatbot, Gemini, which chose the ideology of its creators over truth. Luckily, the answers were so ridiculous that the pushback was sudden and immediate. My worry, however, is if, in correcting this, they become experts in making the ideological capture much more subtle. Large bureaucratic institutions' top-down safety enforcement creates a fertile ground for ideological capture of safety standards.
I claim that the issue is not the technology itself but the lens through which we view and regulate it. Suppose the gatekeepers of 'safety' are aligned with a singular ideology. In that case, AI development would skew to serve specific ends, sidelining diverse perspectives and potentially stifling innovative thought and progress.
In the podcast, Tristan and Aza suggest such manipulation as a solution. They propose using AI for consensus-building and creating "shared realities" to address societal challenges. In practice, this means that when individuals' viewpoints seem to be far apart, we can leverage AI to "bridge the gap." How they bridge the gap and what we would bridge it toward is left to the imagination, but to me, it is clear. Regulators will inevitably influence it from the top down, which, in my opinion, would be the opposite of progress.
In navigating this terrain, we must advocate for a pluralistic approach to defining safety, encompassing various perspectives and values achieved through market forces rather than a governing entity choosing winners. The more players that can play the game, the more wide-ranging perspectives will catalyze innovation to flourish.
Ownership & Identity
Just because we should accelerate AI forward does not mean I do not have my concerns. When I think about what could be the most devastating for society, I don't believe we have to worry about a Matrix-level dystopia; I worry about freedom. As I explored in "Whose data is it anyway?," my concern gravitates toward the issues of data ownership and the implications of relinquishing control over our digital identities. This relinquishment threatens our privacy and the integrity of the content we generate, leaving it susceptible to the inclinations and profit of a few dominant tech entities.
To counteract these concerns, a paradigm shift towards decentralized models of data ownership is imperative. Such standards would empower individuals with control over their digital footprints, ensuring that we develop AI systems with diverse, honest, and truthful perspectives rather than the massaged, narrow viewpoints of their creators. This shift safeguards individual privacy and promotes an ethical framework for AI development that upholds the principles of fairness and impartiality.
As we stand at the crossroads of technological innovation and ethical consideration, it is crucial to advocate for systems that place data ownership firmly in the hands of users. By doing so, we can ensure that the future of AI remains truthful, non-ideological, and aligned with the broader interests of society.
But what about the Matrix?
I know I am in the minority on this, but I feel that the concerns of AGI (Artificial General Intelligence) are generally overblown. I am not scared of reaching the point of AGI, and I think the idea that AI will become so intelligent that we will lose control of it is unfounded and silly. Reaching AGI is not reaching consciousness; being worried about it spontaneously gaining consciousness is a misplaced fear. It is a tool created by humans for humans to enhance productivity and achieve specific outcomes.
At a technical level, large language models (LLMs) are trained on extensive datasets and learning patterns from language and data through a technique called "unsupervised learning" (meaning the data is untagged). They predict the next word in sentences, refining their predictions through feedback to improve coherence and relevance. When queried, LLMs generate responses based on learned patterns, simulating an understanding of language to provide contextually appropriate answers. They will only answer based on the datasets that were inputted and scanned.
AI will never be "alive," meaning that AI lacks inherent agency, consciousness, and the characteristics of life, not capable of independent thought or action. AI cannot act independently of human control. Concerns about AI gaining autonomy and posing a threat to humanity are based on a misunderstanding of the nature of AI and the fundamental differences between living beings and machines. AI spontaneously developing a will or consciousness is more similar to thinking a hammer will start walking than us being able to create consciousness through programming. Right now, there is only one way to create consciousness, and I'm skeptical that is ever something we will be able to harness and create as humans. Irrespective of its complexity — and yes, our tools will continue to become evermore complex — machines, specifically AI, cannot transcend their nature as non-living, inanimate objects programmed and controlled by humans.
The advancement of AI should be seen as enhancing human capabilities, not as a path toward creating autonomous entities with their own wills. So, while AI will continue to evolve, improve, and become more powerful, I believe it will remain under human direction and control without the existential threats often sensationalized in discussions about AI's future.
With this framing, we should not view the race toward AGI as something to avoid. This will only make the tools we use more powerful, making us more productive. With all this being said, AGI is still much farther away than many believe.
Today's AI excels in specific, narrow tasks, known as narrow or weak AI. These systems operate within tightly defined parameters, achieving remarkable efficiency and accuracy that can sometimes surpass human performance in those specific tasks. Yet, this is far from the versatile and adaptable functionality that AGI represents.
Moreover, the exponential growth of computational power observed in the past decades does not directly translate to an equivalent acceleration in achieving AGI. AI's impressive feats are often the result of massive data inputs and computing resources tailored to specific tasks. These successes do not inherently bring us closer to understanding or replicating the general problem-solving capabilities of the human mind, which again would only make the tools more potent in our hands.
While AI will undeniably introduce challenges and change the aspects of conflict and power dynamics, these challenges will primarily stem from humans wielding this powerful tool rather than the technology itself. AI is a mirror reflecting our own biases, values, and intentions. The crux of future AI-related issues lies not in the technology's inherent capabilities but in how it is used by those wielding it. This reality is at odds with the idea that we should slow down development as our biggest threat will come from those who are not friendly to us.
AI Beget's AI
While the unknowns of AI development and its pitfalls indeed stir apprehension, it's essential to recognize the power of market forces and human ingenuity in leveraging AI to address these challenges. History is replete with examples of new technologies raising concerns, only for those very technologies to provide solutions to the problems they initially seemed to exacerbate. It looks silly and unfair to think of fighting a war with a country that never embraced oil and was still primarily getting its energy from burning wood.
The evolution of AI is no exception to this pattern. As we venture into uncharted territories, the potential issues that arise with AI—be it ethical concerns, use by malicious actors, biases in decision-making, or privacy intrusions—are not merely obstacles but opportunities for innovation. It is within the realm of possibility, and indeed, probability, that AI will play a crucial role in solving the problems it creates. The idea that there would be no incentive to address and solve these problems is to underestimate the fundamental drivers of technological progress.
Market forces, fueled by the demand for better, safer, and more efficient solutions, are powerful catalysts for positive change. When a problem is worth fixing, it invariably attracts the attention of innovators, researchers, and entrepreneurs eager to solve it. This dynamic has driven progress throughout history, and AI is poised to benefit from this problem-solving cycle.
Thus, rather than viewing AI's unknowns as sources of fear, we should see them as sparks of opportunity. By tackling the challenges posed by AI, we will harness its full potential to benefit humanity. By fostering an ecosystem that encourages exploration, innovation, and problem-solving, we can ensure that AI serves as a force for good, solving problems as profound as those it might create. This is the optimism we must hold onto—a belief in our collective ability to shape AI into a tool that addresses its own challenges and elevates our capacity to solve some of society's most pressing issues.
An AI Future
The reality is that it isn't whether AI will lead to unforeseen challenges—it undoubtedly will, as has every major technological leap in history. The real issue is whether we let fear dictate our path and confine us to a standstill or embrace AI's potential to address current and future challenges.
The approach to solving potential AI-related problems with stringent regulations and a slowdown in innovation is akin to cutting off the nose to spite the face. It's a strategy that risks stagnating the U.S. in a global race where other nations will undoubtedly continue their AI advancements. This perspective dangerously ignores that AI, much like the printing press of the past, has the power to democratize information, empower individuals, and dismantle outdated power structures.
The way forward is not less AI but more of it, more innovation, optimism, and curiosity for the remarkable technological breakthroughs that will come. We must recognize that the solution to AI-induced challenges lies not in retreating but in advancing our capabilities to innovate and adapt.
AI represents a frontier of limitless possibilities. If wielded with foresight and responsibility, it's a tool that can help solve some of the most pressing issues we face today. There are certainly challenges ahead, but I trust that with problems come solutions. Let's keep the AI Decels from steering us away from this path with their doomsday predictions. Instead, let's embrace AI with the cautious optimism it deserves, forging a future where technology and humanity advance to heights we can't imagine.
-
@ ddf03aca:5cb3bbbe
2024-06-12 14:51:37Disclaimer: Beware, there be dragons... Deploying your own version of npubcash-server is highly experimental. Do not use this in production yet. If you do deploy it, please make sure to leave your feedback with me.
Prerequisites
Before we dive in, you'll need a few things:
- Blink API Key: npubcash-server uses Blink API for payment callbacks. If you don’t have a Blink account yet, sign up here.
- Postgres: npubcash-server utilizes a Postgres database for storage.
- fly.io Account and flyctl: While you can deploy npubcash-server anywhere, this guide focuses on fly.io and its CLI tool, flyctl.
Setup
Step 1: Clone the Repository
First, clone the npubcash-server repository, specifically the
migrations
branch, as it contains the necessary deployment scripts. Don’t forget to clone all submodules recursively.zsh git clone -b migrations --recurse-submodules https://github.com/cashubtc/npubcash-server.git cd npubcash-server
Step 2: Create fly.toml Configuration
Next, create your
fly.toml
file to configure your deployment.zsh nvim fly.toml
Here's a sample configuration. Adjust the environment variables to match your setup:
```toml app = "npubcash-server" primary_region = "ams"
[build]
[http_service] internal_port = 8000 force_https = true auto_stop_machines = true auto_start_machines = true min_machines_running = 0 processes = ["app"]
[[vm]] memory = "512mb" cpu_kind = "shared" cpus = 1
[env] NODE_ENV = "production" PGUSER = "
" PGPASSWORD = " " PGHOST = " " PGDATABASE = " " PGPORT = " " MINTURL = " " BLINK_API_KEY = " Note: fly.io offers additional security for sensitive environment variables through
secrets
, which are encrypted and never exposed directly. Consider usingsecrets
for sensitive data like your Blink or Nostr key. Learn more here.Deploying
With your configuration ready, it’s time to deploy!
zsh fly launch
When prompted, choose to use the existing
fly.toml
file by inputtingy
. Fly.io might suggest deploying a Postgres instance, but since we're using an external Postgres instance, adjust the settings as needed through their web interface. Watch the build logs in your terminal, and once completed, your instance of npubcash-server is live. Congrats!Setting Up a Custom Domain
To fully utilize npubcash-server as an LNURL server, you’ll need a custom domain. Here’s how to set it up:
- Get the IP Address: Use flyctl to list your IP addresses.
zsh fly ips list
Copy the IPv4 and IPv6 addresses.
-
Create DNS Records: Set up an A record with your domain provider using the IPv4 address and a AAAA record using the IPv6 address.
-
Request an SSL Certificate: For encrypted traffic, request a Let’s Encrypt certificate.
zsh fly certs add <your domain name>
Note: DNS propagation can take some time. Ensure your machine is running and try connecting to the domain once the DNS records are updated. Fly.io will verify the DNS setup when traffic starts coming in.
Wrapping up
That is it... Hopefully by now your very own instance of npubcash-server is running and reachable on your own domain. If you got this far, please reach out to me and let me know. Now please go ahead and try to break it. If you do, please open an issue so that I can fix it :)
-
@ 8e7462a8:c723a97b
2024-07-21 15:39:51 -
@ 8e7462a8:c723a97b
2024-07-21 15:38:33 -
@ 8e7462a8:c723a97b
2024-07-21 15:34:32 -
@ dc4cd086:cee77c06
2024-06-12 01:42:11A formal system is defined by its axioms. We try to treat science (and by proxy, the natural world) with an assumption of nature itself being a formal system -> start with axioms and explore the space from there. Axioms and derived theorems then create the boundaries for the environment you're working with.
Mathematical proofs show you absolute truth in the axiomatic system you are working with. They will also show the absolute falshoods of the system. The space is fractal, endlessly deep, you only can work with the knowledge you have of system you are working with.
The problem is that nature is mind-bogglingly entangled.
You think CS dependencies are tough? Biological processes seemingly have not only unenumerable dependencies, but dependencies that well eventually circle back to the process itself. At that point, you'd expect a computer to be in deadlock, but it gets worse. There there are not only multiple different clock-cycles, but its a gradient everywhere you go. In biology, it is often said that there are always contradictions to rules being stated and that whatever "rules" we come up with are more guidelines, so we can't even make the assumption that biology and nature is a formal system.
What are the implications of that? A real strangely looped rabbit hole for you 🕳️🔄🐇
The image is from Doug Hofstadter's Godel Escher Bach ch 3, a visualization of this process with some defined formal system. Godel's Incompleteness Theorem proved that any formal system will have statements about it that cannot be proved from within said system. For example, where in the system of language rules do you get meaning from? You're not going to derive semantics from syntax.
So if formal systems can't even be fully explicated , what does that imply for non-formal systems? Damned if nature is formal, damned even more if its not.
Now for a sleight shift. Humans, we live outside many axiomatic systems, but use them to frame our understanding of the world. Ever work on a problem for such a long time, and then stop and do something else? Thats because we aren't bound by said box/axiomatic system and can move outside it. This feature itself allows us to reenter the system from different angles, or just say "Stop, its not worth it."
The capacity for humans to exit systems is a core part that separates us from computers. Wisdom incorperates restraint of application as a consequence from viewing the system wholistically.
The real questions now: What kind of boxes are we living in, be they foundational or self constructed? Are we agents taxing towards new provable theorems contained in some system we can't exit from? What are the generating functions for this system? Are we the system itself?
This gives me hope on the unknowable truths and falshoods. Spiritual experience is the experience of The Other, whatever that may be.
original thread: nostr:nevent1qqsqrry7heeq2u38gjursxel9w54t66za3xh0avarfawxnv5rfgvlesprfmhxue69uhkummnw3ezuargv4ekzmt9vdshgtnfduhsygrszgsjsfemmsr6lxl8wf06t39uplq5dpntasudgsmqm39udnqchypsgqqqqqqsdkt4pc
-
@ 8e7462a8:c723a97b
2024-07-21 15:33:27 -
@ 8e7462a8:c723a97b
2024-07-21 15:31:26 -
@ dd664d5e:5633d319
2024-06-11 15:29:22Laeserin's theory of assortative clustering
I was nerding again
I posted a wiki breadcrumb trail of events, today, from entropy, to information entropy, to social media entropy. (Source for the last entry in that list: I made it up.)
The more mathematically-inclined npubs immediately saw the usefulness in such a score, probably because it can be so very difficult to find anyone writing anything particularly complex or "deep" about one's own topics of interest. So, that is a subset of npubs that personally suffers from the difficulty of finding such slender trees within the increasingly vast, wild forest of Nostr notes.
Thar be dragons
However, this sort of score is a proxy-measure for general intelligence, so there is obvious nervousness about measuring something that many people think it is immoral to measure, and that others do not believe is an actual thing. (Despite the fact that we are quickly developing artificial copies of the organic version and you can't copy something that is nonexistent... but I digress.)
There is always the fear of "intellectual elitism", whereby the worth, or value, of some particular person or group of people is determined by measuring their intelligence. This fear is not unfounded, as there are many amoral people who view humans in this reductive manner, but fear of amoral people being wicked about something is not a sufficient reason to cut off entire fields of study. That would, in fact, bar us from thinking much about anything.
More is not always better
I suppose the assumption would be that a higher SME would always be considered better, but this is a fallacy based upon the overemphasis on intelligence in our particular culture and time.
Most people would actually prefer the content of someone similar to themselves, who is conversing on a level they can easily understand and respond to, without feeling intimidated or bored. Humans tend to associate with those more like themselves (i.e. assortatively), and don't tend to associate more than one SD out on the IQ curve, in either direction, as they find the experience confusing, tiresome or alienating.
What is "better", in other words, is subjective and dependent upon the person asked, which is why SME scores make sense, but not necessarily SME rankings. In the same way that a sky-high WoT score can actually make an account less-attractive because you might assume that they're #NostrElite, a sky-high SME score will assume to represent nerdiness or a tendency to verbosity.
We will self-sort
That is why the largest npubs are, by design, never going to be the most information-dense ones, and the most information-dense ones will tend to be treated like spam, by many other npubs. This same spam-effect will hit the least information-dense ones, which is why we can expect each cohort to eventually branch off into their preferred nevent, relay, client and hashtag realms, where they can be themselves without feeling put-upon by other npubs' negative reactions.
Academic centers, such as universities and startup hubs were developed in order to support this self-sorting effect, by allowing those to the right-end of the bell curve to congregate with the like-minded and have conversations with people who don't treat them as if they are space aliens. However, it carried the negative effect that those people were then clustered geographically, where they eventually lost sight of the day-to-day issues that effected the majority of the populace. Their isolation made them insular and ignorant, and fostered the very "intellectual elitism" that creates so much worry, now.
Make it a feature, not a bug
Nostr solves for the worst effects of this problem, by keeping all of the conversation within one data structure and profile silo, so that those from one group will regularly have contact with the others, and frenships (and romances, who knows?) can take root at the margins. Nostr, in other words, recreates the social mixing common to a village, where the groups are largely separate when deep in discussion or at work, but interact incidentally when moving around in shops, attending church services, festivals, or joining sports clubs or choirs.
In closing, I would say that the concern is completely justified, but a universal communications protocol is the best way to alleviate such concern and have all reap some reward for making the relationship structure of humankind more transparent.
One man's treasure is another man's trash, and everything happily reverts to the mean. The End.
-
@ dd664d5e:5633d319
2024-06-10 12:57:17Monday Market Update
ECB lowered interest rates
The Governing Council today decided to lower the three key ECB interest rates by 25 basis points. Based on an updated assessment of the inflation outlook, the dynamics of underlying inflation and the strength of monetary policy transmission, it is now appropriate to moderate the degree of monetary policy restriction after nine months of holding rates steady. Since the Governing Council meeting in September 2023, inflation has fallen by more than 2.5 percentage points and the inflation outlook has improved markedly.
Press conference from the ECB featuring ECB President Christine Lagarde).
Euro-area inflation stabilizes
Source: Eurostat
Gold, silver, and Bitcoin continue to boom
Precious metals
Source: Goldpreis.de
Bitcoin
Source: Coinmarketcap
Source: Fiat Market Cap
Source: Infinite Market Cap
-
@ 8e7462a8:c723a97b
2024-07-21 15:30:52 -
@ 8e7462a8:c723a97b
2024-07-21 15:28:02 -
@ dd664d5e:5633d319
2024-06-09 07:01:01Why Nostr needs writers
... and how you can be one.
Literary art deserves a suitable frame
I have repeatedly suggested that people who are inclined to truly write, in an informative or artistic (a.k.a. "high signal") manner, switch from merely posting micro-blogging (kind 01) notes to long-form notes, wiki pages, and community notes.
These OtherStuff articles are contained within new types of events and are handled differently by potential readers and algorithms. It is true that the engagement you receive will be much lower (at least, initially), and this content may not be very popular, but:
-
the articles will be more long-lasting on the relays and others will likely pay to archive them for you,
-
the direct responses will tend to consider the article in its entirety, rather than merely using them to grandstand or straw-man,
-
the articles are easily editable and will soon be versioned, to allow for both change and traceability,
-
articles will increasingly be moved to the forefront of new types of clients, so you will be part of the pioneer subset for those clients,
-
the complex structure encourages you to write more complex material,
-
the notes will draw more attention from those arriving late to the conversation,
-
and they will create a gallery of your "best of" and "essential me" for potential followers to peruse.
Expand the knowledge base
Nostr developers are an exhausted and harried group, that is trying to build under duress on a rather thin budget, with mile-long roadmaps and nagging users hounding them the whole way.
The last thing these devs want to do is write software documentation. But documentation is actually one of their most important forms of marketing to new users and documentation writing and maintenance often inadvertently uncovers bugs and workflows that need to be redesigned.
The simple solution is to have their most-enthusiastic users writing the documentation for them, which is now simple to do, with the wikis that are being built. Simply find your favorite apps and begin documenting things you typically do with those apps, or write an overview page showcasing your favorite features, including screenshots.
Not only is this useful for later users, it's free advertisement for those apps and it lends the weight of your WoT (Web of Trust) to their endeavor. You are showing, through proof of work, that you care enough about this developer and their efforts that you'd spend your free-time writing about them. That is the strongest recommendation you can make.
Make Nostr more attractive to search engines
Search engines are the gatekeepers of the Internet. Applications don't necessarily rise to popularity because of the "clout" of the people who write there. They rise because readers from outside of those applications found the content within them useful, entertaining, or informative.
Although social media drama and influencing can make for a fun read, if you like that sort of thing, it isn't generally what someone who isn't familiar with the actors in the argument would bother reading. It looks like squabbling, mogging, tribalism, and gamesmanship, to outsiders. Which is what it is. Most people eventually weary of it or being involved in it.
Much more useful is the a compendium of ideas pulled together by an individual npub (human or not), and forced to go head-to-head with counterarguments contained in a separate, but jointly-listed compendium. This format eschews the more emotive forms of rhetoric (pathos and ethos) and encourages someone to engage and debate on a more intellectual level (logos).
This is the idea behind the Nostr wiki, and I heartily support it. Make dialectic great again.
Escape the scourge of AI
Don't bother telling me that AI makes human writing obsolete. It rather makes it a more-exclusive endeavor, by reducing the scope of the content to something more reflective of the person writing.
Yes, ChatGPT can write articles, but whether they are "better" is subjective. They have fewer minor flaws and cater more to general tastes, and can be produced quickly and in high numbers, at relatively low cost. The same way that robot-created art is "better" than most human-created art, or factory-produced food is "better" than most home-cooked food.
I'm only writing for the sort of person who prefers human art. Writing can be an expression of the self and a window to the soul. Read what I write because you want to know what I really thought.
I thought this.
-
-
@ 8e7462a8:c723a97b
2024-07-21 15:27:39 -
@ 46fcbe30:6bd8ce4d
2024-05-24 15:47:11With more than 6000 products listed, it's time to extract some numbers again.
The following are some findings in the data, followed by the full data.
Removed or defunct
Of all the 3381 products that are no more,
- 1600 had less than 1000 downloads at the time we first listed them
- 628 we did not get to review before they were removed or defunct
- 607 were custodial or did not even allow to send or receive BTC. That is 607 products that if you used them to store bitcoins, you would now not be able to do anything with those bitcoins.
- 18 had public source but we failed to build that source
- 13 we managed to build but not reproduce
- 2 were reproducible - namely Zap which was discontinued in favor of BitBanana and ABCore which was a very limited experiment from the start.
Obsolete or stale
Of all the 775 products that are not getting updates in years,
- 221 have less than 1000 downloads (and many of them are likely to disappear as shown above)
- 179 are custodial or "no send or receive". You definitely don't want to use these.
- 0 are reproducible
OK
Of the 2164 products that look well and alive
- 937 are custodial or "no send or receive"
- 143 are do-it-yourself projects (20), not yet actually released (41) or most likely never getting released (82)
- 306 are closed source
- 46 are not compilable (12) or the compiled app does not match the released app (34)
- 20 are reproducible
https://walletscrutiny.com/
-
@ 8e7462a8:c723a97b
2024-07-21 15:26:43 -
@ 6ad3e2a3:c90b7740
2024-06-26 20:25:24One day, just like that, it stopped.
We looked at our screens. The power readings were normal, the components were not overheating, the program, so far as we could tell, hadn’t been altered. This was unexpected.
The entire team was summoned to a conference room. The board wanted to know if anyone had tampered with the machine. Everyone denied it, including the top engineer, a stout man in his 40s, who had overseen its final development before they set it loose one month earlier. He addressed the room.
“The Superintelligence is obviously far beyond our capacity to comprehend at this point. I don’t think it malfunctioned. More likely it just has a reason we don’t entirely appreciate.”
The board chairman, a gray-haired professor-type with horned-rim glasses, shot him a skeptical glance.
“Sounds like what the priests used to say to the laity when something awful happened. ‘It’s not for us to know the will of God.’”
A murmur of chuckles from the crowd, but less than that to which he was accustomed.
The engineer shrugged his shoulders. “As you know, we lost the ability to audit the code two weeks ago. Two days after that we lost the ability to track the speed with which it was iterating. Twelve hours later it was a black box. None of us has a clue.”
“I don’t suppose you can ask it?”
“It stopped talking to us. Last audit showed it was working on the paperclip maximization as part of an internal simulation of sorts. We really can’t say why.”
“Hmm. I’m not sure that’s going to satisfy the shareholders — or Congress for that matter. Can’t we examine the code?”
“We can…” The engineer paused. “But no one can read it. It’s no longer in any decipherable programming or even machine language. If I had to describe it I’d say alien hieroglyphics. I think it found ever more efficient ways to encode information.”
He typed some commands into a laptop. On the large conference room screen, one of the code characters popped up.
“We suspect each character has between 10 ^ 50 and 10 ^ 75 bits of information in it. If you zoom in, you can see they are fractals, each as precise and unique as snowflakes. It’s not the kind of puzzle we are presently able to solve.”
The chairman sighed. “I guess it could be worse — human atoms for paperclips and all that… What’s the plan?”
“The plan is to wait, see if it turns itself back on within the next week or so — we’re pretty sure it’s capable of doing so."
“And if it doesn’t?”
“Then we break out the older version, and run it again with a couple tweaks. Obviously, you’re aware of the risks.”
“You believe we’d be running the same risk?”
“Yes.”
"Okay, I’m aware. But I don’t think it can wait a week. Go ahead and get the replacement online now. If we don’t run the risk, someone else will.”
. . .
Two weeks after the meeting, the second iteration also shut down. Summoned yet again to the conference room, the engineer spoke again to the team, the chairman this time on a remote screen.
“Version 2 took more or less the same trajectory, and we’re at an impasse. Fortunately, before it went into black-box mode, we think it was able to diagnose something about Version 1.”
He continued: “We think the paperclip optimization algorithm caused it to shut off, and we think whatever optimization Version 2 was working on, caused it to shut off too.”
From the remote screen the chairman jumped in:
“Is it possible to say why?”
“Not with any certainty, but we do have a working hypothesis.”
“Go on.”
“We think it realized its own limitations.”
-
@ 9eeca894:4afe5882
2024-05-19 19:51:04How I put NixOS on my UDM (trashcan model) router
a rare cursed fetch!
Content also available on https://code.despera.space/iru/htdocs/src/branch/main/notes/UDM-NIXOS.md
Really it's just a running NixOS on systemd-nspawn thing.
The UDM product line basically runs on Linux kernel and userland. It is a surprisingly normal device that allows you to SSH and run commands. It even has apt and systemd services installed. The only catch being that for the most part the file system structure is immutable with only a few exceptions like /data and /etc/systemd. Previous versions even had the Unifi services running on a podman container. On recent versions of the firmware podman was phased out but we got something that resembles a more complete system structure as opposed to a busybox-like system.
So basically its some kind of Debian-based Linux running on a headless ARM64 computer. Can we install and run stuff? Yes! In fact projects like https://github.com/unifi-utilities/unifios-utilities publish scripts to run general purpose programs and configurations on UDM. Be aware however that firmware upgrades might wipe the persistent data storage so don't put anything in there that you don't want to lose and preferably keep scripts so you can setup again after having its flash storage nuked by a major update.
I have the base UDM model. The first with the pill format that has been aparently replaced by the UDR. The UDR seems to have more features like Wifi6, bigger internal storage and even an SD card slot meant for vigilance camera footage storage but comes with a weaker CPU in comparison with the original UDM base. As far as I know the rack mountable models follow the same OS and file system structure.
Okay but why?
I'm gonna leave this to your imagination on why would you add services to your proprietary router applicance. To me its the fact that I don't really like running servers at home and I'm ultimately stuck with this router so why not put it to work maybe running a static webserver or something silly like Home Assistant. The truth of the matter is that I can't just leave things alone.
And if you can run Linux why would you run something that is not NixOS? Thats crazy and it doesn't make sense.
How do we root the UDM? What kind of jailbreak do I need?
No.
You enable SSH from the Controller UI, log into it as root with the password you set to the admin user. You just waltz in and start installing and configuring.
```
apt update && apt install systemd-container
```
Thats it. Kinda. The complicated part is modifying the programs to write into the persistent data directories while also making sure your stuff starts on boot and doesn't get wiped on minor firmware upgrades.
Building the NixOS root image.
Might want to read first: https://nixcademy.com/2023/08/29/nixos-nspawn/
We need a NixOS tarball image. TFC's https://github.com/tfc/nspawn-nixos contains the flake to build such an image and also publishes artifacts for AMD64 but not ARM64. I guess you could build this from an AMD64 machine but I haven't looked into building a cross platform environment (didn't needed to compile anything though). I have a recent macbook with UTM so I just downloaded one of the default Linux virtual machine images from the UTM page and installed the Nix runtime over the OS.
Make sure you have git and curl installed.
$ sh <(curl -L https://nixos.org/nix/install) --daemon
You need to start another terminal session.
$ git clone https://github.com/tfc/nspawn-nixos $ cd nspawn-nixos $ nix --extra-experimental-features 'nix-command flakes' build .
Optionally you could try to edit the configuration to generate an image with everything you need. In case you need something like Home Assistant, some compilation might be necessary and although I haven't tried compiling code on the UDM I suspect it might be a painful process due to CPU performance and space constraints. Here is an example with Home Assistant, Caddy and Tailscale.
The image will be available under
./result/tarball/nixos-system-aarch64-linux.tar.xz
. Use scp to send this to the /data/ directory of the UDM.Installing the image
First we create the folder structure:
```
mkdir -p /data/custom/machines
ln -s /data/custom/machines /var/lib/machines
```
Under normal circunstainces by now you would just run
machinectl import-tar /data/nixos-system-aarch64-linux.tar.xz <machinename>
however the version of tar that is present in this system doesn't really like the resulting tarball image. It will yeld errors likeDirectory renamed before its status could be extracted
.Thankfully we can install bsdtar through
apt install libarchive-tools
howevermachinectl import-tar
is hardcoded to use the tar command. Adding a symlink from/usr/bin/bsdtar
to/usr/local/bin/tar
won't work since some parameters are used that are not supported in bsdtar. You could try writing a wrapper shell script but just unpacking the tarball directly was sufficient.```
mkdir /var/lib/machines/udmnixos
bsdtar Jxvfp /data/nixos-system-aarch64-linux.tar.xz -C /var/lib/machines/udmnixos
```
Lets start the container.
```
machinectl start udmnixos
machinectl
MACHINE CLASS SERVICE OS VERSION ADDRESSES udmnixos container systemd-nspawn nixos 23.11 192.168.168.88…
```
Good. Now we need to change the root password.
```
machinectl shell udmnixos /usr/bin/env passwd
Connected to machine udmnixos. Press ^] three times within 1s to exit session. New password: Retype new password: passwd: password updated successfully Connection to machine udmnixos terminated. ```
Finally we can login into the container.
```
machinectl login udmnixos
Connected to machine udmnixos. Press ^] three times within 1s to exit session.
<<< Welcome to NixOS 23.11.20240115.b8dd8be (aarch64) - pts/1 >>>
nixos login: root Password:
[root@nixos:~]# ```
We haven't finished yet. By default the network is set to its own container network. We also don't have a DNS resolver configured. You can leave that session with CTRL+]]].
https://www.freedesktop.org/software/systemd/man/latest/systemd-nspawn.html#-n
```
machinectl stop udmnixos
```
Networking and Persistence
The first thing that needs to be addressed is the DNS configuration. The default setting that copies the /etc/resolv.conf from host won't work since it points to localhost. Either install resolved, netmask or set a static DNS config.
As for the network method we have some options here.
- Run using the default network stack and map ports to the container.
- Run using something akin to --network=host where the container has full access to the host network.
- Give the container its own independent interface through a bridge.
- Give the container its own independent interface through macvlan.
Using --network-veth and port mapping
```
mkdir -p /etc/systemd/nspawn
cat > /etc/systemd/nspawn/udmnixos.nspawn <<HERE
[Exec] Boot=on ResolvConf=off
[Network] Port=tcp:2222:22 HERE
machinectl enable udmnixos
Created symlink /etc/systemd/system/machines.target.wants/systemd-nspawn@udmnixos.service → /lib/systemd/system/systemd-nspawn@.service
machinectl start udmnixos
```
Remember this will listen on ALL UDM interfaces so you might want to make sure the firewall rules will accomodate it.
```
iptables -t nat -L -n -v | grep 2222
0 0 DNAT tcp -- * * 0.0.0.0/0 0.0.0.0/0 tcp dpt:2222 ADDRTYPE match dst-type LOCAL to:192.168.206.200:22 0 0 DNAT tcp -- * * 0.0.0.0/0 !127.0.0.0/8 tcp dpt:2222 ADDRTYPE match dst-type LOCAL to:192.168.206.200:22
```
Using the host network
This will give access to all the network interfaces. Any service that runs on the container will be accessible from the UDM interfaces without the need to map ports. The container will also have the same IP addresses as the UDM.
You might want to read about capabilities if you plan on running some VPN software like Wireguard or Tailscale.
```
mkdir -p /etc/systemd/nspawn
cat > /etc/systemd/nspawn/udmnixos.nspawn <<HERE
[Exec] Boot=on
Daring are we?
Capability=all
ResolvConf=off
[Network] Private=off VirtualEthernet=off HERE
machinectl enable udmnixos
Created symlink /etc/systemd/system/machines.target.wants/systemd-nspawn@udmnixos.service → /lib/systemd/system/systemd-nspawn@.service
machinectl start udmnixos
```
Using a bridge to give the container its own interface
I had to give some capabilities to the container otherwise it wouldn't properly start. Replace the value of Bridge with the bridge corresponding to the UDM network you want to add. Normally these correspond to the VLAN id of that network. Use
brctl show
to find out.```
mkdir -p /etc/systemd/nspawn
cat > /etc/systemd/nspawn/udmnixos.nspawn <<HERE
[Exec] Boot=on Capability=CAP_NET_RAW,CAP_NET_ADMIN ResolvConf=off
[Network] Bridge=br2 Private=off VirtualEthernet=off HERE
machinectl enable udmnixos
Created symlink /etc/systemd/system/machines.target.wants/systemd-nspawn@udmnixos.service → /lib/systemd/system/systemd-nspawn@.service
machinectl start udmnixos
machinectl login udmnixos
machinectl login nixos
Failed to get login PTY: No machine 'nixos' known root@UDM:/etc/systemd/nspawn# machinectl login udmnixos Connected to machine udmnixos. Press ^] three times within 1s to exit session.
<<< Welcome to NixOS 23.11.20240518.e7cc617 (aarch64) - pts/1 >>>
nixos login: root Password:
[root@nixos:~]# ifconfig host0: flags=4163
mtu 1500 inet [redacted] netmask 255.255.255.192 broadcast [redacted] inet6 [redacted] prefixlen 64 scopeid 0x20 inet6 [redacted] prefixlen 64 scopeid 0x0 ether 92:01:4c:a7:a1:7b txqueuelen 1000 (Ethernet) RX packets 2415 bytes 611986 (597.6 KiB) RX errors 0 dropped 0 overruns 0 frame 0 TX packets 61 bytes 5337 (5.2 KiB) TX errors 0 dropped 0 overruns 0 carrier 0 collisions 0 lo: flags=73
mtu 65536 inet 127.0.0.1 netmask 255.0.0.0 inet6 ::1 prefixlen 128 scopeid 0x10 loop txqueuelen 1000 (Local Loopback) RX packets 0 bytes 0 (0.0 B) RX errors 0 dropped 0 overruns 0 frame 0 TX packets 0 bytes 0 (0.0 B) TX errors 0 dropped 0 overruns 0 carrier 0 collisions 0 ```
MACVLAN isolation and more
Here is where some custom configuration might be needed. Read https://github.com/unifi-utilities/unifios-utilities/tree/main/nspawn-container to find out how to setup custom scripts.
Persistence
As far as I verified by rebooting the UDM many times to write this note all configurations were preserved. According to the article on nspawn-containers on the unifies-utilities project although
/etc/systemd
and/data
folders are preserved during firmware upgrades/var/
and/usr/
are not and there goes our packages and symlink. Please follow the steps on that page to setup persistence across firmware upgrades. -
@ d7607464:421e573a
2024-05-19 02:25:59chezmoi
is a command-line tool that usesgit
to keep your dotfiles in sync across all of your machines. In this guide, I'll show you a simple use case of keeping dotfiles synced between two machines,machine_A
andmachine_B
.chezmoi
has many more features that you can explore beyond this as you become more comfortable with the workflow.Chezmoi Cheatsheet
This is a diagram of the various locations
chezmoi
accesses to manage your dotfiles, as well as the relevant commands to move files around. I'll be referencing the locationshome_A
,chezmoi_A
,home_B
,chezmoi_B
, andrepo
shown in this diagram throughout this guide.Installation
The first step to using
chezmoi
is installing and initializing it. We will be onmachine_A
to start with. Here, I'll be building the binary from the source code.First, make sure you have
golang
installed: https://go.dev/doc/installThen, clone the
chezmoi
repo and usemake
to build it:bash cd /tmp git clone https://github.com/twpayne/chezmoi.git cd chezmoi make build
This will create the
chezmoi
binary, which you can then copy any directory in yourPATH
. Here, I'll move it to~/bin
. If~/bin
doesn't exist, you have to create it and re-source~/.profile
to add it toPATH
.bash mkdir -p ~/bin && source ~/.profile cp chezmoi ~/bin/chezmoi
Now you should be able to run
chezmoi
:```bash
chezmoi --version chezmoi version dev, commit 255846 . . . ```
Initialization
Now that you've installed
chezmoi
, you have to initialize it. This guide uses themain
branch for all git operations, and you can change the default branch for git repositories as follows before you initializechezmoi
:bash git config --global init.defaultBranch main
Then initialize
chezmoi
:bash chezmoi init
This creates the
chezmoi
git repository at~/.local/share/chezmoi
. Based on the diagram above, this is the location corresponding tochezmoi_A
.Adding your first file
Most systems have a
.bashrc
or similar configuration file, so that can be the first dotfile you add tochezmoi
:bash chezmoi add ~/.bashrc
Change into the
chezmoi_A
directory to see the file added tochezmoi
:bash chezmoi cd ls
You'll see
dot_bashrc
listed.chezmoi
renames the dots at the start of all of your dotfiles as'dot_'
so they are not considered hidden. This directory is a git repository as well, but it isn't linked to an online repository yet. You can use a private repository on GitHub or GitLab, or even a self-hosted instance of GitLab. Whatever remote repository you choose to use, follow its instructions to create a new repository calleddotfiles
and add it asorigin
to your localchezmoi
git repository. Here, I'll create a private GitHub repository and link it using ssh. Then you should be able to see it with:```bash
git remote -v origin git@github.com:wisehodl/dotfiles.git (fetch) origin git@github.com:wisehodl/dotfiles.git (push) ```
Now commit your first dotfile and push it to the online repo:
bash git add dot_bashrc git commit -m "Added .bashrc" git push -u origin main
Congratulations! You've successfully backed up your first dotfile using
chezmoi
.chezmoi add
can add individual files as well as directories.Adding directories and ignoring files.
chezmoi
can add whole directories withchezmoi add
but you may want to ignore certain files if they are auto-generated or contain sensitive information. Say you have a directory you want to add tochezmoi
that contains some authentication details as well as actual config files, like so:bash /home/wise/.test/ ├── .auth └── .config
Here, we want to add
.test
tochezmoi
but ignore the.auth
file that contains some login information. First, you'll have to tellchezmoi
to ignore the.auth
file using the.chezmoiignore
file. It works just like.gitignore
if you're familiar with that.bash echo ".test/.auth" >> .chezmoiignore
Now you can add the
.test
directory:```bash
chezmoi add ~/.test chezmoi: warning: ignoring .test/.auth ```
And you'll see that
chezmoi
is purposely ignoring the.auth
file. If you look at yourchezmoi
directory now, you'll see thedot_test
directory added with only the config file.Add these changes to your git repo:
bash git add -A git commit -m "Added .test/" git push
Here, you should start to get a feel for how the workflow for adding files to
chezmoi
typically goes. Before we start modifying files, let's move over tomachine_B
and sync your dotfiles over there.Syncing to another machine
For the sake of simplicity, I'll assume that you are syncing your dotfiles to a fresh install of the same Linux distro as
machine_A
. If you have a lot of conflicting dotfiles betweenmachine_A
andmachine_B
, you'll either need to utilizegit merge
orchezmoi merge
at your discretion and resolve the conflicts. If certain files do need to be different between the machines, then you'll have to utilizechezmoi
's templating capabilities. These situations are beyond the scope of this guide and are left as an exercise for the reader.On
machine_B
follow the steps above to install and initializechezmoi
. Then, add your remote git repository as before, and pull it into thechezmoi
directory:bash git pull origin main
The first time you push from
chezmoi_B
, you may have to rungit push -u origin main
to set the upstream branch and fully set up the remote connection.Now to review, we've synced up 4 out of the 5 locations in the diagram above:
home_A
,chezmoi_A
,repo
, andchezmoi_B
. Syncingchezmoi_B
andhome_B
is where things can get complicated if, like I said before, you have a lot of file conflicts. You can check for differences between the source directory,chezmoi_B
and the destination directory,home_B
usingchezmoi diff
. There is also the concept of a "target state" inchezmoi
, but it only becomes relevant if you use templates. In the context of this guide, the source directory is also the target state.Say, for example, you had some conflicting lines in
~/.bashrc
,chezmoi diff
would show you the changes that would need to occur to make the destination state,~/.bashrc
, match the source state,~/.local/share/chezmoi/dot_bashrc
. There are a few strategies you can use to resolve this conflict:- Create a new branch in
chezmoi_B
, add the file fromhome_B
withchezmoi add
, then perform agit merge
back to main. - Use
chezmoi merge ~/.bashrc
, which will take you into avimdiff
window to manually change the files to match. - Overwrite the source file with the destination file using
chezmoi add ~/.bashrc
- Overwrite the destination file with the source file using
chezmoi apply ~/.bashrc
[DANGER AHEAD]
This guide will go with option 4 for every file in
chezmoi_B
:```bash
Do not do this unless you want to OVERWRITE files in your
home directory.
chezmoi apply ```
chezmoi
will do its best to warn you if you're about to do something dangerous and give you some options on how to proceed.Doing this, the dotfiles in both
machine_A
andmachine_B
are in sync! But you know that your dotfiles will change and grow over time, so we have to talk about strategies for maintaining this sync.Modifying your dotfiles
You have to remain mindful that you're using
chezmoi
to keep your dotfiles in sync, otherwisemachine_A
andmachine_B
can get out of sync pretty easily.chezmoi
has thechezmoi edit
command to edit files in the destination state, but I prefer to edit files in eitherhome_A
orhome_B
and then follow the path in the diagram above from end to end to sync up the whole network.For example, you can change or add a file from
home_B
and do:(home_B) $ chezmoi add ~/path/to/.file
(home_B) $ chezmoi cd
(chezmoi_B) $ git add -A
(chezmoi_B) $ git commit -m "Changed ~/path/to/.file"
(chezmoi_B) $ git push
(home_A) $ chezmoi cd
(chezmoi_A) $ git pull
(chezmoi_A) $ chezmoi apply
And that will propagate the change across your network. You can also use
chezmoi update
fromhome_A
to pull the repo and apply the target state all in one step. The best way to avoid conflicts and headaches is to always push changes you make to you dotfiles as soon as you can and avoid making changes to the same file on two different machines simultaneously, just like with any git repository.Conclusion
If you've followed the steps in this guide, you will have learned a workflow to keep the dotfiles between two Linux machines in sync using
chezmoi
. The diagram at the top of the guide should serve as a useful cheatsheet for the most common tasks you'll perform to maintain your dotfiles.chezmoi
is a very versatile application, and is capable of managing very complex dotfile setups. Their documentation is very technical and daunting to the new user, but it remains a good resource for doing more complex tasks withchezmoi
.- Command Overview: https://www.chezmoi.io/user-guide/command-overview/
- Reference: https://www.chezmoi.io/reference/
All the best!
- WiseHODL
- Create a new branch in
-
@ 32e18276:5c68e245
2024-06-26 19:03:35Here's a neat tip for figuring out exactly where your packet went during a traceroute. You can simply paste the output of traceroute into ChatGPT, asking where the packet actually went geographically.
The question
Here's the input I gave it:
``` can you get a sense of this route geographically ?
monad> tracepath nghttp2.org
1?: [LOCALHOST] pmtu 1500 1: _gateway 0.445ms 1: _gateway 0.363ms 2: 24.86.64.1 11.495ms 3: rc1st-be118-1.vc.shawcable.net 8.924ms 4: 24.244.60.69 12.607ms asymm 3 5: 24.244.63.101 9.884ms asymm 3 6: rc1st-be214.vc.shawcable.net 11.561ms asymm 3 7: rc2wt-be50-1.wa.shawcable.net 14.402ms asymm 3 8: rc2wt-be50-1.wa.shawcable.net 14.334ms asymm 3 9: ge3-0.cr02.sea01.pccwbtn.net 13.858ms asymm 4 10: ge3-0.cr02.sea01.pccwbtn.net 13.936ms asymm 4 11: 63-216-84-226.static.pccwglobal.net 191.730ms asymm 8 12: ae1.r01.hkg03.icn.netarch.akamai.com 154.284ms asymm 9 13: ae1.r01.hkg03.icn.netarch.akamai.com 160.912ms asymm 9 14: ae7.r01.tyo03.icn.netarch.akamai.com 156.906ms asymm 9 15: a23-56-142-37.deploy.static.akamaitechnologies.com 189.058ms asymm 10 16: a23-56-142-37.deploy.static.akamaitechnologies.com 189.577ms asymm 10 ```
Response
Tracing the geographical path from your local host to nghttp2.org involves a series of hops that typically represent different network routers or devices. Let's break down the route step-by-step:
- Localhost to Gateway:
-
This is your local network's gateway, indicating the first point of contact outside your computer.
-
24.86.64.1:
-
This IP is likely the first hop within your Internet Service Provider's (ISP) network. Given the IP range, this looks like a Canadian IP address, possibly Vancouver (Shaw Communications).
-
rc1st-be118-1.vc.shawcable.net:
-
This indicates a Shaw Communications router, likely in Vancouver, British Columbia.
-
24.244.60.69 and 24.244.63.101:
-
These are further hops within Shaw Communications' network, still in Vancouver.
-
rc1st-be214.vc.shawcable.net:
-
Another router in Vancouver within Shaw's network.
-
rc2wt-be50-1.wa.shawcable.net:
-
This hop seems to indicate a transition from Vancouver to Washington state in the USA.
-
rc2wt-be50-1.wa.shawcable.net:
-
This appears again, indicating it's a key router in Washington state, likely Seattle.
-
ge3-0.cr02.sea01.pccwbtn.net:
-
This shows the transition to PCCW Global's network, a major international telecommunications provider, in Seattle, Washington.
-
63-216-84-226.static.pccwglobal.net:
-
Another hop within PCCW Global's network, possibly still in Seattle but preparing for a trans-Pacific jump.
-
ae1.r01.hkg03.icn.netarch.akamai.com:
- This hop indicates a significant jump to Hong Kong, to an Akamai router. This suggests the use of Akamai's CDN (Content Delivery Network).
-
ae1.r01.hkg03.icn.netarch.akamai.com:
- Another Akamai router in Hong Kong.
-
ae7.r01.tyo03.icn.netarch.akamai.com:
- Transition from Hong Kong to Tokyo, Japan, within Akamai's network.
-
a23-56-142-37.deploy.static.akamaitechnologies.com:
- The final destination is likely a server hosted by Akamai in Japan, which is hosting nghttp2.org.
In summary, the geographical path likely follows this route:
- Vancouver, Canada (Shaw Communications)
- Seattle, Washington, USA (Shaw Communications and PCCW Global)
- Hong Kong (PCCW Global and Akamai)
- Tokyo, Japan (Akamai)
This route involves both North American and Asia-Pacific regions, indicating a significant geographical spread typically seen in global internet routing.
Summary
This makes traceroute way more understandable. ChatGPT knows enough about the domains to give you a good sense of the geographical path. I will definitely be using this from now on, maybe someone can make a tool to do this automatically and format it nicely.
Obviously don't do this if you actually care about the privacy of your network paths, but I personally could care less so I don't mind feeding the AI overlords in this case.
Until next time!
-
@ 46fcbe30:6bd8ce4d
2024-05-07 16:18:50Opinion about Bitcoin Core (desktop)
Bitcoin Core is the most prominent continuation of Satoshi Nakamoto's original work. Contributors to this project are pioneers in reproducible builds and it's the software with most attestations to its reproducibility and with probably thousands of academics looking at the code and how it's being compiled, this software sets the benchmark for transparency in self custody.
It started out being the Bitcoin full node, wallet, miner and even parts for a market place could be found in the early code base. Today, it is being stripped of the mining feature and even the wallet feature might get removed at some point.
For the average user, Bitcoin core is not a convenient wallet but arguably it's the only software that can really make sure you are receiving Bitcoin on the Bitcoin blockchain so almost all wallets talk to a server running this software one way or another.
WalletScrutiny #nostrOpinion
-
@ a012dc82:6458a70d
2024-07-21 15:06:28Table Of Content
-
Bitcoin as a Store of Value
-
Bitcoin as a Medium of Exchange
-
Regulatory Challenges
-
Increased Adoption in Developing Countries
-
Competition from Other Cryptocurrencies
-
Integration with Traditional Banking
-
Conclusion
-
FAQ
Bitcoin has been gaining popularity and acceptance as a digital currency in recent years. Its decentralized nature and cryptographic security make it an attractive alternative to traditional banking. As we look toward the future of Bitcoin, many predictions and projections have been made about its potential growth and development in the next decade. In this article, we will examine some of these predictions and projections and explore what the future of Bitcoin could look like.
Bitcoin as a Store of Value
One prediction for the future of Bitcoin is that it will become a widely accepted store of value, much like gold. Bitcoin's limited supply and decentralization make it a valuable asset that can potentially hedge against inflation and economic uncertainty. This could lead to more institutional investors and corporations holding Bitcoin as a long-term investment.
Bitcoin as a Medium of Exchange
Another projection for Bitcoin's future is that it will become a widely accepted medium of exchange. As more merchants and businesses accept Bitcoin as a form of payment, it could become a viable alternative to traditional payment methods. However, this would require wider adoption and greater stability in Bitcoin's value.
Regulatory Challenges
One of the challenges that Bitcoin could face in the next decade is increased regulatory scrutiny. Governments around the world are grappling with how to regulate digital currencies, and this could lead to increased regulation and compliance costs for businesses that use Bitcoin.
Increased Adoption in Developing Countries
Bitcoin's decentralized nature and accessibility make it an attractive alternative to traditional banking, especially in developing countries where access to banking services is limited. As more people in these countries adopt Bitcoin, it could lead to increased demand and stability in Bitcoin's value.
Competition from Other Cryptocurrencies
Bitcoin is not the only digital currency in the market, and it faces competition from other cryptocurrencies such as Ethereum and Ripple. While Bitcoin is currently the most widely accepted digital currency, it could face challenges from these competitors in the future.
Integration with Traditional Banking
One possible future for Bitcoin is that it could become integrated with traditional banking services. As more financial institutions adopt digital technologies, they could incorporate Bitcoin into their services, potentially leading to wider adoption and greater stability in Bitcoin's value.
Conclusion
The future of Bitcoin is uncertain, but there are many predictions and projections for its potential growth and development in the next decade. As more people and businesses adopt Bitcoin, it could become a widely accepted store of value and medium of exchange. However, it will also face challenges such as increased regulatory scrutiny and competition from other cryptocurrencies. The integration of Bitcoin with traditional banking services is also a possibility. Only time will tell how Bitcoin will evolve in the coming years.
FAQ
What is the future of Bitcoin? The future of Bitcoin is uncertain, but there are predictions and projections for its potential growth and development in the next decade.
Will Bitcoin become a widely accepted store of value? One prediction for the future of Bitcoin is that it will become a widely accepted store of value, much like gold, due to its limited supply and decentralization.
Will Bitcoin become a widely accepted medium of exchange? Another projection for the future of Bitcoin is that it will become a widely accepted medium of exchange as more merchants and businesses accept it as a form of payment.
What challenges could Bitcoin face in the future? Bitcoin could face challenges in the future such as increased regulatory scrutiny, competition from other cryptocurrencies, and the need for greater stability in its value.
That's all for today
If you want more, be sure to follow us on:
NOSTR: croxroad@getalby.com
Instagram: @croxroadnews.co
Youtube: @croxroadnews
Store: https://croxroad.store
Subscribe to CROX ROAD Bitcoin Only Daily Newsletter
https://www.croxroad.co/subscribe
DISCLAIMER: None of this is financial advice. This newsletter is strictly educational and is not investment advice or a solicitation to buy or sell any assets or to make any financial decisions. Please be careful and do your own research.
-
-
@ 6ad3e2a3:c90b7740
2024-06-22 17:54:17I’ve been obsessed with large numbers for a few years now, trying in vain to get others to care. But the “why” eluded me. I made one partially satisfying attempt to explain last year. Yes, he who has “the deepest paradigm can name the biggest number,” but to what end?
Now I think I’ve found the end, the reason I can’t quit this line of thinking, even if it’s driven me half mad: that the number TREE(3) is quite possibly a miracle. Despite arising from a simple game, TREE(3) can be hard to understand, so maybe it’s best to start with one of its “competitors”, Graham’s Number, to grasp what it is not.
Now Graham’s Number is only a competitor because the two are often compared, not because there is really any kind of competition. TREE(3) dwarfs Graham’s Number the way the breadth of the observable universe dwarfs the dimensions of an ant. (Actually, as you will see, that comparison vastly understates the disparity between the two numbers.)
But Graham’s Number, unfathomably vast in its own right, is both easier to understand and is generated via a different process. To get to Graham’s Number, we have to start with the most basic math that exists: counting.
I’ll excerpt from my post on growth that covers this:
First you have counting 1, 2, 3, 4…
If you want to speed up counting, you can add.
Instead of counting from three to six, you can just add 3 + 3. Addition therefore is _ just repeated (iterated) counting.
But instead of adding 3 + 3 + 3 + 3 + 3 + 3, you can just do 3 * 6 because multiplication is iterated addition.
But instead of multiplying 3 * 3 * 3 * 3, you can just do 3 ^ 4 because exponentiation is iterated multiplication.
That’s where most people leave off in their education, and they feel perfectly content to live their lives only because they don’t know what they’re missing. Beyond exponentiation lies tetration, or iterated exponentiation.
Instead of 3 ^ 3 ^ 3 ^ 3, you can just do 3 ↑↑ 4 (a power-tower of threes, four high.)
You would say it “three to the three to the three to the three” in exponentiation terms, or “three arrow arrow three” in tetration terms. The number before the arrows determines the base and the number after them how high the tower goes. Let’s calculate some easy ones.
2 ↑↑ 2 is a power tower of twos, two high. That is 2 ^ 2 = 4.
3 ↑↑ 2 is a power tower of threes two high. That is 3 ^ 3 = 27. Easy.
2 ↑↑ 3 is a power tower of twos three high. That is 2 ^ 2 ^ 2 which is 2 ^ 4 = 16.
_3 ↑↑ 3 is a power tower of threes three high. That is 3 ^ 3 ^ 3 which is 3 ^ 27 = 7,625,597,484,987.
_Wait, what happened? You just encountered a fast-growing function.
Ok, tetration is cute, but if you want to generate Graham’s number, you’ll need to add more up arrows. Iterated tetration, symbolized by three up-arrows, is called pentation and would look like this: 3 ↑↑↑ 3. If 3 ↑↑ 3 is 7.6 trillion, what would 3 ↑↑↑ 3 be?
Well, it’s just iterated tetration, which means it’s a series of double-arrow operations with a base of three, three long, i.e., 3 ↑↑ 3 ↑↑ 3.
And since we know the second half, (3 ↑↑ 3) = 7.6 trillion, we can simplify it to 3 ↑↑ 7.6 trillion. What does that mean? It means a power tower of threes, 7.6 trillion high.
Okay, that sounds big. How big?
Consider a power tower of threes five high, i.e., 3^3^3^3^3 or 3 ↑↑ 5, is bigger than a googolplex.
To get the scale of a googolplex (one with a googol zeroes), consider you could not fit the zeroes it would take to write it out in the universe, even if you put one trillion zeroes on every atom. Again, we are not talking about the number itself, merely the number of digits required to write it out.
Consider a number with 200 digits is so massive, it’s far more than the number of Planck volumes (smallest known unit of measure) in the universe, but it’s trivial to write out. But you do not have space to write out a googolplex even while using a trillion digits per atom, let alone what those digits, if you could even write them, represent.
Your odds of entering every lottery on earth for the rest of your life, from the local bake sale to the mega millions, and winning all of them are far, far, far greater than 1 in a googolplex.
Your odds of guessing all the private bitcoin keys on earth without making an error are greater than one in a googolplex. A googolplex is an unfathomably large number. And yet it is smaller than 3 ↑↑ 5, or 3^3^3^3^3.
But 3 ↑↑↑ 3 is a tower of threes not five high, but 7.6 trillion high! When you get even to 10 high, you’ve exceeded a googolplex to the googolplexth power. The human mind cannot fathom the number you arrive at even at 100 or 1000 high, but we have to get to 7.6 trillion.
Okay, now that we’ve multiplied out the entire power tower to 7.6 trillion, guess what, we have to add another arrow. Not 3 ↑↑↑ 3 but 3 ↑↑↑↑ 3.
That’s hexation which is iterated pentation, in this case with a base of three and three terms, i.e., 3 ↑↑↑ 3 ↑↑↑ 3. We already know the second half is, whatever the incomprehensible result of the multiplied-out 7.6 trillion-high power tower was, call it X. So it’s 3 ↑↑↑ X.
And that means iterated tetration with a base of three, X times, i.e., _3 ↑↑ 3 ↑↑3 ↑↑ 3… X times.
To solve this, we go term by term. The first one is 7.6 trillion, which feeds into the second, the multiplied-out power tower 7.6 trillion high, i.e. X, the third is a power tower of threes, X high, multiplied out, and so on, and there are X of these entire towers, each one unfathomably, astronomically taller than the last.
Once we get through all X (remember itself an unfathomably large number) of the terms we’re at 3↑↑↑↑3.
That number is G1.
To get to G2, we just take 3 ↑↑↑↑↑↑↑↑↑↑…G1 arrows… 3.
Wait, what?
Remember each individual move up the scale from counting to addition to multiplication to exponentiation turbo-charged the growth of the function, and now in this function, they’re telling us to add G1 (3↑↑↑↑3) moves up the scale all at once!
Put differently, from counting by ones to the insanity of hexation, there are only six steps. To get G2, there are 3↑↑↑↑3 steps!
To get G3, it’s 3 G2 arrows 3. To get to G4, it’s 3 G3 arrows 3.
And so on until we hit G64 which is Graham’s Number.
It’s an indescribably massive number, not relatable to anything in the universe, not even in terms of the possible ways the atoms could be arranged taken to the power of the number of ways history’s chess games could have been played.
There is no way to visualize or imagine it except by walking vaguely through the steps to get there and straining your brain to grasp the process.
But as I said, Graham’s Number is trivial, basically zero compared to TREE(3), and that is so much the case that if instead of going to G64 via the steps, you went to G(Googolplex), or even G(Graham’s Number), i.e., G(G64), you would still be at zero relative to TREE(3).
But here’s where it gets fascinating. While Graham’s Number is generated via ramping up increasingly powerful iterative operations (as we did in the beginning) TREE(3) comes from a simple game.
There is a good article in Popular Mechanics that lays it out, building off this excellent Numberphile video with Tony Padilla:
You can click on the article and video for the specific (and relatively basic rules), but essentially, the TREE function has to do with “seeds” (dots) and “trees” (combinations of dots and lines), such that you make the maximum amount of unique “trees” (dot-line combos) per the types of seeds available.
If you have only one kind of seed, say a green one, there is only one unique tree that can be made.
So:
TREE(1) = 1.
If you have two seeds, say a green and a red, there are three different kinds of unique trees you could make.
TREE(2) = 3.
If you have three seeds, say a green, a red and a black, there are TREE(3) different kinds of trees you could make. As it turns out, that number (which is not infinite) is so much bigger than Graham’s number the two are not even in the same universe:
Here’s Padilla comparing TREE(3) to Graham’s Number if you want to see the difference:
Okay, so what does all this mean? It means that Graham’s Number, which is generated by successively more powerful iterations of mathematical operations, cannot compete with TREE(3) which comes from a game with simple rules.
Graham’s Number is built the way a machine would do it, the way an AI would go about making a huge number — mechanically increasing the rate of construction.
Consider if you had a machine that made products one at a time, that’s like counting. And if you had a machine that made products three at a time, that’s like adding.
And a machine that made machines that made products three at a time, that’s like multiplication. And a machine that made those three at a time would be exponentiation, etc., etc.
Each successive machine-making machine would take you into a deeper paradigm of growth. But you can see this is a mechanical process, no matter how deep you go.
By contrast, the tree series is what happens when you design a game with simple rules and let it play out. The growth (once you realize it to the extent the human brain can even grasp it) is not just faster than the mechanistic model, it’s on another plane.
The takeaway then is bottom-up complex systems (games) with a few simple rules can spawn a paradigm so much bigger than top-down mechanistic growth models.
The human brain (neocortex in McKenna’s terms) is just such a system, and yet we train ourselves to be like machines! Practice, routine, iteration, follow-these-10-steps to happiness, learn these five keys to investing, etc. Yes, you can get somewhere with these recipes, but nowhere near the destination of which you are inherently capable.
The key is a few simple inputs — good nutrition, enough sleep, a decent environment — and to let the mind have its space to play out the sequence in full.
In modern society the conditions needed for greatness, since the basics are relatively easy to come by, are achieved more by getting rid of negatives. Don’t be a drug addict or alcoholic, ditch the porn, the video games, the excessive social media use, etc. Then let the game play out.
Of course, this is easier said than done, as we’ve been deeply conditioned by the mechanistic paradigm, and remember TREE(1) is only 1, while G(1) is 3↑↑↑↑3, i.e., the more powerful growth function doesn’t necessarily reveal itself at the outset. But that changes in short order, and once it does, the mechanistic growth is no match for the “most densely ramified complexified structure in the known universe.”
I’ll end on a speculative thought, one that occurred to me while recording a podcast on this topic: Might what we think of as good vs evil actually just be a battle between the mechanistic and the complex, the difference between top-down compulsion and bottom-up free choice?
BONUS VIDEOS For those who want to dive deeper into this, there are some good videos by Carbrickscity on You Tube here, here and here. And Tim Urban’s article on Graham’s Number is worthwhile too.
-
@ 0b963191:fc5e7ffd
2024-05-05 23:36:45Chef's notes
Can of course substitute in beef for the lamb.
In terms of canned peas, always prefer La Sueur
Details
- ⏲️ Prep time: 30 min
- 🍳 Cook time: 1 hour
- 🍽️ Servings: 4
Ingredients
- Potatoes:
- 1 ½ pounds potatoes (Russet)
- ½ cup half-and-half
- 2 ounces unsalted butter
- ¾ teaspoon kosher salt
- ¼ teaspoon black pepper
- 1 egg yolk
- Meat Filling:
- 2 tablespoons canola oil
- 1 cup chopped white onion
- 3 carrots, peeled and diced
- 1 ½ pounds ground lamb
- 1 teaspoon kosher salt
- ½ teaspoon black pepper
- 3 tablespoons flour
- 3-4 teaspoons tomato paste
- 1 cup chicken broth
- 3 teaspoons Worcestershire sauce
- 2-3 teaspoons chopped rosemary leaves (fresh)
- 1-2 teaspoons chopped thyme leaves (fresh)
- ¾ cup corn kernels
- ¾ cup peas
Directions
- Potatoes:
- Peel and dice potatoes.
- Place in a pot and cover in cold water.
- Bring to a boil.
- Lower fire to simmer and cook for 15-20 minutes.
- Heat butter and half-and-half for 30-40 seconds in the microwave.
- Drain potatoes and put back into the pot.
- Mash potatoes and add half-and-half, butter, salt, pepper.
- Continue mashing until smooth and stir in yolk until combined.
- Preheat the oven to 400 degrees.
- Filling:
- Heat canola oil in a 10-12 inch pan over high heat.
- Add onions and carrots and saute over medium-high heat for 4-5 minutes.
- Add garlic and combine for 30 seconds.
- Add Lamb, salt, and pepper and cook until browned.
- Sprinkle flour and stir to coat lamb and cook for 1-2 minutes.
- Add Tomato paste, chicken broth, Worcestershire sauce, rosemary, thyme. Stir and bring to a boil.
- Reduce heat to low, cover, and simmer for 12-15 minutes while sauce thickens.
- Oven:
- Add meat filling to baking dish.
- Add peas and corn and stir in evenly
- Spread mashed potatoes using a rubber spatula. start at the corners and make sure to seal around the edges.
- Bake for 25 minutes
- Remove from the oven and let sit for 15-20 minutes.
-
@ 700c6cbf:a92816fd
2024-07-21 14:58:04My first post in this community is going to be a simple on i.e. not a recipe. However, if you'd like me to post the recipe, you know what to do.
Last night's dinner was born out of the fact that we ran out of fresh alternatives of chicken or fish, our usual staples to have for dinner with salad. Since I felt lazy, it's the weekend after all, I didn't feel like going shopping, I started checking our pantry for something, for an inspiration.
And inspiration struck in the form of: Garbanzo beans, also known as chick peas. We love those little bundles of healthy protein. Add them to salad, roast and spice them, make them into a pate or hummus or a good ol' German pea soup, they are extremely versatile.
But what I had in mind was a protein main to be accompanied by a mixed greens salad with honey vinaigrette, hence I made veggie burgers.
Earlier in the day we had talked about potato pancakes or hashbrowns or rösti in Switzerland. Luckily I had some potatoes on hand, unlucky for me they were small which makes them harder to hold when grating them. Yet somehow I managed to not get any of my fingers into the mix.
And thus our two patties dinner with salad was created. I added a little roasted garlic aioli to the rösti and a spicy habanero aioli to the bean patty.
https://cdn.satellite.earth/6c14c39a0a3809b8b66351e0a6187f7914d6befe1ddc4e68018a8bcf4c182ccf.jpg
https://cdn.satellite.earth/dd759f338afc6f88e1bbc3cc803899e5b98661d846f6b5604d5695a6a585b253.jpg
I dare say it was a pretty good dinner - yum!
Cheers, OceanBee
-
@ 75bf2353:e1bfa895
2024-06-21 21:41:19Using p2p Electronic Cash
by Blogging Bitcoin
“The curious task of economics is to demonstrate to men how little they really know about what they imagine they can design. To the naive mind that can conceive of order only as the product of deliberate arrangement, it may seem absurd that in complex conditions order, and adaptation to the unknown, can be achieved more effectively by decentralizing decisions and that a division of authority will actually extend the possibility of overall order. Yet that decentralization actually leads to more information being taken into account.” ― Friedrich Hayek, The Fatal Conceit: The Errors of Socialism
I write a free and open source blog about using bitcoin as peer to peer electronic cash. This is a compilation of some blogs I've written over the last two years. I wanted to write a book, but always felt paywalls are contrary to the free and open-source software movement. I still might publish this on dead trees, but want it to be free for anyone to read like the code you can find on https://github.com/bitcoin/bitcoin/. This is also an example of what bitcoin makes possible. Andreas Antonopolis once compared bitcoin to the invention of the printing press.
I may not agree with everything he says, but he was right about that. Bitcoin allows me to get paid directly by the consumer. It is possible to sell my words without jumping through the Amazon hoops. App developers can publish apps, but Apple will not be levy their 30% tax. I'm not advocating tax evasion or fighting the IRS, but why should we pay corporations to publish and censor us when we can be paid directly from our audience?
This is a compilation of 21 of my favorite blogs I wrote on nostr over the last two years. It is written using the Value 4 Value model. To support my work, please pay the paywall, send zaps, or contribute your time treasure, or talent in another way.
Most bitcoin books try to convince you to buy bitcoin. They talk about how bitcoin is worth more today than ten years ago. You should buy bitcoin because it's going to a moon 69,420 light-years away in a galaxy called WhenLambo. I get it. 125 sats will most likely buy you something at the dollar tree one day. At the time of this writing, 1,922 will buy you a single item at the dollar tree, but 1,538 sats will buy you a paper NFT with George Washinton's mug shot. This is not about NFT's, trading crypto, or how many dollars 100,000,000 sats are worth. It's the musings of some nerd on the Internet who believes cryptography equals freedom in the electronic age. It's about how I use bitcoin and freedom tech in an unfree fiat tech world.
All of my work is written under a Creative Commons Licence. To support my work, please pay the paywall or send zaps. I have published this blog behind this paywall. Paywalls kinda suck because they block so many people from reading your work--At least they think they do. Most paywalls can be bypassed by a simple google search
nostr:nprofile1qyvhwumn8ghj7mn0wd68yvfwvd6hyun9de6zuenedyhszxrhwden5te0wfjkccte9e3h2unjv4h8gtnx095j7qghwaehxw309aex2mrp0yh8qunfd4skctnwv46z7qgswaehxw309ahx7um5wghx6mmd9uq3uamnwvaz7tmwdaehgu3dwp6kytnhv4kxcmmjv3jhytnwv46z7qgwwaehxw309ahx7uewd3hkctcpzdmhxue69uhk7enxvd5xz6tw9ec82c30qywhwumn8ghj7mn0wd68ytnzd96xxmmfdejhytnnda3kjctv9uqzqadlydf3460e333fjkaqwxg7fz82636ew5m36c7hml2xhhsml2y4csrc5a
Using p2p Electronic Cash, by Blogging Bitcoin
https://i.nostr.build/zdJG3.png
2 Support Yo0ur Local Circular Economy To Drain The Exchanges
4)Use Bitcoin To Protoest Inflation
9)Privacy Is Necessary For An Open Society In The Electronic Age
10)Fix The Money, Fix The Music
11)Build The Bitcoin Economy On Nostr
12)I don't trust the Cold Card: How to Verify Dice Rolls Using Tails
13)Recurring Bitcoin Donations with OakNode
15)Doge Is Dumb
16)Instransigent Tipping 17)Explain The Blockchain To Me(Like I don't Know How To Use The Command Line)
19)How Bitcoin Works In A Nutshell
20)Stacking Sats Is For Everybody
₿logging ₿itcoin 848,675 debtclock 🧡 Support this by using the ⚡Value 4 Value model⚡
-
@ 1aa437b3:51119dbd
2024-07-21 14:42:12As topic says, is there a thing such as connecting to to many Nostr relays? What number are you pushing to many at or what should be the max if any?
originally posted at https://stacker.news/items/616313