The COVID-19 pandemic, rampant inflation and regional conflicts directly influenced Bitcoin’s (BTC) drop in value over the past two years. However, 2024 promises to be a resurgent period, according to Blockstream CEO Adam Back.
The cryptographer, who pioneered the proof-of-work algorithm applied in Bitcoin’s protocol, tells Cointelegraph that the preeminent cryptocurrency is trailing below the historical price trend line of previous mining reward-halving events.
“Biblical” events hurt Bitcoin
Back weighed in on the potential price action of Bitcoin as the next halving, which will see Bitcoin miners’ block reward reduced by 6.25 BTC to 3.125 BTC, looms in April 2024. Block rewards halvings are programmatically hardwired into Bitcoin’s code, taking place after every 210,000 blocks are mined.
Bitcoin’s supply issuance is hardwired into its protocol, with BTC mining rewards halving every 210,000 blocks. Source: bitcoinblockhalf.com
Back says that the overlaid averages of the previous market cycles and halvings indicate that Bitcoin’s relative value is trailing behind widely accepted projections. Multiple events have played a role in driving the price of BTC down, which has also been seen across conventional financial markets:
“The last few years were like biblical pestilence and plague. There was COVID-19, quantitative easing, and wars affecting power prices. Inflation running up people, companies are going bankrupt.”
The impact has keenly affected markets and portfolio management according to Back. Investment managers have had to manage risk and losses over the past few years which has necessitated the sale of more liquid assets.
“They have to come up with cash and sometimes they’ll sell the good stuff because it’s liquid and Bitcoin is super liquid. It used to happen with gold and I think that’s a factor for Bitcoin in the last couple of years,” Back explains.
Bitcoin would have hit $100,000 already
As 2023 comes to a close, many of these macro events that Back cited have wound down while more industry-specific failures have also been resolved. This has been reflected in Bitcoin’s recent price surge from Nov. 2023 onwards.
“The wave of the contagion, the companies that went bankrupt because they were exposed to Three Arrows Capital, Celsius, BlockFi and FTX – that’s mostly done. We don’t think there are many more big surprises in store,” Back said.
The Blockstream CEO predicted that Bitcoin would hit $100,000 in the following market cycle earlier this year and referred back to this point. He believes BTC would have hit this mark already if not for the factors highlighted in conversation with Cointelegraph.
Back also referred to the Bitcoin “stock-to-flow” model created by pseudonymous former institutional investor PlanB as a reference point for the potential upside for Bitcoin in 2024.
Back explains that PlanB’s model and heuristics suggest that savvy Bitcoin investors historically bought BTC six months before a halving event and sold into significant surges in price that have occurred in the 18 months following the drop in mining rewards:
“People thought it was a bit of a crazy assertion that we might get to $100,000 pre-halving because I said it when the price was around $20,000.”
He adds that Bitcoin’s price hitting $44,000 multiple times in Dec. 2023 suggests that his prior prediction might not be so far-fetched.
People asking me if we changed odds. No, we still holding line at 90% odds of approval by Jan 10 (aka this cycle), the same odds we’ve had for months (before it was cool/safe). What we watching for now: more amended/final filings to roll in and clarity on in-kind vs cash creates https://t.co/uiWgfxOfzz
Senior ETF analysts Eric Balchunas and James Seyffart have touted these applications to get the green light in early 2024. Galaxy Digital’s co-founder Michael Novogratz has also predicted mass inflows of institutional investment into the BTC-back products, a point which Back echoes:
“I thinkBitcoin could get to $100,000 even before the ETF and before the halving. But I certainly think the ETF shouldn’t be undervalued in its influence.”
A key reason cited by the Bitcoin advocate is that whole segments of traditional markets, including major fund managers like BlackRock and Fidelity, are simply not allowed to invest directly into assets like Bitcoin.
“If they’re managing a mutual fund they have rules, either externally imposed or as part of their fund, that they can only buy things like public stocks and ETFs. They can’t buy into startups, they can’t buy precious metals physically. They can’t do any of that stuff,” Back highlights.
This remains a pertinent reason why a spot Bitcoin ETF could drive major capital inflows into the space. Back adds that the investment vehicle opens access to Bitcoin exposure for many types of funds, particularly in the U.S., that are more inclined to do so through Fidelity or BlackRock than with a cryptocurrency exchange.
TikTok and Instagram have been accused of targeting teenagers with suicide and self-harm content – at a higher rate than two years ago.
The Molly Rose Foundation – set up by Ian Russell after his 14-year-old daughter took her own life after viewing harmful content on social media – commissioned analysis of hundreds of posts on the platforms, using accounts of a 15-year-old girl based in the UK.
The charity claimed videos recommended by algorithms on the For You pages continued to feature a “tsunami” of clips containing “suicide, self-harm and intense depression” to under-16s who have previously engaged with similar material.
One in 10 of the harmful posts had been liked at least a million times. The average number of likes was 226,000, the researchers said.
Mr Russell told Sky News the results were “horrifying” and showed online safety laws are not fit for purpose.
Image: Molly Russell died in 2017. Pic: Molly Rose Foundation
‘This is happening on PM’s watch’
He said: “It is staggering that eight years after Molly’s death, incredibly harmful suicide, self-harm, and depression content like she saw is still pervasive across social media.
“Ofcom’s recent child safety codes do not match the sheer scale of harm being suggested to vulnerable users and ultimately do little to prevent more deaths like Molly’s.
“The situation has got worse rather than better, despite the actions of governments and regulators and people like me. The report shows that if you strayed into the rabbit hole of harmful suicide self-injury content, it’s almost inescapable.
“For over a year, this entirely preventable harm has been happening on the prime minister’s watch and where Ofcom have been timid it is time for him to be strong and bring forward strengthened, life-saving legislation without delay.”
Image: Ian Russell says children are viewing ‘industrial levels’ of self-harm content
After Molly’s death in 2017, a coroner ruled she had been suffering from depression, and the material she had viewed online contributed to her death “in a more than minimal way”.
Researchers at Bright Data looked at 300 Instagram Reels and 242 TikToks to determine if they “promoted and glorified suicide and self-harm”, referenced ideation or methods, or “themes of intense hopelessness, misery, and despair”.
Please use Chrome browser for a more accessible video player
3:53
What are the new online rules?
Instagram
The Molly Rose Foundation claimed Instagram “continues to algorithmically recommend appallingly high volumes of harmful material”.
The researchers said 97% of the videos recommended on Instagram Reels for the account of a teenage girl, who had previously looked at this content, were judged to be harmful.
Some 44% actively referenced suicide and self-harm, they said. They also claimed harmful content was sent in emails containing recommended content for users.
A spokesperson for Meta, which owns Instagram, said: “We disagree with the assertions of this report and the limited methodology behind it.
“Tens of millions of teens are now in Instagram Teen Accounts, which offer built-in protections that limit who can contact them, the content they see, and the time they spend on Instagram.
“We continue to use automated technology to remove content encouraging suicide and self-injury, with 99% proactively actioned before being reported to us. We developed Teen Accounts to help protect teens online and continue to work tirelessly to do just that.”
TikTok
TikTok was accused of recommending “an almost uninterrupted supply of harmful material”, with 96% of the videos judged to be harmful, the report said.
Over half (55%) of the For You posts were found to be suicide and self-harm related; a single search yielding posts promoting suicide behaviours, dangerous stunts and challenges, it was claimed.
The number of problematic hashtags had increased since 2023; with many shared on highly-followed accounts which compiled ‘playlists’ of harmful content, the report alleged.
A TikTok spokesperson said: “Teen accounts on TikTok have 50+ features and settings designed to help them safely express themselves, discover and learn, and parents can further customise 20+ content and privacy settings through Family Pairing.
“With over 99% of violative content proactively removed by TikTok, the findings don’t reflect the real experience of people on our platform which the report admits.”
According to TikTok, they not do not allow content showing or promoting suicide and self-harm, and say that banned hashtags lead users to support helplines.
Please use Chrome browser for a more accessible video player
5:23
Why do people want to repeal the Online Safety Act?
‘A brutal reality’
Both platforms allow young users to provide negative feedback on harmful content recommended to them. But the researchers found they can also provide positive feedback on this content and be sent it for the next 30 days.
Technology Secretary Peter Kyle said: “These figures show a brutal reality – for far too long, tech companies have stood by as the internet fed vile content to children, devastating young lives and even tearing some families to pieces.
“But companies can no longer pretend not to see. The Online Safety Act, which came into effect earlier this year, requires platforms to protect all users from illegal content and children from the most harmful content, like promoting or encouraging suicide and self-harm. 45 sites are already under investigation.”
An Ofcom spokesperson said: “Since this research was carried out, our new measures to protect children online have come into force.
“These will make a meaningful difference to children – helping to prevent exposure to the most harmful content, including suicide and self-harm material. And for the first time, services will be required by law to tame toxic algorithms.
“Tech firms that don’t comply with the protection measures set out in our codes can expect enforcement action.”
Image: Peter Kyle has said opponents of the Online Safety Act are on the side of predators. Pic: PA
‘A snapshot of rock bottom’
A separate report out today from the Children’s Commissioner found the proportion of children who have seen pornography online has risen in the past two years – also driven by algorithms.
Rachel de Souza described the content young people are seeing as “violent, extreme and degrading”, and often illegal, and said her office’s findings must be seen as a “snapshot of what rock bottom looks like”.
More than half (58%) of respondents to the survey said that, as children, they had seen pornography involving strangulation, while 44% reported seeing a depiction of rape – specifically someone who was asleep.
The survey of 1,020 people aged between 16 and 21 found that they were on average aged 13 when they first saw pornography. More than a quarter (27%) said they were 11, and some reported being six or younger.
Anyone feeling emotionally distressed or suicidal can call Samaritans for help on 116 123 or email jo@samaritans.org in the UK. In the US, call the Samaritans branch in your area or 1 (800) 273-TALK.
The SEC has pushed back decisions on Truth Social’s Bitcoin-Ethereum ETF, Solana products from 21Shares and Bitwise and 21Shares’ Core XRP Trust — all now set for October deadlines.