• 0 Posts
  • 18 Comments
Joined 1 year ago
cake
Cake day: November 4th, 2023

help-circle



  • Even though costs of AAA games have gone up for some games (certainly not all) because of the size of teams/labor hours, so have the volume of sales. Publishers have made more and more profit while the average price of AAA games had stayed about the same for a long time.

    Games selling in the hundreds of thousands was considered really good decades ago but now those are in the tens of millions.

    Publishers aren’t having problems with profitability, so much so that they’ve been buying up large swaths of development houses and IPs and then dismantling them when they have a single flop.

    EA’s gross profit in 2010 was $1.6B, in 2014 was $3.03B and in the past 12 months have been $5.8B right now according to macrotrends.

    But the current trends are unsustainable

    The current trend in profitability is increasing, not decreasing. It isn’t a minor trend or minor increases either.

    Major publisher profitability has vastly increased in spite of stagnant game prices. They don’t have to increase prices to increase growth. It is simply that the market allows the increase of the price with more profitability and so they do.



  • I was trying to find the old Level 3 blog post but didn’t because I believe they basically said that Comcast needed to upgrade its infrastructure and never did. Netflix was the cashcow they saw to essentially make them pay for it. As a Comcast customer, I see it as charging the customer twice – first for the Internet service for the content and again because Netflix is going to pass that extra cost onto you (and everyone else who isn’t a Comcast customer).

    You’re right on about CDNs and edge / egress/ingress PoPs. It also keeps it cheaper for the likes of Netflix/Amazon/etc. in the long run with the benefits of adding more availability.



  • The problem historically isn’t that streaming services are paying for fast lanes but that they have to pay not to be throttled below normal traffic. In other words, they have to pay more to be treated like other traffic.

    Even crazier is remember that there are actual peering agreements between folks like cogentco, Level 3, comcast, Hurricane Electric, AT&T, etc. What comcast did that caused the spotlight was to bypass their peering agreement with Level 3 and went direct to their end customer (netflix) and told them they’d specifically throttle them if they didn’t pay a premium which also undermined Level3’s peering agreement with Comcast.

    Peering agreements are basically like “I’ll route your traffic, if you route my traffic” and that’s how the Internet works.





  • They are also the only RCS supplier on Android. A random messaging app can’t simply add RCS messaging functionality.

    You are correct that an app can’t directly implement RCS but it can support it. RCS is implemented by the carrier, not by Google or any other text application.

    RCS is an open standard that any carrier can implement to replace SMS/MMS. The only thing special that Google does is on top of RCS is provides E2E via its own servers for handling messaging. The E2E isn’t a part of RCS, though it should be IMO. Regardless, Google doesn’t ‘own’ the Android implementation because it isn’t a part of Android, other than it can support the carrier’s implementation of RCS.





  • Right now the closest we have to that is running ampere clusters. I’m saying that because it is going to be some years before any phone GPU/CPU is going to be able to effectively run a decent AI model. I don’t doubt there will be some sort of marketing for ‘boosting’ AI via your phone CPU/GPU but it isn’t going to do much more than be a marketing ploy.

    It is far more likely that it will still continue to be offloaded to the cloud. There is going to be much more market motivation to continue to put your data on the cloud instead of off of it.


  • It’s already here. I run AI models via my GPU with training data from various sources for both searching/GPT-like chat and images. You can basically point-and-click and do this with GPT4All which integrates a chat client and let’s you just select some popular AI models without knowing how to really do anything or use the CLI. It basically gives you a ChatGPT experience offline using your GPU if it has enough VRAM or CPU if it doesn’t for whatever particular model you’re using. It doesn’t do images I don’t think but there are other projects out there that simplify doing it using your own stuff.




  • The problem is that there is that ad networks and ad placements are just bad actors in the consumer space. Not only has malware been passed time and time again with ads but also false ads to malware. When that happens suddenly the content creator/website/whatever ‘isn’t responsible’ for it. Then there’s the issue of ads being placed everywhere slowing down websites but even worse, getting in the way with auto play audio and video, videos autoscrolling over the content you’re trying to read or whatever, etc.

    As a consumer, I should not and ethically do not need to worry about another’s business model. If the business model fails simply because I don’t allow something that model depends on to traverse my network then it is on them to figure it out. If the ads get in the way of the content, then I just want consume the content anyway.

    Some news websites use Ad Admiral or whatever it is called and I haven’t bothered trying to bypass the adblock wall for them. I just simply consume the content elsewhere.

    If ads were ever responsibly used or perhaps could be argued that there is compromise where consumers wouldn’t mind, then there’d probably be a lot less ad blocker usage. It’s like anything else. When it takes less effort to install an adblocker to have an OK experience, then ad blockers will be popular.

    I was around before ad blockers were very popular and even before pop-up blockers were around. Ads kept getting worse which is why ad blockers became more popular and more sophisticated. The Internet had ads for years before ad blockers were the norm.