Quantum computing could be useful faster than anyone expected zdnet.com/article/quantum…
πŸ‘︎ 2k
πŸ“°︎ r/tech
πŸ’¬︎
πŸ‘€︎ u/MichaelTen
πŸ“…︎ May 06 2021
🚨︎ report
Intel says it has solved a key bottleneck in quantum computing - The breakthrough could lead to tightly integrated quantum chips. engadget.com/intel-ends-q…
πŸ‘︎ 2k
πŸ“°︎ r/Futurology
πŸ’¬︎
πŸ‘€︎ u/izumi3682
πŸ“…︎ May 12 2021
🚨︎ report
With all due respect to Nepo, we all know who would have won the Candidates if only his algorithm had finished computing
πŸ‘︎ 3k
πŸ“°︎ r/AnarchyChess
πŸ’¬︎
πŸ‘€︎ u/martincheckmate
πŸ“…︎ May 01 2021
🚨︎ report
A quick trick for computing eigenvalues | Essence of linear algebra, chapter 15 youtube.com/watch?v=e50Bj…
πŸ‘︎ 990
πŸ“°︎ r/math
πŸ’¬︎
πŸ‘€︎ u/mohamez
πŸ“…︎ May 07 2021
🚨︎ report
NVIDIA Announces CPU for Giant AI and High Performance Computing Workloads

https://nvidianews.nvidia.com/news/nvidia-announces-cpu-for-giant-ai-and-high-performance-computing-workloads

β€œNVIDIA today announced its first data center CPU, an Arm-based processor that will deliver 10x the performance of today’s fastest servers on the most complex AI and high performance computing workloads.

The result of more than 10,000 engineering years of work, the NVIDIA Graceβ„’ CPU is designed to address the computing requirements for the world’s most advanced applications β€” including natural language processing, recommender systems and AI supercomputing β€” that analyze enormous datasets requiring both ultra-fast compute performance and massive memory. It combines energy-efficient Arm CPU cores with an innovative low-power memory subsystem to deliver high performance with great efficiency.

β€œLeading-edge AI and data science are pushing today’s computer architecture beyond its limits – processing unthinkable amounts of data,” said Jensen Huang, founder and CEO of NVIDIA. β€œUsing licensed Arm IP, NVIDIA has designed Grace as a CPU specifically for giant-scale AI and HPC. Coupled with the GPU and DPU, Grace gives us the third foundational technology for computing, and the ability to re-architect the data center to advance AI. NVIDIA is now a three-chip company.”

Shares up 4% on the news wonder what that means for the ARM deal

Edit: https://www.globenewswire.com/news-release/2021/04/12/2208550/0/en/NVIDIA-Announces-First-Quarter-Fiscal-2022-Revenue-Tracking-Above-Outlook.html

Raised guidance too wow

πŸ‘︎ 2k
πŸ“°︎ r/investing
πŸ’¬︎
πŸ‘€︎ u/day_bowbow
πŸ“…︎ Apr 12 2021
🚨︎ report
Goldman Sachs predicts quantum computing 5 years away from use in markets ft.com/content/bbff5dfd-c…
πŸ‘︎ 363
πŸ“°︎ r/finance
πŸ’¬︎
πŸ‘€︎ u/Forgottenmudder
πŸ“…︎ Apr 29 2021
🚨︎ report
Suddenly on Christmas you get a PC made of pulsating flesh, blood and bone with all the normal pc ports. It Has 1000 times mire computing power than your current PC but you have to feed it with a rat once a month. How would you react to that?
πŸ‘︎ 75k
πŸ“°︎ r/AskReddit
πŸ’¬︎
πŸ‘€︎ u/Kohrack
πŸ“…︎ Dec 04 2020
🚨︎ report
Was watching a documentary on chess computing when I saw this stray AMD logo residing inside the HiTech chess computer
πŸ‘︎ 3k
πŸ“°︎ r/Amd
πŸ’¬︎
πŸ“…︎ Apr 22 2021
🚨︎ report
The etymology of general computing terms (featuring avatar, boot, cookie, spam and wiki)
πŸ‘︎ 21k
πŸ“°︎ r/coolguides
πŸ’¬︎
πŸ‘€︎ u/TheStrangeRoots
πŸ“…︎ Feb 22 2021
🚨︎ report
Source Review - Militaries regularly compensate for the rotation of the planet Earth (the Coriolis Effect) when computing ballistic firing solutions, therefore the Earth rotates. Flat Earthers lament.

So, during a debate on another thread I encountered a flat Earther making the following claim:

>No military take curvature or spin into account for any targeting. This is a very old argument.Seems you are just regurgitating the same old arguments. Everything in your comment has not only been disproven, it has been absolutely destroyed in reality.

This surprised me, as it is extremely well-understood that militaries do take these factors into account. That individual has elected not to review any of this evidence (no surprise), but I felt it would go to waste languishing ten comments deep in a flat Earth thread, so I reprise it here.

I submit that in fact there is a strong body of evidence that militaries do have to account for Earth's rotation when aiming ballistic weaponry at long ranges.

I will break this down into evidence sections for organisation... this is only a fraction of the evidence out there, but I have to stop somewhere :)

Primary Military Sources

>CORRECTIONS TO RANGE/AZIMUTH TO COMPENSATE FOR THE ROTATION OF THE EARTH.

There is also explicit discussion of it throughout the text, here on p399;

>compensate for the effects of ballistic wind direction and speed and for rotation of the earth throughout the firing unit’s area of responsibility. These corrections are combined with the position constants determined in the concurrent met by solving a subsequent met in each 800-mil sector or selected 800-mil sectors. (See Figure 11-32.)

>*Rotation of Earth.Although the rotation of the earth is a natural phenomenon, it is treated as a nonstandard condition for simplicity in the construction of firing tables. The magnitude and direction of projectile displace-ment from the target owing to rotational effects are functions of azimuth of fire, time of flight, projectile velocity, and relative position of piece and target with respect to the Equato

... keep reading on reddit ➑

πŸ‘︎ 222
πŸ“°︎ r/conspiracyNOPOL
πŸ’¬︎
πŸ‘€︎ u/Aurazor
πŸ“…︎ Mar 29 2021
🚨︎ report
Quantum computing papers
πŸ‘︎ 611
πŸ“°︎ r/xkcd
πŸ’¬︎
πŸ‘€︎ u/AlrikBunseheimer
πŸ“…︎ May 02 2021
🚨︎ report
[D] ML and Quantum Computing

Hi reddit,

I am not entirely sure if this is the right place to ask this but here I go.I recently got an opportunity to work in quantum computing team. The entire purpose of the team is to find applications of quantum computing in our firm. It sounds like an amazing opportunity but I am having a lot of reservations. Would love and appreciate your insights.

I come from an CS engineering background with a masters in AI/ML, my full time position at the moment is that of a data scientist working on clustering models. The quantum computing opportunity is available due to formation of a new team which has a lot of vacancies at the moment.

Although the prospects of Quantum AI/ML sounds hella interesting(although very niche), quantum computing itself seems like an extreme departure from my formal education with its heavy emphasis in physics. This is making me anxious

How viable is this opportunity for my career? How are the job prospects in the domain ( for someone with my background)? Would it help me in creating a niche for myself in data science ?I can admit shamelessly that I'd like a career which pays the big bucks at the same time keep me interested enough.

Would love your opinions

EDIT: I am very thankful for this amazing community. Thank you for all your responses, Holy shit thats a lot to take in.

πŸ‘︎ 61
πŸ“°︎ r/MachineLearning
πŸ’¬︎
πŸ‘€︎ u/ml_abler
πŸ“…︎ May 10 2021
🚨︎ report
IBM just solved this quantum computing problem 120 times faster than previously possible. Big Blue has now released Qiskit Runtime, which enables a significant acceleration of quantum calculations carried out over the cloud. zdnet.com/article/ibm-jus…
πŸ‘︎ 171
πŸ“°︎ r/Futurology
πŸ’¬︎
πŸ‘€︎ u/izumi3682
πŸ“…︎ May 11 2021
🚨︎ report
Goldman Sachs predicts quantum computing 5 years away from use in markets ft.com/content/bbff5dfd-c…
πŸ‘︎ 286
πŸ“°︎ r/singularity
πŸ’¬︎
πŸ‘€︎ u/JackFisherBooks
πŸ“…︎ May 03 2021
🚨︎ report
How does quantum computing affect CS theory?

I've read very little about quantum computing tbh but from what i've read so far, quantum computing doesn't seem to be bringing anything new to the table except faster processing. more hashes. It doesn't seem to be changing the way we approach problems or build proofs since we have to deal with qubits as a 0 or a 1 in the end. is that actually true or is my knowledge on the topic too shallow?

πŸ‘︎ 142
πŸ“°︎ r/csMajors
πŸ’¬︎
πŸ‘€︎ u/Vanilla_mice
πŸ“…︎ May 02 2021
🚨︎ report
As I am doing my PhD in Medical Image Computing, I designed a LEGO Radiology set to inspire the next generation of scientists. If you find my design valuable please support it on LEGO Ideas :) reddit.com/gallery/mtcdwi
πŸ‘︎ 439
πŸ“°︎ r/PhD
πŸ’¬︎
πŸ‘€︎ u/kobalt93
πŸ“…︎ Apr 18 2021
🚨︎ report
Solarpunk Computing

Using tech available to us right now, what would a solarpunk laptop computer look like?

A few thoughts of mine would include the following features.

  • An e-paper screen for low power usage and ease of use in a wide variety of environments.

  • A raspberry Pi as the brains (again low power consumption and a high return on computing muscle per energy unit) booting off an SD card.

  • A high quality mechanical (buckling spring) USB keyboard with a USB hub built into it that can accept several USB thumb drives for storage memory made available when the keyboard is plugged in, with an accessible port for an external hard drive to back it up easily.

  • A fairly beefy capacitor bank for power smoothing and rapid energy storage.

  • A photovoltaic panel for charging said capacitor bank, deployed nearby, to keep this rig off the electric grid.

  • Some basic weatherproofing do a rainstorm doesn't short it out. Yes, it wont charge during said storm but if it is like the weather up here it just has to outlast the storm or enter a low power mode to hibernate until sunlight is available again.

It won't satisfy a hard core gamer or demanding corporate exec, but for the average person it would meet one's needs. Portable, lightweight, and when you need more computing power, you use it as a terminal to access a beefier system installed elsewhere on the internet.

πŸ‘︎ 128
πŸ“°︎ r/solarpunk
πŸ’¬︎
πŸ‘€︎ u/ArenYashar
πŸ“…︎ Apr 19 2021
🚨︎ report
Tried to create a GIF of Cloud Computing. P.S. I am not the original creator of the particular design. I used Illustrator for shapes and After Effects for animation. Design idea from: www.wallpaperaccess.com. v.redd.it/24k0c2fcouy61
πŸ‘︎ 145
πŸ“°︎ r/AfterEffects
πŸ’¬︎
πŸ‘€︎ u/skalbin97
πŸ“…︎ May 13 2021
🚨︎ report
β€˜Brain-like device’ mimics human learning in major computing breakthrough independent.co.uk/life-st…
πŸ‘︎ 88
πŸ“°︎ r/technology
πŸ’¬︎
πŸ‘€︎ u/Pessimist2020
πŸ“…︎ Apr 30 2021
🚨︎ report
COMPUTING.GIF
πŸ‘︎ 307
πŸ“°︎ r/VaporwaveArt
πŸ’¬︎
πŸ‘€︎ u/Elficidium
πŸ“…︎ May 08 2021
🚨︎ report
Is access to computing education at a University Level restricted?

Recently the MAS Chief spoke about Singapore having to rely on foreign manpower for our IT sector, because there is a lack of local workers for that sector.

However as of right now, the IGP for all 4 of NUS School of Computing courses is AAA/A, with the poly route GPA requirement being min 3.75. (10th percentile)

Source: https://www.nus.edu.sg/oam/undergraduate-programmes/indicative-grade-profile-(igp)

This seems abit excessive. If indeed the future of Singaporean economy is tech dependent (in both the narrower IT sector and the broader Smart Nation sense), when why is computing education being restricted to such a narrow band of top performers? Our universities are not short on budget for facilities and instructors, and the training for computing is not so difficult that only the very best of Singaporeans are capable of it (if A Level grades are even representative of forms of intelligence relevant for tech).

So it appears to be a bottleneck. If we look at the undergraduate enrollment, while it is increasing, it is still at a sluggish pace. I have also heard from NUS senior management that these modest increases were also after quite persistent pressure by the senior management.

(Google NUS Undergraduate Enrollment for source, link very long)

If we Google for the NUS graduate salary survey, we can also see that computing graduates earn significantly above the average market rate of most other graduates.

So from the MAS comments and the salaries of fresh computing graduates, we know there is a strong demand. From the high IGP, we also know the school of computing is over subscribed. So the bottleneck can only be the Universities, that due to having a rather monopolistic position, is not letting Singaporean Labour be trained to meet market demand. Why they would choose to do so is beyond me.

However unless we expand our tech training (and certification) and make it more accessible to those willing and able to learn, the market will eventually pressure the government to let in foreign alternatives (which it is already doing). Given the incumbent party's well founded (but not entirely holistically conceived) intention to upgrade our national economy, they cannot resist the market pressure forever.

There is no reason a AAB/B student with willingness to learn and eagerness to earn a good living should not be allowed to be trained in the tech skills necessary to improve the productivity of our economy.

πŸ‘︎ 31
πŸ“°︎ r/singapore
πŸ’¬︎
πŸ“…︎ May 07 2021
🚨︎ report
Computing question on AI, please help someone! Thanks.
πŸ‘︎ 125
πŸ“°︎ r/GCSE
πŸ’¬︎
πŸ‘€︎ u/Lolmaojeez
πŸ“…︎ May 02 2021
🚨︎ report
So... Yall Did NOT Like Distributed Computing CS7210

I was looking to be annoyed so I was trawling OMSCentral for some trolls and there in behold, I found it~! Who is going out of their way to ruin the hrs/per week skewing it towards 100 hours a week. We all know there aren't even 100 hours in a single week so joke is on you, trolls!

And then..... it was another... and another ... so many 100 hrs/week..

Jokes aside: would love to hear some more trash talk on this course and have this thread be a venting (not attacking) towards that course.

πŸ‘︎ 33
πŸ“°︎ r/OMSCS
πŸ’¬︎
πŸ‘€︎ u/p_h_a_e_d_r_u_s
πŸ“…︎ May 12 2021
🚨︎ report
TIL of Smart Dust, a miniature sensing chip with an autonomous power supply, computing and wireless communication in a space that is typically only a few millimeters in volume. With such a small size, these devices can stay suspended in an environment just like a particle of dust. theneweconomy.com/technol…
πŸ‘︎ 72
πŸ“°︎ r/todayilearned
πŸ’¬︎
πŸ‘€︎ u/sundog925
πŸ“…︎ May 04 2021
🚨︎ report
NVIDIA Announces CPU for Giant AI and High Performance Computing Workloads nvidianews.nvidia.com/new…
πŸ‘︎ 83
πŸ“°︎ r/AMD_Stock
πŸ’¬︎
πŸ‘€︎ u/robmafia
πŸ“…︎ Apr 12 2021
🚨︎ report
Computing in the 90's VS computing in 2018
πŸ‘︎ 19k
πŸ“°︎ r/ProgrammerHumor
πŸ’¬︎
πŸ‘€︎ u/leapin09
πŸ“…︎ Jan 24 2021
🚨︎ report
EPLI5: how does quantum computing work and how do you even make one?

I’ve been curious about quantum physics for a while, specifically in computing. How does a quantum computer use super positions to compute things? Aren’t they random?

πŸ‘︎ 146
πŸ“°︎ r/QuantumPhysics
πŸ’¬︎
πŸ‘€︎ u/A-very-gay-boo
πŸ“…︎ Apr 24 2021
🚨︎ report
White House launches AI website!!!! β€œThis is a resource that will enable researchers from all over the country to have access to both the computing and the data that they need in order to do cutting edge research” artificialintelligence-ne…
πŸ‘︎ 123
πŸ“°︎ r/artificial
πŸ’¬︎
πŸ‘€︎ u/abbumm
πŸ“…︎ May 06 2021
🚨︎ report
β€˜Brain-like device’ mimics human learning in major computing breakthrough - US, Chinese team of scientists have developed a device modelled on the human brain that can learn by association in the same way as Pavlov’s dog.Device can β€˜directly interface with living tissue’ for next-gen bioelectronics independent.co.uk/life-st…
πŸ‘︎ 238
πŸ“°︎ r/singularity
πŸ’¬︎
πŸ‘€︎ u/QuantumThinkology
πŸ“…︎ Apr 30 2021
🚨︎ report
Does anyone else think that cryptocurrencies are becoming a threat to FOSS and local computing?

With crypto mining creating an insatiable demand for computing resources...

  • Formerly free build services for FOSS software are becoming paid, because people are uploading fake build systems that mine cryptocurrency instead of compiling legitimate free software...

  • First GPU, and now HDD and SSD prices are through the roof, but only for regular people, not the cloud or prebuilt PC makers, at least until their long-term contracts expire...

  • DIY PC building has taken a hit due to the prices rising, and I figure most people won't be modifying prebuilt PCs for fear of voiding the warranty, whether justified or not...

  • Local storage is taking a hit as well... If worst comes to worst, regular people will no longer be able to afford much more than they absolutely need... Really bad news for hobbyist digital librarians and P2P... Thanks, crypto miners, for stealing our memories!

  • Gamers are warming up to cloud gaming and consoles (with consoles obviously not being free software friendly), now that it only makes sense to own a decent PC GPU if you run it 24/7 to mine on it or rent it, which most people won't do because of the noise and/or residential electricity prices...

  • Will the hard drive shortage affect Linux and other open-source software mirrors? IMO the open-source software distribution model should be moving ASAP from .tar.xz packages to a file-based model similar to Git, Borg, and Restic, where duplicate files can be shared between different projects or different versions of the same project...

Disclaimer: I might be wrong, and I really hope I am, so please correct me, I'll edit the post...

πŸ‘︎ 43
πŸ“°︎ r/freesoftware
πŸ’¬︎
πŸ‘€︎ u/lamefun
πŸ“…︎ May 05 2021
🚨︎ report
Activision Blizzard CEO says advancements in cloud computing, AR, VR, and other technologies means we are "rapidly progressing" towards a Ready Player One-style metaverse gamespot.com/articles/act…
πŸ‘︎ 78
πŸ“°︎ r/Futurology
πŸ’¬︎
πŸ‘€︎ u/playertariat
πŸ“…︎ Apr 29 2021
🚨︎ report
Are we underestimating the threat of quantum computing?

Having spent the day looking at the current state of the quantum computer industry, I am left with a feeling that Bitcoin and other blockchains seem dangerously unprepared for the possible disintegration of RSA encryption. (Which means private keys will mean nothing and bitcoin can be distributed freely by anyone with access to such computing power.)

As far as I can tell, the only ledger that deals with this head on is QRL (Quantum Resistant Ledger,) otherwise it seems to be a ridiculed non-issue for most hodlers.

The answers I have gotten for asking this is usually something along the lines of:

"Quantum computing is like cold fusion; the technology of the future that never actually happens." I disagree. https://www.mckinsey.com/business-functions/mckinsey-digital/our-insights/tech-forward/the-current-state-of-quantum-computing-between-hype-and-revolution

"If RSA Encryption breaks, Bitcoin will be the last of our problems." I disagree, bitcoin seems like a great target to take advantage of without destroying the world.

"Someone is already working on this threat, dont worry about it, it will be fixed, just hodl, buy the dip etc etc" Who? What exactly is being done?

"Its just FUD, leave me alone." Ok, fine, I will ask someone who actually thinks about this instead.

Is there any documentation of what is actually being done to prepare bitcoin (and others) for the seemingly inevitable destruction of RSA encryption?

Edit: forgot to add the link I actually meant to post: https://www.technologyreview.com/2019/05/30/65724/how-a-quantum-computer-could-break-2048-bit-rsa-encryption-in-8-hours/

πŸ‘︎ 64
πŸ“°︎ r/Bitcoin
πŸ’¬︎
πŸ‘€︎ u/h4v3anic3d4y
πŸ“…︎ Apr 08 2021
🚨︎ report
TIL a significant portion of the computer science community once believed Douglas Engelbart was "a crackpot." But, then, on 12/9/1968, he demonstrated to critics and world most all the fundamental elements of modern computing. When finished, they described him as "dealing lightning with both hands." en.wikipedia.org/wiki/The…
πŸ‘︎ 840
πŸ“°︎ r/todayilearned
πŸ’¬︎
πŸ‘€︎ u/WhileFalseRepeat
πŸ“…︎ Mar 30 2021
🚨︎ report
From my stats prof, as a computing major, I died inside
πŸ‘︎ 390
πŸ“°︎ r/badcode
πŸ’¬︎
πŸ‘€︎ u/kwokyto
πŸ“…︎ Apr 09 2021
🚨︎ report
New to cloud computing. Looking for a cloud-based Windows instance from which I can live stream a desktop software to Youtube 24 hours a day without stopping. Most affordable option?

Sounds odd, I know. Some important details:

  • The content software is fairly simple. No more taxing than a typical web app.
  • Fast internet/network - interactivity with viewers is important, so latency should be minimal (this might be standard, I'm not sure)

Best options from Azure or otherwise?

Edit: cloud computing might not be the terminology that's best suited to what I'm looking for. Essentially I'm just trying to look into options for a dependable remote Windows instance that I don't have to personally maintain.

πŸ‘︎ 10
πŸ“°︎ r/AZURE
πŸ’¬︎
πŸ‘€︎ u/roleparadise
πŸ“…︎ May 13 2021
🚨︎ report
[Uni] NUS Master of Computing | Ask Me Anything!

Since data science, machine learning and AI are all the rage right now, I figured that a lot of people would be curious about the different ways to break into this field and what does a career in data science entail. Also, with the COVID-19 pandemic still looming in the background, many of you (including my close friends) might be considering taking up a graduate programme or various online certifications as an investment in your career. However, with so many choices available, it is indeed a daunting decision.

I'm a current student in the NUS MComp programme and would be more than happy to share some of my experiences in the programme and also the university. Feel free to ask me anything (or DM) me!

Cheers and stay safe.

πŸ‘︎ 33
πŸ“°︎ r/SGExams
πŸ’¬︎
πŸ‘€︎ u/titlwayh55
πŸ“…︎ May 10 2021
🚨︎ report
eProcessor is a project that will create a open source RISC-V core for High Performance Computing (HPC) hpcwire.com/off-the-wire/…
πŸ‘︎ 220
πŸ“°︎ r/linux
πŸ’¬︎
πŸ‘€︎ u/wiki_me
πŸ“…︎ Apr 19 2021
🚨︎ report
Cloud Computing Magic

So I thought of this as kind of a sideline to my main magic system, but now I think on it, it could be quite interesting by itself.

The way I have it working is that nobody "knows" magic, but when they use it, they log onto the "cloud" where all the magic and spellbooks and the wisdom of generations is stored.

Casting a spell is akin to performing a Google search for "light a guy on fire" etc. The most popular version of that spell comes up, which is ideally the most effective, or the most reliable way of using whatever elemental energy you've channelled. You blast a dude, upvote the spell, and it becomes just a little easier for the next mage to use that same spell.

More niche spells take longer to search for, and so are harder to cast quickly in battle, for example. Healing magic might take a while because there are only a few ways to light a guy on fire, but a lot of different injuries that need different spells to heal. The spellbooks in my world don't tell you how to cast a spell, but list effects of spells, good ones to try to search for, and ideal use cases.

You can of course create new spells, or more likely find ones that are defunct and slightly change the way they're cast. Even the popular ones change over time, as each mage's personal version of "light a guy on fire" gets added to the cloud. Mages who can regularly pull a cool spell from the "deep web", are known as wizards of great ability. Who knows, if they cast their custom version of a spell often enough it might become a fad, and therefore much more powerful.

Let me know your thoughts on this.

Edit - This makes for my first award ever on Reddit. I'm touched.

πŸ‘︎ 80
πŸ“°︎ r/magicbuilding
πŸ’¬︎
πŸ‘€︎ u/crazydave11
πŸ“…︎ Apr 30 2021
🚨︎ report
Is stadia too frugal with computing power for RE8?

I have tried the RE8 demo on stadia and on my gaming pc which has a GTX 1080. Even though the GTX 1080 is capable of 9 TFlops the image is significantly better than the stadia image which should be capable of 10-11 Tflops.

Yes, I know the gtx 1080 is a beefy card but it's getting fairly old now.

I have tried comparing other games and the difference isn't usually so jarring.

I love stadia and buy many games on it (since I travel a lot it's just more convenient), but I think I might skip RE8 on stadia since the PC version looks so damn good.

Anyone else feel the same way? Is google able to select how many Tflops they can assign per virtual machine, and if so are they being too frugal with it?

πŸ‘︎ 12
πŸ“°︎ r/Stadia
πŸ’¬︎
πŸ‘€︎ u/Lynks0
πŸ“…︎ May 05 2021
🚨︎ report
TIL the Girl Scouts had a "Computing Fun" badge in the 1980s that featured the binary code 00111 10011 on its patch. This translated to 7 and 19, and when mapped onto the order of the alphabet stands for G S, the Girl Scouts. fastcompany.com/3064842/w…
πŸ‘︎ 3k
πŸ“°︎ r/todayilearned
πŸ’¬︎
πŸ‘€︎ u/Mike_ZzZzZ
πŸ“…︎ Mar 14 2021
🚨︎ report
USPS gets ahead of missing packages with AI edge computing federalnewsnetwork.com/ar…
πŸ‘︎ 48
πŸ“°︎ r/technology
πŸ’¬︎
πŸ‘€︎ u/thinkB4WeSpeak
πŸ“…︎ May 10 2021
🚨︎ report
Retro computing/vintage gaming youtuber starter pack
πŸ‘︎ 216
πŸ“°︎ r/starterpacks
πŸ’¬︎
πŸ‘€︎ u/SN74HC04
πŸ“…︎ Apr 29 2021
🚨︎ report
The amount of computing power to make a fake CGI Moon landing is significantly more than the amount used actually land on the Moon.
πŸ‘︎ 70
πŸ“°︎ r/Showerthoughts
πŸ’¬︎
πŸ‘€︎ u/Justryan95
πŸ“…︎ May 03 2021
🚨︎ report
Humble Book Bundle: Azure Cloud Computing by Springer humblebundle.com/books/az…
πŸ‘︎ 44
πŸ“°︎ r/humblebundles
πŸ’¬︎
πŸ‘€︎ u/Torque-A
πŸ“…︎ May 10 2021
🚨︎ report
AMD CEO: 'our goal is to bring the best of computing to the industry' finance.yahoo.com/video/a…
πŸ‘︎ 104
πŸ“°︎ r/AMD_Stock
πŸ’¬︎
πŸ‘€︎ u/AMD_winning
πŸ“…︎ Apr 29 2021
🚨︎ report
IBM unveils 2-nanometer chip technology for faster computing reuters.com/article/ibm-s…
πŸ‘︎ 29
πŸ“°︎ r/AMD_Stock
πŸ’¬︎
πŸ‘€︎ u/AMD_winning
πŸ“…︎ May 06 2021
🚨︎ report
is it trustable distributed computing needs so much time

OMSCentral shows average workload for DC is 78 hours a week. Is it possible, that's means every day is 11 hours. I don't think anyone can spend so much time. Or people just randomly write down the hours, I do see many people write 100 hours.

πŸ‘︎ 42
πŸ“°︎ r/OMSCS
πŸ’¬︎
πŸ“…︎ Apr 18 2021
🚨︎ report
Northwestern and University of Hong Kong develop computing device that emulates human brain dailynorthwestern.com/202…
πŸ‘︎ 151
πŸ“°︎ r/technews
πŸ’¬︎
πŸ‘€︎ u/QuantumThinkology
πŸ“…︎ May 04 2021
🚨︎ report
The former Information Computing Centre in PanevΔ—ΕΎys, Lithuania. Built in 1987, designed by ArΕ«nas BlΕ«Ε‘ius. reddit.com/gallery/n8x2da
πŸ‘︎ 300
πŸ’¬︎
πŸ“…︎ May 10 2021
🚨︎ report

Please note that this site uses cookies to personalise content and adverts, to provide social media features, and to analyse web traffic. Click here for more information.