“NVIDIA today announced its first data center CPU, an Arm-based processor that will deliver 10x the performance of today’s fastest servers on the most complex AI and high performance computing workloads.
The result of more than 10,000 engineering years of work, the NVIDIA Grace™ CPU is designed to address the computing requirements for the world’s most advanced applications — including natural language processing, recommender systems and AI supercomputing — that analyze enormous datasets requiring both ultra-fast compute performance and massive memory. It combines energy-efficient Arm CPU cores with an innovative low-power memory subsystem to deliver high performance with great efficiency.
“Leading-edge AI and data science are pushing today’s computer architecture beyond its limits – processing unthinkable amounts of data,” said Jensen Huang, founder and CEO of NVIDIA. “Using licensed Arm IP, NVIDIA has designed Grace as a CPU specifically for giant-scale AI and HPC. Coupled with the GPU and DPU, Grace gives us the third foundational technology for computing, and the ability to re-architect the data center to advance AI. NVIDIA is now a three-chip company.”
Shares up 4% on the news wonder what that means for the ARM deal
Raised guidance too wow
So, during a debate on another thread I encountered a flat Earther making the following claim:
>No military take curvature or spin into account for any targeting. This is a very old argument.Seems you are just regurgitating the same old arguments. Everything in your comment has not only been disproven, it has been absolutely destroyed in reality.
This surprised me, as it is extremely well-understood that militaries do take these factors into account. That individual has elected not to review any of this evidence (no surprise), but I felt it would go to waste languishing ten comments deep in a flat Earth thread, so I reprise it here.
I submit that in fact there is a strong body of evidence that militaries do have to account for Earth's rotation when aiming ballistic weaponry at long ranges.
I will break this down into evidence sections for organisation... this is only a fraction of the evidence out there, but I have to stop somewhere :)
>CORRECTIONS TO RANGE/AZIMUTH TO COMPENSATE FOR THE ROTATION OF THE EARTH.
There is also explicit discussion of it throughout the text, here on p399;
>compensate for the effects of ballistic wind direction and speed and for rotation of the earth throughout the firing unit’s area of responsibility. These corrections are combined with the position constants determined in the concurrent met by solving a subsequent met in each 800-mil sector or selected 800-mil sectors. (See Figure 11-32.)
>*Rotation of Earth.Although the rotation of the earth is a natural phenomenon, it is treated as a nonstandard condition for simplicity in the construction of firing tables. The magnitude and direction of projectile displace-ment from the target owing to rotational effects are functions of azimuth of fire, time of flight, projectile velocity, and relative position of piece and target with respect to the Equato... keep reading on reddit ➡
I am not entirely sure if this is the right place to ask this but here I go.I recently got an opportunity to work in quantum computing team. The entire purpose of the team is to find applications of quantum computing in our firm. It sounds like an amazing opportunity but I am having a lot of reservations. Would love and appreciate your insights.
I come from an CS engineering background with a masters in AI/ML, my full time position at the moment is that of a data scientist working on clustering models. The quantum computing opportunity is available due to formation of a new team which has a lot of vacancies at the moment.
Although the prospects of Quantum AI/ML sounds hella interesting(although very niche), quantum computing itself seems like an extreme departure from my formal education with its heavy emphasis in physics. This is making me anxious
How viable is this opportunity for my career? How are the job prospects in the domain ( for someone with my background)? Would it help me in creating a niche for myself in data science ?I can admit shamelessly that I'd like a career which pays the big bucks at the same time keep me interested enough.
Would love your opinions
EDIT: I am very thankful for this amazing community. Thank you for all your responses, Holy shit thats a lot to take in.
I've read very little about quantum computing tbh but from what i've read so far, quantum computing doesn't seem to be bringing anything new to the table except faster processing. more hashes. It doesn't seem to be changing the way we approach problems or build proofs since we have to deal with qubits as a 0 or a 1 in the end. is that actually true or is my knowledge on the topic too shallow?
Using tech available to us right now, what would a solarpunk laptop computer look like?
A few thoughts of mine would include the following features.
An e-paper screen for low power usage and ease of use in a wide variety of environments.
A raspberry Pi as the brains (again low power consumption and a high return on computing muscle per energy unit) booting off an SD card.
A high quality mechanical (buckling spring) USB keyboard with a USB hub built into it that can accept several USB thumb drives for storage memory made available when the keyboard is plugged in, with an accessible port for an external hard drive to back it up easily.
A fairly beefy capacitor bank for power smoothing and rapid energy storage.
A photovoltaic panel for charging said capacitor bank, deployed nearby, to keep this rig off the electric grid.
Some basic weatherproofing do a rainstorm doesn't short it out. Yes, it wont charge during said storm but if it is like the weather up here it just has to outlast the storm or enter a low power mode to hibernate until sunlight is available again.
It won't satisfy a hard core gamer or demanding corporate exec, but for the average person it would meet one's needs. Portable, lightweight, and when you need more computing power, you use it as a terminal to access a beefier system installed elsewhere on the internet.
Recently the MAS Chief spoke about Singapore having to rely on foreign manpower for our IT sector, because there is a lack of local workers for that sector.
However as of right now, the IGP for all 4 of NUS School of Computing courses is AAA/A, with the poly route GPA requirement being min 3.75. (10th percentile)
This seems abit excessive. If indeed the future of Singaporean economy is tech dependent (in both the narrower IT sector and the broader Smart Nation sense), when why is computing education being restricted to such a narrow band of top performers? Our universities are not short on budget for facilities and instructors, and the training for computing is not so difficult that only the very best of Singaporeans are capable of it (if A Level grades are even representative of forms of intelligence relevant for tech).
So it appears to be a bottleneck. If we look at the undergraduate enrollment, while it is increasing, it is still at a sluggish pace. I have also heard from NUS senior management that these modest increases were also after quite persistent pressure by the senior management.
(Google NUS Undergraduate Enrollment for source, link very long)
If we Google for the NUS graduate salary survey, we can also see that computing graduates earn significantly above the average market rate of most other graduates.
So from the MAS comments and the salaries of fresh computing graduates, we know there is a strong demand. From the high IGP, we also know the school of computing is over subscribed. So the bottleneck can only be the Universities, that due to having a rather monopolistic position, is not letting Singaporean Labour be trained to meet market demand. Why they would choose to do so is beyond me.
However unless we expand our tech training (and certification) and make it more accessible to those willing and able to learn, the market will eventually pressure the government to let in foreign alternatives (which it is already doing). Given the incumbent party's well founded (but not entirely holistically conceived) intention to upgrade our national economy, they cannot resist the market pressure forever.
There is no reason a AAB/B student with willingness to learn and eagerness to earn a good living should not be allowed to be trained in the tech skills necessary to improve the productivity of our economy.
I was looking to be annoyed so I was trawling OMSCentral for some trolls and there in behold, I found it~! Who is going out of their way to ruin the hrs/per week skewing it towards 100 hours a week. We all know there aren't even 100 hours in a single week so joke is on you, trolls!
And then..... it was another... and another ... so many 100 hrs/week..
Jokes aside: would love to hear some more trash talk on this course and have this thread be a venting (not attacking) towards that course.
I’ve been curious about quantum physics for a while, specifically in computing. How does a quantum computer use super positions to compute things? Aren’t they random?
With crypto mining creating an insatiable demand for computing resources...
Formerly free build services for FOSS software are becoming paid, because people are uploading fake build systems that mine cryptocurrency instead of compiling legitimate free software...
First GPU, and now HDD and SSD prices are through the roof, but only for regular people, not the cloud or prebuilt PC makers, at least until their long-term contracts expire...
DIY PC building has taken a hit due to the prices rising, and I figure most people won't be modifying prebuilt PCs for fear of voiding the warranty, whether justified or not...
Local storage is taking a hit as well... If worst comes to worst, regular people will no longer be able to afford much more than they absolutely need... Really bad news for hobbyist digital librarians and P2P... Thanks, crypto miners, for stealing our memories!
Gamers are warming up to cloud gaming and consoles (with consoles obviously not being free software friendly), now that it only makes sense to own a decent PC GPU if you run it 24/7 to mine on it or rent it, which most people won't do because of the noise and/or residential electricity prices...
Will the hard drive shortage affect Linux and other open-source software mirrors? IMO the open-source software distribution model should be moving ASAP from
.tar.xz packages to a file-based model similar to Git, Borg, and Restic, where duplicate files can be shared between different projects or different versions of the same project...
Disclaimer: I might be wrong, and I really hope I am, so please correct me, I'll edit the post...
Having spent the day looking at the current state of the quantum computer industry, I am left with a feeling that Bitcoin and other blockchains seem dangerously unprepared for the possible disintegration of RSA encryption. (Which means private keys will mean nothing and bitcoin can be distributed freely by anyone with access to such computing power.)
As far as I can tell, the only ledger that deals with this head on is QRL (Quantum Resistant Ledger,) otherwise it seems to be a ridiculed non-issue for most hodlers.
The answers I have gotten for asking this is usually something along the lines of:
"Quantum computing is like cold fusion; the technology of the future that never actually happens." I disagree. https://www.mckinsey.com/business-functions/mckinsey-digital/our-insights/tech-forward/the-current-state-of-quantum-computing-between-hype-and-revolution
"If RSA Encryption breaks, Bitcoin will be the last of our problems." I disagree, bitcoin seems like a great target to take advantage of without destroying the world.
"Someone is already working on this threat, dont worry about it, it will be fixed, just hodl, buy the dip etc etc" Who? What exactly is being done?
"Its just FUD, leave me alone." Ok, fine, I will ask someone who actually thinks about this instead.
Is there any documentation of what is actually being done to prepare bitcoin (and others) for the seemingly inevitable destruction of RSA encryption?
Edit: forgot to add the link I actually meant to post: https://www.technologyreview.com/2019/05/30/65724/how-a-quantum-computer-could-break-2048-bit-rsa-encryption-in-8-hours/
Sounds odd, I know. Some important details:
Best options from Azure or otherwise?
Edit: cloud computing might not be the terminology that's best suited to what I'm looking for. Essentially I'm just trying to look into options for a dependable remote Windows instance that I don't have to personally maintain.
Since data science, machine learning and AI are all the rage right now, I figured that a lot of people would be curious about the different ways to break into this field and what does a career in data science entail. Also, with the COVID-19 pandemic still looming in the background, many of you (including my close friends) might be considering taking up a graduate programme or various online certifications as an investment in your career. However, with so many choices available, it is indeed a daunting decision.
I'm a current student in the NUS MComp programme and would be more than happy to share some of my experiences in the programme and also the university. Feel free to ask me anything (or DM) me!
Cheers and stay safe.
So I thought of this as kind of a sideline to my main magic system, but now I think on it, it could be quite interesting by itself.
The way I have it working is that nobody "knows" magic, but when they use it, they log onto the "cloud" where all the magic and spellbooks and the wisdom of generations is stored.
Casting a spell is akin to performing a Google search for "light a guy on fire" etc. The most popular version of that spell comes up, which is ideally the most effective, or the most reliable way of using whatever elemental energy you've channelled. You blast a dude, upvote the spell, and it becomes just a little easier for the next mage to use that same spell.
More niche spells take longer to search for, and so are harder to cast quickly in battle, for example. Healing magic might take a while because there are only a few ways to light a guy on fire, but a lot of different injuries that need different spells to heal. The spellbooks in my world don't tell you how to cast a spell, but list effects of spells, good ones to try to search for, and ideal use cases.
You can of course create new spells, or more likely find ones that are defunct and slightly change the way they're cast. Even the popular ones change over time, as each mage's personal version of "light a guy on fire" gets added to the cloud. Mages who can regularly pull a cool spell from the "deep web", are known as wizards of great ability. Who knows, if they cast their custom version of a spell often enough it might become a fad, and therefore much more powerful.
Let me know your thoughts on this.
Edit - This makes for my first award ever on Reddit. I'm touched.
I have tried the RE8 demo on stadia and on my gaming pc which has a GTX 1080. Even though the GTX 1080 is capable of 9 TFlops the image is significantly better than the stadia image which should be capable of 10-11 Tflops.
Yes, I know the gtx 1080 is a beefy card but it's getting fairly old now.
I have tried comparing other games and the difference isn't usually so jarring.
I love stadia and buy many games on it (since I travel a lot it's just more convenient), but I think I might skip RE8 on stadia since the PC version looks so damn good.
Anyone else feel the same way? Is google able to select how many Tflops they can assign per virtual machine, and if so are they being too frugal with it?
OMSCentral shows average workload for DC is 78 hours a week. Is it possible, that's means every day is 11 hours. I don't think anyone can spend so much time. Or people just randomly write down the hours, I do see many people write 100 hours.