I have recently learned about hashing and the different type of hashing algorithms. While researching about SHA1, I came across the Shattered project by Google where they showcased a collision in SHA-1. All the headlines of this events say stuff about using an insanely high amount of computation but they obviously didn't wait 6500 years for this so how was it done? Did they use a lot of CPU servers and the 6500 years of CPU computation is how long one CPU would take? What CPU is that? I'm just so confused sorry.
Book suggestions or PDF links for helping me me find out how crypto currencies are invented / generate backing / operate computationally / play out theoretically, overall... etc.
I want to know how literal examples of this being done goes... pretty big ask, I suppose, as no one that informed is going to be chiming in, here, I'm sure. But if anyone has any good educational material to suggest on the subject of crypto currency founding or originating, it seems to be hard to come by online.
I'm rather confused here, it was my understanding that quantum computers simply allowed specific things to be done much faster than traditional computers. But that a traditional Turing machine could still calculate them given enough time and space.
But the article on quantum simulators states that:
>A universal quantum simulator is a quantum computer proposed by Yuri Manin in 1980 and Richard Feynman in 1982. Feynman showed that a classical Turing machine would not be able to simulate a quantum effect, while his hypothetical universal quantum computer would be able to mimic needed quantum effect.
Which means that there are calculations that a quantum computer can do, that a classical computer simply can't, regardless of time and space.
I don't really get how this can be possible? How is it possible that there's something a classical computer cannot calculate, but a quantum computer can? Or how can a quantum computer calculate something that isn't calculable?
What are an example of these computations?
Assuming that human brains also take part in simulation via thinking and reaction to VR, how much computation power would be needed to run virtual reality, to simulate (not nescessary perfectly) the world, animals, plants, w/e?
Based on Dominic's recent interview with OKEX: https://www.youtube.com/watch?v=-MCo94FLvf8
He explains how cycles are a stablecoin backed by the SDR (1 SDR = ~$1.44 USD), and will always maintain a static value because people in need of computational power for their smart contracts / canisters could buy cheaper cycles on the open-market which will inevitably raise the price back to the 1.44 SDR value. Therefore assuming the internet computer achieves scale and use, cycles can become a stablecoin vastly superior to Tether/USDC - because the underlying backing behind the cycles currency is computational power - which obviously holds more value than Tether which is supposedly backed by USD and other physical assets?
Correct me if I'm wrong but computational power holds immense value in today's world and it could be truly revolutionary to have a currency that is backed by computation rather than traditional assets / centralized currency? It's essentially a decentralized stablecoin?
I think this is a relatively unknown aspect of the IC at the moment. I feel like it could be revolutionary though? Is my thinking correct?
I mostly understand algorithms, but I would like to formally learn the basics of the model of computation for algorithms.
Sorry if this is not the right place I also posed on r/compsci where it got removed.
I'm a computer science student, and, even if I'm interested in these topics, it's hard to see the applications.
I recently got accepted into Mathematics of Computation (my second choice of major) at UCLA and L&S CS at Berkeley. I have always wanted to go to UCLA and right now I do not know what to do. If you guys were me what would you do? Please list out the PROS and CONS too!
Hi, I'm wondering how much added overheads there are when it comes to assigning CSS properties. One example is the
text-transform property. If you have some text you wish to display as uppercase, you can assign
text-transform: uppercase to the element it's contained in.
Or, you could store that text by casting it to uppercase and then all you need to do is retrieve said text.
How does this work on a lower level? If you assign this property, upon the page rendering, what exactly happens? Does the browser look at every character in this element and subsequently cast it to uppercase, and can this be a blocking operation?
If you theoretically load some very large amount of characters into this element, would there be a performance boost if you remove that CSS property and merely retrieve the stored uppercase text?
Does CSS ever need to be optimized in terms of computation speed to get those millisecond or microsecond faster page renders?
AnyDice can't compute this within its time limit. Neither can Troll.
output [highest 5 of 10d[explode d10]]
With the right algorithm, this can be done in under 5 milliseconds.
Probably only of interest to programmers, the code is still quite a ways from productionization.
I took a gap semester due to COVID and missed the Theory of Computation class, which is required for graduation. Unfortunately, the school only offers it once every two years. I need to take a similar class somewhere else to get the credit for Theory of Computation. Any suggestion on an online class that teaches Theory of Computation and can be transferred for college credits?
(to admin, sorry if I'm creating another post about this cause it's been days and no one is replying on my comment in Weekly Discussion Thread)
I just to know if there is a place where I can see the detailed explanation or exact computation of the Grade and Medal Awarding we get after games. I want to know which factors (pushing, KDA, hero damage's damage taken) weighs more.
Idk if I should create a separate Suggestion post to the devs that they should include this information together with the tutorial section. It would also help players in identifying if there is an error or give suggestion to improve the grading system.
What are your top sources to learn Theory Of Computations, if someone could help me with sources I would be glad.
Hello! i'm trying to follow the course on Neural Computation at MIT (https://ocw.mit.edu/courses/brain-and-cognitive-sciences/9-40-introduction-to-neural-computation-spring-2018/index.htm) but as a psychology student i'm struggling with the math and the programming - not discouraged though, just need some assistance!
There's something entirely wrong with the program i've written, or the values i've calculated, so here's all the steps i've done (you can skip to the code, the first part is just calculating values - i just added it in case, maybe the error is done at this early stage, even though i checked it several times).
Calculating values (more physics than code):
Given a "neuron" which is a perfect sphere, with the radius of 0.06mm, a specific membrane capacitance (cm - c for capacitance, m for membrane) 10nF/mm^(2), a specific membrane conductance (gm - g for conductance) of 1 μS/mm^(2), calculate total membrane conductance and total membrane capacitance.
I get these values: total membrane conductance = 0.045 μS
total membrane capacitance = 0.45 nF
and from the conductance i calculate the resistance (1/0.45μS)= 22.22 m ℧
The lecturer emphasizes:
a. You set the cell’s capacitance, resistance, and resting potential (using the values calculated and defined above).
b. You set the initial condition V0 (i.e. V(t=0)) to V0 = Vrest and the current injected Ie to 100 pA starting at 100ms and finishing at 200 ms.
c. Your code updates V(t) at every time step using the exponential Euler method.
C = 0.45; % Capacitance in nF
R = 2.222e-11; % Resistance in MegaOhm
Vrest = -70; % Leakage current reversal potential in mV
% Integration parameters
dt = 0.1; % integration time-step in ms
Tdur = 1000; % simulation total time in ms
V0 = Vrest; % initial condition in mV
k = ceil(Tdur/dt); % total number of iterations
V = zeros(1,k+1); % Voltage vector in mV
V(1)=V0; % assigned to the first element of array V, the initial condition V0
% time vector
t = dt.*(0:k); % time vector in ms
% Current pulse parameters
Tstart = 100; % curent pulse start time in ms
Tstop = 200; % curent pulse stop time in ms
Iamplitude = 0.1; % current pulse amplitude in nA
I = zeros(1,k+1);... keep reading on reddit ➡
Bitcoin and other proof of work blockchains are criticised for high energy consumption. As the mining difficulty increases it is going to be only worse. Would it be possible to lower the computational cost by sharing partial results? I mean, if I am looking for specific hash, it would be good to know what nonces I don’t need to try because somebody else tried that. Or is the possibility of computing the same hash twice for all the participants near zero?
Edit: nonce is the word, not seed
There is a debate whether or not calculator apps such as Wolfam Alpha, Symbolab, or Programming Language Apps (i.e. Matlab, Maple, or C/C++) be treated same way as programmable calculator for take-home or open book engineering exams. In some of math and engineering sequences, professor have encouraged this tools to solve complex equations which cannot be solved by hand or scientific calculator. But, many other professors believe that those calculators apps, and even programming languages are obstruction to learning objectives and consider as dishonest practices. We need to use lot of programming languages though especially in computer sciences, as well as in computational mathematics, computational physics, and engineering computations courses. For my opinion I found Matlab and Maple very useful when solving Mathematics and Engineering Programming. Even though both of those languages are difficult to learn for first time, once you learn Maple, and Matlab, you can solve any calculus, diff eq, and engineering mechanics problem.
Unless otherwise stated in syllabus, I think those calculator/programming apps are acceptable tools to do homework, and open note/book/calculator exams. Most of the professor allows any calculators if you were given open-calculator exams. If you have any doubt, please feel free to either directly email the professor or ask questions in this forums? I will try my best answering your response on the comment within 24 hours.
Hi guys :D
I'm a rising freshman, and I was hoping any past or present biomedical computation majors could share their thoughts on their general experience.
Also, is there any way to see reviews from students who have taken certain classes before? I know about Carta, and it's super helpful, but, just in case, is there anything else?
You can pretty much stop reading here if you're feeling lazy.
More specific stuff:
I'm not super set on biocomputation. So, would doing something broader, like majoring in MCS and minoring in humbio be a better approach?
I'm a little scared that this major is non-versatile for job-finding stuff after college-like I'd be stuck doing a certain kind of job-what do you guys think?
Is this major harder or easier than your average STEM major?
I'm tenuously considering pre-med, would this major be helpful if I were to go down that path?
Does anybody know a good introductory book on the subject of the Theory of Computation? No buzz book recommendations (like recommending Axler to someone who is studying Linear Algebra for the first time). I'm a rising sophomore in CS. So, basically, a peasant in terms of knowledge. Please be considerate.
Then, I saw that every class is full and waitlists are not even offered. Anyone had a similar experience ?
Looking for help with Computation Theory and Automata test this Friday from 5-7 EST. It is a computer science course based on discreet math. It is a two hour test. Will require knowledge on the below.
Sets, Sequences, Function and Relations, Alphabets strings and Languages, finite automata and Non-determinism.
Willing to pay well for a knowledgeable person on the subject. Since its a online test, 2 hour commitment is required. Thank you.
Hi! How is the grade computed in UST SHS ? Does it follow DepEd's grading system i.e. transmuted grades ? In my school right now , we follow the actual grade like 7/10 = 70%. I think in DepEd's standard grading system this actual grade will translate to 84%.
I’m an incoming freshman at Concordia who’s going to take Comp Arts. I would like to ask for assistance in choosing the right courses for my first year studies.
I was looking at the contents of this major, and it seems to be everything I like.
How marketable is this major though for someone wanting to work in the data science industry?
I would be declaring this as a second major next to Cognitive Sciences ML
Does anyone have feedback on Carl McTague for CS Logic and Comp? Also how would one sort of “prepare” beforehand this summer and look into what we learn during the semester to make sure I have a foundational understanding of the class?
Hi, I've been trying to pin down the meaning and distinctions of these word(also for calculate and compute) but checking out dictionaries and their semantic path, both words references each other in a circular way.