**Date: **24/08/2023

**Host**: David Ding, Business Innovation Advisor at Callaghan Innovation

**Guest:** Dr. Vladimir Bubanja, Distinguished Scientist at the Measurement Standards Laboratory of New Zealand and Associate Investigator at the Dodd-Walls Centre for Photonic and Quantum Technologies.

**Video Length: **1:09:08

## Transcript

**David Ding:** kia ora koutou, welcome everyone. Today we've got our special guest, Dr. Vladimir Bubanja who's a distinguished scientist specializing in quantum metrology, Callaghan Innovation. The topic today is the quantum web. And it's going to be perceived through a Web3 lens, Web3 and beyond.

So the format for today is a presentation to begin with. Feel free to plonk any questions you have in the chat at the end and we'll have some Q& A for about half an hour at the end. Okay, so I'll hand over to you. Just start sharing your screen and we'll crack into it

**Vladimir Bubanja:** okay, thanks everyone for coming and thank you David and Kevin for organizing this presentation. It is my pleasure to share with this community some of our work. I work at the Measurement Standards Laboratory of New Zealand. We are a business unit of Callaghan Innovation and we are also National Metrology Institute.

I'm also a member of the Dodd Wall Center for Photonic and Quantum Technologies, which is one of 10 centers of research excellence in New Zealand. I will talk about some aspects of quantum information, which is considered one of the main scientific and technological developments at present.

**Vladimir Bubanja:** And this is the outline of my talk.

During the 20th century, there were two important and revolutionary developments in science and technology. One was development of quantum mechanics and another development of information theory. Quantum mechanics has changed the way how we understand the natural world around us, and it has led to a range of products that we use in everyday life.

Similarly, development of information theory has changed the way how we think about information and it has led to widespread use of computers that are performing tasks which were previously thought possible to be done only by humans. Around the beginning of the 21st century, there came a convergence of these two revolutions, namely people realized that information is physical, that is, information is always stored in some physical system, whether something is written on a piece of paper, or stored in a computer memory, or in nerve cells, it is always stored in some physical system.

And at fundamental level, all physical systems are described by quantum mechanics, and so from, from that it emerged basically a new science, quantum information science, which has led to the development of quantum technologies. These are emerging disruptive technologies that are expected to revolutionize computing, communication, sensing, and various other industries.

So I will first briefly talk about quantum mechanics, just say a couple of words about quantum mechanics and its applications to accurate measurements. This is the field of work of MSL. With regard to information theory, I will briefly reflect on information security. This is essential for the development of Web3.

And then I will talk a little bit about quantum information and the development of new... quantum devices. And it is expected that these devices will be connected by ultra secure network in what is envisioned to be quantum internet.

**Vladimir Bubanja:** So I'll start with a little bit of history. At the beginning of 20th century, scientists at the National Meteorology Institute in Germany measured what is called a black body radiation. That black body is sketched here as it can be simply a sphere with a small opening. So any radiation that falls onto that opening gets absorbed by this sphere.

That can be, for example, a microwave oven with a small opening at the door. And what they measured was radiation coming out of that black body. And so they plotted their results here. On the vertical scale is the intensity of radiation and on the horizontal is the wavelength. And so that radiation depends on temperature, so they have three plots here for three different temperatures.

By increasing the temperature also intensity of radiation increases. And so they compare their results, which are experimental results are shown in orange color, and they compare it to the existing So called the Wien law that was a theoretical formula and they got a good agreement at short wavelengths These two curves fall on top of each other, but they notice a discrepancy here at longer wavelengths And so they couldn't explain that in terms of classical physics.

So we call the physics before 20th century classical, and starting with the beginning of 20th, modern physics. So they couldn't explain this. So they consulted Max Planck, who was a theoretical physicist in Germany, and he introduced a radically new idea. And that was that. Energy is quantized. Instead of being continuous he proposed or assumed that it is it consists of packets.

And so energy of one such packet is given by a product of a constant h, which we now call Planck's constant, and the frequency f of radiation. And starting with that assumption, he derived the formula which fit the experimental results. A couple of years later, Albert Einstein provided a physical interpretation to these packets.

So for example, light consists of particles, photons. And so in that way, he was able to explain another effect called photoelectric effect and both Planck and Einstein got the Nobel prizes in physics for their work. Planck also thought about units of measurements. And so he made this statement that with the help of fundamental constants, it's possible to define units of physical.

Of physical quantities, and so such units will retain, their values throughout time and civilizations, even extraterrestrial and nonhuman. And, in fact, our present international system of units is based on a set of fundamental constants, and one of those is the Planck's Constant these were the very beginnings of development of quantum mechanics. And now we have quantum mechanics, this has been developed and it is the most accurate theory in the history of science. It describes properties of matter at atomic subatomic scale to large objects such as stars. Properties of these small systems, subatomic particles, are for example studied at accelerators.

And here is a... Aerial image of the largest such accelerator in the world. It's called large hydrum collider. And on this aerial image is superimposed the circle that indicates the tunnel underground. So circumference is about 27 kilometers and it is about a hundred meters underground. And so there is accelerator, which using electromagnetic fields, accelerate nuclei of atoms or just two protons in opposite directions. They are made to collide, and from these collisions, there are bursts of so called elementary particles. And so, these elementary particles are listed in this table, which together with associated theory, is called Standard Model of Elementary Particles. Going to, say, large objects such as stars, quantum mechanics also describes how nuclei of hydrogen collide, forming helium, and in this nuclear fusion energy is released.

Stars have their life cycles. Some of them explode in so called supernova explosions, which create all the other elements. So for example, all the other elements on earth were created that way. And so quantum mechanics provides not only qualitative, but also quantitative description of the phenomena.

And that quantitative description is extremely accurate. So just to demonstrate that point, I show here two numbers. The, they are quantities of anomalous magnetic moment of electrons, which is just one property of electrons. But what I wanted to show is that the top number is obtained by starting from principles of quantum mechanics and calculating that quantity.

And the number below is obtained by actual experimental measurements. So I just want to show this extreme agreement between all these digits. These numbers in brackets are uncertainty of last couple of digits, but here we can see extraordinary agreement, and in fact, there is no any other area of science that we can have such an agreement between understanding and the experiments.

And so using that knowledge, it was possible to develop a range of products that we use. So, for example, from the understanding of the properties of semiconductors. It was possible to invent transistor. Transistor is the basic element of electronic chips and these chips are used in computers, cell phones, cars, airplanes, etc. Similarly, knowledge of the or understanding of the interactions between light and matter led to the invention of the laser and lasers are used from cutting steel to eye operations. Then we have nuclear magnetic resonance and other medical instruments. In a word, everything that we call high tech was made possible by the development of quantum mechanics.

And so, we have that starting from detailed considerations of quite obscure phenomenon, black body radiation. Now we have quantum mechanics, which, according to some estimates, underpins more than 60% of the world economy. And so the development of novel quantum technologies, which is in progress now, is also expected that it will have a major impact on the world economy.

**Vladimir Bubanja:** With regard to applications of quantum mechanics to measurements. There is this so called metrology pyramid. Metrology is the science of measurements. And this pyramid sort of summarizes the range of activities in, in metrology. At the base of this pyramid are all the measurements that we encounter in everyday life, such as measurement of time or speed of cars or temperature or blood pressure in medical clinics. All these measurements are done with instruments that are calibrated with higher accuracy instruments. These are in turn calibrated with even higher accuracy instruments, and this so called traceability chain leads to primary standards.

Primary standards are held at national metrology institutes, that's MSL in case of New Zealand, and more or less every country in the world has such an institute. Primary standards are the most accurate instruments, and as such, there are no other instruments that they can be calibrated with. The only thing that they can be compared to are the fundamental theories of physics.

And so at the top of this pyramid is the international system of units, which is based on these fundamental theories of physics, and this is the logo of the current system. At the inner circle here is a set of exactly defined values of fundamental constants, and one of those is the Planck constant, which I have mentioned on the previous slide.

An example of a cut through this pyramid, one traceability chain, is, ah, yes, I just wanted to mention Planck also made a comment about if we define international systems based on fundamental constants that can be communicated to even some non human civilizations. And in fact, when the Voyager spacecraft was launched about half a century ago it had this gold plated record on board and instructions also how to use the instrument to play. There were various photos of people from around the world, sounds and photos of life on earth, et cetera, and various descriptions where is earth in the solar system? And so the units that they use are, were based on fundamental constants.

And so example of a traceability chain is all houses and businesses in New Zealand use electrical power and power meters, or for example, transpower, are calibrated in MSL by we have power, electrical power standard. And this instrument is calibrated in terms of more accurate instruments called thermal converters, and they are calibrated in terms of quantum standards.

So here are two electronic chips on the left is quantum standard of voltage, and on the right is quantum standard of resistance. If we buy, say, a battery in a supermarket, say, three volt battery, and we measure the output, typically it will be between 2. 9 and 3. 1 volts or so, but here you can see the accuracy of the quantum standard of voltage. This is a 10 volt chip, and you can see the extraordinary accuracy, and similar is with the quantum standard of resistance. This chip here consists of a series array of so called superconducting junctions. Superconductors are materials that have no resistance to flow of current. And this is a sketch of one such device.

In green are superconductors, and between them is insulating layer through which electrons tunnel. I just mention this device because... This is used for the quantum standard of voltage, but the same device is used for superconducting quantum computers, which I will mention later. Another example of a traceability chain is measurements of mass. So in a supermarket, we measure weight of food or Measurements of milligrams of some pharmaceuticals are measured in pharmacies. So these measurements are done with instruments that are traceable to primary standard of mass. This is the image from NIST, that's National Meteorology Institute in the US, but MSL is currently building similar instrument, it's called the Kibble balance, which is very accurate weighing balance.

And the principle of operation of this balance is also based on these two quantum standards. So that was a little bit about the development of quantum mechanics and some applications in measurements.

**Vladimir Bubanja:** With regard to information security the organization is somewhat similar to this metrology pyramid organization of measurements.

Namely... Government organizations and private companies have their own information security policies. These policies reference the National Security Manual in case of New Zealand that is published by GCSB. This manual references this commercial suite of algorithms published by NSA and Each of these particular algorithms is described in more details by special publications issued by NIST.

So, just a couple of paragraphs from this National Security Manual, they describe that it is intended for government, as well as public sector, private sector, and they say with increasing Speed of computers. Some of the cryptographic protocols are getting increasingly vulnerable and they mentioned that RSA, for example, is expected to be deprecated on various protocols in terms of elliptic curve cryptography.

But as I will mention on the next slides, also elliptic curve cryptography is expected to be obsolete soon. And so they recommend that agencies and companies keep track of these forthcoming changes. So here is this suite of algorithms where the various protocols are listed, such as advanced decryption standard, Diffie Hellman RSA, elliptic curve cryptography, digital signature algorithms, et cetera. And all of them are specified in these special publications such as federal information processing standards that are issued by NIST.

**Vladimir Bubanja:** So with, I would say a couple of words about the public key cryptography so historically in early seventies NIST, so that is National Institute of Standards and Technology, National Meteorology Institute of, in the U. S., analog of MSL in New Zealand, so they were called National Bureau of Standards, so they have published a request for proposals for standardized cipher for public key cryptography in about, I think, 77 digital encryption standard was adapted and maybe If I remember correctly, 1990 Advanced Encryption Standard, which is still in use today, and then due to the developments in quantum computing in 2016, NIST initiated a process for standardization of post quantum or sometimes called quantum resistant cryptography.

With regard to public key cryptography, this is a simplified sketch, which is if we have two parties, say Alice and Bob, who want to communicate over public channel, the procedure briefly goes something like this. Suppose Bob wants to receive a message from Alice, so he publishes a public key, this is just a number that is accessible to anyone, say if they communicate on the internet, it can be in principle accessible by anyone on the internet, and so Alice then wants to send some message. This is again, a number. It is written in binary, just a sequence of zeros and ones, whether that can be a video or email, etc. That is always represented as a sequence of binaries, zeros and ones. And so she performs a mathematical operation that also uses this public key, and she obtains a result, which is indicated here as E of M, and that is the encrypted message. That is just another binary sequence. And she submits that over public channel. And then Bob uses another private key, so this key is known only to Bob, that is another number, and he performs another operation D of E, which decrypts the message, so he recovers the original message that Alice wants him to send.

Because this is a public channel, like say Optical Fiber, so, in principle, there could be eavesdroppers that copy this encrypted message, but it is not feasible for them to try to decrypt without the knowledge of the private key, which only Bob has.

So for this to work one needs so called one way functions. So these are functions that are easy to calculate one way, but difficult to invert. So it is easy for Alice to just calculate E of M, that can be done quickly, but knowing E of M, without the knowledge of a private key, it should be difficult to invert and recover M. And so... Essentially, presently, two such schemes are used. One is RSA algorithm, and another is elliptic curve cryptography. And I'll just say a couple of words about them. So with regard to RSA, there are a couple of steps that are done. I will not go through these steps, but we'll just emphasize the basics.

So... For, in order to use this scheme, Bob starts with two prime numbers, p and q. So these are numbers that are divisible only by one and themselves. And he calculates a product of these two numbers. So n is p times q, and he obtains that number. He then publishes this number. So this is the public key. Anyone can have access to this number, and Alice uses this number and encrypts her message and sends it to Bob.

For any eavesdropper, in order to recover the message, to decrypt the message, they need to know the prime factors of this composite number. And that step is difficult. In other words, so on the present day computers, classical computers, this can be done very quickly. So I here show small numbers. In practice, numbers are much larger.

With regard to elliptic curve cryptography, so first of all, elliptic curves are mathematical objects that are define, defined by this equation. So there are just two parameters, A and B, and by choosing these two parameters, we can have different shapes of these. There are infinitely many of these elliptic curves.

And so multiplying even huge numbers can be done very quickly on present day computers. But there is no efficient algorithm to factor composite number, so that just takes for present numbers used in, in cryptography, it takes thousand years or, longer.

And then on any such elliptic curve, one can define operation called point addition, and that's done in the following way. If we have two points, we draw a line where that line intersects this elliptic curve, we flip it over horizontal axis, and that point is also, because of the symmetry, it's also on the elliptic curve, and that's called point addition. That is P plus Q. And so, in cryptography, instead of these elliptic curves defined over real numbers, there are elliptic curves defined over so called Galois fields. These are just integers. And so it's little bit less easy to see that, but this set of points here, that is elliptic curve defined over a particular Galois field, and still the same operation applies, having two points, one can define in the same way, as I mentioned before, one can define the sum of points, and the equation is the same, only modular arithmetic is used, and so here, one can easily add points, starting from P. One can use the same point and define, say, P plus P, or one can add P plus P plus P to obtain NP. That can be done very quickly on present day computers, but knowing point P and final point NP, there is no efficient way to know how many times we added point P to obtain the final point. So that, that... is a hard problem. It's called discrete algorithm problem.

**Vladimir Bubanja:** And so basically in cryptography, there are these two problems. One is factoring, which is RSA. And another one is this discrete algorithm problem. They cannot be solved. At least nobody knows how to solve that on classical computers. On the other hand, quantum computers can easily break this.

And just to mention here, I, I said there are infinitely many elliptic curves. There is actually this website, standards of Efficient cryptography group. They list a number of elliptic curves that are used in cryptography, and I just show here one of them. So it is just defined by these parameters and it's called cots curve that is used for example, by Bitcoin and most of the cryptocurrencies.

Some of them use another one, but this is just an example. And so, there are these commonly used cryptographic algorithms. And so, there is RSA, elliptic curve cryptography, digital signature algorithms. They are there is a efficient algorithm to use on a quantum computer to, to break all of these.

With regard to this advanced decryption standard and these hash functions there is also algorithm to be used on quantum computer, but it is not as efficient as, as in this case. So the speedup here is quite large. It's exponential speedup, and here goes as a square root. So people generally now I think that using larger key sizes can be used as a protection of a large quantum computer. The large quantum computer still doesn't exist, I will explain later, but the protocol for using algorithms exists.

And so for that reason, NIST initiated, I mentioned in 2016, this post quantum cryptography standardization. Last year they selected candidates and now they are being considered and once the decision is made, then they restart the process of replacing currently used encryption protocols, and so I just here to mention that three of these four candidates use what is called lattice cryptography and so that is if you have two vectors say V1 and V2 by adding them so using integer numbers to add or subtract them one can define the lattice of points and so if these two vectors are not perpendicular it would look like this and that is for two dimensions and Kevin say vectors B1, B2, we can form lattice of points, and difficult problem, problem, analog of what I have mentioned, factoring, and this discrete logarithm problem, the analog problem here, which is hard problem, is finding the shortest vector, if we know this set of points, so in two dimensions, like I show here, that would be easy to find, but in cryptography, multidimensional lattices are used.

And so I can mention quantum computers can break the present cryptographic protocols. On the other hand, there is a field called quantum cryptography, which uses properties of quantum systems to secure communications. I will mention that also later.

**Vladimir Bubanja:** Right. So with regard to quantum information, so first of all, people here for a long time communicated at a distance by using, for example smoke signals, et cetera. And then in the 19th century, electrical signals started to be used. And so there was invention of telegraphy and telephony. And at the beginning of the 20th century, there was a kind of a predecessor of internet.

We can say that was telegraph stations around the world were connected. And then with regard to connecting computers in 1969, there was a ARPANET where they connected two distant computers. One was in LA, another in San Francisco, and they were actually talking on the phone and discussing what is on the screen.

They wanted to send just one word, login, but when they got to the third letter, the system crashed. So that was the beginning. And then gradually they connected more and more computers and there was development of various protocols. And some of the important protocols were actually developed to transfer data. Large amounts of data that were collected at a large hydrum collider that I have mentioned. This is the largest accelerator in the world and etc. So there was development of technologies and then usually it is summarized as Web 1, 2, and 3. There are no strict definitions of these, but roughly speaking in the early 90s, mainly academic institutions were connected, and that was mainly stationary, kind of static webpages.

Then from 2000s more streaming was included, blogging, and so on. Or, one summary's kind of web one was read, web two is read, write, and web3 is read, write, and own. And then there are further developments, which I will mention going to quantum internet.

**Vladimir Bubanja:** Just a couple of words about the history of the development of computers. The first known computer, this was analog computer, is this so called Antikythera mechanism, which is shown here and the replica is shown below. This instrument was surprisingly complicated. It is from around 100 B. C. and it was used to calculate the positions of the planets and use that to decide when to hold Olympic Games.

And So then, surprisingly, maybe for more than a thousand years, there was nothing of similar complexity until people started to make clocks and clock mechanisms. And then in the middle of 19th century Charles Babbage made a plan for, for so called difference engine and also another one called analytic engine, which was a digital mechanical computer.

And actually part of that original engine was transferred to New Zealand, I think, by his grandson, and later it was moved, and now it's in a museum in Australia, and then, during Second World War electronic digital computers were developed, so this is, for example, ENIAC, one such computer, they implemented a vacuum tubes and these tubes would blow up after a couple of hours continuous work.

So they needed to be replaced. And then in 1947, physicists working at Bell labs in the U. S. invented transistor and about decades later, integrated circuit was developed. And so that has given this exponential rise in speed of competing. This the development of transistors was something like this.

So originally transistors were plainer devices. So this, what is shown in gray here are Electrodes called source and drain. And between them is electronic channel, which on top of that has this so-called gate electrode. By applying voltage to this gate electrode, one can control passage of electrons underneath.

So it is just like a tap of water. It can be opened or closed. And so by applying some voltage, there are two states off and on, which represents. represent two binary states, zero and one, so digital representation. And then with miniaturization, these electrodes are made to stick out of the plane of the chip.

And so that's called FinFET, Fin Field Effect Transistor. And now these electrodes are getting thinner and thinner and they're approaching a single atomic layer. And then further projection is that these will become just one dimensional. So from two dimensional going to one dimension, and it's not sketched here, but the next step is these will become just points, zero dimensional systems.

And just here is a cross section of such FinFET, and even with the current technology, 7nm and 5nm technologies, the number of atoms in that electronic channel is about a dozen, dozen of atoms, so replacing even one minute of these atoms changes the threshold voltage. So it changes the operation of the transistor and on a chip, there are tens of billions of these transistors and all of them have to operate reliably.

So this is in general, the most advanced technology that there is. And so there's only one company in the world who can make the latest, most advanced chips. That's Taiwan Semiconductor Manufacturing Corporation. And this is a little bit about our work. So this device is, this is atomic force microscope image of a single electron memory.

So in the middle of this device, there is a like a small box. I sketched it here as this square. And by operating this device we can put one electron into that box. That represents state one. When we remove it, that box is empty and that represents state zero. So here by using one electron, one can represent one bit. And that's, that's the ultimate limit in the miniaturization of electronics. I have mentioned that with miniaturization these materials are going to single atomic layer materials. And so these are some examples of the materials that we have investigated.

So currently used material for chips is silicon, and when we get a single atomic layer, that's called silicin. And then another material is graphene. Silicin consists of atoms of silicon and graphene of atoms of carbon. When graphene is folded seamlessly into a cylinder, then we obtain so called carbon nanotube. It's shown here. And this is a chip we have developed also with colleagues from Japan.

These vertical lines are gold electrodes and between them our carbon nanotubes. And so we have demonstrated transistor operation. So one dimensional electronic channel. And we also considered sensing applications of these devices. We were interested in sensing individual molecules of water and various groups around the world are working on similar sensors.

The goal is to detect minute amounts of some. Dangerous substances such as poisonous gases or explosives, et cetera. So he mentioned three dimensional, that single electron memory, two dimensional materials, carbon nanotubes is one dimensional. These are one dimensional systems. They are called quantum dots.

And example how they are made is there is some conducting material shown in light blue, and between them is in dark blue, blue semi conducting material. Because of the small size it is possible to controllably add one, two, three, et cetera, electrons. And in that way, one can form artificial atoms. One advantage of these artificial atoms with respect to natural ones, when used in devices, is that in these artificial atoms, one can adjust the energy separation of electron levels while in natural atoms, they are fixed.

And so just to summarize this discussion with regard to material science and also development of devices, material science has traditionally been heavily empirically based. So. For example, making some material like famous Damascus steel, it took maybe hundreds of years. People tried starting with something like iron. They added various elements like nickel or carbon and then changed the percentages and the process of heating and cooling until they get some desirable effects such as. strong material and yet elastic however, this process is very slow. And as a consequence, humanity now is familiar with only a very small number of materials.

With respect to huge possibilities that can be made. And so computational techniques have considerably advanced in recent years. And so this experimentation is, is done now on supercomputers. And so that procedure is one can start, for example, with some. Choosing some elements, atoms from periodic system, and then combining them to form materials and investigate their properties on a computer or also assemble some devices.

And then that can be zero to three dimensions. And then in the laboratory testing that those devices or materials, and finally making products. So in our case, we use New Zealand supercomputer for, for this work.

These small quantum devices, they have one property of superposition, which is useful for computing and communications.

**Vladimir Bubanja:** So classical bits can be in a state zero or one, while quantum bits or qubits can be also in a superposition. So they can be in states zero or one, but they can also be in these two states. Simultaneously. And they can be in these two states with various percentages, so they can be, say, 99% 0 and 1% 1, or they can be with equal proportion in these two states, and so usually qubit, quantum bit, is represented by this so called block sphere, and so any point, every point on this sphere represents the state of the qubit.

And so this parallelism being in two states at the same time, gives enormous advantage for, for computing and also for communications. I have talked a little bit about history of classical computing. So there was a development going from analog computers to mechanical, digital, electronic, et cetera. And so, finally... This development settled on silicon based transistors. They are devices that can very quickly switch between the two states. And for the last 50 years or so, that has been improved. Quantum computing is still in early stages. And so there are various approaches. How to build a quantum computer and what one needs is just this qubit, which is a two state quantum system.

So, any two state quantum system represents a qubit, and so the question is, how easy is it to combine such systems how easy is to control them, how quickly one can operate, et cetera. And so for that reason, there are a variety of approaches, one approach is using quantum dots. I can mention these quantum dots before and just using two lowest energy states in a quantum dot can serve as a qubit.

Then, I have mentioned these superconducting junctions when I was talking about quantum standard of voltage. And so, they, these devices can be combined in for example, superconducting transmon qubits. Then there are ion traps, so when we have atom and we take away one or several electrons, they become charged, so these are ions.

And then using electromagnetic fields, one can trap them and perform operations. So that's also one approach. Then there are diamond vacancies, lattices of neutral atoms, or using photons. All these are approaches to for building quantum computers. Currently, the most advanced are these superconducting quantum computers, but each of them have some advantages and disadvantages.

So, for example, regarding superconducting quantum computers, there is established technology that was used to build quantum standard of voltage. And on the other hand that requires low temperatures. So these are millikelvin below minus 270 degrees Celsius. While those based on photons can be operate at room operated at room temperature.

However operations of, of, of. So, photons are, are more difficult to manipulate than these superconducting operating on electrons. So, people are, are still pursuing various options.

**Vladimir Bubanja:** So, I have mentioned that there is a quantum algorithm to be used on quantum computer to Factor, and also there is another algorithm to break elliptical cryptography. This was the first experiment that demonstrated so called Shor's algorithm for factoring. In this experiment, they used nuclear magnetic resonance and a small quantum computer of seven qubits.

And they factored small number, 15, which for this purpose, we don't need any computer, but they just followed this algorithm and wanted to just give a proof of principle and The main difficulty with quantum computers is that these qubits are very fragile. So, for example and so you know, those superconducting qubits. If there is, there are some thermal photons from the surrounding equipment or there is some electronic noise, They can easily be disturbed. So they changed the state and so one needs to have extraordinary care to protect these qubits, and yet one also needs to access them and, and perform operations on them.

And so in addition to the qubits that one needs to operate, it is also needed to perform error correction. And so there are various approaches how many of these extra so called ancilla qubits are needed to actually have one logical qubit that performs operations.

**Vladimir Bubanja:** And with regard to this error correction, I have mentioned, for example, this Voyager spacecraft that has already exited, entered interal space and still communicates with Earth. So these are weak signals and in classical communications one way to correct errors is using redundancy. So, for example, if one wants to transmit digit zero, one can simply transmit three times that same, same value three times zero.

And then if. In transmission error occurs with large probability, it will be only one of these three will be flipped. And so one can use majority rule to decide what was the intended digit to be submitted. So if there are two zeros and just one, then one decides that in transmission, it was probably zero that one wanted to transmit.

However, correcting qubits, it's much more difficult. First of all, this principle cannot be used in quantum computing. There is something called no cloning theorem. So if there is a state of a qubit, one cannot just copy and produce a form, say two qubits of the same or three qubits of the same state. So that's not possible. And So I am just make this analogy. These are weak signals coming from far, far away in case of, for example, superconducting quantum computing, there is a chip that is at the bottom of this structure. This is placed in a dewar. And so this is submerged in liquid helium. So it's very low temperatures.

And then they're connecting wires to classical computers outside, etc. And so these can all cause some interference. So there is no this redundancy procedure, but there are other procedures that one can correct. For example, for this bit flip correction, but I have shown that qubit can be state on a sphere.

So there can be phase errors. Not only flipping, but various other errors, but there are procedures and that requires additional qubits. And so that increases the, the difficulty. And so that's a major engineering challenge of going from the present state quantum computers, which qubits to the size of, say, a million qubits, which is needed to break the existing cryptographic protocols. So that is an engineering challenge, but there are a number of groups that are working in this field.

**Vladimir Bubanja:** And I just want to say that I mentioned one application of quantum computers is breaking the cryptographic protocols, but they have a variety of other potential and useful applications, such as prototyping materials, considering large quantum systems, which cannot be done efficiently on supercomputers development of various medications, improving solar panels and batteries various optimization problems applications in finance, etc.

**Vladimir Bubanja:** So, I have mentioned quantum computers can be used to break the cryptographic protocols. There is crypto quantum cryptography which can be used to communicate securely. And so there is actually in classical cryptography, there is this so called one time pad procedure, which is information theoretically secure procedure.

That is, if one starts with some message, one can add a random key and then transmit this sum of the message plus the random key. Because of the... This scheme being random, if somebody intercepts this message, they cannot recover the original message. Just because of this random key. However, that key has to be used only once so that's why it's called one time pad.

If the key is used repeatedly, then there are techniques they can be used to, to recover the message. So for example, in the language letters have certain frequency appearing in, in some text et cetera. So there are techniques if one uses the same key, then to recover the message. And so to use this procedure, the parties originally need to be at the same place. So they can share pile of these keys and then they can separate and communicate. But with present communications over internet, distance, computers, that's not practical. And so, to use this there is this problem called key distribution problem, and that is solved by quantum key distribution.

So, by sending, for example, photons that are used as qubits, one can protect the eavesdropping, because these, I have mentioned qubits are very fragile, so if there is an eavesdropper who attempts to copy or to measure the state of a qubit, the communicating parties can detect that. And so these physical properties of qubits are used as security in communication. So current cryptography uses some mathematical functions that are considered difficult to, to invert, to solve on present day computers.

But just like this classical cryptography has been used for maybe 50 years, but then came up this algorithm that can be used on, on different type of machines on quantum computers, which can break that. So this procedure that is called post quantum cryptography, or quantum resistance cryptography, just relies on present day of knowledge and assumption that it will be difficult or nobody knows how to solve these lattice cryptography problems even on a quantum computer, but there is no guarantee that that's not possible. It's just, it's not known and in, in quantum cryptography one actually uses These are the fundamental principles of Qubit to ensure security. There are some disadvantages. They are slower than presently used cryptography. So, maybe in limited applications, they are currently used.

An example of this quantum key distribution is shown in this experiment. There is original message that is decoded, looks like this, and they used quantum dots. So I have mentioned these quantum dots before.

**Vladimir Bubanja:** A couple of words about general trends in, in this space. So there are relatively large public invest in investments. For example, UK had a national started with national quantum strategy about 10 years ago by investing a billion this year. They have more than doubled that, a new investment. Also, US has a similar national quantum initiative. European Union has three large and long term All the flagship research projects. One is about quantum technologies. Another one is about graphene. So that is the material single atomic layer of carbon that I had mentioned, and the third one is about brain.

But in addition to this level of European commission, also individual countries. have similar scale investments, for example, Germany, Netherlands, France, et cetera. China might be the, the, actually the largest investor, just one institute has a investment of 10 billion, but there are multiple streams and, and programs. And, similarly, other countries.

Also, there are private investments by large companies who are working on quantum computing and quantum communications, and there are more than 200 startups. Some are small, some medium sized, some are development hardware, another just software for, for these quantum technologies.

This is the image of the first. Commercial quantum computer, it was made by a Canadian company, D. Wave. And their estimates of future projections that in perhaps several decades, this kind of sector will be over a trillion U. S. dollars.

**Vladimir Bubanja:** And so, I have mentioned these various quantum devices, such as quantum computers, quantum sensors, and the vision is to connect them by ultra secure quantum communication devices in something that is projected to be quantum internet.

And so in conclusion, we are in the midst of a quantum information revolution with unimaginably exciting promises. Thank you very much for your attention.

**David Ding:** Okay. Thank you for that. I've just got a few questions actually. One second.

**David Ding:** So, you know, you talk about you know, how energy is quantized. Yeah. And I think part of the problem with quantum cryptography is that, you know, if something's quantized, it can be measured. And so there's a startup in Australia currently who are using human neurons and computer chips. And one of the challenges they have is that it's imprecise, and so it's non quantized.

So, is there a possibility for cryptography solutions to come out of that kind of technology?

**Vladimir Bubanja:** Yeah, it sounds interesting. I'm not familiar with that particular company. I know another one in Australia, but it's I'm not exactly sure what they are doing, but the the issues with the quantum communications is Not so much this discretization.

So one can be in some energy state, but that state still has various possible phases, and so disrupting that phase is why the qubits are fragile. And with regard to discreteness of the state, usually one is interested only in two states. And so maybe in their system, they sort of slip into another third state or something like that. And also with instrumentation, maybe they don't. If this separation between levels is not large enough, maybe they have issue that they interpret as it's not discretized, so they can't distinguish which state it is, or maybe the third state is close to second one, or some issues like that can happen, so it is usually needed to have to well defined states so that one can interpret in which state it is, but still it has to be suitable system that one can make state system to be in a superposition, to be in both states at once.

So these are one needs to choose such systems appropriately.

**David Ding:** Yeah. And so with, with, when something is in a superposition, are those positions still deterministic?

**Vladimir Bubanja:** Yeah, so when it is in a superposition, one can evolve the system deterministically, yes? So one can perform some operation and one can say, well, I want to move more into that way or that way by using the suitable equipment.

So it can be evolved deterministically.

**David Ding:** Okay, interesting. So I've got a question from Jeff here. Actually, if you just exit your screen share, it'll be a slightly better view. So Jeff's got a question. What did you make of the LK99 drama?

Okay. So, does that make any sense to you? What did you make of the LK99 drama?

**Vladimir Bubanja:** LK99. Actually, I'm not familiar. Can you maybe explain that acronym?

**Jeff:** Yeah, the Lee Kim superconducting room temperature material. What did you make of that saga coming out of Korea? And do you think that there's opportunity there to refine this?

**Vladimir Bubanja:** Yeah, now I can see the chat questions. So can you please explain that again, there was a leak in superconducting. Can you please explain that? Repeat the question

**Jeff:** yeah, did you think that there was promise in that, in that research direction?

**Vladimir Bubanja:** Huh. Yeah. So maybe I, I don't remember that particular paper, but certainly there are developments and actually, Here at Callaghan Innovation, they had some work done and the spinoff company HTS one one oh was formed with high temperature superconductors. These, these are still low temperatures, but certainly there has been considerable advancement in, in that direction.

And actually, there, there are a number of materials that can be superconducting at room temperature, but they require very high pressures. So at very high pressures, a number of elements become superconducting. And so the goal is to find materials that don't require these high pressures.

And, yeah, I now don't remember that particular paper, but certainly these so called critical temperatures over the years, people have increased critical temperatures. And in fact quantum computers are projected to be used to investigate and find new materials that can be used for, for these applications.

It's supercomputers are for example, much better than desktop machines in performing some studies of materials, but they still cannot truly perform simulate quantum realistic, realistically simulate quantum materials. And so for, for a range of applications, such as discovering room temperature superconductors people have hoped that that, that will be one useful application of, of quantum computers.

And There is a section of quantum computers people call quantum simulators, which can be used to simulate various materials and other systems quantum mechanically. Yeah.

**David Ding:** Okay. Thanks for that. So Paul's got a question. I understand that programming quantum computer systems requires very different approaches to those that we understand.

Is there any movement to create standards?

**Vladimir Bubanja:** Yes, that's right. There are I, yeah, they, they look somewhat different and operations of, of these qubits are, are different than, for example, flipping bits. And in terms of standards there are, for example, various websites that introduce gradually how to draw these diagrams, which are more or less standard. I would say nowadays it's not maybe official standard, but it's widely used. So I'm not sure if at some stage there will be some probably there will be some kind of a standardization of that, but there is a, for example, standardization in terms of devices that are being developed, so people are planning, for example, for these quantum softwares.

Communications to introduce standards. What definitions of one of the protocols, one of the properties of these systems is entanglement. And so there will be some standardization, what is meant by entanglement measures and so on and in, in software there. Yeah, I think it's good point there.

There will be some standardization. Not at this moment, but I think there will be.

**David Ding:** Okay, so I just want to, I think the, the, the thing that a lot of people will be thinking about is, are quantum computers going to break Web3? What's going to, what, what are the risks and, you know, how do we mitigate those?

Yeah. So web 3 as far as I know, will be based on, on this cryptography that is, that is used. So for example all bitcoin, all cryptocurrencies, blockchain technologies all that is based on on what will be this post quantum cryptography that will be introduced. The term a little bit post quantum is misleading.

It sounds like after Quantum or something like that. In fact, it's better term is maybe quantum resistant because they might be able to be broken, but at present nobody knows the algorithms even on quantum computers. And so it is generally believed that these, for example, lattice cryptography that will be used.

In several years, actually these processes take some time to replace all the existing cryptographic protocols, but apparently it is believed that these will be safe to use even when quantum computers come online. So it's in general, what is currently happening is that although people cannot break the Existing protocols, but there are, for example, nation states that are using what is called harvest now and decrypt later.

So it is easy to copy a lot of communications now and store. And then once quantum computers become of sufficient size, that can be decrypted. And some communications. Like some diplomatic communications, etc are meant to be secure 30 years from now, or even forever and for that reason, people are in this process of, of standardizing post quantum cryptography or quantum resistant cryptography, so blockchain web3 will, will use this post quantum cryptography, which is now being standardized.

**David Ding: **So that this technology is symbiotic. It's not it's not going to replace anything like Web3 or blockchain. No.

**Vladimir Bubanja:** Yeah.

**David Ding:** So what, what is the next evolution look like once that there is, you know, a production ready version of what you're calling the quantum internet?

**Vladimir Bubanja:** Yeah, so it is hoped that if one has these large scale quantum computers, they can be used for many sort of useful applications, designing new products maybe, or like we mentioned finding high temperature or room temperature superconductors using quantum sensors that can detect some dangerous substances or just monitor temperature around the world, et cetera.

And all these communications can be connected by using that quantum key distribution protocols, which cannot be Intercepted. And there can still be some classical computers that are used for, for tasks that are basically used now. Some of them will be more efficiently done on quantum computers, but they can still operate.

And so it will be like enlargement of another layer of the, say, Web3. So another layer on top of that for other functions.

**David Ding:** Yep, it's in in the web3 space, it's quite common to, people are trying to get the space in between verifiable transactions as small as possible, you know, a nano transaction, it's, and when I think about the quantum factor, it's kind of like, It makes that space just unrecognizable potentially, but there's a capacity problem with that, and I think, does the does this, does the quantum internet solve the capacity problem of verifiable transactions?

**Vladimir Bubanja:** Yeah. So there, there are various kind of things that people think about these verifiable transactions. So for example, people talk let's say in cryptocurrency, et cetera. These roof of work, roof of stake. And actually there is I will just share the screen here

So there is this area of random number generation, and so that's Used in, in various protocols on, on the internet. And it is also required in, in these verifying transactions, et cetera. And so people are thinking of adding another one, which is called proof of randomness. And so in, in general, random numbers have a important role in, in cryptography and.

I just want to mention the development of quantum random number generation. So random number generators can be software based and, and hardware based. So, currently, in, in computers, they are using some mathematical functions, and they're called pseudo random number generators. So there is a mathematical function that just outputs numbers that, for the observer who doesn't know the function, And where one, what is the input into that function?

These numbers look random, but they are actually deterministic. So if one starts with the same number, you always will come the output number. So they just look difficult to predict which is the next number. If observer doesn't know the function, etc. There are also hardware numbers that are used for various applications.

And Again, these are based, if we consider like a coin toss or roulette or temperature variations, etc. They, for the observer, who doesn't know the details of the process, they look... Random, but in fact, everything in classical physics if we know the exact position of the coin initially and which force we flip the coin, one can actually calculate what will be the outcome.

On the other hand, in quantum mechanics, it is non deterministic part of the measurement. And so I just want to mention here that with the Internet of Things and many things connecting there were a number of attacks, for example, on random number generation. So, for example, it was known that Bitcoin on Android, for example, was possible to hack.

Then there were like Sony, he had maybe PlayStation, some hacks, and a number of known attacks on the random numbers. And yeah, this is also another one. I had mentioned these elliptic curves. They can be used to generate random numbers. And this is also used for, for that, what you ask about verifying the transactions.

And there was a well known so called elliptic curve deterministic random bit generator that was published by NIST. And that had backdoor. And so this received a lot of A lot of media attention a couple of years ago, because this was actually required to be used for those who need certification under this federal information protocols standard.

So... There are known attacks on random numbers, and so I have mentioned that qubits can be in a superposition, so they can be simultaneously state 0 and 1, and then measurement of that is actually stochastic, so that cannot be predicted. So if the qubit is here on this equator, so equal composition of 0 and 1, if one measures it will be the result 0 or 1, but that cannot be predicted.

So that is a stochastic part. And actually MSL in light standards, he started developing one random number generator based on photons. So in any case, that's related to that area that you asked about verifying transactions.

Yeah. Yeah. Very interesting. I think,

**David Ding:** you know, I think Because at the end of the day, you can select something that's nondeterministic, but there has to be a deterministic outcome.

And I think what people are seeing, what we're seeing in Web3 now are, rather than people trying to make something totally trusted, it's producing a risk score. You know, out of 1 or out of 10 in each moment. So, this transaction is 9 out of 1 trusted, and in each moment it's producing another score. And I think that's kind of, you know, if it gets down to a 3 there could be an issue, but then it comes down to the whole system has to be, you know, trusted.

So are you are you doing any work on trustless systems? Like, systems that can be entirely trusted? Or is it more just to do with the cryptography?

**Vladimir Bubanja:** Yeah, we are we are not actually working in that area, but I think it's important area of these systems that you describe. It's just for us, we are concentrating on these developing devices that are very accurate, accurate, and they're at the top of this metrology pyramid.

So to ensure. accuracy of measurements in New Zealand. And these devices have the same quantum principles as, for example, qubit and things that have these other applications. But we sort of concentrate on, on, on these devices and their principles. Actually, there is a question here can New Zealand become world leading quantum tech?

It's a good question it's currently there is this Dodd Wall Center for Photonic and Quantum Technologies, and so that, that is kind of center of people from like us from MSL and universities where different groups are working in, in this area and some, some of the work that is there produced is, is in general world leading and people are interested to scale this up.

There are various ideas and, projects of commercializing various discoveries that were done and so hopefully it's a bright future.

**David Ding:** So if, let's say there's a, there are some builders watching this and they want to solve some cryptography problems, they've got some hard problems to solve. Is that something that Callaghan can do?

Can they come to us with a problem? In cryptography, and can we just solve it?

**Vladimir Bubanja:** So maybe it depends on the exact type of the problem, or so I think there, there would be some people interested in, in generally working and probably there are varieties of, of approaches and problems. So maybe, certainly we are always happy to, to discuss and if even if we don't Working in the particular area, probably we know who does, so we are always happy to, to discuss, even, even in the case that we are not working in the area, but we are always open to discuss with people.

**Vladimir Bubanja:** Okay. That sounds good. So if there's no more questions, we'll wind it up there. Thanks so much, Dr. Bubanja, we really appreciate your time and thanks everyone for attending. See you at the next one.

Cheers for now. Thank you. Bye.