Learn some best practices for engineering projects that have huge amounts of data. Data analytics tools are crucial for project success! Listen in on today’s EEs Talk Tech electrical engineering podcast.
It seems most large labs have a go-to data person. You know, the one who had to upgrade his PC so it could handle insanely complex Excel pivot tables? In large electrical engineering R&D labs, measurement data can often be inaccessible and unreliable.
2:00 – for a hobbyist in the garage, they may still have a lot of data. But, because it’s a one-person team, it’s much easier to handle the data.
Medium and large size teams generate a lot of data. There are a lot of prototypes, tests, etc.
3:25 – The best teams manage their data efficiently. They are able to make quick, informed decisions.
4:25 – A manager told Brad, “I would rather re-make the measurements because I don’t trust the data that we have.”
6:00 – Separate the properties from the measurements. Separate the data from the metadata. Separating data from production lines, prototype units, etc. helps us at Keysight make good engineering decisions.
9:30 – Data analytics helps for analyzing simulation data before tape out of a chip.
10:30 – It’s common to have multiple IT people managing a specific project.
11:00 – Engineering companies should use a data analytics tool that is data and domain agnostic.
11:45 – Many teams have an engineer or two that manage data for their teams. Often, it’s the team lead. They often get buried in data analytics instead of engineering and analysis work. It’s a bad investment to have engineers doing IT work.
14:00 – A lot of high speed serial standards have workshops and plugfests. They test their products to make sure they are interoperable and how they stack up against their competitors.
15:30 – We plan to capture industry-wide data and let people see how their project stacks up against the industry as a whole.
16:45 – On the design side, it’s important to see how the design team’s simulation results stack up against the validation team’s empirical results.
18:00 – Data analytics is crucial for manufacturing. About 10% of our R&D tests make it to manufacturing. And, manufacturing has a different set of data and metrics.
19:00 – Do people get hired/fired based on data? In one situation, there was a lack of data being shared that ended up costing the company over $1M and 6 months of time-to-market.
Learn about radar basics and get a peek into the world of aerospace electronic warfare. Hosted by Daniel Bogdanoff and Mike Hoffman, EEs Talk Tech is a twice-monthly engineering podcast discussing tech trends and industry news from an electrical engineer’s perspective.
Phil Gresock, Keysight’s Radar Lead, sits down with us to discuss the basics of radar and give us a peek into the world of aerospace electronic warfare.
00:20 Adaptive cruise control for cars works really well.
1:00 the history of radar – the original radar display was an oscilloscope in WWII. (radar test equipment)
2:00 The rumor that carrots are good for your eyesight was a British misinformation campaign.
2:58 The British had the “chain home radar system” all along the coast that pointed to their western front. They wanted early warning radar because they had limited defensive forces. By knowing what was coming, they could allocate defenses appropriately.
3:50 How does radar work? You send out a pulse that is modulated on a carrier frequency. If that pulse gets reflected back, we can do some math and work out how far away something is.
4:30 Typically, there’s a specific frequency used. For long range radar, like search and early warning radar, a lower frequency is used.
5:15 What does a modern radar system look like?
It depends on the application. Early warning systems are often anchored on old oil rigs. The rigs have a radome installed on them.
6:25 How does radar detect something so small and so far away? A lot of it depends on the frequencies and processing techniques you use.
6:40 There are some radar techniques you can use, for example bouncing off of the sea, the earth, the troposphere.
7:15 Radar also has some navigational benefits. For example, wind shear flying into Breckenridge airport. A change in medium is measurable.
8:10 Radars also get installed on missiles to do some last-minute corrections.
8:35 Ultimately, the goal of radar is to detect something. You’re trying to figure out range, elevation (azimuth), velocity, etc.
Different target sizes and ranges require different pulse widths, different frequencies, etc.
Azimuth is easy to determine because you know what direction your radar is pointing.
To detect velocity with radar you can use doppler shift.
10:30 Radar cross section analysis gives even more information.
11:00 There are spheres in space for radar calibration. You can send pulses to the sphere and measure what you get back.
Radar calibration sphere in low earth orbit:
http://www.dtic.mil/docs/citations/ADA532032 (for full paper, click the “full text” link)
11:40 There are also reflectors on the moon so you can use laser telescopes to measure the reflection.
Mirrors on the moon:
12:30 NASA put reflectors in space.
12:58 So, you send a pulse out and get a return signal, but there was a scattering effect. There are libraries for what a return pulse for different objects looks like so you can identify what you are looking at.
14:00 Radar counter intelligence techniques.
First, you have to know you are being painted by radar. Military jets have a number of antennas all around it. And, you generally know what radars are being used in a theater of operation. So, there will be a warning that will let you know you are being painted by a certain type of radar.
15:30 Get Daniel on a fighter jet
16:05 How do you stop your radar from being detected or interfered with? There are a few techniques.
Radar frequency hopping is changing the frequency used from pulse to pulse.
Radar frequency modulation changes the modulation pulse to pulse – phase shifts, amplitude changes, frequency chirps, etc.
This helps avoid detection, get better performance, or reduce susceptibility to jamming.
If you know how your radar responds to different signals, you have a lot of flexibility in what signal you use.
How do you spoof a radar? You have to know what is incident upon you and know how that will act over time. You can send out pulses advanced or lagging in time or with different Doppler shifts to give misinformation to the receiver.
You can also drown out the pulses so that your pulses get read instead of your reflections.
You have to have an intimate understanding of the radar you’re trying to defeat, a good system to handle that quickly, and good knowledge that something is actually happening.
We need radar profile flash cards.
Radar peak energies are anywhere from kilowatts to Megawatts.
21:10 A recent US Navy ship had a new hull design, and it has to turn on a beacon because it had so little reflections.
Learn about RF designs, radio frequencies, RADAR, GPS, and RF terms you need to know in today’s electrical engineering podcast!
We sit down with Phil Gresock to talk about the basics of RF for “DC plebians.” Learn about RF designs, radio frequencies, RADAR, GPS, and RF terms you need to know in today’s electrical engineering podcast!
RF stands for radio frequency
00:40 Phil Gresock was an RF application engineer
1:15 Everything is time domain, but a lot of RF testing tools end up being frequency domain oriented
2:15 Think about radio, for example. A tall radio tower isn’t actually one big antenna!
7:00 Communication is just one use case. RADAR also is an RF application.
8:10 The principles between RF and DC or digital use models are very similar, but the words we use tend to be different.
Bandwidth for oscilloscopes means DC to a frequency, but for RF it means the analysis bandwidth around a center frequency
9:22 Cellular and FCC allocation chart will talk about different “channels.”
Channel in the RF world refers to frequency ranges, but in the DC domain it typically refers to a specific input.
10:25 Basic RF block diagram:
First, there’s an input from an FPGA or data creating device. Then, the signal gets mixed with a local oscillator (LO). That then connects to a transmission medium, like a fiber optic cable or through the air.
Cable TV is an RF signal that is cabled, not wireless.
Then, the transmitted signal connects to an RF downcoverter, which is basically another mixer, and that gets fed into a processing block.
Wide bandgap semiconductors, like Gallium Nitride (GaN) and Silicon Carbide (SiC) are shaping the future of power electronics by boosting power efficiency and reducing physical footprint. Server farms, alternative energy sources, and electrical grids will all be affected!
Wide bandgap semiconductors, like Gallium Nitride (GaN) and Silicon Carbide (SiC) are shaping the future of power electronics by boosting power efficiency and reducing physical footprint. Server farms, alternative energy sources, and electrical grids will all be affected! Mike Hoffman and Daniel Bogdanoff sit down with Kenny Johnson to discuss in today’s electrical engineering podcast.
3:00 What is a wide bandgap semiconductor? GaN (Gallium Nitride) devices and SiC (Silicon Carbide) can switch on and off much faster than typical silicon power devices. Wide bandgap semiconductors also have better thermal conductivity. And, wide bandgap semiconductors have a significantly lower drain-source resistance (R-on).
For switch mode power supplies, the transistor switch time is the key source of inefficiency. So, switching faster makes things more efficient.
4:00 They will also reduce the size of power electronics.
6:30 Wide bandgap semiconductors have a very fast rise time, which can cause EMI and RFI problems. The high switching speed also means they can’t handle much parasitic inductance. So, today’s IC packaging technology isn’t ideal.
8:30 Wide bandgap semiconductors are enabling the smart grid. The smart grid essentially means that you only turning on things being used, and turning off power completely when they aren’t being used.
9:35 Wide bandgap semiconductors will probably be integrated into server farms before they are used in power grid distribution or in homes.
10:20 Google uses a lot of power. 2.3 TWh (terawatt hour)
NYT article: http://www.nytimes.com/2011/09/09/technology/google-details-and-defends-its-use-of-electricity.html
It’s estimated Google has 900,000 servers, and that accounts for maybe 1% of the world’s servers.
So, they are willing to put in the investment to work out the details of this technology.
11:50 The US Department of Energy wants people to get an advanced degree in power electronics. Countries want to have technology leadership in this area.
13:00 Wide bandgap semiconductors are also very important for wind farms and other alternative forms of energy.
Having a solid switch mode power supply means that you don’t have to have extra capacity.
USA Dept of Energy: If industrial motor systems were wide bandgap semiconductors took over, it would save a ton of energy.
14:45 A huge percentage of the world’s power is consumed by electrical pumps.
16:20 Kenny’s oldest son works for a company that goes around and shows companies how to recover energy costs.
There aren’t many tools available for measuring wide bandgap semiconductor power electronics.
19:30 Utilities and servers are the two main industries that will initially adopt wide band gap semiconductors
20:35 When will this technology get implemented in the real world? There are parts available today, but it probably won’t be viable for roughly 2-5 years.
21:00 Devices with fast switching are beneficial, but have their own set of problems. The faster a devices switches, the more EMI and RFI you have to deal with.
Spread spectrum clocking is a technique used to pass EMI compliance.
24:00 Band gaps of different materials: Diamond 5.5 eV Gallium Nitride (GaN) 3.4 eV Silicon Carbide (SiC) 3.3 eV
Learn about parallel computing, the rise of heterogeneous processing (also known as hybrid processing), and quantum engineering in today’s EEs Talk Tech electrical engineering podcast!
Learn about parallel computing, the rise of heterogeneous processing (also known as hybrid processing), and the prospect of quantum engineering as a field of study!
Parallel computing used to be a way of sharing tasks between processor cores.
When processor clock rates stopped increasing, the response of the microprocessor companies was to increase the number of cores on a chip to increase throughput.
But now, the increased use of specialized processing elements has become more popular.
A GPU is a good example of this. A GPU is very different from an x86 or ARM processor and is tuned for a different type of processing.
GPUs are very good at matrix math and vector math. Originally, they were designed to process pixels. They use a lot of floating point math because the math behind how a pixel value is computed is very complex.
A GPU is very useful if you have a number of identical operations you have to calculate at the same time.
GPUs used to be external daughter cards, but in the last year or two the GPU manufacturers are starting to release low power parts suitable for embedded applications. They include several traditional cores and a GPU.
So, now you can build embedded systems that take advantage of machine learning algorithms that would have traditionally required too much processing power and too much thermal power.
This is an example of a heterogeneous processor (AMD) or hybrid processor. A heterogeneous processor contains cores of different types, and a software architect figures out which types of workloads are processed by which type of core.
Andrew Chen (professor) has predicted that this will increase in popularity because it’s become difficult to take advantage of shrinking the semiconductor feature size.
This year or next year, we will start to see heterogeneous processors (MOOR) with multiple types of cores.
Traditional processors are tuned for algorithms on integer and floating point operations where there isn’t an advantage to doing more than one thing at a time. The dependency chain is very linear.
A GPU is good at doing multiple computations at the same time so it can be useful when there aren’t tight dependency chains.
Neither processor is very good at doing real-time processing. If you have real time constraints – the latency between an ADC and the “answer” returned by the system must be short – there is a lot of computing required right now. So, a new type of digital hardware is required. Right now, ASICs and FPGAs tend to fill that gap, as we’ve discussed in the All about ASICs podcast.
Quantum cores (like we discussed in the what is quantum computing podcast) are something that we could see on processor boards at some point. Dedicated quantum computers that can exceed the performance of traditional computers will be introduced within the next 50 years, and as soon as the next 10 or 15 years.
To be a consumer product, a quantum computer would have to be a solid state device, but their existence is purely speculative at this point in time.
Quantum computing is reinventing how processing happens. And, quantum computers are going to tackle very different types of problems than conventional computers.
There is a catalog on the web of problems and algorithms that would be substantially better on a quantum on a computer than a traditional computer.
People are creating algorithms for computers that don’t even exist yet.
The Economist estimated that the total spend on quantum computing research is over 1 Billion dollars per year globally. A huge portion of that is generated by the promise of these algorithms and papers. The interest is driven by this.
Quantum computers will not completely replace typical processors.
Lee’s opinion is that the quantum computing industry is still very speculative, but the upsides are so great that neither the incumbent large computing companies nor the industrialized countries want to be left behind if it does take off.
The promise of quantum computing is beyond just the commercial industry, it’s international and inter-industry. You can find long whitepapers from all sorts of different governments laying out a quantum computing research strategy. There’s also a lot of venture capitalists investing in quantum computing.
Is this research and development public, or is there a lot of proprietary information out there? It’s a mixture, many of the startups and companies have software components that they are open sourcing and claim to have “bits of physics” working (quantum bits or qbits), but they are definitely keeping trade secrets.
19:50 Quantum communication means space lasers.
Engineering with quantum effects has promise as an industry. One can send photons with entangled states. The Chinese government has a satellite that can generate these photons and send them to base stations. If anyone reads them they can tell because the wave function collapsed too soon.
Quantum sensing promises to develop accelerometers and gyroscopes that are orders of magnitude more sensitive than what’s commercially available today.
Quantum engineering could become a new field. Much like electrical engineering was born 140 years ago, electronics was born roughly 70 years ago, computer science was born out of math and electrical engineering. It’s possible that the birth of quantum engineering will be considered to be some point in the next 5 years or last 5 years.
Lee’s favorite quantum state is the Bell state. It’s the equal probability state between 1 and 0, among other interesting properties. The Bell state encapsulates a lot of the quantum weirdness in one snippet of math.
00:40 Lee talks about how to crack RSA and Shor’s algorithm (wikipedia)
00:50 The history of quantum computing (wiki). The first person to propose it was Richard Feynman in the mid 1960s. There was some interest, but it died out.
In the 1990s, Peter Shor published a paper pointing out that if you could build a quantum computer with certain operational properties (machine code instructions), then you could find one factor of a number no matter how long it is.
Much of the security we use every day is both the RSA public key system and the Diffie Hellman Key Exchange algorithm.
HTTPS connections use the Diffie Hellman Key Exchange algorithm. RSA stands for “really secure algorithm” “Rivest, Shamir, and Adelman.”
RSA only works if the recipients know each other, but Diffie Hellman works for people who don’t know each other but still want to communicate securely. This is useful because it’s not practical for everyone to have their own RSA keys.
Factoring numbers that are made up of large prime numbers is the basis for RSA. The processing power required for factoring is too large to be practical. People have been working on this for 2500 years.
Shor’s algorithm is theoretically fast enough to break RSA. If you could build a quantum computer with enough quantum bits and operate with a machine language cycle time that is reasonable (us or ms), then it would be possible to factor thousand bit numbers.
Famous professors and famous universities have a huge disparity of opinion as to when a quantum computer of that size could be built. Some say 5-10 years, others say up to 50.
What does a quantum computer look like? It’s easier to describe architecturally than physically. A quantum computer isn’t that much different from a classical computer, it’s simply a co-processor that has to co-exist with current forms of digital electronics.
If you look at Shor’s algorithm, there are a lot of familiar commands, like “if statements” and “for loops.” But, quantum gates, or quantum assembly language operations, are used in the quantum processor. (more about this)
Lee thinks that because a quantum gate operates in time instead of space, the term “gate” isn’t a great name.
What quantum computers exist today? Some have been built, but with only a few quantum bits. The current claim is that people have created quantum computers with up to 21 quantum bits. But, there are potentially a lot of errors and noise. For example, can they actually maintain a proper setup and hold time?
Continuing the Schrodinger’s Cat analogy…
In reality, if you have a piece of physics that you’ve managed to put into a superimposed quantum state, any disturbance of it (photon impact, etc.) will cause it to collapse into an unwanted state or to collapse too early.
So, quantum bits have to be highly isolated from their environments. So, in vacuums or extreme cold temperatures (well below 1 degree Kelvin!).
The research companies making big claims about the quantity of bits are not using solid state quantum computers.
The isolation of a quantum computer can’t be perfect, so there’s a limited lifetime for the computation before the probability of getting an error gets too high.
Why do we need a superposition of states? Why does it matter when the superimposed states collapse to one state? If it collapses at the wrong time you’ll get a wrong answer. With Shor’s algorithm it’s easy to check for the right answer. And, you get either a remainder of 0 or your don’t. If you get 0, the answer is correct. The computation only has to be reliable enough for you to check the answer.
If the probability of getting the right answer is high enough, you can afford to get the wrong answer on occasion.
The probability of the state of a quantum bit isn’t just 50%, so how do you set the probability of the state? It depends on the physical system. You can write to a quantum bit by injecting energy into the system, for example using a very small number of photons as a pulse with a carefully controlled timing and phase.
Keysight helps quantum computer researchers generate and measure pulses with metrological levels of precision.
The pulses have to be very carefully timed and correlated with sub nanosecond accuracy. You need time synchronization between all the bits at once for it to be useful.
What is a quantum bit? Two common kinds of quantum bits are
1: Ions trapped in a vacuum with laser trapping . The ions can’t move because they are held in place by standing waves of laser beams. The vacuum can be at room temperature but the ions are low temperature because they can’t move.
2. Josephson junctions in tank circuits (a coil capacitor) produce oscillations at microwave frequencies. Under the right physical conditions, those can be designed to behave like an abstract two state quantum system. You just designate zero and one to different states of the system.
Probabilities are actually a wrong description, it should be complex quantum amplitudes.
Pricing a new hardware product in a global economy with regional pricing, psychological factors, and the challenges of pricing in white space. This week’s guest is Brig Asay. Hosted by Daniel Bogdanoff and Mike Hoffman, EEs Talk Tech is a twice-monthly electrical engineering podcast discussing tech trends and industry news from an electrical engineer’s perspective.
Daniel Bogdanoff and Mike Hoffman sit down with Brig Asay to talk about how to price a hardware project. Listen in as they discuss the complexities of pricing a new hardware product in a global economy.
Spreadsheets are the killer of pricing. They compete with your gut feeling.
$10K per GHz of bandwidth is a standard in oscilloscope pricing, but it doesn’t always apply. When we came out with the Infiniium Z-Series, a 63 GHz scope, we knew the market couldn’t support a $630K price.
Laser Netflix delivery, backyard data centers, and how the internet gets delivered to homes and businesses. This week’s podcast guest is optical guru Stefan Loeffler. Hosted by Daniel Bogdanoff and Mike Hoffman, EEs Talk Tech is a twice-monthly engineering podcast discussing tech trends and industry news from an electrical engineer’s perspective.
Laser-delivered Netflix and backyard data centers!
The conversation continues with optical communications guru, Stefan Loeffler. In this episode, Daniel Bogdanoff and Mike Hoffman discuss optical infrastructure today and what the future holds for optics.
Another key spec is how many transistors you can fit in a square mm
Metal layers for interconnects are also more important, but can cause the mask sets to be more expensive
Do we care more about a gate’s footprint or its depth? 4:11
Will Moore’s Law hit a ceiling? 4:29
What about using three dimensional structures? 5:37
Is Moore’s Law just a marketing number? 5:51
What to consider when investing in an ASIC 13:23
What’s the next best alternative to building this ASIC?
With an ASIC, you can often drive lower cost, but you also increase performance and reliability
Is there a return on investment? 14:24
What happens when Moore’s Law hits a dead end with transistors? 14:46
Could we replace electrical with optical? 15:30
Is it possible that there other fundamental devices out there, waiting to be discovered? 16:20
The theoretical fourth device, the memristor 17:00
Will analog design ever die? Mike was told to get into digital design.
How does optical communication work? We sit down with Stefan Loeffler to discuss the basics of optics and its uses for electrical engineering.
Optical communication 101 – learn about the basics of optics! Daniel Bogdanoff and Mike Hoffman interview Stefan Loeffler.
Video Version (YouTube):
Similarities between optical and electrical
Stefan was at OFC
What is optics? 1:21
What is optical communication? 1:30
There’s a sender and a receiver (optical telecommunication)
Usually we use a 9 um fiber optic cable, but sometimes we use lasers and air as a medium
The transmitter is typically a laser LEDs don’t work for optical
Optical fiber alignment is challenging, and is often accomplished using robotics
How is optical different from electrical engineering?
Photodiodes act receivers, use a transimpedance amplifier. It is essentially “electrical in, electrical out” with optical in the middle.
Why do we have optical communication?
A need for long distance communication led to the use of optical.
Communication lines used to follow train tracks, and there were huts every 80 km. So, signals could be regenerated every 80 km.