What happened during these few decades was the APPLIED PHYSICS -----how to penetrate buildings----how to create public surveillance-----how to use imagery for medical research---penetrating TISSUE. So, the equipment developed to view the furthest reaches of universe have these few decades been turned back on EARTH and civil societies. The military DOD of course does that----tied to DEEP, DEEP, REALLY DEEP STATE.
When I do my DEPOSITION for lawsuits tied to NOSY NEIGHBORS AND THE GANG-----who can SEE THROUGH WALLS----telling me they can see through just about everything...........we will take a look at this optical technology----OPTICAL PHYSICS.
GIVE IT UP SAY NOSY NEIGHBORS ----THERE IS NOWHERE TO HIDE-------WE SEE YOU EVERYWHERE INSIDE YOUR LIVING SPACE......
Cameras That Can See Through Walls!
Seeing through walls is no longer the stuff of science fiction! From light field cameras to super slow motion, we're able to grab more information from our d...
The CCD Chip
This video shows the chip of a CCD (Charge-Coupled Device), the device which registers light information coming from distant astronomical objects and transforms that into an image. It is somewhat similar to the sensors that captures images in common digital cameras. The resolution of the images depend on the number of pixels, or picture elements, that it has.
We shared a video titled CAMERAS THAT SEE THROUGH WALLS------marketing commercial products anyone can buy-------and said it has lots to do with imaging the COSMOS. This research tied to BIG BANG---INFLATIONARY MODEL with more and more and more complex OPTICAL PHYSICS and cameras partnered with telescopes like HUBBLE.
The CCD CHIP shown here as a product tied to just such telescope imagery equipment shows how VERY LOW LIGHT-----those PHOTONS inside the building in last video------can be captured and create a clear image of what is in the COSMOS-----or inside a BUILDING.
So, optical technology driven by SPACE EXPLORATION is now driving SEEING THROUGH ANYTHING------but at least for now through WALLS.
The CCD Chip
This video shows the chip of a CCD (Charge-Coupled Device), the device which registers light information coming from distant astronomical objects and transfo...
Hubble Smart Homes Applications | Hubble Connectedapp.hubbleconnected.comHubble Connected provides high performance Home Monitoring service with unlimited Cloud stronge for mimimal subscription fee
Secure cloud storage & free streaming24 hour, 7 days or 30 day recording plans.
Life: Made Easier.With this optional service, recorded footage is instantly pushed to your smartphone whenever motion is detected in your home. Clips can be accessed at any time and anywhere, thanks to our secure cloud storage. Selected users can enjoy this feature as part of a free trial.
More Memorable.Captured a memorable day with our CVR service? With Hubble Connnected you can download the video, then watch it and share it!
More Secure.Hubble Connected is more than just cameras and sensors, it’s about piece of mind. Keep an eye on things while at home or away.
WHAT YOU GET
Complete peace of mind with a Hubble Cloud SubscriptionStandard Plan$99 for 12 months
12 months for the price of 10
7-Day Cloud Video History
Up to 5 Connected Cameras
NEW Daily Video Summary / Baby Sleep Diary
25% off cameras, headphones, speakers, accessories at hubbleconnected.comFREE HD camera when you subscribe today.
How Hubble Space Telescope Works
by Craig Freudenrich, Ph.D. & Sarah Goddard
Hubble's Scientific Instruments: WFPC2, NICMOS and STIS
A picture of the Eagle Nebula, captured by Hubble's main camera, the WFPC2
Photo courtesy STScI and NASABy looking at the different wavelengths, or the spectrum of light, of a celestial object, you can discern many of its properties. To do this, HST is equipped with several scientific instruments. Each instrument uses charge-coupled devices (CCDs) rather than photographic film to capture the light. The light detected by the CCDs are turned into digital signals, which are stored in onboard computers and relayed to Earth. The digital data are then transformed into amazing photos. Let's look at how each instrument contributes to those images.
The Wide Field and Planetary Camera 2 (WFPC2) is Hubble's main "eye," or camera. It sees with the help of four CCD chips arranged in an "L" shape to catch the light -- three low-resolution, wide-field CCD chips, plus one high-resolution planetary camera CCD chip. All four chips are exposed simultaneously to the target, and the target image is centered on the desired CCD chip. This eye can see visible and ultraviolet light, and can take images through various filters to make natural color pictures, such as this well-known image of the Eagle nebula.
Often, interstellar gas and dust can block our vision of the visible light from various celestial objects. No problem: Hubble can see the infrared light, or heat, from the objects hidden in the dust and gas. To see this infrared light, HST has three sensitive cameras that make up the Near Infrared Camera and Multi-object Spectrometer (NICMOS).
Besides illuminating a celestial object, the light emanating from that object can also reveal what it's made of. The specific colors tell us what elements are present, and the intensity of each color tells us how much of that element is present. The Space Telescope Imaging Spectrograph (STIS) separates the incoming colors of light much as a prism makes a rainbow.
In addition to describing the chemical composition, the spectrum can convey the temperature, density and motion of a celestial object. If the object is moving, the chemical fingerprint may shift toward the blue end (moving toward us) or the red end (moving away from us) of the spectrum. Unfortunately, the STIS lost power in 2004 and has been inactive ever since.
Keep reading to find out what other scientific instruments Hubble has up its telescopic sleeve.
Real-Life Superpower: 'See' Around Corners with Smartphone Tech
By Charles Q. Choi, Live Science Contributor | October 9, 2017 04:56pm ET
In spy novels and superhero films, the ability to see through walls has always been a handy — not to mention, impressive — trick. And now, this tech could be available to people in real life, with smartphone cameras that can help detect moving objects even if they are hidden around corners, according to a new study.
This futuristic-sounding tech could one day help vehicles see around blind corners, the researchers said.
"We may eventually be able to use this idea to alert drivers to pedestrians or cars that are about to dart out from behind buildings into a driver's path. Perhaps a few seconds of notice could save lives," said study lead author Katie Bouman, an imaging scientist at the Massachusetts Institute of Technology's Computer Science and Artificial Intelligence Laboratory.
"Search and rescue, or helping to understand what is going on behind a wall in a hostage situation, are also potential applications," Bouman added.
Researchers have taken many different approaches in trying to make the "superpower" of seeing around corners a reality. For example, in 2015, researchers showed they could use lasers to see objects around corners by firing light pulses at surfaces near the items. Those surfaces could act like mirrors, scattering the laser pulses onto any hidden objects. By analyzing the light that was reflected off the objects and other surfaces back onto the scanners, researchers could reconstruct the shapes of the hidden items.
Although most strategies for seeing around corners "are really great ideas," they also "usually require complex modeling [or] specialized hardware, or are computationally expensive," Bouman told Live Science. The 2015 study's technique, for example, required both extremely fast lasers and extraordinarily sensitive cameras.
But Bouman and her colleagues' method for seeing around corners simply uses a smartphone camera.
"We use light naturally in the scene and do not have to introduce our own light to probe the hidden scene," Bouman said. "This allows us to use common consumer cameras and not specialized equipment to see around corners."
The new system, known as CornerCameras, analyzes light that is reflected off objects hidden around corners and that falls on the ground within the line of sight of the camera. This light is called the "penumbra."
The system works by analyzing light at the edge of walls, which is impacted by the reflections of objects around the corner from the camera.
Credit: MIT CSAIL The system analyzes this penumbra over several seconds, stitching together dozens of distinct images, according to the study. This data helps the system measure the speed and trajectory of objects around corners in real time. (It does not see any identifying details about those objects — just the fact that they are moving.)
"I think the biggest surprise was that the system worked well in situations that I would not have expected," Bouman said. "For instance, once, during filming, it started raining. This caused big raindrops to start appearing on the ground, changing the color of the concrete floor."
Because CornerCameras is trying to analyze light signals that are just 0.1 percent of the total brightness of the ground, "I thought these raindrops would wipe out any signal we had," Bouman said. However, CornerCameras analyzes the data of a scene across dozens of images, so "the effect of the raindrops was essentially averaged out."
One current limitation of CornerCameras is that it requires a stationary camera that's held very steady. "In many situations, such as in a collision-avoidance system on a car, you do not have the luxury of a stationary camera," Bouman said. The researchers are now focused on getting the system to work first on a moving wheelchair and eventually on a moving car, she said.
Future research will also aim to make CornerCameras work in a variety of lighting situations, or in changing lighting conditions, such as when clouds overhead constantly move in front of the sun. "Getting the system to work in these scenarios would open up the possibility of it being able to be used by a person with a handheld smartphone," Bouman said.
Bouman and her colleagues will detail their findings on Oct. 25 at the International Conference on Computer Vision in Venice, Italy.