Recently, immersive action cam maker Insta360 has been nurturing a new camera-drone venture called Antigravity. The company has now revealed its first model – a foldable aerial explorer capable of capturing stunning 8K 360-degree footage.
“We didn’t set out to make just another drone,” said BC Nie, Antigravity’s Head of Marketing. “With the A1, we aimed to reinvent the flying experience – making it safe, intuitive, expressive, and infinitely creative for everyone.”
Intuitive Controls with Three Flight Modes
The press release offers limited specifics, but the A1 pairs with wireless Vision goggles and a handheld controller. The pistol-grip remote offers intuitive point-and-fly control with buttons, rockers, switches, a knurled wheel, and three modes: C (cinematic), N (normal), and S (sport).
Image Credits:The Antigravity A1 setup includes the 249-g 360-degree cameradrone, a point-to-fly grip controller and a pair of FPV goggles with funky front viewing screen Antigravity/Insta360
Vision goggles provide an immersive 360° view, with FreeMotion tech and head tracking enabling intuitive, gesture-based flight control.
In practice, this means you can glance left, right, up, or down as the drone soars, creating the sensation of actually riding aboard the mini aircraft.
Goggles with Shared View and External Power
The goggles have a left-eye display for onlookers, with the right eye blocked. Funky antennas protrude from each side, and the headset is powered by an external battery worn on a lanyard.
Image Credits:The Antigravity A1 “redefines what drones can do by combining an immersive flying experience with intuitive controls” Antigravity/Insta360
The A1’s dual-lens camera system—akin to Insta360’s X Series action cams—enables live streaming and 360-degree capture, with one fisheye lens mounted on top of the fuselage and another underneath.Two additional front-facing lenses remain undisclosed in purpose.
Immersive 8K Capture with Invisible Drone View
Antigravity promises seamless 360° coverage with the drone removed from view, enabling multi-angle playback and creative post-production options.
It also supports exporting multiple viewpoints from a single recording without quality loss, as well as advanced effects such as dynamic camera moves, Tiny Planet shots, and horizon flips.
Image Credits:The Antigravity folds down for transport in the supplied carry case Antigravity/Insta360
Lightweight Build, Safety Features, and Launch Timeline
Beyond its 249-g weight and safety features, A1 details remain scarce. Full specifications will be released closer to launch, with Antigravity confirming availability by January next year at the latest. Pricing is still under wraps.
During final testing, creators and experts can join to shape future products, with selected participants getting a pre-production A1 and a share of a US$20,000 prize.
A preview of what’s in store can be seen in the video below.
Now you see it: Like most animals, owl vision has evolved for survival and they perceive the world around them very differently to how humans do. Credit: Pixaobay
It’s simple to overlook that the majority of animals perceive the world differently from humans. In reality, due to their ability to see infrared and ultraviolet light, many animals encounter a world that remains entirely hidden from our sight.
Presently, researchers have created both hardware and software enabling the recording of footage as though it were captured through the vision of animals like honeybees and birds.
It presents a captivating and revealing perspective on nature and animal behavior, with researchers from the University of Sussex and the Hanley Color Lab at George Mason University anticipating a broad range of applications. Recognizing its potential, they have released the software as open-source, inviting everyone from nature documentary producers and ecologists to outdoor enthusiasts and bird-watchers to explore the unique visual realities of these animals.
Unveiling the Dynamic World of Animal Vision
Senior author Daniel Hanley expressed the team’s enduring fascination with how animals perceive the world. While modern techniques in sensory ecology enable insights into how static scenes might appear to animals, crucial decisions often revolve around dynamic elements, such as detecting food or assessing a potential mate. The introduced hardware and software tools are designed to capture and display animal-perceived colors in motion, benefiting both ecologists and filmmakers.
The camera system is sensitive to (1) UV and (2) visible light, plus (3) the modular cage, and (4) the enlarging lens within a recessed (see arrow) custom mount. Here, it’s mounted on the commercially available (5) Novoflex BALPRO bellows system Vasas et al/PLOS Biology/(CC0 1.0)
The composition of our eyes’ photoreceptors, along with biological components like cones and rods, dictates our vision capabilities, including color and depth perception. Some animals, like vampire bats and mosquitoes, can detect infrared (IR) light, while butterflies and certain birds can see ultraviolet (UV) light—both outside the visible spectrum for humans.
This variance in vision poses challenges for understanding animal behavior and assessing our inadvertent impact on their ability to communicate, find food, shelter, or a mate. Current methods, such as spectrophotometry, have limitations—they are time-consuming, dependent on specific lighting conditions, and unable to capture moving images, hindering our ability to comprehend their perspective fully.
Capturing the World Through Animal Eyes
Herein lies the distinction in the researchers’ innovative approach. They have meticulously designed a tool utilizing multispectral photography, capable of capturing light across various wavelengths, including those in the infrared (IR) and ultraviolet (UV) ranges. The camera records videos in four color channels – blue, green, red, and UV – and subsequently processes them to produce footage that simulates the visual experience through the eyes of a specific animal, taking into account our understanding of their eye receptors.
Video recordings can produce accurate estimates of animal quantum catches specific to their vision spectrum range. In this case, for the honeybee (left) and average ultraviolet-sensitive bird (right) Vasas et al/PLOS Biology/(CC0 1.0)
Separating UV and Visible Light
The team devised a portable 3D-printed device housing a beam splitter that separates ultraviolet (UV) from visible light, each captured by a dedicated camera. The UV-sensitive camera alone does not record perceptible data, but when combined with the other camera, they jointly capture high-quality video. Algorithms align the footage and present visuals from the perspective of various animals’ sight, demonstrating an average accuracy of 92%, with some tests yielding 99% positive results.
However, the hardware is designed to be compatible with commercially available cameras, and the researchers have shared the software as open-source, hoping others will adapt it for their specific wildlife filming requirements.
Despite limitations such as the inability to capture polarized light and a restricted frame rate, making it challenging for fast-moving subjects, the system provides unique insights to enhance our comprehension of animal behavior and guide us in mitigating our impact on the natural world.
And regarding the footage?
The team recorded a museum specimen of a Phoebis philea butterfly using avian receptor noise-limited (RNL) false colors. The researchers pointed out: “Another possible application of the system is the rapid digitization of museum specimens. This butterfly exhibits UV coloration through both pigments and structures. Vivid magenta hues emphasize the areas predominantly reflecting UV light, while purple regions reflect similar amounts of UV and long-wavelength light. The specimen is positioned on a stand and rotated slowly, illustrating the dynamic changes in iridescent colors based on the viewing angle.
How birds see butterflies
An anti-predator display by a caterpillar as seen in the vision of Apis (bee).
Spectral Challenges and Aposematic Signals
The researchers remarked, “Conceal and reveal displays can present challenges for spectroscopy and standard multispectral photography.” They presented a video featuring a black swallowtail Papilio polyxenes caterpillar exhibiting its osmeteria. The video was rendered in honeybee false colors, where UV, blue, and green quantum catches are represented as blue, green, and red, respectively. The human-perceived yellow osmeteria and yellow spots on the caterpillar’s back, both strongly reflecting in the UV, appear magenta in honeybee false colors (as the robust responses on the honeybee’s UV-sensitive and green-sensitive photoreceptors are depicted as blue and red, respectively). Given that many caterpillar predators perceive UV, this coloration may serve as an effective aposematic signal.
How bees see caterpillars
Bees engaging in foraging and interactions on flowers as observed in Apis vision. The researchers highlighted, “The camera system has the capability to document naturally occurring behaviors in their authentic settings. This is demonstrated through three brief clips showcasing bees foraging (first and second clips) and engaging in a fight (third clip) in their natural environment. The videos are presented in honeybee false colors, with the honeybee’s UV, blue, and green photoreceptor responses depicted as blue, green, and red, respectively.”
How bees see flowers – and other bees
Lastly, an iridescent peacock feather viewed through the eyes of four different animals: its own species (peafowl), humans, honeybees, and dogs.
Varied Perceptions Across Species
To conclude, the researchers clarified, “The camera system is capable of measuring angle-dependent structural colors, including iridescence. This is demonstrated in a video featuring a highly iridescent peacock (Pavo cristatus) feather. The colors in this video are represented as (A) peafowl Pavo cristatus false color, where blue, green, and red quantum catches are shown as blue, green, and red, respectively, and UV is superimposed as magenta. While resembling a standard color video in many aspects, the UV-iridescence (highlighted in the video at approximately five seconds) is observable on the blue-green barbs of the ocellus (“eyespot”). Additional UV iridescence is evident along the perimeter of the ocellus, situated between the outer two green stripes. Intriguingly, the peafowl perceives the iridescence more prominently than (B) humans (standard colors), (C) honeybees, or (D) dogs.”
We use cookies on our website to give you the most relevant experience by remembering your preferences and repeat visits. By clicking “Accept All”, you consent to the use of ALL the cookies. However, you may visit "Cookie Settings" to provide a controlled consent.
This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. These cookies ensure basic functionalities and security features of the website, anonymously.
Cookie
Duration
Description
cookielawinfo-checkbox-analytics
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics".
cookielawinfo-checkbox-functional
11 months
The cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional".
cookielawinfo-checkbox-necessary
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary".
cookielawinfo-checkbox-others
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other.
cookielawinfo-checkbox-performance
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance".
viewed_cookie_policy
11 months
The cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data.
Functional cookies help to perform certain functionalities like sharing the content of the website on social media platforms, collect feedbacks, and other third-party features.
Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.
Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics the number of visitors, bounce rate, traffic source, etc.
Advertisement cookies are used to provide visitors with relevant ads and marketing campaigns. These cookies track visitors across websites and collect information to provide customized ads.