株探米国株
英語
エドガーで原本を確認する
0001826681false0001826681us-gaap:CommonStockMember2025-04-022025-04-020001826681pdyn:RedeemableWarrantMember2025-04-022025-04-0200018266812025-04-022025-04-02

 

UNITED STATES
SECURITIES AND EXCHANGE COMMISSION
WASHINGTON, D.C. 20549

 

FORM 8-K

 

CURRENT REPORT

Pursuant to Section 13 or 15(d) of the Securities Exchange Act of 1934

Date of Report (Date of earliest event reported): April 02, 2025

 

 

Palladyne AI Corp.

(Exact name of Registrant as Specified in Its Charter)

 

 

Delaware

001-39897

85-2838301

(State or Other Jurisdiction
of Incorporation)

(Commission File Number)

(IRS Employer
Identification No.)

 

 

 

 

 

650 South 500 West, Suite 150

 

Salt Lake City, Utah

 

84101

(Address of Principal Executive Offices)

 

(Zip Code)

 

Registrant’s Telephone Number, Including Area Code: (888) 927-7296

 

 

(Former Name or Former Address, if Changed Since Last Report)

 

Check the appropriate box below if the Form 8-K filing is intended to simultaneously satisfy the filing obligation of the registrant under any of the following provisions:

☐Written communications pursuant to Rule 425 under the Securities Act (17 CFR 230.425)
☐Soliciting material pursuant to Rule 14a-12 under the Exchange Act (17 CFR 240.14a-12)
☐Pre-commencement communications pursuant to Rule 14d-2(b) under the Exchange Act (17 CFR 240.14d-2(b))
☐Pre-commencement communications pursuant to Rule 13e-4(c) under the Exchange Act (17 CFR 240.13e-4(c))

Securities registered pursuant to Section 12(b) of the Act:


Title of each class

 

Trading
Symbol(s)

 


Name of each exchange on which registered

Common Stock, par value $0.0001 per share

 

PDYN

 

The Nasdaq Stock Market LLC

Redeemable warrants, exercisable for shares of Common Stock at an exercise price of $69.00 per share

 

PDYNW

 

The Nasdaq Stock Market LLC

Indicate by check mark whether the registrant is an emerging growth company as defined in Rule 405 of the Securities Act of 1933 (§ 230.405 of this chapter) or Rule 12b-2 of the Securities Exchange Act of 1934 (§ 240.12b-2 of this chapter).

Emerging growth company ☒

If an emerging growth company, indicate by check mark if the registrant has elected not to use the extended transition period for complying with any new or revised financial accounting standards provided pursuant to Section 13(a) of the Exchange Act. ☐

 


 

Item 7.01 Regulation FD Disclosure.

On April 2, 2025, Palladyne AI Corp. (the “Company”) posted on the investor relations page of its website at www.palladayneai.com an investor presentation furnished as Exhibit 99.1 to this Current Report on Form 8-K (the "Investor Deck") and incorporated herein by reference. This presentation is expected to be used by the Company in connection with certain future presentations to investors and others. The information contained in the Investor Deck is summary information and contains forward-looking statements that are subject to risks and uncertainties, including those set forth in the Company’s filings with the Securities and Exchange Commission (the “SEC”). The information in the Investor Deck is as of April 2, 2025, except for information that is specifically identified as being as of an earlier date. The Company undertakes no obligation to publicly update or revise the information contained in the Investor Deck or this Item 7.01, except as required by law, although it may do so from time to time. Any such updating may be made through the filing of other reports or documents with the SEC, press releases, disclosure on the Company’s website or other means of public disclosure.

The Company announces material information to the public through a variety of means, including filings with the SEC, public conference calls, the Company’s website (https://www.palladyneai.com/), its investor relations website (https://investor.palladyneai.com/), and its news site (https://www.palladyneai.com/press/). The Company uses these channels, as well as its social media, including its X (@PalladyneAI) and LinkedIn accounts (https://www.linkedin.com/company/palladyneaicorp/), to communicate with investors and the public news and developments about the Company, its products and other matters. Therefore, the Company encourages investors, the media, and others interested in the Company to review the information it makes public in these locations, as such information could be deemed to be material information. The information that can be accessed through hyperlinks or website addresses included in this Current Report on Form 8-K and Exhibit 99.1 attached hereto is deemed not to be incorporated in or part of this Current Report on Form 8-K.

The information in this Item 7.01 of this Current Report on Form 8-K and Exhibit 99.1 are being furnished and shall not be deemed to be “filed” for purposes of Section 18 of the Securities Exchange Act of 1934, as amended (the “Exchange Act”), or otherwise subject to the liabilities of that section, and shall not be incorporated by reference into any registration statement or other document filed pursuant to the Securities Act of 1933, as amended, or the Exchange Act, regardless of any general incorporation language contained in such, unless the Company specifically states that the information is to be considered “filed” under the Exchange Act or specifically incorporates it by reference into a filing under the Securities Act or the Exchange Act.

 

Item 9.01 Financial Statements and Exhibits.

(d) Exhibits.

 

 

Exhibit
Number

Description

99.1

Investor Presentation.

104

Cover Page Interactive Data File (formatted as Inline XBRL)

 


SIGNATURES

Pursuant to the requirements of the Securities Exchange Act of 1934, the registrant has duly caused this report to be signed on its behalf by the undersigned hereunto duly authorized.

Palladyne AI Corp.

Dated:

April 2, 2025

By:

/s/ Stephen Sonne

Name:

Title:

Stephen Sonne
Chief Legal Officer & Secretary

 


EX-99.1 2 pdyn-ex99_1.htm EX-99.1

Slide 1

Artificial Intelligence that Enables Robots and Drones to Think Like Humans and Perform Real-World Complex Tasks Exhibit 99.1   April 2025


Slide 2

Disclaimer This presentation and any related oral statements contain forward-looking statements within the meaning of the Private Securities Litigation Reform Act of 1995 including, but not limited to, statements regarding Palladyne AI’s business strategy, projections of market opportunity, anticipated benefits of its technologies, plans and objectives for future operations and offerings, Palladyne AI’s product development, expected features, benefits and use cases of Palladyne AI’s foundational technology and products, expectations and timing related to commercial product launches, and the potential success of Palladyne AI’s strategy. In some cases, you can identify forward-looking statements by terminology such as “may,” “will,” “should,” “could,” “expect,” “plan,” anticipate,” “believe,” “estimate,” “predict,” “intend,” “potential,” “would,” “continue,” “ongoing” or the negative of these terms or other comparable terminology. Such forward-looking statements involve risks, uncertainties and assumptions that may cause actual events, results, or performance to differ materially from those indicated by such statements. Certain of these risks and uncertainties are set forth in the section entitled “Risk Factors” and “Cautionary Note Regarding Forward-Looking Statements” in Palladyne AI’s filings with the Securities and Exchange Commission (the “SEC”) from time to time which are available, free of charge, at the SEC’s website at www.sec.gov. In addition, statements that “we believe” and similar statements reflect Palladyne AI’s beliefs and opinions on the relevant subject. These statements are based upon information available to Palladyne AI as of the date of this presentation, and although Palladyne AI believes such information forms a reasonable basis for such statements, such information may be limited or incomplete, and Palladyne AI’s statements should not be read to indicate that Palladyne AI has conducted a thorough inquiry into, or review of, all potentially available relevant information. These statements are inherently uncertain and readers are cautioned not to unduly rely upon these statements. If any of these risks materialize or our assumptions prove incorrect, actual results could differ materially from the results implied by these forward-looking statements. In light of the significant uncertainties in these forward-looking statements, you should not regard these statements as a representation or warranty by Palladyne AI or any other person that Palladyne AI will achieve its objectives and plans in any specified time frame, or at all. Except as required by law, Palladyne AI assumes no obligation and does not intend to update any forward-looking statements or to conform these statements to actual results or changes in Palladyne AI’s expectations. This presentation may also contain estimates and other statistical data made by independent parties and by Palladyne AI relating to market size and growth and other industry data. These data involve a number of assumptions and limitations and is subject to change. You are cautioned not to give undue weight to such estimates. Palladyne AI has not independently verified the statistical and other industry data generated by independent parties and contained in this presentation and, accordingly, cannot guarantee their accuracy or completeness. In addition, any projections, assumptions and estimates of Palladyne AI’s future performance and the future performance of the markets in which it competes are necessarily subject to a high degree of uncertainty and risk due to a variety of factors. These and other factors could cause results or outcomes to differ materially from those expressed in the estimates made by the independent parties and by Palladyne AI. The products described in this presentation are subject to trade controls, including but not limited to the U.S. Export Administration Regulations (“EAR”) and/or the International Traffic in Arms Regulations (“ITAR”). Information in this presentation is meant for background purposes only, and availability of the products and/or capabilities described herein is subject to U.S. Government authorization. This presentation does not contain any National Security Information, Restricted Data, or other sensitive information subject to disclosure controls under the National Industrial Security Program Operating Manual (“NISPOM,” codified at 32 CFR Part 117). Any use of the term “confidential” in this document is meant to indicate the presence of information of a business-sensitive or proprietary nature; it is not meant to be construed consistent with the term’s definition and associated safeguarding requirements set forth in the NISPOM. By attending or receiving this presentation you acknowledge that you will be solely responsible for your own assessment of the market and our market position and that you will conduct your own analysis and be solely responsible for forming your own view of the potential future performance of our business. Palladyne AI announces material information to the public through a variety of means, including filings with the SEC, public conference calls, Palladyne AI’s website (www.palladyneai.com), its investor relations website (https://investor.palladyneai.com/), and its news site (https://www.palladyneai.com/press/). Palladyne AI uses these channels, as well as its social media, including its X (@PalladyneAI) and LinkedIn accounts (https://www.linkedin.com/company/palladyneaicorp/), to communicate with investors and the public news and developments about Palladyne AI, its products and other matters. Therefore, Palladyne AI encourages investors, the media, and others interested in the company to review the information it makes public in these locations, as such information could be deemed to be material information. The information that can be accessed through hyperlinks or website addresses included herein is deemed not to be incorporated in or part of this presentation.


Slide 3

Palladyne AI At-a-Glance NASDAQPDYN Experience30+ years of robotics engineering excellence. Technology team led by CTO with 25+ years of AI/ML expertise >70team members, world-class robotics & AI/ML software engineers Salt Lake City, UTInnovation and operations Robotics DNA30+ years in robotics and robotics software. Legacy leadership in dexterous mobile robot technology across aviation, construction, energy, and defense sectors


Slide 4

Palladyne AI: 30+ Years of Innovation and Evolution 1983 2015 2021 2024 2007 Purchase from Raytheon (2015) Raytheon Buys Sarcos (2007) Sarcos Spins Out of University of Utah (1983) Sarcos Robotics Begins Trading Publicly (2021) Sarcos Becomes Palladyne AI (2024) 2019 2022 2023 2024 2025 Palladyne IQ Palladyne Pilot Software Development Begins (2019) STRATFI Contract1(9/27/23) Cybernetic Training for Autonomous Robots Human Augmentation via Generalizable Mobile Autonomous Robot Dexterity (C-H); Contract # FA8571-23-C0042. Collaborative sensing platform for the detection, tracking, and classification of time-critical objects in dynamic adversarial environments; AFRL Contract # FA8750-22-C-1005. CompleteMVP(5/30/24) Commercial Release(9/30/24) Initial Customer Trials Begin(10/1/24) STRATFI Phase 1 Complete(10/8/24) First CLUTCHES Contract2 (2/17/22) Software Development Begins (2/2022) Complete MVP(12/31/24) CLUTCHES Contract Extension (5/18/23) CommercialRelease(Q1 2025) Palladyne AI


Slide 5

Our Vision: To Automate Tasks Too Complex for Traditional Automation by Enabling Machines to Observe, Learn, Reason & Act Like Humans Substantially accelerate speed of programming and training Increase agility, task sets and use cases Reduce need for human intervention and oversight Reduce cost of standing up and maintaining automation For mobile machines, evolve from human-in-the-loop to human-on-the-loop Eliminate need for continuous cloud connectivity


Slide 6

Automation of Complex Tasks Has Been Limited For Several Reasons: Large language models. Most industrial robots are highly programmed for a single specific task and cannot process variations in objects, tasks, or the environment Programming and implementation of industrial robots have been time-consuming and costly, often yielding an insufficient customer ROI Today’s state-of-the-art AI approaches (e.g., LLM1 for generative AI) require massive data sets to train models, limiting tasks solely to what is contained in the data sets


Slide 7

Physical Intelligence for a Wide-Array of Robotic Form Factors1 Expected to Enable Stationary and Mobile Robotic Platforms to be Agile and Autonomous, Reduce Human Intervention and Increase ROI Designed to work with most industrial robots being sold today. According to the Proficient Market Insights’ “Global Robot Operating System” report, ROS 1 robots comprised of 74% of the total ROS market in 2021, “Global Robot Operating System (ROS) Market 2022 Size Of $ (globenewswire.com). Industrial Robots and Cobots             (Palladyne IQ) Mobile Robots (Palladyne IQ and Pilot) Unmanned Aerial Vehicles (Palladyne Pilot)


Slide 8

Palladyne™ IQ: AI Software Platform for Robotics Real-Time Closed-Loop Autonomy Software to Enable Machines to Observe, Learn, Reason, and Act Like Humans Based on internal testing, actual figures will vary depending on complexity of the task. Act Precise robotic control & completion of tasks Completes the task by accurately controlling the manipulator arm, robot, and/or end effector Achieves complex combination of tasks over extended periods of time in a stable, safe, and precise manner Real-time perceiving, learning & decision-making occurs at the edge without retraining or cloud connectivity Learn Intelligent machine learning to accelerate onboarding for new & complex tasks Robots learn novel or complex combination of tasks via dynamic reasoning and learning Learning occurs with minimal demonstrations (1-5)1 Learning model adapts to environments Reason Human-like, AI-based reasoning to determine best course of action without human intervention Enables robots to adapt to unexpected events in real-time Generates real-time motion plans based on situational awareness at the edge Observe Advanced perception & observation to improve situational awareness Perceives environment using a mix of sensor inputs, e.g., vision, LiDAR, radar, acoustic, etc. Utilizes Multi-Modal Sensor Fusion to make perception more robust to sensor occlusion and noise


Slide 9

Expected Advantages of Palladyne IQ How Our Approach Differs Hardware agnostic1 Addresses robotic-specific challenges beyond integration Solves for system stability and pose estimation/end effector orientation Robots able to plan and execute complex combination of tasks over extended periods of time, even in dynamic and unstructured environments Designed to work with most industrial robots being sold today. According to the Proficient Market Insights’ “Global Robot Operating System” report, ROS 1 robots comprised of 74% of the total ROS market in 2021, “Global Robot Operating System (ROS) Market 2022 Size Of $ (globenewswire.com). Full stack, closed-loop autonomy enables adaptability to dynamic changes in environment or defined task without human intervention or reprogramming Uses probabilistic machine learning (ML) techniques to learn the task, accounting for uncertainty and variability Dynamic model inference methods require much less training data; robots can learn to generalize with only a few demonstrations (1-5)4 Computational efficiencies gained through use of Palladyne AI’s domain-specific language models Complex task-learning capabilities are similar to humans; in some cases, robots can be trained in significantly less time than it takes relying on current state-of-the-art approaches3 Enables edge computing; lower total cost of ownership (TCO) with no need to incur recurring cloud services costs Improves system implementation and startup times Fuses multi-sensor data inputs together to improve system flexibility & adaptability Flexible instructional input options for task model learning (i.e., LLMs, DSLs2, motion-capture-based teleoperation, video input, etc.) Can provide language-to-motion instructions ideal for edge computing/robotics applications; doesn’t require cost/latency associated with use of LLMs requiring connectivity to the Cloud Domain specific languages. Robotics Transformer 1 & 2 deep learning-based approach, 2022 – 2023. Based on internal testing, actual figures will vary depending on complexity of the task.


Slide 10

Our AI Approach vs "Brute-Force" Foundation Models How the Palladyne AI Approach Differs from the Traditional Foundational Model Approach Inference Constraint satisfaction problems Aspect Foundational Model for Robotics  Palladyne AI Approach Model Size          Large – often in the range of hundreds of millions to billions of parameters Compact – a simplified model / method for decision making (i.e., symbolic and modular representations) resulting in smaller models Required Data Engineering  Extensive pre-processing required to combine multiple types of data (e.g., images, text, actions, robot joint states, end effector, etc.) and effectively align diverse inputs Minimal – focuses on task and motion-specific capabilities and leverages domain expertise to provide context to robots (e.g., technician familiar with a task)  Training Data Requires massive, diverse datasets with multiple types of labels, such as simulated and real-world robotic tasks Uses smaller, domain-specific datasets, focusing on specific scenarios and constraints based upon the use case Computational Requirements High – requires GPUs and cloud computing resources for model training and use1 due to the complex, resource-intensive nature of robotics Moderate – edge computing primarily solves optimization and other complex problems2 with GPUs dedicated for perception and the Cloud is not required Time / Costs High – persistent cloud compute and high number of parameters required for model training resulting in significant time and cost impact Low – no cloud compute or access required Robotic Applicability Not yet achievable due to the limitation of robotic data, with the focus on generating a vast number of general-purpose task models to be applied across a diverse range of scenarios (e.g., manipulation, navigation, language-guided tasks) – not a 1:1 translation for robotics Robot Specific Generalization – Focused on complex robotics use cases for enterprise applications that were previously too difficult for robots due to their dynamic nature, which also limits the time and cost of developing AI models Repeatability Moderate – exact repeatability challenged due to reliance on learned generalizations from data (generally varies between 25 ~ 80%) High – achieved through planning AI that leverages constraints to deliver more consistent outcomes Success Rate Low – requires vast amounts of relevant, labeled data that is difficult and expensive to obtain and mass groups of people are not incentivized to create relevant data for robotic systems High – for well-defined tasks within the planning model’s domain, limited by the accuracy of constraints and physics Closed-loop Autonomy  Smart enough to “see” and act within defined boundaries but are not yet fully autonomous in their perception, reasoning, motion planning, or task creation to execute actions Similar to how humans use their senses to achieve a successful outcome, full-stack, closed-loop autonomy at the edge gives robots the ability to observe, learn, reason, and act


Slide 11

Market Opportunity for Palladyne IQ Millions of complex but dangerous or boring jobs Existing Industrial Robot Installed Base Repetitive Tasks Currently Performedby Humans Opportunityto Augment Existing IndustrialRobots Millions of existing installed industrial robots


Slide 12

Product Assembly Sub Parts Assembly1 Tasks Structured manufacturing line with task variability Challenges Manufacturing downtime: Changes in assembly parts/tasks can result in costly robot retraining and extended downtime Opportunity & Expected Benefits Quickly retrain robots to perform new tasks with minimal downtime (low-code/no-code training) Quickly adapt to varying tasks on a multi-product assembly line set up Run assembly lines with mixed products to meet demand Robot automatically adapts how it completes a task based on object detected in its field-of-view Flexibility & future-proof task planning; extends usability & life of robot Potential use cases based on discussions with potential customers.


Slide 13

Kitting and Parts Sequencing Pick/Place/Sort Parts into Assembly Kits/Containers1 Tasks Kitting and parts sequencing for complex assemblies Challenges Difficult to automate: Can require sophisticated planning, human intervention & high programming costs Variability in parts: Can lead to inefficiencies and errors, causing delays, rework, and increased costs Fluctuating demand: Industries such as consumer electronics and automotive need automation that can quickly adapt to changes in demand Opportunity & Expected Benefits Advanced object detection, ML and AI enables robot to: Achieve continuous workflow by dynamically adapting to changes in kitting/sequencing orders Recognize and pick/place complex parts geometries efficiently, even in variable conditions and dynamic environments Quickly and accurately classify parts and determine their optimal sorting location, helping streamline production and enabling parts traceability Quickly retrain robots to perform new tasks with minimal downtime (low-code/no-code training) Reduces overhead costs and increases throughput, providing a faster ROI Potential use cases based on discussions with potential customers.


Slide 14

Heavy Material Handling & Truck Loading1 Pick & Place of Heavy Materials onto Transport Vehicles Tasks Picking pipe and rods from various inventory bins and transferring to transport vehicles in the loading yard (drill rig, pipe truck, or skid) Challenges Variability and unstructured environments make automation difficult and cost-prohibitive Material variability: Varying material size, weight, and diameter requires different handling methods and tools (i.e., end effectors) Configuration variability: Picking bin and transport vehicle cab configuration can vary, requiring real-time adjustments for how materials are picked, placed, and stacked in the cab Picking bin variability: The presentation of materials in the bins can be non-uniform and disorganized; recognizing how to best grasp and lift each material requires precise perception, sophisticated motion planning, and frequent human intervention Unstructured Environment: Oil & Gas facilities are often located in remote locations, making human intervention and reconfiguration of automated systems complex and costly Opportunity & Expected Benefits Advanced object detection AI enables robots to: Achieve continuous workflow by rapidly adjusting to material type and bin/cab configurations Quickly onboard new material types, set up new bin and cab configurations, and train new pick and place tasks based on new parameters Low-code/no-code task training feature enables loading yard employees to retrain robots to perform new tasks Unlike other AI solutions that require constant connectivity to the cloud for access to their AI task models, our Edge AI approach allows for real-time path planning adjustments on the device, minimizing potential latencies and the high costs associated with cloud computing Automation improves safety, helps reduce overhead/rig worker costs, and increases throughput, providing a faster ROI Potential use cases based on discussions with potential customers.


Slide 15

Surface Preparation Grit Blasting, Hydro Blasting, Sanding, and Grinding1 Tasks Removal of paint, rust, and debris from surfaces using various media blasting and grinding tools: Heavy Manufacturing: Prepare components, chassis, and heavy machinery for finishing processes Structural Maintenance & Repair: Clean ship hulls, tanks, bridges, and offshore structures to prepare for painting & coating Challenges Variability in surface contours and shape: Difficult to automate; typically requires delicate handling, manual work or semi-automation with high degrees of human intervention Safety risks: Manual surface preparation exposes human workers to high risk of injury due to hazardous materials and environments Opportunity & Expected Benefits Advanced object detection, ML and AI enables robot to: Manipulate blast hose and tools accurately by adapting to varying surface conditions in real-time Achieve a precise and consistent result, reducing the need for re-work and human intervention Detect and respond quickly to potential hazards, ensuring safer operation and compliance with safety regulations Quickly retrain robots to perform new tasks with minimal downtime (low-code/no-code training) Reduces overhead costs and increases throughput, providing a faster ROI Potential use cases based on discussions with potential customers.


Slide 16

Video: Simulated Autonomous Surface Prep1 Media Blasting, de-painting, sanding, and hydro-blasting: Applicable for manufacturing, construction, and various repair & maintenance workflows 1. Video is from May 2024 MUA Demonstration showcasing autonomous media blasting, de-painting and teleoperated sanding and hydro-blasting tasks. Approved for Distribution Statement A /Public Release. https://www.palladyneai.com/video/sambd


Slide 17

Quality Control Inspection In-process and Final Quality Checks for Equipment Manufacturing1 Tasks Automate manual visual inspections at various assembly stages to ensure quality standards are met Challenges Labor intensive & time-consuming process: Complex components like servers require 100+ inspection points for quality checks High precision requirements: Inspections demand custom tools and skilled technicians Scaling quality inspections: Increasing production volume and model variations make thorough inspections harder Consistency: Human inspectors face fatigue and performance variability, affecting quality Opportunity & Expected Benefits Advanced object detection and edge-based AI/ML enable robots to quickly scan and analyze server panels Detect defects faster without relying on cloud-based image recognition Quickly retrain robots to perform new tasks with minimal downtime (low-code/no-code training) Time savings and improved ROI by increasing yields and reducing human error Potential use cases based on discussions with potential customers. FPO


Slide 18

Palladyne™ Pilot Closed-loop Autonomous Detection, Tracking, and Control for Mobile Machines Solutions Overview


Slide 19

Palladyne Pilot for UAV   Surveillance/Reconnaissance Tasks Persistent detection, tracking, and classification of targets Challenges Highly unstructured environment – in flight High levels of uncertainty Personnel-intensive missions requiring multiple roles and shifts Opportunity & Expected Benefits Advanced perception & observation, dynamic learning, and AI enables: Enhanced operational effectiveness of tactical UAV missions Enhanced situational awareness and real-time decision-making Reduced manpower needed to control multiple UAVs, when integrated with UAV autopilot systems Potential use cases based on discussions with potential customers.


Slide 20

Challenges for Current UAV-Based Missions in Military Operations Drone-based missions are personnel-intensive and require multiple roles and shifts to maintain situational awareness Today, most tactical UAVs require manual 1:1 platform control and one-to-many data coordination Real-time decision-making in highly variable and unpredictable conditions requires robust situational awareness and multi-data stream analysis Target detection and tracking challenges Limited or contested RF availability jeopardizing persistent target tracking custody Fast-moving targets frequently exit field-of-view or are blocked from line-of-sight Difficult to distinguish targets with the naked eye (e.g., color, shape, external markings)


Slide 21

Palladyne™ Pilot: Closed-Loop Autonomy for UAVs Cooperative Autonomous Sensing for Enhanced Surveillance and Reconnaissance


Slide 22

Control of sensors only; does not control UAV/UGV’s flight or navigational functionalities. Enhanced Situational Awareness:Sensor Control and Multi-Modal Sensor Fusion Multi-Modal Sensor Fusion: Fuses multi-modal data from multiple distributed autonomous sensors and actively controls sensors to maximize detection, tracking, classification and identification performance, and activity prediction. Video Features Radar Features RFFeatures Audio Features Decision Process Approach: Feature Level Fusion Moderate dimensional Complex joint statistics Sensor Control: Distributed, cooperative sensor and platform management via Inter-Platform Communication (IPC) transforms UAVs into autonomously collaborating sensing grids that self-orchestrate.


Slide 23

Autonomously Collaborating Drone Team Maximizes Situational Awareness Enables a single operator (on-the-loop) to command and control autonomous drone operations that provide intel on current target tracks while searching for new targets Drones control sensors via reinforcement learning to maintain track custody Multi-sensor drones exchange low bandwidth information1, not high bandwidth video or raw sensor data1 Drones share situation awareness information (e.g., target locations, motions, and characteristics) Drones determine sensor-to-target assignments 1. "Data" refers to raw, unorganized facts or figures, while "information" is low-bandwidth data that has been processed, organized, and interpreted to provide context and meaning, making it useful for decision-making.


Slide 24

Scale Operational Effectiveness of Tactical-Class UAV Missions Through Enhanced Human-Machine Synergy Provide a single-source, fully-integrated software stack that transforms UAVs into autonomous collaborative assets Enable a network of collaborating drones and sensors that self-orchestrate when integrated with an auto-pilot program Maximize operational coverage and information-gathering capabilities while reducing personnel required for UAV mission control Enable persistent target tracking custody of moving or stationary targets - critical to situational awareness Able to operate at the edge in limited or contested RF availability


Slide 25