Tesla Autopilot Rear-End Accident: Who Is Liable?

Rear-end crashes are some of the most common accidents on the road. When a Tesla is involved and autopilot is active, the legal questions multiply quickly.

Drivers, passengers, and people hit from behind often walk away confused. The car was braking on its own. The system was engaged. Tesla markets the technology as advanced and responsive. So who is actually responsible?

The answer depends on more than people expect. In many Tesla autopilot rear-end accidents, liability is far more layered than a standard crash. If you were injured, responsibility may not be as simple as pointing to the driver alone.

What Is Tesla Autopilot and How Does It Work?

Tesla autopilot is a driver-assist system, not a fully autonomous driving program. Even Tesla says the driver must stay alert, keep control of the vehicle, and be ready to intervene at any time.

Autopilot and Tesla full self-driving modes rely on cameras, sensors, and software to help with speed, braking, and lane positioning. When conditions are ideal, the system can reduce driver workload. But when conditions change, the technology may struggle.

Many drivers misunderstand this balance. The name “autopilot” suggests independence. The reality is shared control, and that gap often plays a role in rear-end collisions.

How Rear-End Accidents Typically Happen With Autopilot

Tesla autopilot rear-end accidents tend to follow a few recurring patterns. These crashes are rarely random. In some cases, the vehicle brakes suddenly without an obvious reason. In others, it fails to slow down at all when traffic ahead has stopped. Drivers may rely on the system longer than they should, assuming it will react faster than a human would.

Certain traffic environments also increase risk. Stop-and-go congestion, sudden lane changes, construction zones, and poor visibility can all expose weaknesses in how autopilot interprets what’s happening ahead. When those failures line up with human overreliance, rear-end crashes can happen in an instant.

Common Injuries in Tesla Autopilot Rear-End Accidents

Rear-end collisions often look minor from the outside, but injuries may tell a different story. These crashes can cause lasting harm, even at lower speeds.

Whiplash and Neck Injuries

Whiplash is one of the most common rear-end accident injuries. The sudden forward-and-back motion strains muscles, ligaments, and soft tissue in the neck. Symptoms don’t always show up right away, which can complicate treatment and claims.

Back and Spinal Injuries

Rear-end crashes can damage the spine in ways that don’t heal easily. Herniated discs, nerve compression, and chronic back pain are common complaints. These injuries often interfere with work, sleep, and daily movement.

Head and Brain Injuries

Even without a direct head impact, the force of a rear-end collision can cause concussions or more serious traumatic brain injuries. Confusion, headaches, memory issues, and mood changes are all red flags after a car crash.

Broken Bones and Orthopedic Injuries

Wrists, arms, ribs, and legs often take the brunt of the impact. Bracing for a collision can lead to fractures that require surgery, physical therapy, or long recovery periods.

Psychological Injuries

Crashes involving advanced vehicle technology can leave people shaken long after the physical injuries heal. Anxiety, sleep issues, and symptoms associated with PTSD are common after sudden, unexpected collisions.

Who Can Be Liable in a Tesla Autopilot Rear-End Accident?

Liability depends on how the crash happened and what role the technology played.

The Tesla Driver

Autopilot does not replace driver responsibility. If the driver was distracted, inattentive, or relying on the system beyond its limits, fault may rest there even if autopilot was active.

Tesla (Product Liability Claims)

Tesla may be responsible when a rear-end accident stems from a system failure. This can include defective software, sensor issues, design problems, or failure to properly warn drivers about known limitations.

Other Drivers

In multi-vehicle crashes, another driver’s sudden stop or unsafe maneuver may contribute to the collision. These cases often involve shared responsibility across multiple parties.

Shared Liability

Many Tesla autopilot crashes involve comparative fault. Responsibility may be divided between the driver, Tesla, and others involved. Fault percentages directly affect how much compensation an injured person can recover.

Proving Liability in an Autopilot-Related Crash

Tesla vehicles store data that can show whether autopilot or full self-driving modes were active and how the vehicle responded before impact. Traffic cameras, dashcam footage, and witness accounts can support or contradict that data.

In serious cases, accident reconstruction experts and technical specialists are often needed to explain what went wrong and why.

What Compensation Can Injured Victims Recover?

A Tesla autopilot rear-end accident can disrupt nearly every part of a person’s life. Compensation may account for:

Why Tesla Autopilot Cases Are More Complex Than Normal Rear-End Accidents

These cases are not routine. They involve corporate defense teams, proprietary vehicle data, and disputes over how software should behave in real-world conditions. Federal and state regulations also play a role, especially as self-driving technology continues to evolve. Early legal action often makes the difference between clarity and confusion later on.

What to Do After a Tesla Autopilot Rear-End Accident

What you do in the days after a crash matters. Medical care should come first, even if symptoms seem minor. Vehicle data should be preserved whenever possible.

Avoid accepting explanations from Tesla or insurance companies at face value. Early narratives tend to favor whoever controls the information.

How a Personal Injury Lawyer Can Help

Tesla autopilot cases require more than standard crash handling. A personal injury lawyer familiar with these claims can investigate system involvement, identify every liable party, and deal directly with Tesla and insurance carriers.

Contact Our Firm for a Free Consultation

At Breit Biniazan, we handle serious injury cases involving complex liability and powerful defendants. Our team has secured over $2 billion for our clients, and we prepare cases with the potential for trial in mind from the start.

If you were injured in a Tesla autopilot rear-end accident, contact Breit Biniazan to discuss what happened and what your options may be. There are no upfront fees, and our team is ready to take on cases that others won’t.

Are Tesla Drivers Still Responsible While Using Autopilot?

After a Tesla crash, one question comes up almost immediately: “Autopilot was on - does that mean I’m still responsible?”

It’s a fair question. Tesla markets its technology as advanced, intelligent, and capable of handling real-world driving. When something goes wrong, many drivers assume the system should shoulder the blame.

The reality is more complicated, especially if someone was seriously injured. If you were using autopilot when your accident occurred, you need a legal team that knows how to address this complicated area of accountability. Reach out to Breit Biniazan at (855) 659-4457 today.

What Tesla Autopilot Actually Does (and Doesn’t Do)

Despite the name, Tesla autopilot is not autonomous driving. Even Tesla says the driver must stay alert and ready to take control at all times.

From a legal standpoint, that distinction carries real weight.

Autopilot can assist with steering, speed, and braking under certain conditions. It can help manage highway driving and traffic flow. But what it does not do is replace the human driver under current traffic laws. That gap between marketing language and legal reality is where many injury cases begin.

Why Drivers Are Often Blamed First After an Autopilot Crash

In the early stages of a claim, responsibility is usually framed around traditional driving factors. Speed, attention, reaction time, and following distance all come into play, even if autopilot was engaged. 

That initial framing can be misleading, as it doesn’t account for how the vehicle actually behaved or whether the system contributed to the collision.

Many injured drivers assume fault simply because autopilot was on. In reality, that assumption often benefits insurers and manufacturers far more than it reflects what truly happened.

Situations Where Tesla May Share Responsibility

Not every autopilot crash is about driver error. Some accidents raise serious questions about whether the system performed as it should have. Tesla may share responsibility when the technology:

These cases shift the focus away from momentary driver judgment and toward system performance, design, and safety testing.

The Role of Tesla Vehicle Data in Determining Fault

One of the biggest differences between Tesla crashes and traditional car accidents is data.

Tesla vehicles record detailed information before, during, and after a collision. That data can show whether autopilot or full self-driving was active, how the car responded, and if the driver intervened at all.

In many cases, this data tells a very different story than witness statements or police assumptions. Accessing this information is rarely simple. Tesla and insurance carriers typically do not hand over this information without pressure, and delays can affect what evidence remains available.

Patterns Seen in Real Autopilot Injury Cases

While every crash is different, certain autopilot-related scenarios appear again and again in serious injury claims.

These include rear-end collisions where the Tesla fails to slow down, intersection crashes involving stopped vehicles, and sudden braking events that trigger chain-reaction accidents. Each pattern raises different liability questions and requires a tailored investigation.

What matters is how the vehicle’s autopilot behaved and whether that behavior created danger.

What Injured Drivers and Passengers Should Take Seriously Early On

The first decisions made after an autopilot crash can shape the entire case. Statements given too early, assumptions about fault, or failure to preserve evidence can limit options later. A few realities are worth keeping in mind:

Why Clients Trust Breit Biniazan With Complex Injury Cases

At Breit Biniazan Trial Lawyers, we focus on serious injury cases involving high stakes and powerful defendants. Our team has secured over $2 billion for our clients, and we approach every case with the expectation that it may need to be tried.

That mindset matters in cases involving Tesla autopilot and full self-driving technology. These claims require a firm that understands how to challenge system failures, analyze technical evidence, and push back when blame is unfairly shifted onto injured people.

If you or someone you love was injured in a Tesla crash involving autopilot or full self-driving features, contact the experienced team at Breit Biniazan to talk through what happened and where responsibility may truly lie.

What If You Were Hit by a Self-Driving Car as a Pedestrian or Cyclist?

When a pedestrian or cyclist is hit by a self-driving or driver-assist vehicle, the first question is often: Who was actually in control of the car?”

These crashes raise different issues than traditional pedestrian or bicycle accidents. Responsibility may depend on what the vehicle’s system was doing, whether the driver was expected to intervene, and how the technology responded in the moments before impact.

Because self-driving systems change how vehicles detect and react to people outside the car, these cases are evaluated differently than standard traffic collisions.

How Self-Driving Cars Are Supposed to Detect Pedestrians and Cyclists

Most self-driving and driver-assist systems are built with one core goal, to detect hazards early and react faster than a human can. That includes recognizing people who are walking or riding bikes.

These systems rely on a combination of cameras, sensors, and software to identify movement, distance, and direction. When everything works the way it should, the vehicle should slow down or stop before impact.

The problem is that real streets aren’t controlled environments, lighting changes, rain and glare interfere with cameras, cyclists take up far less space than cars and pedestrians don’t always walk in straight lines. When detection fails in these everyday conditions, serious injuries can happen.

Why Injuries Are Often More Serious When Pedestrians or Cyclists Are Hit

People outside the vehicle don’t have airbags, seatbelts, or steel frames around them. Even a low-speed impact can cause lasting harm.

In many self-driving pedestrian and cyclist crashes, injuries include broken bones, head injuries, spinal injuries, and deep soft-tissue damage. Some injuries may not fully show up until days later, when swelling and pain has set in.

Cyclists are especially vulnerable because they may be thrown from the bike or dragged after impact. Pedestrians often suffer secondary injuries from hitting the ground or nearby objects after being struck.

Who May Be Responsible When a Self-Driving Car Hits Someone

Responsibility in these crashes is rarely simple. More than one party may be involved, depending on how the crash happened. Potentially responsible parties can include:

In many cases, responsibility is shared. That’s common when technology and human decision-making overlap.

How Fault is Actually Evaluated in These Cases

Unlike traditional pedestrian accidents, fault isn’t based only on what witnesses saw or what the driver says. Self-driving vehicles generate data that can tell a more detailed story.

Investigators often look at how early the system detected a pedestrian or cyclist, whether alerts were issued to the driver, how the vehicle responded, and whether braking or steering occurred before impact. That information can help explain whether the system worked, whether the driver had time to react, or whether something else failed along the way.

Without that data, it becomes much harder to push back against simplified explanations.

Why These Claims Feel More Difficult Than Normal Pedestrian Accidents

Many injured pedestrians and cyclists feel like they’re at a disadvantage from the start. Insurance companies may focus on where the person was standing or riding. Manufacturers may try to avoid involvement altogether. Drivers may claim they trusted the system.

At the same time, the injured person is trying to recover physically and emotionally, often without clear answers.

What makes these cases especially challenging is that digital evidence doesn’t wait. Vehicle data can be overwritten, software can update, and important details can disappear quietly if no one acts early.

What Can Quietly Disappear After a Self-Driving Pedestrian Crash

Modern cars collect short windows of system data like what the vehicle detected, how it reacted, and whether the driver was alerted. That information isn’t always saved long-term. In some cases, it can be overwritten the next time the car is driven or updated. When that happens, important details can be lost, including:

Once that window closes, there may be no way to reconstruct exactly how the system performed before impact.

How Breit Biniazan Helps Injured Pedestrians and Cyclists

At Breit Biniazan, cases involving pedestrians and cyclists hit by self-driving or driver-assist vehicles are handled with urgency and care. These crashes often cause serious injuries and raise questions that deserve real answers.

With over $2 billion recovered, the firm has the experience and resources to look closely at how self-driving technology performed and whether it failed to protect someone who had no control over the situation.

If you were hit by a self-driving or driver-assist vehicle while walking or riding a bike, Breit Biniazan can help examine what happened and whether the technology involved should be held accountable. Give us a call at (855) 659-4457 or fill out our online contact form, and our team will reach out. 

Can You Sue Tesla or Another Car Manufacturer After a Self-Driving Car Accident?

After a crash involving self-driving technology, most people assume the focus will be on the driver. Sometimes it is. But when a vehicle is steering, braking, or making decisions on its own, the manufacturer doesn’t automatically step out of the picture.

In certain situations, the company that designed or built the technology may be legally responsible for what happened. These cases fall under product liability, and they are very different from ordinary car accident claims.

When a Self-Driving Accident Becomes a Product Liability Case

A self-driving crash may involve product liability when the injury is tied to how the vehicle or its systems were designed, manufactured, or presented to drivers. That can include problems with:

Product liability is not about perfection. It’s about whether the technology was reasonably safe for real-world use.

Design Defects in Self-Driving Systems

Some cases are about design, not mistakes made during manufacturing. This means the system may work exactly as intended but can still create unsafe situations. Design-related issues can involve decisions like where sensors are placed, how the system prioritizes hazards, or how it handles common driving conditions such as glare, rain, or construction zones.

When a system predictably struggles in everyday situations, that raises questions about whether safety was given enough weight during development. These cases often focus on choices made long before the car ever reached the road.

Manufacturing Problems That Affect Self-Driving Performance

Other cases involve problems that occur during production, not design. These issues may affect only certain vehicles, even if the overall system is sound. Examples may include faulty sensors, improper installation of components, or hardware issues that interfere with how the system processes information. 

Drivers usually have no way to detect these problems until something goes wrong. When a single vehicle or group of vehicles fails in a way others do not, manufacturing defects may become a key issue.

How Software Updates Can Play a Role in Self-Driving Crashes

Self-driving cars rely heavily on software, and that software changes over time. Updates can improve performance but they can also introduce new risks. Some crashes happen shortly after updates that affect:

Software does not age the way mechanical parts do, but changes to code can alter how a car behaves in subtle ways. In product liability cases, it’s often important to understand what version of the software was running at the time of the crash and what changed.

Marketing Claims and How Drivers Rely on Them

Another issue that comes up is how self-driving technology is described to drivers. Names, advertising, and demonstrations can shape expectations.

If drivers are led to believe a system can safely handle more than it actually can, reliance becomes understandable. Product liability cases often examine whether warnings were clear enough and whether limitations were easy to understand in real driving situations.

This isn’t about fine print. It’s about whether an average driver would realistically understand what the system can and cannot do.

Why Suing a Car Manufacturer is Different From a Normal Accident Claim

Claims against manufacturers follow a different path than claims against other drivers. These cases involve deeper investigation and more resistance. Manufacturers often respond by arguing that:

Sorting through those claims takes time and careful review of evidence, especially vehicle data that shows what the system was doing before impact.

Cases Involving Tesla and Other Self-Driving Vehicle Companies

Cases involving Tesla often get public attention, but the legal questions are the same across manufacturers. The focus is not on the brand; it’s on whether the technology performed safely and whether drivers were given a fair understanding of its limits.

Each case depends on the specific facts, the vehicles involved, and what went wrong in the moments leading up to the crash.

How Breit Biniazan Approaches Manufacturer Liability

At Breit Biniazan, self-driving accident cases are handled with the understanding that these situations affect real people, not just legal claims. Clients are often dealing with injuries, unanswered questions, and companies that move quickly to protect themselves.

With over $2 billion recovered, the firm has the experience and resources to take on manufacturers when self-driving technology may have played a role in a crash. That means digging into how the vehicle was designed, how the system behaved, and whether safety took a back seat to speed or convenience.

If you were injured in a crash involving autopilot or self-driving features and believe the technology may be part of the reason it happened, Breit Biniazan can help look beyond surface explanations and uncover what actually went wrong. Contact our team today to discuss your next steps.

Who Is Liable If a Self-Driving Car Crashes While Autopilot Is On?

When a crash involves a self-driving or semi-autonomous car, the confusion often hits before the pain does. People expect answers right away, but instead they hear mixed messages. The driver says the car was in control. The insurance company says the driver should have stepped in. The technology company stays quiet.

If you were injured, all of that noise can make it hard to understand what actually matters.

Liability in these car crashes depends on how the system was being used, what the vehicle did in the moments before impact, and whether the driver was realistically able to respond. If you were injured in a self-driving car accident and looking for answers, contact Breit Biniazan at (855) 659-4457 to discuss your next steps.

Is The Driver Always Responsible When Autopilot is Turned On?

Not always, but the driver is rarely ignored. Most vehicles on the road today are not fully autonomous. Even when autopilot or a similar feature is active, drivers are usually expected to stay alert and be ready to take over at a moment's notice. That expectation plays a big role in how fault is evaluated after a crash.

Insurance companies often begin with the assumption that the driver should have reacted. This simply means the burden often falls on the injured person to show that something more was going on.

How Responsibility is Evaluated After an Autopilot Crash

There is no checklist that decides fault automatically. Instead, investigators look at a mix of human behavior and system behavior. That may include:

Sometimes this information supports the driver’s account. Other times, it raises questions that were not obvious at the scene.

When Driver Behavior Becomes the Main Focus

In many cases, insurers focus heavily on what the driver was doing. That’s because it’s easier to argue human error than technology failure. Drivers are more likely to be blamed when:

Even when autopilot is on, these details can shift a large share of responsibility back onto the driver.

Situations Where the Technology Itself May Be Responsible

There are also crashes where the system’s performance cannot be ignored. This may happen when the vehicle:

In these cases, responsibility may extend beyond the driver and involve the company that designed or maintained the system. These claims are more complex, but they are becoming a common issue especially as this technology evolves.

Why Self-Driving Car Accident Cases Feel Overwhelming

People injured in these crashes often feel stuck between powerful interests. On one side are insurance companies pushing for a quick conclusion. On the other are technology companies with teams dedicated to protecting their systems. At the same time, the injured person is dealing with medical care, lost income, and uncertainty about the future.

Adding to that stress is the fact that self-driving vehicles store digital data that can quietly disappear. Some information may be overwritten if it is not preserved early, making it harder to understand what the system actually did.

Why Fault is Often Disputed in Autopilot Accidents

Unlike traditional crashes, there is rarely a clear mistake that everyone agrees on. Instead, there are questions like:

These questions don’t have quick answers, and that’s why fault is often contested.

How Breit Biniazan Approaches These Cases

At Breit Biniazan, self-driving accident cases are approached with patience and care. These are not situations where rushing helps anyone, especially the injured person.

With over $2 billion recovered, the firm has handled complex cases where responsibility was not obvious and evidence mattered tremendously. The focus is on understanding what really happened, not accepting the first explanation offered. That means looking closely at both the human and technology sides of the crash and making sure important details are not overlooked.

When autopilot or driver-assist technology is involved, clarity matters. For people dealing with injuries, having someone willing to slow things down and look carefully can make a real difference. Contact Breit Biniazan today to discuss your legal options after an autonomous car crash.

Can Autopilot Accident Victims File a Product Liability Claim?

After a crash involving autopilot, many injured people feel stuck between two explanations. One side says the driver made a mistake. The other says the technology was never meant to drive on its own. For victims, that leaves a lot of unanswered questions.

Some autopilot crashes are not just about driver error. They involve technology that failed, reacted too late, or behaved in a way no reasonable driver could expect. When that happens, a different type of legal claim may apply.

That is where product liability lawsuits come in.

An autopilot product liability claim is not simple, and it is not available in every case. Still, for some injured victims, it may be the most direct path toward accountability and compensation.

What Is a Product Liability Claim in a Car Accident?

A product liability claim focuses on the safety of a product, not just the actions of a person. In a car accident, that means looking at whether the vehicle or one of its systems was defective and caused harm.

This is different from a standard auto accident claim, which usually centers on driver behavior like speed, distraction, or failure to yield. A product liability car accident case asks a different question: Was the vehicle safe to use as designed and marketed?

With advanced driver-assistance systems becoming the norm, that question comes up more often. Software, sensors, and automated features can all affect how a vehicle responds in real traffic. When those systems fail, the crash may be tied to a defective vehicle claim rather than a simple mistake behind the wheel.

Can Autopilot Be Considered a Defective Product?

Yes, under certain circumstances.

Software can qualify as a product when it plays a direct role in how a vehicle operates. Autopilot typically controls braking, steering, and speed decisions, which places responsibility on the manufacturer to ensure the system performs safely in real-world conditions.

Manufacturers often point to warnings and user agreements, but disclaimers do not automatically erase liability. If a system behaves in a way that puts users at risk, or if its limits are not communicated clearly, those warnings may not be enough. 

Types of Product Liability Claims in Autopilot Accidents

Autopilot product liability claims usually fall into one of three categories.

Design Defects

A design defect means the system was unsafe from the start. This may involve poor decision-making logic, lack of safeguards, or failure to account for common driving situations like stopped traffic or construction zones. If the overall design creates risk even when used properly, liability may follow.

Manufacturing Defects

Manufacturing defects involve something going wrong during production. Faulty sensors, camera issues, or hardware failures can cause the system to misread the road or respond incorrectly. These cases focus on whether the specific vehicle or component failed to meet safety standards.

Failure to Warn

Failure to warn claims focus on communication. Marketing that overstates capabilities, unclear instructions, or downplayed limitations can mislead drivers. If people are encouraged to trust technology beyond what it can safely handle, that gap can form the basis of a claim.

Common Injuries in Autopilot Product Liability Cases

Crashes involving advanced vehicle technology often happen suddenly and with little warning. That can lead to serious and unexpected injuries.

Rear-end impacts and high-speed collisions frequently cause whiplash and soft-tissue injuries. The force can also damage the spine, leading to herniated discs, nerve pain, or lasting mobility problems.

Head injuries are another concern. Concussions and traumatic brain injuries may occur even without direct head impact, especially when the body is thrown forward or backward quickly.

Broken bones, joint injuries, and internal trauma are common in more severe crashes. In the most serious cases, Autopilot-related failures have been linked to fatal injuries, particularly when the system does not recognize hazards in time.

What Evidence Is Needed to Prove an Autopilot Product Liability Claim?

Autonomous cars store detailed data about how systems behave before and during a crash. Vehicle data and system logs can show whether autopilot was active and how it responded.

Event data recorders, traffic cameras, and dash footage can support that information. Crash reconstruction experts may be needed to explain how the technology contributed to the collision.

Software updates, recalls, and internal manufacturer documents can also play a role. Accessing this information often requires legal pressure and technical expertise.

Who Can Be Held Liable in an Autopilot Product Liability Case?

Product liability cases often involve more than one defendant. Depending on the facts, responsibility may extend beyond the driver. Liable parties may include:

Identifying every responsible party matters, especially in serious injury cases where long-term care is involved.

Challenges in Autopilot Product Liability Claims

Manufacturers often have strong defense teams and access to proprietary data. Disputes often arise over whether the driver should have intervened or whether the system performed as designed.

How Product Liability Claims Differ From Negligence Claims

Negligence claims focus on conduct. Product liability claims focus on safety.

The burden of proof is different, and comparative fault rules may still apply. In many cases, both claims are pursued together. A driver’s actions and a system defect can contribute to the same crash.

What to Do If You Were Injured in an Autopilot-Related Crash

After an autopilot crash, health comes first. Seek medical care right away, even if symptoms seem manageable.

Preserve the vehicle and any available data if possible. Avoid assumptions about fault, especially early on. Technology-related crashes often look different once the facts are fully developed.

How a Product Liability Lawyer Can Help Autopilot Accident Victims

Product liability lawyers focus on investigation and accountability. In Autopilot cases, that includes analyzing system behavior, working with experts, and dealing directly with manufacturers and insurers. These cases require time, resources, and experience. The right approach can make a real difference in how much compensation is recovered and how responsibility is assigned.

Our Experience Handling Complex Vehicle Defect and Autopilot Injury Claims

At Breit Biniazan, we handle serious injury cases involving complex liability and advanced technology. Our team has experience investigating vehicle defects, working with technical experts, and preparing cases that demand trial-level attention.

We have secured over $2 billion for our clients, and we understand how early involvement can protect evidence and preserve options. Every case is different, and outcomes depend on the facts, but preparation always matters.

Contact Our Firm for a Free Consultation

If you were injured in a crash involving autopilot technology, contact Breit Biniazan for a confidential case review. There are no upfront fees, and our firm focuses on serious injury cases involving complex vehicle defects and corporate defendants.

We are ready to listen and explain your options.