By understanding how military medicine has evolved and adapted over time, modern practitioners can better appreciate the need to question convention, advance research, and rapidly integrate beneficial new technologies into medical care.
By Christine Miller, DPM, FACCWS, Ashley Finn, and Emily Delzell
Military medicine has always had a lasting effect on civilian practices and has often served as a catalyst for dislodging entrenched surgical dogma and improving medical care and surgical technique. The beginnings of modern triage and evacuation methods date to the Napoleonic wars, for example, while the US Civil War helped bring modern forms of anesthesia into widespread use. In more recent conflicts, the medical needs of wounded soldiers have driven significant advances in prosthetic and orthotic device design, amputee care, and rehabilitation practices.
Here, we review the evolution of selected surgical techniques as well as the challenges of treating lower extremity trauma in the US military from the Revolutionary War through the 21st century conflicts in Iraq and Afghanistan. Although many factors that have impeded progress in combat casualty care no longer plague surgeons or their patients, historical context emphasizes the need for modern practitioners to question convention, advance research, and rapidly integrate beneficial new technologies into medical care.
Problems of infection control
The use of amputation as a treatment for severe lower extremity injuries sustained during war dates to antiquity, though in the ancient world the procedure was most often used to treat gangrene, not battle wounds. In the pregunpowder era swords, spears, and arrows caused most combat wounds, which were often survivable without major surgery.1
The Greek physician Hippocrates (460-370 BCE) said amputation of gangrenous limbs should occur through the gangrenous area rather than above it.2 It wasn’t until more than three centuries later that the Roman medical writer Celsus (50 BCE-25 CE) advocated circular surgical amputation through the healthy tissue above gangrenous areas, as noted by military historian Richard Gabriel: “Celsus was the first physician to suggest amputation through live tissue and to use a rasp to smooth the bone prior to closure.”3
Celsus also described ligating vessels, creating a surgical flap to cover the amputation site, and packing wounds with vinegar-soaked sponges.2
Historians say concern regarding the onset of sepsis influenced ancient treatment protocols significantly. “The Roman practice of removing decayed foreign matter from the wound before and after repeated cleansing helped reduce the rate of tetanus and gangrene, as did loose bandaging, regular bandage changes and the use of surgical clips,” Gabriel wrote.3
However, the Greeks and Romans, notably the gladiatorial and, later, imperial physician Galen (130-200 CE), also made the theory of “laudable” pus—the belief that infection is a necessary and beneficial component of successful healing—a central tenet of wound management.4
Although surgeons reported sporadically on antiseptic practices from the time of the Sumerians and ancient Greeks (Hippo- crates advocated pouring wine over wounds, for example) and, over the centuries, some directly disputed the idea that suppuration was essential for wound healing, they were unable to dislodge Galenic doctrine.4,5 In his 2006 Annals of Surgery review of several millennia of combat care, Pruitt labeled this the “tyranny of surgical dogma” but, along with others, noted that part of the problem was a lack of understanding of the origins and transmission of infection.1,4,6
Physicians distinguished between different types of suppuration. Wounds that produced thick creamy pus (i.e., laudable pus) could take months to heal, but patients were often free of systemic infection, and many recovered. A watery, foul-smelling discharge, however, indicated pyemia, which killed about 97% of those infected.1,7
The theory of laudable pus hindered medical practice for almost 2000 years and US military doctors, much like their Roman predecessors, struggled with the notion of amputation and its unfortunate consequences. Before the advent of antiseptic technique, almost all amputation wounds became infected, and overall postamputation mortality was about 50%.6
The primacy of laudable pus finally came to an end in the late 19th century after British surgeon Robert Lister (1827-1912), building on the germ theory of disease proposed by French microbiologist Louis Pasteur (1822-1895), published his antiseptic operative techniques in 1867. Lister reported that his methods, which involved scrubbing all operating room surfaces with a carbolic acid solution, sterilizing instruments in boiling water, and disinfecting surgeons’ hands, reduced mortality after amputation from 46% to 15%.8
Still, Listerian practice, which required much preoperative preparation, was slow to catch on in the US. Asepsis—the use of sterile operating gowns, caps, and gloves—was not emphasized in American military medicine until well after the Civil War, and took even more time to gain general acceptance.4,6
The military surgeons of the Continental Army lacked training and resources to deal with the growing number of casualties of the Revolutionary War (1775-1783). At the time violent conflict broke out, most colonial physicians were products of an apprentice system and lacked standardized medical education.4 There were only two medical schools in the colonies, the Medical Department of the College of Philadelphia (later the University of Pennsylvania) and the King’s College in New York City (later Columbia University).
Until the aftermath of the June 1775 Battle of Bunker Hill, where soldiers were subject to disorganized and sometimes incompetent medical care, the Continental Army had no structured medical service.4
The Second Continental Congress tried to address these inadequacies by establishing the Hospital Department for the Army in July 1775, but infighting among its leaders plagued the department—and soldiers—throughout the war, leading to critical shortages of resources and supplies and failed attempts to enforce medical competency among Army surgeons.4
Continental Army surgeon John Jones, MD, unlike many of his contemporaries, had followed his medical apprenticeship with formal training in England and France, earning a medical degree from the University of Reims in 1751. He became the first professor and chair of surgery in the colonies (King’s College, 1767), and worked to alleviate the paucity of formal medical education and guidelines by authoring the first American surgical text and first army manual published in the Americas, Plain Concise Practical Remarks on the Treatment of Wounds and Fractures, which was used extensively by Revolutionary War surgeons.9
Jones served as a surgeon in the British army during the French and Indian War (1754-1763), and this experience, which provided a close-up view of the risks of amputation, including lethal sepsis and shock, likely influenced his focus on limb salvage rather than sacrifice.9 He is widely considered the father of American surgery.
Some techniques Jones advocated were using bark emollient for cautery, debriding foreign bodies with as little soft tissue disruption as possible, and amputating when other treatments failed, though surgeons commonly amputated compound fractures to avoid life-threatening infection.4 Surgeons’ general inclination to limit wound exploration and debridement inevitably led to infection.
Most surgeons began their military service with little combat experience, and surgical technique and equipment varied significantly from site to site. They used large curved knives or saws to accomplish amputation, and anesthesia typically consisted of quantities of rum or wine, a tobacco juice mixture, or, more often, nothing.1 Holding patients still during operations required two or three surgical attendants. A screw-type tourniquet, invented by French surgeon Jean Petit in 1718, helped control bleeding.1
Other treatment options for lower extremity injuries were splinting and removal of “easily reached” bullets (the only kind surgeons were advised to try and extract), which were made from lead and enlarged on impact, often causing severe tissue and bone damage.1 Doctors treated burns with a range of topical agents, including wine for superficial burns and hog’s lard for full-thickness burns. Surgeons of this era dabbled with some debridement of shrapnel, but if the injury was too complex then amputation was an effective solution.10
Benedict Arnold is famously quoted as saying he “would rather die than be a one legged cripple” after being wounded in battle.10 His statement suggests soldiers of this era regarded amputation as a heinous treatment one would rather die than endure.
At the onset of the US Civil War (1861-1865) medical education was still not streamlined, and medical departments on both sides were unprepared and understaffed,10 but by the conflict’s end a number of important advances occurred: Surgeons, who were now ranked by skill and experience, commonly used ether or chloroform for anesthesia and bromine to prevent gangrene, and the lack of standardization eventually gave way to more organized medical care.1,4
Civil War soldiers experienced large numbers of gunshot fractures, which highlighted the importance of orthopedic surgery as a critical surgical specialty and drove technical advances in the field, including internal fixation of fractures with metal wire and brass pins as well as the first US military experimentation with external fixation.10
Unfortunately, the introduction of antiseptic practice came too late for Civil War soldiers, and would not be used until the Spanish-American War. During this five-month conflict in 1898 every soldier carried an occlusive antiseptic dressing in his first aid kit,4 and this innovation, along with the use of hydration for wound shock, halved gunshot-wound mortality from Civil War levels to about 7.4%.1
During the war, both Union and Confederate sides recruited civilians into medical service, and most lacked operative and combat experience or more than a two-year medical education.11 By the war’s end, however, Union medical forces had grown from fewer than 200 to more than 10,000 surgeons. The Confederacy established the Medical Department of the Confederate States of America in 1861, and ended the conflict with more than 4000 surgeons.11
“Amputation was commonly practiced and became the primary surgical skill of the Civil War battlefield surgeon,” wrote Tooker in a paper on Civil War-era medicine and nursing.11 Amputation was the most common surgical procedure for gunshot wounds, which had taken on new destructive force with the introduction of the Minié ball, first used in the Crimean War (1853-1856). The high-velocity projectiles resulted in greater destructive force, shattering bone and causing massive tissue damage.1
Many battlefield hospitals were filled with wounded soldiers awaiting the saw blade or knife to remove their injured limbs, and about 12% of all battle injuries resulted in major amputation. Overall mortality for lower limb amputation was 33%, increasing to 54% for above knee amputation. Most amputations were done at the injury level to preserve limb length. Some surgeons had training in tissue flap techniques, but most performed circular amputation to control bleeding.1
Performing amputations took a toll on the surgeons. A surgeon from the 121st New York Volunteer Infantry noted, “Every house for mile around is a hospital and I have seen arms, legs, feet, and hands lying in piles rotting in the blazing heat of a southern sky unburied and uncared for, and still the knife went steadily in its work adding to the putrid mess.”12
One innovation to come out of Civil War medicine still in use today was mass casualty management. Union surgeon Jonathan Letterman, MD, built on the experience of renowned French surgeon Baron Dominique-Jean Larrey in the Napoleonic Wars (1803-1815), and devised a triage system to handle the large numbers of battlefield wounded.13 He also standardized ambulance service, and each Union regiment, in theory at least, had three ambulances and a transport cart. In practice, the numbers of wounded often overwhelmed such systems and many injured soldiers lay on battlefields for days.4
Letterman also introduced competence-based ranking of surgeons, mandating that only the three most experienced surgeons in each Union division perform operations.4 On the Confederate side, J.J. Chisolm, MD, the first surgeon appointed to active military service by the fledgling government, also instituted a form of credentialing, recommending senior colleagues evaluate all surgeons before allowing them to amputate. Chisolm also published a military surgery manual in which he argued against the theory of laudable pus and emphasized concepts such as nutrition and nursing care.14
Civil War surgeons also confirmed that primary amputation—amputation performed as soon after injury as possible (after recovery from shock and before infection onset)—was superior to secondary amputation.4 Records from the Confederate Medical Department, for example, reported 30% mortality in patients undergoing primary amputation versus 53% in those who underwent secondary amputation.14
The Civil War proved to be an eye-opening event with regard to the importance of organization and preparedness of wartime military medicine in the US.
World War I
Trench warfare in World War I (1914-1918) led to high numbers of lower extremity wounds, sustained as soldiers ran between trenches while the enemy fired from low elevations. Bullets had again become more lethal. Soldiers began using metal-jacketed bullets in the Spanish-American War, and these projectiles produced greater cavitation and soft tissue damage than earlier bullets.1 Weapons technology would continue to follow this trajectory of increasing damage and lethality throughout all future conflicts.
In the early years of the war, compound lower limb fractures caused by gunshots in trench warfare sparked debate over the traditional splinting practices that delayed surgery, leading to high mortality rates, particularly for open femoral fractures.15
Femoral fractures stranded soldiers on the battlefield, and stretcher-bearers reached them only with difficulty, leaving many lying wounded for days or enduring rough transport, all of which left soldiers particularly vulnerable to gas gangrene and secondary hemorrhage. Australian surgeons in France reported injury-to-treatment times ranging from 36 hours to a week, and averaging three to four days.16 Fracture immobilization during transport was poor, and in the early war years surgeons reported about 80% mortality for soldiers with femoral fractures transported from the field.15
By 1915 medics and stretcher-bearers were routinely trained to apply immobilizing splints, and by 1917 specialized femur wards had been established;16 during this period mortality from all fractures fell to about 12%15 and below 20% for open femoral fractures.10
The nature of the injuries once again highlighted the importance of orthopedic care and, because the US didn’t enter the conflict until 1917, its military leaders had three years to observe and prepare for soldiers’ medical needs. In 1916 the Office of the Surgeon General formed a subsection, the Department of Military Orthopedic Surgery, and in 1917 an Orthopedic Section accompanied the American Expeditionary Force to Europe.10
Wound management in World War I consisted of using antiseptics such as iodine and Dakin’s solution (diluted sodium hypochlorite and boric acid). Surgeons in the early war years relied on primary closure, leading to severe cases of infections. The poor results led surgeons to move to delayed closure with thorough debridement of foreign bodies and loose bone fragments.1 In cases of major trauma, gas gangrene, and arterial injury, amputation was the mainstay.15
Fracture management relied heavily on traction splints, and US surgeons preferred the modified Liston splint, which was wood with metal bar sections that allowed wound access.15
In the decades prior to WW I, scientists had discovered blood typing17 and x-ray technology,18 and surgeons used these techniques, along with anti-tetanus serums,4 to improve combat care. (X-rays, discovered in 1895, had also been used to localize bullet fragments and fractures during the Spanish-American War.)10
“Official reports and clinical case histories fail to illuminate the individual wretchedness of dying young men—their faces to the wall, whimpering and calling for their mothers; nor the agonizing on the part of the surgeons faced with the dilemma of if, or when, to amputate,” wrote a contemporary surgeon.15
World War II & Korea
During World War II (1941-1945) significant advances in pharmacology shifted the paradigm from splinting and amputating to more skilled surgical care. Transfusion medicine, including a reliable supply of blood products and typed blood as well as the advent of antibiotics, helped transition surgery from straight amputation of severely injured limbs to successful internal fixation of fractures.20
In this period, surgeons used advanced salvage techniques such as bone grafting. External fixation was used in the first years of the war, but many military surgeons had little experience with the technique, and high infection rates led them to rely on functional casting for long-bone fractures, a combat medicine policy that stayed in place until the early 1970s.1
Many techniques remained largely experimental during the war years because of the high rate of postoperative infections caused by a limited supply of antibiotics. Although Scottish biologist Alexander Fleming discovered penicillin in 1929, the first purified penicillin, which had a broad antibacterial action and low toxicity, wasn’t manufactured until 1942, when it began to enter use in military hospitals.4
In the interwar years Army officials recognized the need for a permanent medical ancillary organization, and established the US Army Medical Service Corps, which has its roots in the Revolutionary War-era Hospital Department for the Army and the Union and Civil War ambulance and surgical services.19 By WW II this trauma system provided continuity of care to wounded soldiers as they were transported by vehicle, train, ship, or air, sometimes to distant facilities for advanced care and rehabilitation.4
Auxiliary groups of surgeons now included all types of surgical specialists and anesthesiologists and deployed as mobile teams to all levels of treatment facilities, from field clearing stations to general hospitals.4 Their detailed records documented outcomes and laid a base for evidence-based casualty care.
Amputation protocols also solidified. In 1943 the Surgeon General of the Army issued a directive stating that, in combat conditions, surgeons should use open circular or guillotine amputation and avoid primary suture of extremity injuries, confirming the benefits of delaying closure of contaminated wounds.20 Military surgeons relied on ligation for vascular injures and rarely attempted vascular repair, though ligation resulted in gangrene and subsequent amputation in about half of those who underwent the procedure.21
In the Korean Conflict (1950-1953), limb salvage was emphasized and, rather than amputate extremity arterial injuries, surgeons now attempted direct vascular repair and arterial replacement. Rapid evacuation combined with specialized and stable hospital care produced better results than in WW II, with an overall amputation rate of 13%4 for vessels repaired by suture compared with 36% in WW II soldiers.22
US surgeons used intramedullary fixation for lower extremity injuries for the first time and, though they achieved good results, recommended reserving the procedure for patients who didn’t respond to traction and suspension.10
Treatment for shock also improved along with understanding of its mechanisms, and US soldiers received adequate volumes of resuscitation fluids in the first forward care surgical facilities,23 the Mobile Army Surgical Hospital (MASH) units.24 Helicopter transport of casualties also saw its first use during the Korean Conflict.4
During the Vietnam War (1959-1975), surgical specialists in small medical units close to the combat zone4 could provide specialized, definitive surgical care more rapidly, sometimes within one to two hours of injury, though four to six hours was more common.25 Rapid helicopter transport and specialized surgical care saved the lives of soldiers who would have died in previous conflicts on the battlefield or during transport. The North Vietnamese relied heavily on machine gun fire, booby traps, and mines, and the nature of the weaponry led to increased rates of major amputation, which rose to 3.4% of all battle injuries from 1.2% in WW I and 1.4% in Korea.26
“Combat injuries to the extremities were treated with wound exploration, debridement, and open packing. Pulsatile lavage was used to treat contaminated war injuries and primary closure was avoided in country,” wrote Schoenfeld in a review of orthopedic surgery in the US Army.10
Vascular repair was attempted to salvage limbs10 and US surgeons in Vietnam pioneered repair of venous battle injuries, which were still typically treated with ligation, and established a registry to track vascular injury and repair outcomes.27
Compared with earlier military conflicts, Vietnam displayed the advantages of sophisticated surgical technique combined with rapid and concise emergency care.
Iraq & Afghanistan
In the recent conflicts in Afghanistan (Operation Enduring Freedom, 2001-present) and Iraq (Operation Iraqi Freedom, 2003-2010, and its drawdown phase, Operation New Dawn, 2010-2011), extremity injuries have been prevalent and explosive devices their primary cause.28
In these modern-day conflicts explosions cause 87.9% of all injuries,28 and body armor, or personal protective equipment, combined with rapid transport to medical units with advanced surgical capabilities has allowed medical personnel to save the lives of soldiers with critical chest and other injuries that would have been lethal in earlier wars.29 The case fatality rate among soldiers fighting in Iraq and Afghanistan is 9.4%, lower than in all other modern wars, including WW II (19.1%) and Vietnam (15.8%). The extremities remain vulnerable, however, and 70.5% of all injuries sustained in these conflicts have been major limb injuries.28
The rate of major amputations, 5.2% of all serious injuries and 7.4% of all major limb injuries, is similar to earlier conflicts, including Vietnam. From 2001 to 2006 soldiers wounded in Iraq and Afghanistan sustained 3854 lower extremity and 3349 upper extremity injuries classified as serious by forward medical teams. Those with lower extremity injures were almost three times as likely as those with upper extremity trauma to undergo amputation (8.5% vs 3.1%), a finding for which Stansbury et al noted several possible explanations, including the ground-level location of many explosive devices, the massive tissue destruction in limbs close to the blast area, and the higher rates of nonunion and infection among lower versus upper extremity injuries.28
Multiple amputations also have been more common in recent conflicts. About 18% of amputees wounded in Iraq and Afghanistan have more than one extremity amputation, much higher than rates recorded from WW I through Korea, when they ranged from 2% to 8%, but on par with the 18% rate reported during Vietnam.30
There has been an increase in late amputations among soldiers serving in 21st-century conflicts. In 2008 Stansbury et al reported a 5% rate of amputations done more than three months after injury,28 but in 2012 Stinner et al31 reported a 15.2% rate of late amputations (range, 12 weeks-5.5 years).
A possible reason for this rise could be the success with which amputees are returning to function with prosthetic devices.31 Prosthetic device technology has developed profoundly over the past 40 years, and the amputees who use prosthetic devices can see the evidence for this in their return to high levels of activity. This shows a trend in which amputation may be chosen over a lengthy reconstructive procedure because of prosthetic advances.
The goal of military medicine has been to give wounded soldiers the best and most efficient care to restore their quality of life after traumatic injuries.32 US military doctors have proposed a new model to achieve this purpose.
The Orthopedic Specialty Care Management Team program was initiated on September 1, 2005, to optimize the orthopedic surgical treatment process for wounded soldiers. The Orthopedic Specialty Care Management Team program includes a chief of orthopedic surgery, chief of podiatry, and chief of physical therapy.33 This multidisciplinary team approach is meant to evaluate each soldier’s injury and apply the most comprehensive treatment plan that would allow him or her to return to service.
The ability for soldiers to return to duty after being wounded is a cherished gift. The revival of an organization such as the Invalid Corps may be of use, as in the past this served as means to keep wounded soldiers active and part of the military.
The Invalid Corps first appeared in the Revolutionary War, when Commander Lewis Nicolas led a small unit of wounded soldiers in noncombat roles such as guard duty and training drills. During the Civil War, the Invalid Corps had nearly 60,000 men in its ranks, with a three-battalion structure depending on the severity of disability.35 The post-Civil War era marked the demise of this program despite its earlier successes, largely due to some who were malingerers. In a 2008 review of military medical disability care Lande proposed that “…a modernized form of the Invalid Corps would preserve personnel, retain valuable experience, promote self-esteem, and write a new chapter on military disabilities.”34
The development of better functioning prosthetic devices and improved combat and amputee care greatly increases the feasibility of retaining injured soldiers. Return-to-duty rates reflect these advances.
In 2010 Stinner et al35 reported that 16.5% of soldiers who sustained a combat-related amputation between 2001 and 2006 returned to active duty, findings that contrast sharply with a 1995 study that found return-to-duty-rates of 2.3% among soldiers who underwent amputation between 1980 and 1988.36
The complex issue of treating lower extremity war injuries has concerned military doctors and surgeons throughout the ages. During the dawn of this nation, amputation was the mainstay of treatment, but was associated with terrible consequences. Civil War surgeons also relied heavily on amputation as a primary surgical treatment for lower extremity issues, but brought about a triage and ambulance system to make care more efficient.
The greatest success in limb salvage began during the World War II era with the arrival of antibiotics, which allowed for infection control and internal fixation. From Vietnam to the present, reliance on amputation as a primary treatment has declined. Amputation is still used in the direst circumstances, however, and gives injured soldiers the opportunity for an improved quality of life with a prosthetic device. Experimental advances in robotics and prosthetics technology suggest the future is promising.
Christine Miller, DPM, FACCWS, is assistant professor and chair of the Department of Podiatric Medicine & Orthopedics, and Ashley Finn is a doctoral candidate, at the Temple University School of Podiatric Medicine in Philadelphia. Emily Delzell is associate editor of Lower Extremity Review.
Acknowledgment: The authors thank Robert Butera, historical interpreter at the Old Barracks Museum in Trenton, NJ, for his help with the Revolutionary War research for this manuscript.
2. Kirk NT. The development of amputation. Bull Med Libr Assoc 1944;32:132-163.
3. Gabriel RA. The best medicine. Military History July 2011.
4. Pruitt BA Jr. Combat casualty care and surgical progress. An Surg 2006;243(6):715-729.
5. Miller JT, Rahimi SY, Lee. History of infection control and its contributions to the development of successful brain tumor operations. Neurosurg Focus 2005;18(4):1-5.
6. Alexander JW. The contributions of infection control to a century of surgical progress. Ann Surg 1985; 201(4):423–428.
7. Bollet AJ. Civil War Medicine: Challenges and Triumphs. Tucson, AZ: Galen Press Ltd; 2002.
8. Lister J. On the effects of an antiseptic system treatment upon the salubrity of a surgical hospital. Lancet 1870;95(2419):40-42.
9. Griesemer AD, Widmann WD, Forde KA, Hardy MA. John Jones, MD.: Pioneer, patriot, and founder of American surgery. World J Surg 2010;34(4):605-609.
10. Schoenfeld AJ. Orthopedic surgery in the United States Army: a historical review. Mil Med 2011;176(6):689-695.
11. Tooker J. Antietam: aspects of medicine, nursing, and the civil war. Trans Am Clin Climatol Assoc 2007;118:215-223.
12. Nelson JH. Misery Holds High Carnival, America’s Civil War, September 2007, 30-39.
13. Givens R. Father of emergency medicine. American History 2009;43(6):17.
14. Cunningham HH. Doctors in Gray: the Confederate Medical Service. Baton Rouge, LA: Louisiana University Press, 1958.
15. Kirkup J. Fracture care of friend and foe during World War I. ANZ Journal of Surgery 2003;73(6):453-459.
16. Hurley V, Weedon SH. Treament of fractured femur at base hospital in France. Br J Surg 1918;6(23):351-376.
17. Priutt BA Jr. Multidisciplinary care and research for burn injury: 1976 presidential address, American Burn Association meeting. J Trauma 1977;17(4):263-269.
18. Cushing H. From a Surgeons Journal 1915–1918. Boston: Little, Brown, 1936:50-51.
20. Rankin FW. Mission accomplished: the task ahead. Ann Surg. 1949;130:289-309.
21. DeBakey ME, Simeone FA. Battle injuries of the arteries in World War II: an analysis of 2,471 cases. Ann Surg 1946;123(4):534-579.
22. Hughes CW. Arterial repair during the Korean War. Ann Surg 1958;147(4): 555-561.
23. Pruitt BA Jr. Centennial changes in surgical care and research. AnnSurg 2000;232(3):287-301.
24. Woodard SC. Mil Med. The AMSUS History of Military Medicine Essay Award. The story of the mobile army surgical hospital[corrected]. Mil Med 2003;168(7):503-513.
25. Hardaway RM 3rd. Viet Nam wound analysis. J Trauma. 1978;18(9):635-643.
26. Potter BK, Scoville CR. Amputation is not isolated: An overview of the US Army Amputee Patient Care Program and associated amputee injuries. J Am Acad Orthop Surg 2006;14(10 Spec No):S188-S190.
27. Rich NM, Leppäniemi A. Vascular trauma: a 40-year experience with extremity vascular emphasis. Scand J Surg 2002;91:109-125.
29. Nelson T, Clark T, Stedje-Larsen E, et al. Close proximity blast injury patterns from Improvised Explosive Devices in Iraq: A report of 18 cases. J Trauma. 2008;65(1):212-217.
30. Mayfield, G.W. Vietnam War amputees. In Burkhalter WE: Surgery in Vietnam. Orthopedic Surgery. Washington, DC: Office of the Surgeon General; 1994:131-153.
32. Gailey R, McFarland LV, Cooper RA, et. al. Unilateral lower-limb loss: prosthetic device use and functional outcomes in servicemembers from Vietnam war and OIF/OEF conflicts. J Rehabil Res Dev 2010;47(4):317-331.
33. Johnson AE, Roach C, Jolissaint D. The orthopedic specialty care management team: the mcdonald army health center experience. Mil Med 2009;174(7):745-749.
34. Lande RG. Invalid corps. Mil Med 2008;173(6):525-528.
36. Kisbaugh D, Dillinham TR, Howard RS, et al. Amputee soldiers and their return to active duty. Mil Med. 1995;160:82-84.