Noun western united states has 1 sense
  1. West, western United States - the region of the United States lying to the west of the Mississippi River
    --1 is a kind of geographical area, geographic area, geographical region, geographic region
    --1 is a part of
     United States, United States of America, America, US, U.S., USA, U.S.A.
    --1 has parts:
     Santa Fe Trail; Southwest, southwestern United States; Northwest, northwestern United States
    --1 has particulars: Wild West
,
TOP