Definition of West coast

1. Noun. The western seaboard of the United States from Washington to southern California.


Definition of West coast

1. Adjective. of or relating to the western seaboard of the United States. ¹

¹ Source: wiktionary.com

Lexicographical Neighbors of West Coast

West's syndrome
West-Flanders
West-sider
West Africa
West African
West African fever
West African sleeping sickness
West African trypanosomiasis
West Bank
West Bengal
West Berlin
West Berliner
West Brit
West Briton
West Chadic
West Coast
West Country
West End
West Flanders
West Flemish
West Frisian
West Frisian Islands
West German
West Germanic
West Germanic language
West Germans
West Germany
West Greece
West Highland white terrier
West Hollywood

Other Resources:

Search for West coast on Dictionary.com!Search for West coast on Thesaurus.com!Search for West coast on Google!Search for West coast on Wikipedia!

Search