Women's American football in the United States
American football sport / From Wikipedia, the free encyclopedia
Women's American football in the United States is the American football sport played by women, both regionally in the United States and worldwide in the IFAF Women's World Championship.
"Women's football in the United States" redirects here. For women's association football in the United States, see Women's soccer in the United States.