advertisement
advertisement
advertisement

Question:

What happened in Africa?

Answer:

Show Answer

Great Britain and France were able to successfully defeat German forces in Africa, and take over German colonies in Africa. WWI warfare