Games are a major testing ground for Artificial Intelligence. Though AI has become proficient at playing games such as Space Invaders, it behaves in a way that is distinctly artificial, lacking the human-like qualities of a real player. This human element is important in competitive multiplayer games, as a large part of the enjoyment comes from outwitting other human strategies. To address this issue, we investigate a novel AI technique that leverages planning and human demonstrations to create an opponent that exhibits desirable qualities of human play in the context of a fighting game. We introduce the idea of action-deltas, which relate the action performed with the change in the game state. These action-deltas are learned from human demonstrations and are used to help the AI plan out strategies to hit the opponent. We implement a simple fighting game called FG for the AI to compete in and provide it a human demonstration to learn from. The AI utilizes action-deltas with other search techniques to emulate human behavior. Lastly, we evaluate the effectiveness of our AI by comparing its similarity score against other algorithms and other demonstrations by the same human player.