TKInter is a desktop GUI module that accepts mouse input, while the iPhone is a touch based UI. This means there are some interactions and gestures that don't map across between them. For example the phone interface has no analogue of right-click or click-and-drag. Similarly the desktop interaction model has no analogue of a long-press or swipe.
You can arbitrarily try and assign mappings between some of these, so you could say long press equals right-click, but that might work fine in one application but poorly in another in which long-click might need to be used for another gesture such as to select something for drag-and-drop. No one mapping will work for every application, but at the framework level to make something portable you'd need to have fixed mappings.
Then you have the issue that the phone doesn't have a concept of floating windows, re-sizable windows, overlapping windows, floating dialogs and menus, scroll bars and many others. How do you do shift-click or control-click on a phone? Conversely on the phone you have conventions like swiping from the edge of the screen and multi-touch gestures. There are very good reasons why even Apple's own OSX GUI framework was not ported directly to iOS.
It is possible to write UI toolkits from scratch that are designed to map across between touch and desktop paradigms. The way to do this is to identify a subset of gestures and interactions in both models and creating strict mappings between them at the toolkit level. This gives you the portability, but at a cost of sacrificing the ability to use interactions that can't be deterministic-ally mapped to the other platforms. Unfortunately TKInter wasn't designed in terms of sacrificing desktop GUI capabilities in order to make the subset portable to phones, so it has a lot of GUI paradigm interactions and display widgets included that don't port over.