A. It's always an interesting discussion when you talk about virtualizing applications and whether you should run them locally on clients or on presentation servers (such as Remote Desktop Services or Citrix) when these applications talk to a large amount of data that's stored in the datacenter.

Generally, if an application uses a lot of data in the datacenter, hosting the application in the datacenter and serving it via presentation virtualization application publishing will result in the least amount of network traffic and best responsiveness when compared to running the application locally and having to fetch all the data over the network, which may result in a very poor experience.

You're balancing the data the application will access over the network (when running it locally) against the bandwidth used for mouse and keyboard activity and screen updates (when running it remotely). Obviously, network bandwidth isn't the only consideration. With presentation virtualization for the applications, you need servers in the datacenter to run the applications and aren't taking advantage of resources on the local client device, so it costs more than if you just ran the applications on the client devices.

The picture below really summarizes the difference. At the top you can see the communication between the client and the datacenter—all of the data used by the application—which could be huge. At the bottom you can see a presentation virtualization server is in the datacenter running the application, so the application data is sent locally within the datacenter. The only data between the client and the datacenter is the mouse, keyboard, and screen updates, which are minimal in most use cases. (And no, my son didn't draw this, I'm trying a new medium Laughing)

Local vs. Remote Running