|By Bruce Armstrong||
|December 1, 2004 12:00 AM EST||
First there were the thin clients. Not the Internet thin clients, I'm talking about the mainframe applications with dumb terminals. Everything ran on the server; the client was basically there only to display and for input from the user.
Then PCs came along and a paradigm shift occurred toward thick clients. Really thick clients in fact. Not only was the user interface running locally on the PC, but often so was the data access layer, with the application accessing local database files or perhaps a shared database file on the network. Key reasons for the shift were a better user experience and decentralization of development. PCs also tended to be cheaper and potentially faster than mainframe computing.
The pendulum then swung back toward the middle with the advent of client/server computing. If you're an old dog like me, that's when you first started using PowerBuilder. Client/server was an attempt to recentralize the storage of data (making the same data available to different applications simultaneously) while maintaining the rich user interface.
That was followed by another shift toward the start with the introduction of distributed computing. Folks began to realize that they not only needed to share the same data between different applications, they needed to be able to share the same business rules as well. So those business rules were also moved to the server side of the equation, just not necessarily to the same server that was hosting the database.
The pendulum then swung nearly back to its original starting point with the dawn of the Internet version of thin clients. Not fully back to the start: the user interface in the most rudimentary thin client Internet/intranet application is still significantly better than the mainframe dumb terminal days. But it in turn pales in comparison with what can be accomplished with a thick client. People were beginning to see large operating expenses related to the maintenance and upgrade of PCs. It was not uncommon to find that the installation of a new application installed files (e.g., DLLs) that were incompatible with those previously installed by an already installed application, or that two different applications required two different versions of a database client layer. Internet/intranet thin client applications would save the day by reducing application deployment costs to the client to near zero.
In a somewhat related scenario, there was also the introduction of mobile devices such as PDAs that could use Internet technology. While the thin client approach was of particular importance for these devices because of their more limited hardware capabilities, there was also the introduction of technology specialized for these devices that allowed them to run in an occasionally connected mode.
Thin client comes at a cost, though, and the costs have been primarily in the user experience and performance of the application. The thinner the client, the more work the server has to do and the more round-trips that have to be made between client and server to update the display. As DataFace notes in their "Benefits of Smart Clients" white paper: "A 6-processor web server is outgunned by 100 desktops."
So the pendulum started swinging back again. But just as the Internet thin client applications didn't mean a return to mainframe dumb terminals, the shift back toward thick client applications won't mean a return to client/server or distributed computing. The coming paradigm shift will be toward clients that strike the balance between thick and thin clients by being "smart" clients.
What a "smart client" paradigm attempts to do is combine the advantage of thick clients (a rich user experience) with the advantage of thin clients (ease of deployment). As Paul Kimmel noted in his developer.com article entitled "Smart Clients: Windows Forms Flexibility with Web Application Ease":
A smart client basically is a Windows [...] application that dynamically updates client assemblies....Your customers will get a Windows [...] application with all of the expressivity they will have come to expect, along with an ease of deployment like the Web provides.
A lot of the buzz surrounding smart clients has revolved around Microsoft's toolset. However, as Jonathon Walsh and Joe Misner note in their developer.com article "Smart Clients - A Practitioner's Point of View," smart client's can be developed with other tools:
Many equate Smart Client to a set of Microsoft .NET technologies. While the .NET Framework (Windows Forms) and the .NET Compact Framework provide the ability to develop Smart Clients with ease, other technologies can provide smart client applications by utilizing the same architecture.
Smart clients have a number of key ingredients (see, for example, David Hill's WebLog - Smart Clients for Smart Users or the MSDN Smart Client Definition).
- Uses local resources: That's nothing more than saying that a smart client has a significant component that runs on the local machine. A smart client should be able to provide the same rich user experience as a traditional thick client application.
- Connected: A smart client application, like the client/server, distributed, and Internet applications, does not run solely on the client, but is involved with data interactions with other computers in the network. The primary difference between, let's say, a traditional client/server application and a smart application is that a smart application will use nonproprietary standardized protocols for that communication (e.g., Web services).
- Support for both standard and mobile devices: Depending on who is doing the defining, smart clients include clients that are capable of running on a variety of platforms.
- Offline capable: Here's where smart clients break sharply from most of the previous paradigms. Smart clients are more similar to applications written for mobile devices in that they need to be able to continue operating while the client is offline.
- Intelligent deploy and update: Perhaps bearing the most similarity to thin client applications, smart client applications should not be an installation and upgrade nightmare.
What about offline capability? Many of the issues associated with thin client applications revolve around the decision of whether to try to maintain state on the server and, if so, how to do so. Maintaining state on the server has a significant negative impact on the scalability of the application. However, if state is not maintained, a loss in connectivity can result in a loss of data. As DataFace notes in their white paper:
The obvious place to store state is on the client. It has plenty of disk space. There is no risk of interference with other users. Storage and retrieval of state can take place in parallel with other activities with invisibly small overhead.
Offline capability means that the user can work with the application locally, perhaps on a mobile or disconnected device, and only needs to go online long enough to finalize the transaction by transmitting the data. In an article in NetWork World, Anthony Norris - the director of commercial and e-commerce technology at FedEx Kinko's in Dallas - noted the importance of this capability for their "File, Print FedEx Kinko's" smart client application:
"That's a huge advantage," Norris says. "So in the case of File, Print FedEx Kinko's, you don't actually have to be connected to the Internet to use the software. You can start your print order, specify how you want it printed, and then queue it up for submission later when you're connected to the Internet. Business users traveling on an airplane won't have any trouble using our software, whereas with Citrix or a Web application, they can't do that. That's tremendously powerful - a huge value."
Sybase has been touting the benefits of "occasionally connected" applications for some time, and their Adaptive Server Anywhere and related replication technologies are well tailored to supporting such functionality.
That leaves "intelligent deploy and update." That's the one part of the "smart client" paradigm that is significantly different than previous technologies. Microsoft's .NET Framework implements this through "no-touch deployment." (A second generation version of this technology called One-Click will be part of the next major version of the .NET Framework.) David Hill describes this technology as follows:
- Where Are RIA Technologies Headed in 2008?
- PowerBuilder History - How Did It Evolve?
- Creation and Consumption of Web Services with PowerBuilder
- Cloud People: A Who's Who of Cloud Computing
- DDDW Tips and Tricks
- Cloud Expo 2011 East To Attract 10,000 Delegates and 200 Exhibitors
- Working with SOA & Web Services in PowerBuilder
- Dynamically Creating DataWindow Objects
- Cloud Expo, Inc. Announces Cloud Expo 2011 New York Venue
- OLE - Extending the Capabilities of PowerBuilder