Jump to content

Search the Community

Showing results for tags 'terminal server'.



More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • W3Schools
    • General
    • Suggestions
    • Critiques
  • HTML Forums
    • HTML/XHTML
    • CSS
  • Browser Scripting
    • JavaScript
    • VBScript
  • Server Scripting
    • Web Servers
    • Version Control
    • SQL
    • ASP
    • PHP
    • .NET
    • ColdFusion
    • Java/JSP/J2EE
    • CGI
  • XML Forums
    • XML
    • XSLT/XSL-FO
    • Schema
    • Web Services
  • Multimedia
    • Multimedia
    • FLASH

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


AIM


MSN


Website URL


ICQ


Yahoo


Jabber


Skype


Location


Interests


Languages

Found 1 result

  1. Hi! Background:I am maintaining a Java application where I create a http server (using the class org.apache.http.protocol.HttpService). The http server creates a socket on port 6060. From an Ajax script executed in a browser on the same machine, a xmlhttp request is sent to localthost port 6060. The request is transmitted to the Java application and and response is sent, works fine. As long as the solution is executed on a client machine (desktop computer). The Java application executes in ONE instance since it is ONE machine with ONE user. The application is designed to handle requests from one user only. Problem:When the solution is executed on a terminal server (through Windows Remote Desktop Connection), each user will start a new instance of the Java application, that is if there are five users logged in the the terminal server, there will be five running instances of the Java application. The first started instance allocates port 6060 on localhost. But when the following instances tries to allocate port 6060 on localhost, it fails since it is already allocated by the first instance. I expected this to work, since I thought that each user on a terminal server executed in a own session with a memory and ports of its own, but obviously this is not the case. I have however heard that it should be possible to allocate a "virtual" port (i.e. unique for my session) instead of a machine physical port. Does anyone have a clue about how to solve this, I am in urgent need of advises... I insert parts of my Java code where I create the http server: ServerSocket mSocket = new ServerSocket( 6060);HttpParams mHttpParams = new SyncBasicHttpParams();mHttpParams .setIntParameter(CoreConnectionPNames.SO_TIMEOUT, DEFAULT_TIMOUT ) .setIntParameter(CoreConnectionPNames.SOCKET_BUFFER_SIZE, DEFAULT_BUFFER_SIZE ) .setBooleanParameter(CoreConnectionPNames.STALE_CONNECTION_CHECK, false ) .setBooleanParameter(CoreConnectionPNames.TCP_NODELAY, true ) .setParameter(CoreProtocolPNames.ORIGIN_SERVER, "HttpComponents/1.1") .setParameter(CoreProtocolPNames.HTTP_CONTENT_CHARSET, "UTF-8"); // Set up the HTTP protocol processorHttpProcessor httpproc = new ImmutableHttpProcessor(new HttpResponseInterceptor[] { new ResponseDate(), new ResponseServer(), new ResponseContent(), new ResponseConnControl() }); // Set up request handlersHttpRequestHandlerRegistry reqistry = new HttpRequestHandlerRegistry();reqistry.register("*", new GadgetRequestHandler( mGadgetIntegrator )); // Set up the HTTP HttpService mService = new HttpService(httpproc, new DefaultConnectionReuseStrategy(), new DefaultHttpResponseFactory(),reqistry,mHttpParams); Thanks!
×
×
  • Create New...