#include <unix.h>

Public Member Functions | |
| UnixSession (const char *pathname, int size=512, int pri=0, int stack=0) | |
| Create a Unix domain socket that will be connected to a local server server and that will execute under it's own thread. | |
| UnixSession (UnixSocket &server, int size=512, int pri=0, int stack=0) | |
| Create a Unix domain socket from a bound Unix domain server by accepting a pending connection from that server and execute a thread for the accepted connection. | |
| virtual | ~UnixSession () |
| Virtual destructor. | |
Protected Member Functions | |
| int | waitConnection (timeout_t timeout=TIMEOUT_INF) |
| Normally called during the thread Initial() method by default, this will wait for the socket connection to complete when connecting to a remote socket. | |
| void | initial (void) |
| The initial method is used to esablish a connection when delayed completion is used. | |
The Unix domain session also supports a non-blocking connection scheme which prevents blocking during the constructor and moving the process of completing a connection into the thread that executes for the session.
| ost::UnixSession::UnixSession | ( | const char * | pathname, | |
| int | size = 512, |
|||
| int | pri = 0, |
|||
| int | stack = 0 | |||
| ) |
Create a Unix domain socket that will be connected to a local server server and that will execute under it's own thread.
| pathname | path to socket | |
| size | of streaming buffer. | |
| pri | execution priority relative to parent. | |
| stack | allocation needed on some platforms. |
| ost::UnixSession::UnixSession | ( | UnixSocket & | server, | |
| int | size = 512, |
|||
| int | pri = 0, |
|||
| int | stack = 0 | |||
| ) |
Create a Unix domain socket from a bound Unix domain server by accepting a pending connection from that server and execute a thread for the accepted connection.
| server | unix domain socket to accept a connection from. | |
| size | of streaming buffer. | |
| pri | execution priority relative to parent. | |
| stack | allocation needed on some platforms. |
| virtual ost::UnixSession::~UnixSession | ( | ) | [virtual] |
Virtual destructor.
| int ost::UnixSession::waitConnection | ( | timeout_t | timeout = TIMEOUT_INF |
) | [protected] |
Normally called during the thread Initial() method by default, this will wait for the socket connection to complete when connecting to a remote socket.
One might wish to use setCompletion() to change the socket back to blocking I/O calls after the connection completes. To implement the session one must create a derived class which implements Run().
| timeout | to wait for completion in milliseconds. |
| void ost::UnixSession::initial | ( | void | ) | [protected, virtual] |
The initial method is used to esablish a connection when delayed completion is used.
This assures the constructor terminates without having to wait for a connection request to complete.
Reimplemented from ost::Thread.
1.5.4