[omniORB] Strange performance result
Zhao, Jason
jason.zhao at lmco.com
Fri May 19 17:01:37 BST 2006
Serguei,
Performance wise, 4.0.7 and 4.1.0 are roughly the same, enclosed are the
result numbers with three runs for each.
Jason
=============================
4.0.7
[root at TG-P echo]# ./OmniClient10
Testing with 128 bytes per message
Sender 12800000 bytes received at 89.8068 Mbits, usecs 1140226
Testing with 256 bytes per message
Sender 25600000 bytes received at 188.47 Mbits, usecs 1086644
Testing with 512 bytes per message
Sender 51200000 bytes received at 250.309 Mbits, usecs 1636379
Testing with 1024 bytes per message
Sender 102400000 bytes received at 431.162 Mbits, usecs 1899983
Testing with 2048 bytes per message
Sender 204800000 bytes received at 588.97 Mbits, usecs 2781804
Testing with 4096 bytes per message
Sender 409600000 bytes received at 675.266 Mbits, usecs 4852608
[root at TG-P echo]# ./OmniClient10
Testing with 128 bytes per message
Sender 12800000 bytes received at 85.7584 Mbits, usecs 1194052
Testing with 256 bytes per message
Sender 25600000 bytes received at 176.609 Mbits, usecs 1159627
Testing with 512 bytes per message
Sender 51200000 bytes received at 276.035 Mbits, usecs 1483872
Testing with 1024 bytes per message
Sender 102400000 bytes received at 442.753 Mbits, usecs 1850240
Testing with 2048 bytes per message
Sender 204800000 bytes received at 594.114 Mbits, usecs 2757722
Testing with 4096 bytes per message
Sender 409600000 bytes received at 672.995 Mbits, usecs 4868983
[root at TG-P echo]# ./OmniClient10
Testing with 128 bytes per message
Sender 12800000 bytes received at 84.7789 Mbits, usecs 1207848
Testing with 256 bytes per message
Sender 25600000 bytes received at 179.189 Mbits, usecs 1142927
Testing with 512 bytes per message
Sender 51200000 bytes received at 289.399 Mbits, usecs 1415349
Testing with 1024 bytes per message
Sender 102400000 bytes received at 429.372 Mbits, usecs 1907904
Testing with 2048 bytes per message
Sender 204800000 bytes received at 553.895 Mbits, usecs 2957959
Testing with 4096 bytes per message
Sender 409600000 bytes received at 674.378 Mbits, usecs 4858995
CPU (pay attention to the 9th column)
925 root 16 0 12280 2152 1936 R 9.0 0.4 0:00.09
OmniClient10
925 root 15 0 12280 2188 1972 S 34.7 0.4 0:00.44
OmniClient10
925 root 15 0 12280 2188 1972 S 40.5 0.4 0:00.85
OmniClient10
925 root 15 0 12280 2188 1972 S 36.6 0.4 0:01.22
OmniClient10
925 root 16 0 12280 2188 1972 R 35.6 0.4 0:01.58
OmniClient10
925 root 15 0 12280 2188 1972 S 37.8 0.4 0:01.96
OmniClient10
925 root 16 0 12280 2192 1972 R 49.3 0.4 0:02.46
OmniClient10
925 root 16 0 12280 2192 1972 R 59.7 0.4 0:03.06
OmniClient10
925 root 16 0 12280 2192 1972 R 50.8 0.4 0:03.57
OmniClient10
925 root 16 0 12280 2200 1972 R 48.7 0.4 0:04.06
OmniClient10
925 root 16 0 12280 2200 1972 R 49.7 0.4 0:04.56
OmniClient10
925 root 16 0 12280 2200 1972 R 41.8 0.4 0:04.98
OmniClient10
925 root 16 0 12280 2200 1972 R 41.2 0.4 0:05.40
OmniClient10
942 root 16 0 12280 2152 1936 R 5.0 0.4 0:00.05
OmniClient10
942 root 15 0 12280 2152 1936 S 32.8 0.4 0:00.38
OmniClient10
942 root 15 0 12280 2188 1972 S 34.7 0.4 0:00.73
OmniClient10
942 root 15 0 12280 2188 1972 S 34.7 0.4 0:01.08
OmniClient10
942 root 15 0 12280 2188 1972 S 32.6 0.4 0:01.41
OmniClient10
942 root 15 0 12280 2188 1972 S 24.8 0.4 0:01.66
OmniClient10
942 root 15 0 12280 2192 1972 S 26.7 0.4 0:01.93
OmniClient10
942 root 16 0 12280 2192 1972 R 44.7 0.4 0:02.38
OmniClient10
942 root 16 0 12280 2192 1972 R 39.2 0.4 0:02.78
OmniClient10
942 root 16 0 12280 2200 1972 R 39.8 0.4 0:03.18
OmniClient10
942 root 16 0 12280 2200 1972 R 47.6 0.4 0:03.66
OmniClient10
942 root 16 0 12280 2200 1972 R 50.3 0.4 0:04.17
OmniClient10
942 root 16 0 12280 2200 1972 R 56.8 0.4 0:04.74
OmniClient10
942 root 16 0 12280 2200 1972 R 51.1 0.4 0:05.26
OmniClient10
10652 root 19 0 28040 1784 1580 S 28.9 0.2 0:00.29 OmniServer
10652 root 19 0 28040 1840 1636 S 92.3 0.2 0:01.23 OmniServer
10652 root 19 0 28040 1840 1636 S 97.7 0.2 0:02.21 OmniServer
10652 root 19 0 28040 1840 1636 S 81.4 0.2 0:03.03 OmniServer
10652 root 19 0 28040 1840 1636 S 97.6 0.2 0:04.02 OmniServer
10652 root 19 0 28040 1840 1636 S 98.6 0.2 0:05.01 OmniServer
10652 root 19 0 28040 1840 1636 S 98.6 0.2 0:06.00 OmniServer
10652 root 19 0 28040 1840 1636 S 97.6 0.2 0:06.98 OmniServer
10652 root 19 0 28040 1840 1636 S 96.7 0.2 0:07.95 OmniServer
10652 root 19 0 28040 1840 1636 S 47.8 0.2 0:08.43 OmniServer
10652 root 19 0 28040 1844 1636 S 42.8 0.2 0:08.86 OmniServer
10652 root 19 0 28040 1844 1636 S 39.7 0.2 0:09.26 OmniServer
10652 root 19 0 28040 1844 1636 S 41.8 0.2 0:09.68 OmniServer
10652 root 19 0 28040 1864 1656 S 34.8 0.2 0:10.03 OmniServer
10652 root 19 0 28040 1884 1676 S 18.9 0.2 0:10.22 OmniServer
10652 root 19 0 28040 1884 1676 S 79.5 0.2 0:11.03 OmniServer
10652 root 19 0 28040 1884 1676 S 97.6 0.2 0:12.01 OmniServer
10652 root 19 0 28040 1884 1676 S 98.6 0.2 0:13.00 OmniServer
10652 root 19 0 28040 1884 1676 S 98.6 0.2 0:13.99 OmniServer
10652 root 19 0 28040 1884 1676 S 98.5 0.2 0:14.98 OmniServer
10652 root 19 0 28040 1884 1676 S 98.6 0.2 0:15.97 OmniServer
10652 root 19 0 28040 1884 1676 S 97.6 0.2 0:16.95 OmniServer
10652 root 19 0 28040 1884 1676 S 98.6 0.2 0:17.94 OmniServer
10652 root 19 0 28040 1884 1676 S 56.8 0.2 0:18.51 OmniServer
10652 root 19 0 28040 1884 1676 S 44.8 0.2 0:18.96 OmniServer
10652 root 19 0 28040 1884 1676 S 49.6 0.2 0:19.46 OmniServer
10652 root 19 0 28040 1884 1676 S 44.4 0.2 0:19.91 OmniServer
10652 root 19 0 28040 1884 1676 S 45.8 0.2 0:20.37 OmniServer
10652 root 19 0 28040 1884 1676 S 1.0 0.2 0:20.38 OmniServer
10652 root 19 0 28040 1884 1676 S 8.9 0.2 0:20.47 OmniServer
10652 root 19 0 28040 1884 1676 S 85.9 0.2 0:21.35 OmniServer
10652 root 19 0 28040 1884 1676 S 93.1 0.2 0:22.29 OmniServer
10652 root 19 0 28040 1884 1676 S 98.6 0.2 0:23.28 OmniServer
10652 root 19 0 28040 1884 1676 S 98.7 0.2 0:24.27 OmniServer
10652 root 19 0 28040 1884 1676 S 98.6 0.2 0:25.26 OmniServer
10652 root 19 0 28040 1884 1676 S 78.6 0.2 0:26.06 OmniServer
10652 root 19 0 28040 1884 1676 S 98.6 0.2 0:27.05 OmniServer
10652 root 19 0 28040 1884 1676 S 97.6 0.2 0:28.03 OmniServer
10652 root 19 0 28040 1884 1676 S 71.7 0.2 0:28.75 OmniServer
10652 root 19 0 28040 1884 1676 S 41.8 0.2 0:29.17 OmniServer
10652 root 19 0 28040 1884 1676 S 47.7 0.2 0:29.65 OmniServer
10652 root 19 0 28040 1884 1676 S 38.8 0.2 0:30.04 OmniServer
10652 root 19 0 28040 1884 1676 S 47.8 0.2 0:30.52 OmniServer
10652 root 19 0 28040 1884 1676 S 11.0 0.2 0:30.63 OmniServer
4.1.0
[root at TG-P echo]# ./OmniClient10
Testing with 128 bytes per message
Sender 12800000 bytes received at 83.8856 Mbits, usecs 1220710
Testing with 256 bytes per message
Sender 25600000 bytes received at 129.019 Mbits, usecs 1587368
Testing with 512 bytes per message
Sender 51200000 bytes received at 284.193 Mbits, usecs 1441272
Testing with 1024 bytes per message
Sender 102400000 bytes received at 450.92 Mbits, usecs 1816732
Testing with 2048 bytes per message
Sender 204800000 bytes received at 557.493 Mbits, usecs 2938873
Testing with 4096 bytes per message
Sender 409600000 bytes received at 672.151 Mbits, usecs 4875092
[root at TG-P echo]# ./OmniClient10
Testing with 128 bytes per message
Sender 12800000 bytes received at 106.873 Mbits, usecs 958143
Testing with 256 bytes per message
Sender 25600000 bytes received at 177.25 Mbits, usecs 1155430
Testing with 512 bytes per message
Sender 51200000 bytes received at 254.713 Mbits, usecs 1608084
Testing with 1024 bytes per message
Sender 102400000 bytes received at 437.956 Mbits, usecs 1870506
Testing with 2048 bytes per message
Sender 204800000 bytes received at 599.461 Mbits, usecs 2733121
Testing with 4096 bytes per message
Sender 409600000 bytes received at 666.359 Mbits, usecs 4917467
[root at TG-P echo]# ./OmniClient10
Testing with 128 bytes per message
Sender 12800000 bytes received at 86.9961 Mbits, usecs 1177064
Testing with 256 bytes per message
Sender 25600000 bytes received at 191.412 Mbits, usecs 1069943
Testing with 512 bytes per message
Sender 51200000 bytes received at 282.982 Mbits, usecs 1447442
Testing with 1024 bytes per message
Sender 102400000 bytes received at 425.267 Mbits, usecs 1926319
Testing with 2048 bytes per message
Sender 204800000 bytes received at 601.151 Mbits, usecs 2725437
Testing with 4096 bytes per message
Sender 409600000 bytes received at 671.903 Mbits, usecs 4876895
CPU
26274 root 15 0 12368 2216 2000 S 20.9 0.4 0:00.21
OmniClient10
26274 root 15 0 12368 2252 2036 S 21.8 0.4 0:00.43
OmniClient10
26274 root 15 0 12368 2252 2036 S 25.8 0.4 0:00.69
OmniClient10
26274 root 15 0 12368 2252 2036 S 31.7 0.4 0:01.01
OmniClient10
26274 root 16 0 12368 2256 2036 R 27.7 0.4 0:01.29
OmniClient10
26274 root 16 0 12368 2256 2036 R 24.9 0.4 0:01.54
OmniClient10
26274 root 16 0 12368 2260 2036 R 29.7 0.4 0:01.84
OmniClient10
26274 root 16 0 12368 2260 2036 R 41.3 0.4 0:02.26
OmniClient10
26274 root 16 0 12368 2260 2036 R 42.6 0.4 0:02.69
OmniClient10
26274 root 16 0 12368 2264 2036 R 43.7 0.4 0:03.13
OmniClient10
26274 root 16 0 12368 2264 2036 R 42.8 0.4 0:03.56
OmniClient10
26274 root 15 0 12368 2264 2036 S 47.6 0.4 0:04.04
OmniClient10
26274 root 16 0 12368 2264 2036 R 40.6 0.4 0:04.46
OmniClient10
26274 root 16 0 12368 2264 2036 R 42.8 0.4 0:04.89
OmniClient10
26276 root 15 0 12368 2216 2000 S 24.8 0.4 0:00.25
OmniClient10
26276 root 15 0 12368 2252 2036 S 35.7 0.4 0:00.61
OmniClient10
26276 root 15 0 12368 2252 2036 S 30.7 0.4 0:00.92
OmniClient10
26276 root 15 0 12368 2252 2036 S 26.7 0.4 0:01.19
OmniClient10
26276 root 16 0 12368 2256 2036 R 27.7 0.4 0:01.47
OmniClient10
26276 root 16 0 12368 2260 2036 R 35.6 0.4 0:01.83
OmniClient10
26276 root 16 0 12368 2260 2036 R 39.4 0.4 0:02.23
OmniClient10
26276 root 16 0 12368 2260 2036 R 45.7 0.4 0:02.69
OmniClient10
26276 root 16 0 12368 2264 2036 R 46.6 0.4 0:03.16
OmniClient10
26276 root 16 0 12368 2264 2036 R 47.7 0.4 0:03.64
OmniClient10
26276 root 16 0 12368 2264 2036 R 44.5 0.4 0:04.09
OmniClient10
26276 root 16 0 12368 2264 2036 R 39.4 0.4 0:04.49
OmniClient10
26276 root 16 0 12368 2264 2036 R 43.8 0.4 0:04.93
OmniClient10
26293 root 15 0 12372 2216 2000 S 16.9 0.4 0:00.17
OmniClient10
26293 root 15 0 12372 2252 2036 S 37.6 0.4 0:00.55
OmniClient10
26293 root 16 0 12372 2252 2036 R 37.6 0.4 0:00.93
OmniClient10
26293 root 15 0 12372 2252 2036 S 33.8 0.4 0:01.27
OmniClient10
26293 root 16 0 12372 2256 2036 R 28.6 0.4 0:01.56
OmniClient10
26293 root 16 0 12372 2260 2036 R 30.7 0.4 0:01.87
OmniClient10
26293 root 15 0 12372 2260 2036 S 45.5 0.4 0:02.33
OmniClient10
26293 root 16 0 12372 2260 2036 R 48.3 0.4 0:02.82
OmniClient10
26293 root 16 0 12372 2264 2036 R 49.6 0.4 0:03.32
OmniClient10
26293 root 15 0 12372 2264 2036 S 50.7 0.4 0:03.83
OmniClient10
26293 root 16 0 12372 2264 2036 R 52.8 0.4 0:04.37
OmniClient10
26293 root 16 0 12372 2264 2036 R 46.7 0.4 0:04.84
OmniClient10
26293 root 16 0 12372 2264 2036 R 39.8 0.4 0:05.24
OmniClient10
28377 root 19 0 28076 1932 1720 S 6.0 0.2 0:30.27
OmniServer10
28377 root 19 0 28076 1932 1720 S 89.6 0.2 0:31.17
OmniServer10
28377 root 19 0 28076 1932 1720 S 61.7 0.2 0:31.79
OmniServer10
28377 root 19 0 28076 1932 1720 S 87.0 0.2 0:32.68
OmniServer10
28377 root 19 0 28076 1932 1720 S 99.6 0.2 0:33.68
OmniServer10
28377 root 19 0 28076 1932 1720 S 97.6 0.2 0:34.66
OmniServer10
28377 root 19 0 28076 1932 1720 S 97.6 0.2 0:35.64
OmniServer10
28377 root 19 0 28076 1932 1720 S 81.5 0.2 0:36.46
OmniServer10
28377 root 19 0 28076 1932 1720 S 94.6 0.2 0:37.42
OmniServer10
28377 root 19 0 28076 1932 1720 S 89.6 0.2 0:38.32
OmniServer10
28377 root 19 0 28076 1932 1720 S 44.7 0.2 0:38.77
OmniServer10
28377 root 19 0 28076 1936 1720 S 45.7 0.2 0:39.23
OmniServer10
28377 root 19 0 28076 1936 1720 S 44.8 0.2 0:39.68
OmniServer10
28377 root 19 0 28076 1936 1720 S 45.7 0.2 0:40.14
OmniServer10
28377 root 19 0 28076 1936 1720 S 34.9 0.2 0:40.49
OmniServer10
28377 root 19 0 28076 1940 1720 S 1.0 0.2 0:40.50
OmniServer10
28377 root 19 0 28076 1940 1720 S 98.0 0.2 0:41.49
OmniServer10
28377 root 19 0 28076 1940 1720 S 98.6 0.2 0:42.48
OmniServer10
28377 root 19 0 28076 1940 1720 S 88.6 0.2 0:43.37
OmniServer10
28377 root 19 0 28076 1940 1720 S 88.9 0.2 0:44.28
OmniServer10
28377 root 19 0 28076 1940 1720 S 98.6 0.2 0:45.27
OmniServer10
28377 root 19 0 28076 1940 1720 S 97.6 0.2 0:46.25
OmniServer10
28377 root 19 0 28076 1940 1720 S 97.6 0.2 0:47.23
OmniServer10
28377 root 19 0 28076 1940 1720 S 97.6 0.2 0:48.21
OmniServer10
28377 root 19 0 28076 1944 1720 S 54.8 0.2 0:48.76
OmniServer10
28377 root 19 0 28076 1944 1720 S 42.7 0.2 0:49.19
OmniServer10
28377 root 19 0 28076 1944 1720 S 45.8 0.2 0:49.65
OmniServer10
28377 root 19 0 28076 1944 1720 S 43.8 0.2 0:50.09
OmniServer10
28377 root 19 0 28076 1944 1720 S 45.7 0.2 0:50.55
OmniServer10
28377 root 19 0 28076 1944 1720 S 4.0 0.2 0:50.59
OmniServer10
28377 root 19 0 28076 1944 1720 S 1.0 0.2 0:50.60
OmniServer10
28377 root 19 0 28076 1944 1720 S 79.2 0.2 0:51.40
OmniServer10
28377 root 19 0 28076 1944 1720 S 98.6 0.2 0:52.40
OmniServer10
28377 root 19 0 28076 1944 1720 S 97.6 0.2 0:53.38
OmniServer10
28377 root 19 0 28076 1944 1720 S 97.6 0.2 0:54.36
OmniServer10
28377 root 19 0 28076 1944 1720 S 97.6 0.2 0:55.34
OmniServer10
28377 root 19 0 28076 1944 1720 S 97.5 0.2 0:56.32
OmniServer10
28377 root 19 0 28076 1944 1720 S 98.6 0.2 0:57.31
OmniServer10
28377 root 19 0 28076 1944 1720 S 97.6 0.2 0:58.29
OmniServer10
28377 root 19 0 28076 1944 1720 S 57.7 0.2 0:58.87
OmniServer10
28377 root 19 0 28076 1944 1720 S 42.7 0.2 0:59.30
OmniServer10
28377 root 19 0 28076 1944 1720 S 43.8 0.2 0:59.74
OmniServer10
28377 root 19 0 28076 1944 1720 S 42.8 0.2 1:00.17
OmniServer10
28377 root 19 0 28076 1944 1720 S 41.7 0.2 1:00.59
OmniServer10
28377 root 19 0 28076 1944 1720 S 6.9 0.2 1:00.66
OmniServer10
-----Original Message-----
From: Serguei Kolos [mailto:Serguei.Kolos at cern.ch]
Sent: Friday, May 19, 2006 11:36 AM
To: Zhao, Jason
Cc: omniorb-list at omniorb-support.com
Subject: Re: [omniORB] Strange performance result
Hello
For our project we are still using still the latest stable omniORB
version (4.0.7). We plan to move to the 4.1.x branch as soon as it will
become the mainstream version. We have very strong performance
requirements and I wander if you notice any difference in performance
between the 4.0.7 and 4.1.0 versions, at least in the tests which were
running stably.
Cheers,
Serguei
Zhao, Jason wrote:
>Please ignore my previous message. We installed latest release 4.1.0
>beta 2 on Linux 2.6.12 (no smp) and ran several tests. This time the
>results are consistent.
>
>Jason
>
>-----Original Message-----
>From: omniorb-list-bounces at omniorb-support.com
>[mailto:omniorb-list-bounces at omniorb-support.com] On Behalf Of Zhao,
>Jason
>Sent: Thursday, May 18, 2006 11:25 AM
>To: omniorb-list at omniorb-support.com
>Subject: [omniORB] Strange performance result
>
>Hi,
>
>I'm sorry if this problem looks complicated.
>
>I used a simple idl to test omniorb's performance on oneway calls to
>pass data between two machines. When the bandwidth between the two
>machines is less than 150Mbps, the performance results are consistent:
>different runs produced similar numbers and within in the same run,
>larger message sizes result in higher application level throughput. But
>when the bandwidth between the two machines are set to be higher than
>150Mbps (using Linux traffic control and physical interface is gig
>Ethernet), the result numbers are unpredictable: not consistent across
>multiple runs and even within the same run, larger message sizes some
>times result in lower application level throughput.
>
>I tested ACE TAO using the same idl and under the same environment, the
>results are always consistent either across runs or within the same
run.
>Does anyone know what might caused omni to behave unpredictably when
>the bandwidth is higher than 150Mbps (cpu utilization was not very high
>in those cases)?
>
>Thank you.
>
>Jason
>
>
>Configuration:
>
>OMNIORB_4_1_0_BETA_1 snapshot got from
>http://omniorb.sourceforge.net/snapshots/omniORB-4.1-latest.tar.gz on
>05/15
>Ipv6
>Linux version 2.4.21-4.ELsmp (bhcompile at daffy.perf.redhat.com) (gcc
>version 3.2.3 20030502 (Red Hat Linux 3.2.3-20)) #1 SMP Fri Oct 3
>17:52:56 EDT 2003 CPU * Intel Xeon 2.66Ghz 533Mhz FSB 512K Cache MEMORY
>* 2x 512MB DDR266 PC2100 ECC Reg Memory
>
>===============
>Here is the idl used (from ACE TAO's performance test suit), I tested
>message sizes of 128, 256, 512, 1024, 2048, and 4096 bytes, sent 10000
>messages for each size and calculated application level throughput per
>message size.
>
>module Test
>{
> /// The data payload
> typedef sequence<octet> Payload;
> struct Message {
> unsigned long message_id;
> Payload the_payload;
> };
>
> /// Implement a simple interface to receive a lot of data interface
> Receiver {
> /// Receive a big payload
> oneway void receive_data (in Message the_message);
>
> /// All the data has been sent, print out performance data
> void done ();
> };
>
> /// Implement a factory to create Receivers interface
> Receiver_Factory {
> /// Create a new receiver
> Receiver create_receiver ();
>
> /// Shutdown the application
> oneway void shutdown ();
> };
>};
>
>================
>Here is the omniorb configuration file used (comments removed for
>message size issue), basically I make the server side single-threaded
>so that messages were processed in the order they were sent.
>
>traceLevel = 0
>traceExceptions = 0
>traceInvocations = 0
>traceInvocationReturns = 0
>traceThreadId = 0
>traceTime = 0
>dumpConfiguration = 0
>maxGIOPVersion = 1.2
>giopMaxMsgSize = 2097152 # 2 MBytes.
>strictIIOP = 0
>tcAliasExpand = 0
>useTypeCodeIndirections = 1
>acceptMisalignedTcIndirections = 0
>scanGranularity = 5
>nativeCharCodeSet = ISO-8859-1
>nativeWCharCodeSet = UTF-16
>abortOnInternalError = 0
>abortOnNativeException = 0
>InitRef = NameService=corbaname::[2001:411:2:6:2::]
>DefaultInitRef = corbaloc::[2001:411:2:6:2::] clientCallTimeOutPeriod =
>0 clientConnectTimeOutPeriod = 0 supportPerThreadTimeOut = 0
>outConScanPeriod = 120 maxGIOPConnectionPerServer = 1
>oneCallPerConnection = 1 offerBiDirectionalGIOP = 0
>diiThrowsSysExceptions = 0 verifyObjectExistsAndType = 0
>giopTargetAddressMode = 0 bootstrapAgentPort = 900 endPoint =
>giop:tcp:[2001:411:2:6:2::]:20000 serverCallTimeOutPeriod = 0
>inConScanPeriod = 180 threadPerConnectionPolicy = 1
>maxServerThreadPerConnection = 1 maxServerThreadPoolSize = 100
>threadPerConnectionUpperLimit = 10000 threadPerConnectionLowerLimit =
>9000 threadPoolWatchConnection = 1 connectionWatchPeriod = 50000
>connectionWatchImmediate = 0 acceptBiDirectionalGIOP = 0
>unixTransportDirectory = /tmp/omni-%u unixTransportPermission = 0777
>supportCurrent = 0 copyValuesInLocalCalls = 0 objectTableSize = 100
>poaHoldRequestTimeout = 0 poaUniquePersistentSystemIds = 1
>supportBootstrapAgent = 0
>
>_______________________________________________
>omniORB-list mailing list
>omniORB-list at omniorb-support.com
>http://www.omniorb-support.com/mailman/listinfo/omniorb-list
>
>_______________________________________________
>omniORB-list mailing list
>omniORB-list at omniorb-support.com
>http://www.omniorb-support.com/mailman/listinfo/omniorb-list
>
>
More information about the omniORB-list
mailing list