[strongSwan] a problem about size of the filelog

xiaomei xie almacomeon at gmail.com
Sun Apr 7 09:53:27 CEST 2013


Hi all,
I have followed the steps mentioned in wiki to create a filelog, but the
filelog is too big so that run out of the memory. For example, I used an
instrument to establish ipsec tunnels at the speed of 100 per second, the
filelog soon reached 1.5G .
Following is my strongswan.conf:
charon
{
load = openssl random x509 pubkey hmac xcbc stroke kernel-pfkey
kernel-netlink eap-radius socket-default dhcp ha
threads = 1000
ikesa_table_size = 2048
ikesa_table_segments = 128
half_open_timeout = 20
init_limit_half_open = 1000
cookie_threshold = 10000
plugins
{
dhcp
{
server = 10.2.6.255
}
}
filelog
{
/tmp/log/charon.log
{
time_format = %b %e %T
append = no
default = 1
flush_line = yes
}
stderr
{
default = -1
# prepend connection name, simplifies grepping
ike_name = yes
}
}
syslog
{
daemon
{
default = -1
}
auth
{
default = -1
}
}
}
I want to know how to limit the size of the filelog ?
Thank you for your help.
Yourth,
alma
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.strongswan.org/pipermail/users/attachments/20130407/ef5457e8/attachment.html>


More information about the Users mailing list