Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- Host: Linux snabb2 4.4.31 x86_64 Intel(R) Xeon(R) CPU E5-2620 v3 @ 2.40GHz
- Image: lwaftr/snabb-test
- Pull Request: #1172
- Target Head: 3ac14826c639f54ecf0b8c0fb0e033a0c2ed00f9
- Pull Request Head: e635b96cf8e22e8e3d48048f99351fbd8022aecc
- SNABB_PCI0=
- SNABB_PCI1=
- Checking test suite:
- TEST core.app
- TEST core.histogram
- TEST core.worker
- TEST core.link
- TEST core.timer
- TEST core.memory
- TEST core.lib
- TEST core.packet
- TEST core.shm
- TEST core.main
- TEST core.counter
- TEST apps.rss.rss
- TEST apps.bridge.mac_table
- TEST apps.intel.intel_app
- SKIPPED testlog/apps.intel.intel_app
- TEST apps.ipv6.fragment
- TEST apps.ipv6.reassemble
- TEST apps.ipv6.nd_light
- TEST apps.vhost.vhost_user
- SKIPPED testlog/apps.vhost.vhost_user
- TEST apps.packet_filter.pcap_filter
- TEST apps.keyed_ipv6_tunnel.tunnel
- TEST apps.ipfix.ipfix
- TEST apps.ipfix.template
- TEST apps.test.synth
- TEST apps.test.match
- TEST apps.virtio_net.virtio_net
- SKIPPED testlog/apps.virtio_net.virtio_net
- TEST apps.rate_limiter.rate_limiter
- TEST apps.lwaftr.lwdebug
- TEST apps.lwaftr.nh_fwd
- TEST apps.lwaftr.V4V6
- TEST apps.lwaftr.ctable_wrapper
- TEST apps.lwaftr.ndp
- TEST apps.lwaftr.binding_table
- TEST apps.lwaftr.rangemap
- TEST apps.pcap.tap
- TEST apps.tap.tap
- SKIPPED testlog/apps.tap.tap
- TEST apps.example.asm
- TEST apps.vlan.vlan
- TEST apps.socket.unix
- TEST apps.socket.raw
- TEST apps.wall.scanner
- TEST apps.wall.l7fw
- TEST apps.ipv4.fragment
- TEST apps.ipv4.reassemble
- TEST apps.ipv4.arp
- TEST program.l2vpn.control_channel
- TEST program.l2vpn.pseudowire
- TEST program.lwaftr.quickcheck.utils
- TEST program.lwaftr.tests.propbased.genyang
- TEST program.snabbnfv.nfvconfig
- TEST program.snabbnfv.neutron2snabb.neutron2snabb_schema
- TEST program.snabbnfv.neutron2snabb.neutron2snabb
- TEST lib.binary_search
- TEST lib.hash.murmur
- TEST lib.hash.siphash
- TEST lib.rrd
- TEST lib.checksum
- TEST lib.stream
- TEST lib.traceprof.traceprof
- TEST lib.xsd_regexp
- TEST lib.maxpc
- TEST lib.lpm.lpm
- TEST lib.lpm.lpm4_248
- TEST lib.lpm.lpm4
- TEST lib.lpm.lpm4_poptrie
- TEST lib.lpm.lpm4_dxr
- TEST lib.lpm.ip6
- TEST lib.lpm.lpm4_trie
- TEST lib.lpm.ip4
- TEST lib.ctable
- ERROR testlog/lib.ctable
- ERROR during tests:
- src/testlog/apps.bridge.mac_table:
- Sep 20 2018 15:55:13 mac_table: resizing from 512 to 2048 hash buckets, new target size 1024 (926 MAC entries, old target size 256, size/bucket overflow: true/true)
- src/testlog/apps.example.asm:
- selftest: asm
- magic number: 0xdeadbeef
- selftest: ok
- src/testlog/apps.intel.intel_app:
- selftest: intel_app
- SNABB_PCI_INTEL[0|1] not set or not suitable.
- EXITCODE: 43
- src/testlog/apps.ipfix.ipfix:
- selftest: apps.ipfix.ipfix
- selftest ok
- src/testlog/apps.ipfix.template:
- selftest: apps.ipfix.template
- selftest ok
- src/testlog/apps.ipv4.arp:
- selftest: arp
- ARP: Resolving '5.6.7.8'
- ARP: '5.6.7.8' resolved (11:22:33:44:55:66)
- selftest ok
- src/testlog/apps.ipv4.fragment:
- selftest: apps.ipv4.fragment
- selftest: ok
- src/testlog/apps.ipv4.reassemble:
- selftest: apps.ipv4.reassemble
- selftest: ok
- src/testlog/apps.ipv6.fragment:
- selftest: apps.ipv6.fragment
- selftest: ok
- src/testlog/apps.ipv6.nd_light:
- Sep 20 2018 15:55:15 nd_light: Sending neighbor solicitation for next-hop 2001:db8::2
- Sep 20 2018 15:55:15 nd_light: Sending neighbor solicitation for next-hop 2001:db8::1
- Sep 20 2018 15:55:15 nd_light: Resolved next-hop 2001:db8::2 to 00:00:00:00:00:02
- Sep 20 2018 15:55:15 nd_light: Resolved next-hop 2001:db8::1 to 00:00:00:00:00:01
- Sep 20 2018 15:55:16 nd_light: Sending neighbor solicitation for next-hop 2001:db8::2
- Sep 20 2018 15:55:16 nd_light: Sending neighbor solicitation for next-hop 2001:db8::1
- src/testlog/apps.ipv6.reassemble:
- selftest: apps.ipv6.reassemble
- selftest: ok
- src/testlog/apps.keyed_ipv6_tunnel.tunnel:
- Keyed IPv6 tunnel selftest
- link report:
- 102 sent on comparator.output -> match.comparator (loss rate: 0%)
- 102 sent on source.output -> tunnel.decapsulated (loss rate: 0%)
- 102 sent on tunnel.decapsulated -> match.rx (loss rate: 0%)
- 102 sent on tunnel.encapsulated -> tunnel.encapsulated (loss rate: 0%)
- apps report:
- match
- run simple one second benchmark ...
- selftest passed
- src/testlog/apps.lwaftr.binding_table:
- selftest: binding_table
- ok
- src/testlog/apps.lwaftr.ctable_wrapper:
- selftest: apps.lwaftr.ctable_wrapper
- selftest: ok
- src/testlog/apps.lwaftr.lwdebug:
- src/testlog/apps.lwaftr.ndp:
- selftest: ndp
- NDP: Resolving '2001:db8::2'
- NDP: Resolving '2001:db8::1'
- NDP: '2001:db8::2' resolved (de:df:dd:a5:63:03)
- NDP: '2001:db8::1' resolved (26:59:60:9e:15:be)
- selftest: ok
- src/testlog/apps.lwaftr.nh_fwd:
- nh_fwd: selftest
- nh_fwd4: cache_refresh_interval set to 0 seconds
- nh_fwd4: cache_refresh_interval set to 0 seconds
- load: time: 0.10s fps: 19 fpGbps: 0.000 fpb: 0 bpp: 57 sleep: 100 us
- nh_fwd4: cache_refresh_interval set to 0 seconds
- nh_fwd4: static next_hop_mac 52:54:00:00:00:02
- load: time: 0.10s fps: 9 fpGbps: 0.000 fpb: 0 bpp: 72 sleep: 100 us
- nh_fwd4: cache_refresh_interval set to 0 seconds
- load: time: 0.10s fps: 9 fpGbps: 0.000 fpb: 0 bpp: 72 sleep: 100 us
- nh_fwd6: cache_refresh_interval set to 0 seconds
- load: time: 0.10s fps: 19 fpGbps: 0.000 fpb: 0 bpp: 86 sleep: 100 us
- nh_fwd6: cache_refresh_interval set to 0 seconds
- load: time: 0.10s fps: 19 fpGbps: 0.000 fpb: 0 bpp: 86 sleep: 100 us
- nh_fwd6: cache_refresh_interval set to 0 seconds
- nh_fwd6: static next_hop_mac 52:54:00:00:00:02
- load: time: 0.10s fps: 9 fpGbps: 0.000 fpb: 0 bpp: 86 sleep: 100 us
- nh_fwd6: cache_refresh_interval set to 0 seconds
- load: time: 0.10s fps: 9 fpGbps: 0.000 fpb: 0 bpp: 86 sleep: 100 us
- src/testlog/apps.lwaftr.rangemap:
- No PMU available: single core cpu affinity required
- lookup: 5.67 ns per iteration (result: 0)
- src/testlog/apps.lwaftr.V4V6:
- V4V6: selftest
- load: time: 0.10s fps: 29 fpGbps: 0.000 fpb: 0 bpp: 71 sleep: 100 us
- OK
- src/testlog/apps.packet_filter.pcap_filter:
- selftest: pcap_filter
- Run for 1 second (stateful = false)...
- link report:
- 406,062 sent on pcap_filter.output -> sink.input (loss rate: 0%)
- 10,896,048 sent on repeater.output -> pcap_filter.input (loss rate: 0%)
- 161 sent on source.output -> repeater.input (loss rate: 0%)
- ok: accepted 3.7267% of inputs (within tolerance)
- Run for 1 second (stateful = true)...
- link report:
- 524,089 sent on pcap_filter.output -> sink.input (loss rate: 0%)
- 7,031,574 sent on repeater.output -> pcap_filter.input (loss rate: 0%)
- 161 sent on source.output -> repeater.input (loss rate: 0%)
- ok: accepted 7.4534% of inputs (within tolerance)
- Run for 1 second (stateful = false)...
- link report:
- 393,185 sent on pcap_filter.output -> sink.input (loss rate: 0%)
- 10,550,472 sent on repeater.output -> pcap_filter.input (loss rate: 0%)
- 161 sent on source.output -> repeater.input (loss rate: 0%)
- ok: accepted 3.7267% of inputs (within tolerance)
- Run for 1 second (stateful = true)...
- link report:
- 628,541 sent on pcap_filter.output -> sink.input (loss rate: 0%)
- 8,432,952 sent on repeater.output -> pcap_filter.input (loss rate: 0%)
- 161 sent on source.output -> repeater.input (loss rate: 0%)
- ok: accepted 7.4534% of inputs (within tolerance)
- selftest: ok
- src/testlog/apps.pcap.tap:
- selftest: apps.pcap.tap
- selftest: ok
- src/testlog/apps.rate_limiter.rate_limiter:
- Rate limiter selftest
- test effective rate, non-busy loop
- load: time: 1.00s fps: 1,477,218 fpGbps: 0.815 fpb: 102 bpp: 60 sleep: 0 us
- load: time: 1.00s fps: 1,475,377 fpGbps: 0.814 fpb: 102 bpp: 60 sleep: 0 us
- load: time: 1.00s fps: 1,474,940 fpGbps: 0.814 fpb: 102 bpp: 60 sleep: 0 us
- load: time: 1.00s fps: 1,476,651 fpGbps: 0.815 fpb: 102 bpp: 60 sleep: 0 us
- load: time: 0.00s fps: 0 fpGbps: 0.000 fpb: NaN bpp: - sleep: 0 us
- configured rate is 200000 bytes per second
- effective rate is 209986 bytes per second
- measure throughput on heavy load...
- elapsed time 0.299328008 seconds
- packets received 10200000 34 Mpps
- configured rate is 1200000000 bytes per second
- effective rate is 1600199126 bytes per second
- throughput is 26 Mpps
- selftest passed
- src/testlog/apps.rss.rss:
- link report:
- 309,535 sent on rss.default_1 -> sink11.input (loss rate: 0%)
- 306,870 sent on rss.default_2 -> sink12.input (loss rate: 0%)
- 308,034 sent on rss.default_3 -> sink13.input (loss rate: 0%)
- 308,537 sent on rss.default_4 -> sink14.input (loss rate: 0%)
- 154,050 sent on rss.ip6_1 -> sink31.input (loss rate: 0%)
- 155,160 sent on rss.ip6_2 -> sink32.input (loss rate: 0%)
- 154,376 sent on rss.ip6_3 -> sink33.input (loss rate: 0%)
- 153,743 sent on rss.ip6_4 -> sink34.input (loss rate: 0%)
- 154,161 sent on rss.ip_1 -> sink21.input (loss rate: 0%)
- 153,127 sent on rss.ip_2 -> sink22.input (loss rate: 0%)
- 153,984 sent on rss.ip_3 -> sink23.input (loss rate: 0%)
- 154,375 sent on rss.ip_4 -> sink24.input (loss rate: 0%)
- 616,488 sent on source1.output -> rss.input_plain (loss rate: 0%)
- 616,488 sent on source2.output -> vlan.input (loss rate: 0%)
- 616,488 sent on vlan.output -> rss.input_vlan (loss rate: 0%)
- src/testlog/apps.socket.raw:
- link report:
- 1 sent on lo.tx -> match.rx (loss rate: 0%)
- apps report:
- match
- selftest passed
- src/testlog/apps.socket.unix:
- selftest: socket/unix
- link report:
- 1,773 sent on client.tx -> check_client_tx.rx (loss rate: 0%)
- 181,050 sent on say_hello.tx -> client.rx (loss rate: 0%)
- 1,774 sent on server.tx -> server.rx (loss rate: 0%)
- selftest: done
- src/testlog/apps.tap.tap:
- EXITCODE: 43
- src/testlog/apps.test.match:
- load: time: 0.01s fps: 55,148 fpGbps: 0.024 fpb: 307 bpp: 8 sleep: 0 us
- load: time: 0.00s fps: 82,667 fpGbps: 0.036 fpb: 104 bpp: 8 sleep: 0 us
- load: time: 0.00s fps: 147,206 fpGbps: 0.065 fpb: 308 bpp: 9 sleep: 0 us
- src/testlog/apps.test.synth:
- link report:
- 6 sent on reader.output -> match.comparator (loss rate: 0%)
- 102 sent on synth.output -> match.rx (loss rate: 0%)
- apps report:
- match
- src/testlog/apps.vhost.vhost_user:
- selftest: vhost_user
- SNABB_TEST_VHOST_USER_SOCKET was not set
- Test skipped
- EXITCODE: 43
- src/testlog/apps.virtio_net.virtio_net:
- SNABB_TEST_VIRTIO_PCIDEV was not set
- Test skipped
- EXITCODE: 43
- src/testlog/apps.vlan.vlan:
- vlan sent: 14162088
- native sent: 14162088
- trunk received: 28324176
- trunk sent: 14162088
- native received: 14162088
- Sucessfully tagged/untagged all potential VLAN tags (0-4095)
- src/testlog/apps.wall.l7fw:
- selftest ok
- src/testlog/apps.wall.scanner:
- selftest ok
- src/testlog/core.app:
- Restarting app2 (died at 7923.846949: core/app.lua:704: Push error.)
- Restarting app1 (died at 7923.846949: core/app.lua:698: Pull error.)
- Restarting app2 (died at 7925.847037: core/app.lua:704: Push error.)
- Restarting app1 (died at 7925.847037: core/app.lua:698: Pull error.)
- Restarting app3 (died at 7927.846797: core/app.lua:710: Report error.)
- Restarting app2 (died at 7927.847113: core/app.lua:704: Push error.)
- Restarting app1 (died at 7927.847113: core/app.lua:698: Pull error.)
- selftest: app
- empty -> c1
- c1 -> c1
- c1 -> c2
- c2 -> c1
- c1 -> empty
- c_fail
- apps report:
- app3
- app2 [dead: core/app.lua:704: Push error.]
- app1 [dead: core/app.lua:698: Pull error.]
- apps report:
- app3
- app2 [dead: core/app.lua:704: Push error.]
- app1 [dead: core/app.lua:698: Pull error.]
- src/testlog/core.counter:
- selftest: core.counter
- selftest ok
- src/testlog/core.histogram:
- selftest: histogram
- selftest ok
- src/testlog/core.lib:
- selftest: lib
- Testing equal
- Testing load_string
- Testing load/store_conf
- Testing csum
- Testing hex(un)dump
- Testing ntohl
- Testing parse
- src/testlog/core.link:
- selftest: link
- [mounting /var/run/snabb/hugetlbfs]
- selftest OK
- src/testlog/core.main:
- selftest
- src/testlog/core.memory:
- selftest: memory
- Kernel vm.nr_hugepages: 10000
- Allocating a 2MB HugeTLB:Got 2MB
- Physical address: 0x000e82e00000
- Virtual address: 0x500e82e00000
- Allocating a 2MB HugeTLB:Got 2MB
- Physical address: 0x000e69200000
- Virtual address: 0x500e69200000
- Allocating a 2MB HugeTLB:Got 2MB
- Physical address: 0x000e89e00000
- Virtual address: 0x500e89e00000
- Allocating a 2MB HugeTLB:Got 2MB
- Physical address: 0x000e40000000
- Virtual address: 0x500e40000000
- Kernel vm.nr_hugepages: 10000
- Testing automatic remapping of DMA memory
- Unmapping cdata<char *>: 0x500e82e00000
- Unmapping cdata<char *>: 0x500e69200000
- Unmapping cdata<char *>: 0x500e89e00000
- Unmapping cdata<char *>: 0x500e40000000
- Writing cdata<char *>: 0x500e82e00000
- Writing cdata<char *>: 0x500e69200000
- Writing cdata<char *>: 0x500e89e00000
- Writing cdata<char *>: 0x500e40000000
- Created 4 on-demand memory mappings with SIGSEGV handler.
- HugeTLB page allocation OK.
- src/testlog/core.packet:
- src/testlog/core.shm:
- selftest: shm
- checking resolve..
- checking shared memory..
- create shm/selftest/obj
- checking exists..
- checking many objects..
- 10000 objects created
- 10000 objects unmapped
- selftest ok
- src/testlog/core.timer:
- selftest: timer
- ok (973,855 callbacks in 0.1755 seconds)
- src/testlog/core.worker:
- selftest: worker
- Starting children
- Worker status:
- worker w3: pid=87 alive=true
- worker w1: pid=85 alive=true
- worker w2: pid=86 alive=true
- Stopping children
- Worker status:
- worker w3: pid=87 alive=false
- worker w1: pid=85 alive=false
- worker w2: pid=86 alive=false
- selftest: done
- src/testlog/lib.binary_search:
- selftest: binary_search
- selftest: ok
- src/testlog/lib.checksum:
- selftest: checksum
- avx2: 1000/1000
- sse2: 1000/1000
- selftest: tcp/ipv4
- selftest: ok
- src/testlog/lib.ctable:
- selftest: ctable
- lib/ctable.lua:526: at 3104841: hash check: expected 1391311878==found 0
- Stack Traceback
- ===============
- (1) Lua function 'handler' at file 'core/main.lua:168' (best guess)
- Local variables:
- reason = string: "lib/ctable.lua:526: at 3104841: hash check: expected 1391311878==found 0"
- (*temporary) = C function: print
- (2) global C function 'error'
- (3) Lua upvalue 'fail' at file 'lib/ctable.lua:526'
- Local variables:
- expected = number: 1.39131e+09
- op = string: "=="
- found = number: 0
- what = string: "hash"
- where = string: "at 3104841: "
- (4) Lua local 'expect_eq' at file 'lib/ctable.lua:529'
- Local variables:
- expected = number: 1.39131e+09
- found = number: 0
- what = string: "hash"
- where = number: 3.10484e+06
- (5) Lua method 'selfcheck' at file 'lib/ctable.lua:540'
- Local variables:
- self = table: 0x40cac938 {scale:0.0011641532185404, occupancy_lo:0, entries:cdata<struct 1428 *>: 0x2b6b0c400000 (more...)}
- occupancy = number: 1.24192e+06
- max_displacement = number: 8
- fail = Lua function 'fail' (defined at line 524 of chunk lib/ctable.lua)
- expect_eq = Lua function 'expect' (defined at line 528 of chunk lib/ctable.lua)
- expect_le = Lua function 'expect' (defined at line 531 of chunk lib/ctable.lua)
- prev = number: 2.66704e+09
- (for index) = number: 3.10484e+06
- (for limit) = number: 5.00001e+06
- (for step) = number: 1
- i = number: 3.10484e+06
- hash = number: 0
- (6) Lua field 'selftest' at file 'lib/ctable.lua:632'
- Local variables:
- bnot = C function: builtin#65
- occupancy = number: 2e+06
- params = table: 0x40cac910 {initial_size:5000000, key_type:ctype<unsigned int [1]>, max_occupancy_rate:0.4 (more...)}
- ctab = table: 0x40cac938 {scale:0.0011641532185404, occupancy_lo:0, entries:cdata<struct 1428 *>: 0x2b6b0c400000 (more...)}
- (for index) = number: 1
- (for limit) = number: 2
- (for step) = number: 1
- i = number: 1
- (7) Lua function 'opt' at file 'program/snsh/snsh.lua:31' (best guess)
- Local variables:
- arg = string: "lib.ctable"
- (8) Lua field 'dogetopt' at file 'core/lib.lua:426'
- Local variables:
- args = table: 0x410421b8 {1:-t, 2:lib.ctable}
- actions = table: 0x40befe78 {t:function: 0x40bf4198, q:function: 0x40a57c40, P:function: 0x40a6a828 (more...)}
- opts = string: "hl:p:t:die:j:P:q:"
- long_opts = table: 0x41044728 {jit:j, help:h, program:p, test:t, load:l, package-path:P, debug:d, sigquit:q (more...)}
- opts = table: 0x40a57bd8 {1:t}
- optind = number: 3
- optarg = table: 0x40a57c00 {1:lib.ctable}
- (for generator) = C function: builtin#6
- (for state) = table: 0x40a57bd8 {1:t}
- (for control) = number: 1
- i = number: 1
- v = string: "t"
- (9) Lua field 'run' at file 'program/snsh/snsh.lua:65'
- Local variables:
- parameters = table: 0x410421b8 {1:-t, 2:lib.ctable}
- profiling = boolean: false
- traceprofiling = boolean: false
- start_repl = boolean: false
- noop = boolean: true
- program = nil
- opt = table: 0x40befe78 {t:function: 0x40bf4198, q:function: 0x40a57c40, P:function: 0x40a6a828 (more...)}
- (10) Lua function 'main' at file 'core/main.lua:67' (best guess)
- Local variables:
- program = string: "snsh"
- args = table: 0x410421b8 {1:-t, 2:lib.ctable}
- (11) global C function 'xpcall'
- (12) main chunk of file 'core/main.lua' at line 242
- (13) C function 'require'
- (14) global C function 'pcall'
- (15) main chunk of file 'core/startup.lua' at line 3
- (16) global C function 'require'
- (17) main chunk of [string "require "core.startup""] at line 1
- nil
- EXITCODE: 1
- src/testlog/lib.hash.murmur:
- Sleftest hash MurmurHash3_x86_32
- Passed
- Sleftest hash MurmurHash3_x64_128
- Passed
- src/testlog/lib.hash.siphash:
- selftest: ................................................
- selftest ok
- src/testlog/lib.lpm.ip4:
- selftest_parse()
- selftest_masked()
- selftest_get_bit()
- selftest_commonlength()
- PMU not available:
- single core cpu affinity required
- Skipping benchmark.
- src/testlog/lib.lpm.ip6:
- src/testlog/lib.lpm.lpm:
- src/testlog/lib.lpm.lpm4:
- src/testlog/lib.lpm.lpm4_248:
- LPM4_248 15bit keys
- Skipping LPM4:selfest (very specific / excessive runtime)
- In case you are hacking on lib.lpm you might want to enable
- these tests by setting SNABB_LPM4_TEST_INTENSIVE in your
- environment.
- LPM4_248 31bit keys
- Skipping LPM4:selfest (very specific / excessive runtime)
- In case you are hacking on lib.lpm you might want to enable
- these tests by setting SNABB_LPM4_TEST_INTENSIVE in your
- environment.
- src/testlog/lib.lpm.lpm4_dxr:
- Skipping LPM4:selfest (very specific / excessive runtime)
- In case you are hacking on lib.lpm you might want to enable
- these tests by setting SNABB_LPM4_TEST_INTENSIVE in your
- environment.
- src/testlog/lib.lpm.lpm4_poptrie:
- 128.0.0.0/1 2
- 192.0.0.0/2 3
- 224.0.0.0/3 4
- 240.0.0.0/4 5
- 240.128.0.0/10 6
- selftest_get_bits()
- PMU not available:
- single core cpu affinity required
- Skipping benchmark.
- selftest_masks()
- src/testlog/lib.lpm.lpm4_trie:
- Skipping LPM4:selfest (very specific / excessive runtime)
- In case you are hacking on lib.lpm you might want to enable
- these tests by setting SNABB_LPM4_TEST_INTENSIVE in your
- environment.
- src/testlog/lib.maxpc:
- src/testlog/lib.rrd:
- selftest: lib.rrd
- 1537458960 foo counter average 10 nan
- 1537458960 foo counter average 10 1.3333333333333
- 1537458960 foo counter average 10 1.3333333333333
- 1537458960 foo counter average 10 1.3333333333333
- 1537458960 foo counter average 10 1.3333333333333
- 1537458960 foo counter average 10 1.3333333333333
- 1537458960 foo counter average 10 1.3333333333333
- 1537458960 foo counter average 10 1.3333333333333
- 1537458960 foo counter average 10 1.3333333333333
- 1537458960 foo counter average 10 1.3333333333333
- 1537458960 foo counter average 10 1.3333333333333
- 1537458960 foo counter average 10 1.3333333333333
- 1537458960 foo counter average 10 1.3333333333333
- 1537458960 foo counter average 10 1.3333333333333
- 1537458960 foo counter average 10 1.3333333333333
- 1537458960 foo counter average 10 1.3333333333333
- 1537458960 foo counter average 10 1.3333333333333
- 1537458960 foo counter average 10 1.3333333333333
- 1537458960 foo counter average 10 1.3333333333333
- 1537458960 foo counter average 10 1.3333333333333
- 1537458960 foo counter average 10 1.3333333333333
- 1537458960 foo counter average 10 1.3333333333333
- 1537458960 foo counter average 10 1.3333333333333
- 1537458960 foo counter average 10 1.3333333333333
- 1537458960 foo counter average 10 1.3333333333333
- 1537458960 foo counter average 10 1.3333333333333
- 1537458960 foo counter average 10 1.3333333333333
- 1537458960 foo counter average 10 1.3333333333333
- 1537458960 foo counter average 10 1.3333333333333
- 1537458960 foo counter average 10 1.3333333333333
- 1537458960 foo counter average 10 1.3333333333333
- 1537458960 foo counter average 10 1.3333333333333
- 1537458960 foo counter average 10 1.3333333333333
- 1537458960 foo counter average 10 1.3333333333333
- 1537458960 foo counter average 10 1.3333333333333
- 1537458960 foo counter average 10 1.3333333333333
- 1537458960 foo counter average 10 1.3333333333333
- 1537458960 foo counter average 10 1.3333333333333
- 1537458960 foo counter average 10 1.3333333333333
- 1537458960 foo counter average 10 1.3333333333333
- 1537458960 foo counter average 10 1.3333333333333
- selftest: ok
- src/testlog/lib.stream:
- selftest: lib.stream
- selftest: ok
- src/testlog/lib.traceprof.traceprof:
- traceprof report (recorded 512/512 samples):
- 62% TRACE 3:LOOP ->loop
- 19% TRACE 4 ->3
- 18% TRACE 3 ->loop
- src/testlog/lib.xsd_regexp:
- src/testlog/program.l2vpn.control_channel:
- src/testlog/program.l2vpn.pseudowire:
- load: time: 1.01s fps: 200 fpGbps: 0.000 fpb: 0 bpp: 173 sleep: 100 us
- src/testlog/program.lwaftr.quickcheck.utils:
- selftest: quickcheck.utils
- OK
- src/testlog/program.lwaftr.tests.propbased.genyang:
- selftest: program.lwaftr.tests.propbased.genyang
- selftest: ok
- src/testlog/program.snabbnfv.neutron2snabb.neutron2snabb:
- selftest: neutron2snabb
- ok: {{direction='ingress', ethertype='IPv6'}}
- => ip6
- ok: {{direction='ingress', ethertype='IPv4'}}
- => (arp or ip)
- ok: {{direction='ingress', ethertype='IPv4', protocol='tcp'}}
- => (arp or (ip and tcp))
- ok: {{direction='ingress', ethertype='IPv4', protocol='udp'}}
- => (arp or (ip and udp))
- ok: {{direction='ingress', ethertype='IPv4', protocol='udp', port_range_min=1000}}
- => (arp or (ip and udp and dst portrange 1000-1000))
- ok: {{direction='ingress', ethertype='IPv4', protocol='udp', port_range_max=2000}}
- => (arp or (ip and udp and dst portrange 2000-2000))
- ok: {{direction='ingress', ethertype='IPv4', protocol='tcp', port_range_min=1000, port_range_max=2000}}
- => (arp or (ip and tcp and dst portrange 1000-2000))
- ok: {{direction='ingress', ethertype='IPv6', protocol='tcp'}, {direction='ingress', ethertype='IPv4', protocol='udp', remote_ip_prefix='10.0.0.0/8'}}
- => ((ip6 and tcp) or (arp or (ip and udp and src net 10.0.0.0/8)))
- selftest ok
- src/testlog/program.snabbnfv.neutron2snabb.neutron2snabb_schema:
- selftest: neutron2snabb_schema
- selftest: ok
- src/testlog/program.snabbnfv.nfvconfig:
- selftest: lib.nfv.config
- testing: program/snabbnfv/test_fixtures/nfvconfig/switch_nic/x
- engine: start_app id0_Virtio
- engine: start_app id0_NIC
- engine: new_link id0_NIC.output -> id0_Virtio.rx
- engine: link_output id0_NIC
- engine: link_input id0_Virtio
- engine: new_link id0_Virtio.tx -> id0_NIC.input
- engine: link_output id0_Virtio
- engine: link_input id0_NIC
- testing: program/snabbnfv/test_fixtures/nfvconfig/switch_filter/x
- engine: unlink_output id0_NIC
- engine: unlink_input id0_Virtio
- engine: free_link id0_NIC.output -> id0_Virtio.rx
- engine: start_app id0_Filter_in
- engine: new_link id0_Filter_in.tx -> id0_Virtio.rx
- engine: link_output id0_Filter_in
- engine: link_input id0_Virtio
- engine: new_link id0_NIC.output -> id0_Filter_in.rx
- engine: link_output id0_NIC
- engine: link_input id0_Filter_in
- load: time: 0.27s fps: 7 fpGbps: 0.000 fpb: 0 bpp: 90 sleep: 100 us
- testing: program/snabbnfv/test_fixtures/nfvconfig/switch_qos/x
- engine: unlink_output id0_Filter_in
- engine: unlink_input id0_Virtio
- engine: free_link id0_Filter_in.tx -> id0_Virtio.rx
- engine: unlink_output id0_NIC
- engine: unlink_input id0_Filter_in
- engine: free_link id0_NIC.output -> id0_Filter_in.rx
- engine: stop_app id0_Filter_in
- engine: new_link id0_NIC.output -> id0_Virtio.rx
- engine: link_output id0_NIC
- engine: link_input id0_Virtio
- load: time: 0.25s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
- testing: program/snabbnfv/test_fixtures/nfvconfig/switch_tunnel/x
- engine: unlink_output id0_Virtio
- engine: unlink_input id0_NIC
- engine: free_link id0_Virtio.tx -> id0_NIC.input
- engine: unlink_output id0_NIC
- engine: unlink_input id0_Virtio
- engine: free_link id0_NIC.output -> id0_Virtio.rx
- engine: start_app id0_Tunnel
- engine: start_app id0_ND
- engine: new_link id0_NIC.output -> id0_ND.south
- engine: link_output id0_NIC
- engine: link_input id0_ND
- engine: new_link id0_Tunnel.decapsulated -> id0_Virtio.rx
- engine: link_output id0_Tunnel
- engine: link_input id0_Virtio
- engine: new_link id0_Virtio.tx -> id0_Tunnel.decapsulated
- engine: link_output id0_Virtio
- engine: link_input id0_Tunnel
- engine: new_link id0_ND.south -> id0_NIC.input
- engine: link_output id0_ND
- engine: link_input id0_NIC
- engine: new_link id0_Tunnel.encapsulated -> id0_ND.north
- engine: link_output id0_Tunnel
- engine: link_input id0_ND
- engine: new_link id0_ND.north -> id0_Tunnel.encapsulated
- engine: link_output id0_ND
- engine: link_input id0_Tunnel
- Sep 20 2018 15:55:44 nd_light: Sending neighbor solicitation for next-hop 2::2
- load: time: 0.26s fps: 3 fpGbps: 0.000 fpb: 0 bpp: 86 sleep: 100 us
- testing: program/snabbnfv/test_fixtures/nfvconfig/scale_up/y
- engine: unlink_output id0_NIC
- engine: unlink_input id0_ND
- engine: free_link id0_NIC.output -> id0_ND.south
- engine: unlink_output id0_Tunnel
- engine: unlink_input id0_Virtio
- engine: free_link id0_Tunnel.decapsulated -> id0_Virtio.rx
- engine: unlink_output id0_Virtio
- engine: unlink_input id0_Tunnel
- engine: free_link id0_Virtio.tx -> id0_Tunnel.decapsulated
- engine: unlink_output id0_ND
- engine: unlink_input id0_NIC
- engine: free_link id0_ND.south -> id0_NIC.input
- engine: unlink_output id0_Tunnel
- engine: unlink_input id0_ND
- engine: free_link id0_Tunnel.encapsulated -> id0_ND.north
- engine: unlink_output id0_ND
- engine: unlink_input id0_Tunnel
- engine: free_link id0_ND.north -> id0_Tunnel.encapsulated
- engine: stop_app id0_Tunnel
- engine: stop_app id0_ND
- engine: start_app id1_NIC
- engine: start_app id1_Virtio
- engine: new_link id1_Virtio.tx -> id1_NIC.input
- engine: link_output id1_Virtio
- engine: link_input id1_NIC
- engine: new_link id0_Virtio.tx -> id0_NIC.input
- engine: link_output id0_Virtio
- engine: link_input id0_NIC
- engine: new_link id0_NIC.output -> id0_Virtio.rx
- engine: link_output id0_NIC
- engine: link_input id0_Virtio
- engine: new_link id1_NIC.output -> id1_Virtio.rx
- engine: link_output id1_NIC
- engine: link_input id1_Virtio
- load: time: 0.27s fps: 7 fpGbps: 0.000 fpb: 0 bpp: 86 sleep: 100 us
- testing: program/snabbnfv/test_fixtures/nfvconfig/scale_up/x
- engine: start_app id63_Virtio
- engine: start_app id40_Virtio
- engine: start_app id10_NIC
- engine: start_app id61_Virtio
- engine: start_app id6_NIC
- engine: start_app id45_NIC
- engine: start_app id34_NIC
- engine: start_app id35_NIC
- engine: start_app id60_Virtio
- engine: start_app id59_Virtio
- engine: start_app id58_Virtio
- engine: start_app id57_Virtio
- engine: start_app id3_Virtio
- engine: start_app id47_Virtio
- engine: start_app id54_Virtio
- engine: start_app id42_NIC
- engine: start_app id38_NIC
- engine: start_app id24_NIC
- engine: start_app id18_Virtio
- engine: start_app id52_Virtio
- engine: start_app id51_NIC
- engine: start_app id51_Virtio
- engine: start_app id50_Virtio
- engine: start_app id49_Virtio
- engine: start_app id48_Virtio
- engine: start_app id55_Virtio
- engine: start_app id45_Virtio
- engine: start_app id28_NIC
- engine: start_app id46_Virtio
- engine: start_app id15_Virtio
- engine: start_app id41_NIC
- engine: start_app id35_Virtio
- engine: start_app id25_NIC
- engine: start_app id44_Virtio
- engine: start_app id43_Virtio
- engine: start_app id42_Virtio
- engine: start_app id32_NIC
- engine: start_app id18_NIC
- engine: start_app id33_Virtio
- engine: start_app id62_Virtio
- engine: start_app id12_Virtio
- engine: start_app id39_Virtio
- engine: start_app id32_Virtio
- engine: start_app id22_Virtio
- engine: start_app id5_Virtio
- engine: start_app id53_NIC
- engine: start_app id60_NIC
- engine: start_app id8_NIC
- engine: start_app id31_NIC
- engine: start_app id46_NIC
- engine: start_app id37_Virtio
- engine: start_app id27_Virtio
- engine: start_app id17_Virtio
- engine: start_app id30_Virtio
- engine: start_app id50_NIC
- engine: start_app id10_Virtio
- engine: start_app id53_Virtio
- engine: start_app id56_NIC
- engine: start_app id12_NIC
- engine: start_app id43_NIC
- engine: start_app id13_Virtio
- engine: start_app id23_Virtio
- engine: start_app id14_Virtio
- engine: start_app id34_Virtio
- engine: start_app id7_Virtio
- engine: start_app id6_Virtio
- engine: start_app id40_NIC
- engine: start_app id63_NIC
- engine: start_app id21_Virtio
- engine: start_app id31_Virtio
- engine: start_app id62_NIC
- engine: start_app id11_Virtio
- engine: start_app id59_NIC
- engine: start_app id13_NIC
- engine: start_app id41_Virtio
- engine: start_app id57_NIC
- engine: start_app id29_Virtio
- engine: start_app id9_Virtio
- engine: start_app id8_Virtio
- engine: start_app id28_Virtio
- engine: start_app id26_Virtio
- engine: start_app id49_NIC
- engine: start_app id4_Virtio
- engine: start_app id4_NIC
- engine: start_app id2_Virtio
- engine: start_app id36_NIC
- engine: start_app id52_NIC
- engine: start_app id25_Virtio
- engine: start_app id47_NIC
- engine: start_app id58_NIC
- engine: start_app id24_Virtio
- engine: start_app id20_Virtio
- engine: start_app id19_Virtio
- engine: start_app id29_NIC
- engine: start_app id37_NIC
- engine: start_app id54_NIC
- engine: start_app id36_Virtio
- engine: start_app id26_NIC
- engine: start_app id16_Virtio
- engine: start_app id39_NIC
- engine: start_app id38_Virtio
- engine: start_app id48_NIC
- engine: start_app id56_Virtio
- engine: start_app id9_NIC
- engine: start_app id21_NIC
- engine: start_app id61_NIC
- engine: start_app id30_NIC
- engine: start_app id44_NIC
- engine: start_app id19_NIC
- engine: start_app id7_NIC
- engine: start_app id3_NIC
- engine: start_app id22_NIC
- engine: start_app id23_NIC
- engine: start_app id14_NIC
- engine: start_app id2_NIC
- engine: start_app id5_NIC
- engine: start_app id20_NIC
- engine: start_app id17_NIC
- engine: start_app id11_NIC
- engine: start_app id55_NIC
- engine: start_app id33_NIC
- engine: start_app id15_NIC
- engine: start_app id16_NIC
- engine: start_app id27_NIC
- engine: new_link id63_Virtio.tx -> id63_NIC.input
- engine: link_output id63_Virtio
- engine: link_input id63_NIC
- engine: new_link id63_NIC.output -> id63_Virtio.rx
- engine: link_output id63_NIC
- engine: link_input id63_Virtio
- engine: new_link id62_Virtio.tx -> id62_NIC.input
- engine: link_output id62_Virtio
- engine: link_input id62_NIC
- engine: new_link id62_NIC.output -> id62_Virtio.rx
- engine: link_output id62_NIC
- engine: link_input id62_Virtio
- engine: new_link id61_Virtio.tx -> id61_NIC.input
- engine: link_output id61_Virtio
- engine: link_input id61_NIC
- engine: new_link id61_NIC.output -> id61_Virtio.rx
- engine: link_output id61_NIC
- engine: link_input id61_Virtio
- engine: new_link id60_Virtio.tx -> id60_NIC.input
- engine: link_output id60_Virtio
- engine: link_input id60_NIC
- engine: new_link id18_NIC.output -> id18_Virtio.rx
- engine: link_output id18_NIC
- engine: link_input id18_Virtio
- engine: new_link id60_NIC.output -> id60_Virtio.rx
- engine: link_output id60_NIC
- engine: link_input id60_Virtio
- engine: new_link id19_NIC.output -> id19_Virtio.rx
- engine: link_output id19_NIC
- engine: link_input id19_Virtio
- engine: new_link id45_Virtio.tx -> id45_NIC.input
- engine: link_output id45_Virtio
- engine: link_input id45_NIC
- engine: new_link id16_NIC.output -> id16_Virtio.rx
- engine: link_output id16_NIC
- engine: link_input id16_Virtio
- engine: new_link id28_NIC.output -> id28_Virtio.rx
- engine: link_output id28_NIC
- engine: link_input id28_Virtio
- engine: new_link id17_NIC.output -> id17_Virtio.rx
- engine: link_output id17_NIC
- engine: link_input id17_Virtio
- engine: new_link id29_NIC.output -> id29_Virtio.rx
- engine: link_output id29_NIC
- engine: link_input id29_Virtio
- engine: new_link id14_NIC.output -> id14_Virtio.rx
- engine: link_output id14_NIC
- engine: link_input id14_Virtio
- engine: new_link id26_NIC.output -> id26_Virtio.rx
- engine: link_output id26_NIC
- engine: link_input id26_Virtio
- engine: new_link id15_NIC.output -> id15_Virtio.rx
- engine: link_output id15_NIC
- engine: link_input id15_Virtio
- engine: new_link id27_NIC.output -> id27_Virtio.rx
- engine: link_output id27_NIC
- engine: link_input id27_Virtio
- engine: new_link id12_NIC.output -> id12_Virtio.rx
- engine: link_output id12_NIC
- engine: link_input id12_Virtio
- engine: new_link id24_NIC.output -> id24_Virtio.rx
- engine: link_output id24_NIC
- engine: link_input id24_Virtio
- engine: new_link id13_NIC.output -> id13_Virtio.rx
- engine: link_output id13_NIC
- engine: link_input id13_Virtio
- engine: new_link id25_NIC.output -> id25_Virtio.rx
- engine: link_output id25_NIC
- engine: link_input id25_Virtio
- engine: new_link id10_NIC.output -> id10_Virtio.rx
- engine: link_output id10_NIC
- engine: link_input id10_Virtio
- engine: new_link id22_NIC.output -> id22_Virtio.rx
- engine: link_output id22_NIC
- engine: link_input id22_Virtio
- engine: new_link id11_NIC.output -> id11_Virtio.rx
- engine: link_output id11_NIC
- engine: link_input id11_Virtio
- engine: new_link id23_NIC.output -> id23_Virtio.rx
- engine: link_output id23_NIC
- engine: link_input id23_Virtio
- engine: new_link id33_NIC.output -> id33_Virtio.rx
- engine: link_output id33_NIC
- engine: link_input id33_Virtio
- engine: new_link id20_NIC.output -> id20_Virtio.rx
- engine: link_output id20_NIC
- engine: link_input id20_Virtio
- engine: new_link id30_NIC.output -> id30_Virtio.rx
- engine: link_output id30_NIC
- engine: link_input id30_Virtio
- engine: new_link id21_NIC.output -> id21_Virtio.rx
- engine: link_output id21_NIC
- engine: link_input id21_Virtio
- engine: new_link id31_NIC.output -> id31_Virtio.rx
- engine: link_output id31_NIC
- engine: link_input id31_Virtio
- engine: new_link id59_NIC.output -> id59_Virtio.rx
- engine: link_output id59_NIC
- engine: link_input id59_Virtio
- engine: new_link id44_Virtio.tx -> id44_NIC.input
- engine: link_output id44_Virtio
- engine: link_input id44_NIC
- engine: new_link id58_NIC.output -> id58_Virtio.rx
- engine: link_output id58_NIC
- engine: link_input id58_Virtio
- engine: new_link id57_Virtio.tx -> id57_NIC.input
- engine: link_output id57_Virtio
- engine: link_input id57_NIC
- engine: new_link id57_NIC.output -> id57_Virtio.rx
- engine: link_output id57_NIC
- engine: link_input id57_Virtio
- engine: new_link id56_Virtio.tx -> id56_NIC.input
- engine: link_output id56_Virtio
- engine: link_input id56_NIC
- engine: new_link id56_NIC.output -> id56_Virtio.rx
- engine: link_output id56_NIC
- engine: link_input id56_Virtio
- engine: new_link id35_Virtio.tx -> id35_NIC.input
- engine: link_output id35_Virtio
- engine: link_input id35_NIC
- engine: new_link id55_NIC.output -> id55_Virtio.rx
- engine: link_output id55_NIC
- engine: link_input id55_Virtio
- engine: new_link id54_Virtio.tx -> id54_NIC.input
- engine: link_output id54_Virtio
- engine: link_input id54_NIC
- engine: new_link id54_NIC.output -> id54_Virtio.rx
- engine: link_output id54_NIC
- engine: link_input id54_Virtio
- engine: new_link id37_Virtio.tx -> id37_NIC.input
- engine: link_output id37_Virtio
- engine: link_input id37_NIC
- engine: new_link id2_NIC.output -> id2_Virtio.rx
- engine: link_output id2_NIC
- engine: link_input id2_Virtio
- engine: new_link id8_Virtio.tx -> id8_NIC.input
- engine: link_output id8_Virtio
- engine: link_input id8_NIC
- engine: new_link id53_NIC.output -> id53_Virtio.rx
- engine: link_output id53_NIC
- engine: link_input id53_Virtio
- engine: new_link id36_Virtio.tx -> id36_NIC.input
- engine: link_output id36_Virtio
- engine: link_input id36_NIC
- engine: new_link id52_NIC.output -> id52_Virtio.rx
- engine: link_output id52_NIC
- engine: link_input id52_Virtio
- engine: new_link id40_NIC.output -> id40_Virtio.rx
- engine: link_output id40_NIC
- engine: link_input id40_Virtio
- engine: new_link id7_Virtio.tx -> id7_NIC.input
- engine: link_output id7_Virtio
- engine: link_input id7_NIC
- engine: new_link id6_Virtio.tx -> id6_NIC.input
- engine: link_output id6_Virtio
- engine: link_input id6_NIC
- engine: new_link id5_Virtio.tx -> id5_NIC.input
- engine: link_output id5_Virtio
- engine: link_input id5_NIC
- engine: new_link id4_Virtio.tx -> id4_NIC.input
- engine: link_output id4_Virtio
- engine: link_input id4_NIC
- engine: new_link id51_NIC.output -> id51_Virtio.rx
- engine: link_output id51_NIC
- engine: link_input id51_Virtio
- engine: new_link id38_Virtio.tx -> id38_NIC.input
- engine: link_output id38_Virtio
- engine: link_input id38_NIC
- engine: new_link id50_NIC.output -> id50_Virtio.rx
- engine: link_output id50_NIC
- engine: link_input id50_Virtio
- engine: new_link id45_NIC.output -> id45_Virtio.rx
- engine: link_output id45_NIC
- engine: link_input id45_Virtio
- engine: new_link id49_NIC.output -> id49_Virtio.rx
- engine: link_output id49_NIC
- engine: link_input id49_Virtio
- engine: new_link id44_NIC.output -> id44_Virtio.rx
- engine: link_output id44_NIC
- engine: link_input id44_Virtio
- engine: new_link id48_NIC.output -> id48_Virtio.rx
- engine: link_output id48_NIC
- engine: link_input id48_Virtio
- engine: new_link id47_Virtio.tx -> id47_NIC.input
- engine: link_output id47_Virtio
- engine: link_input id47_NIC
- engine: new_link id3_Virtio.tx -> id3_NIC.input
- engine: link_output id3_Virtio
- engine: link_input id3_NIC
- engine: new_link id2_Virtio.tx -> id2_NIC.input
- engine: link_output id2_Virtio
- engine: link_input id2_NIC
- engine: new_link id47_NIC.output -> id47_Virtio.rx
- engine: link_output id47_NIC
- engine: link_input id47_Virtio
- engine: new_link id46_Virtio.tx -> id46_NIC.input
- engine: link_output id46_Virtio
- engine: link_input id46_NIC
- engine: new_link id38_NIC.output -> id38_Virtio.rx
- engine: link_output id38_NIC
- engine: link_input id38_Virtio
- engine: new_link id59_Virtio.tx -> id59_NIC.input
- engine: link_output id59_Virtio
- engine: link_input id59_NIC
- engine: new_link id49_Virtio.tx -> id49_NIC.input
- engine: link_output id49_Virtio
- engine: link_input id49_NIC
- engine: new_link id58_Virtio.tx -> id58_NIC.input
- engine: link_output id58_Virtio
- engine: link_input id58_NIC
- engine: new_link id48_Virtio.tx -> id48_NIC.input
- engine: link_output id48_Virtio
- engine: link_input id48_NIC
- engine: new_link id37_NIC.output -> id37_Virtio.rx
- engine: link_output id37_NIC
- engine: link_input id37_Virtio
- engine: new_link id35_NIC.output -> id35_Virtio.rx
- engine: link_output id35_NIC
- engine: link_input id35_Virtio
- engine: new_link id42_Virtio.tx -> id42_NIC.input
- engine: link_output id42_Virtio
- engine: link_input id42_NIC
- engine: new_link id36_NIC.output -> id36_Virtio.rx
- engine: link_output id36_NIC
- engine: link_input id36_Virtio
- engine: new_link id39_NIC.output -> id39_Virtio.rx
- engine: link_output id39_NIC
- engine: link_input id39_Virtio
- engine: new_link id41_NIC.output -> id41_Virtio.rx
- engine: link_output id41_NIC
- engine: link_input id41_Virtio
- engine: new_link id39_Virtio.tx -> id39_NIC.input
- engine: link_output id39_Virtio
- engine: link_input id39_NIC
- engine: new_link id29_Virtio.tx -> id29_NIC.input
- engine: link_output id29_Virtio
- engine: link_input id29_NIC
- engine: new_link id19_Virtio.tx -> id19_NIC.input
- engine: link_output id19_Virtio
- engine: link_input id19_NIC
- engine: new_link id28_Virtio.tx -> id28_NIC.input
- engine: link_output id28_Virtio
- engine: link_input id28_NIC
- engine: new_link id18_Virtio.tx -> id18_NIC.input
- engine: link_output id18_Virtio
- engine: link_input id18_NIC
- engine: new_link id46_NIC.output -> id46_Virtio.rx
- engine: link_output id46_NIC
- engine: link_input id46_Virtio
- engine: new_link id51_Virtio.tx -> id51_NIC.input
- engine: link_output id51_Virtio
- engine: link_input id51_NIC
- engine: new_link id41_Virtio.tx -> id41_NIC.input
- engine: link_output id41_Virtio
- engine: link_input id41_NIC
- engine: new_link id50_Virtio.tx -> id50_NIC.input
- engine: link_output id50_Virtio
- engine: link_input id50_NIC
- engine: new_link id40_Virtio.tx -> id40_NIC.input
- engine: link_output id40_Virtio
- engine: link_input id40_NIC
- engine: new_link id53_Virtio.tx -> id53_NIC.input
- engine: link_output id53_Virtio
- engine: link_input id53_NIC
- engine: new_link id43_Virtio.tx -> id43_NIC.input
- engine: link_output id43_Virtio
- engine: link_input id43_NIC
- engine: new_link id52_Virtio.tx -> id52_NIC.input
- engine: link_output id52_Virtio
- engine: link_input id52_NIC
- engine: new_link id42_NIC.output -> id42_Virtio.rx
- engine: link_output id42_NIC
- engine: link_input id42_Virtio
- engine: new_link id55_Virtio.tx -> id55_NIC.input
- engine: link_output id55_Virtio
- engine: link_input id55_NIC
- engine: new_link id43_NIC.output -> id43_Virtio.rx
- engine: link_output id43_NIC
- engine: link_input id43_Virtio
- engine: new_link id31_Virtio.tx -> id31_NIC.input
- engine: link_output id31_Virtio
- engine: link_input id31_NIC
- engine: new_link id21_Virtio.tx -> id21_NIC.input
- engine: link_output id21_Virtio
- engine: link_input id21_NIC
- engine: new_link id30_Virtio.tx -> id30_NIC.input
- engine: link_output id30_Virtio
- engine: link_input id30_NIC
- engine: new_link id20_Virtio.tx -> id20_NIC.input
- engine: link_output id20_Virtio
- engine: link_input id20_NIC
- engine: new_link id10_Virtio.tx -> id10_NIC.input
- engine: link_output id10_Virtio
- engine: link_input id10_NIC
- engine: new_link id23_Virtio.tx -> id23_NIC.input
- engine: link_output id23_Virtio
- engine: link_input id23_NIC
- engine: new_link id13_Virtio.tx -> id13_NIC.input
- engine: link_output id13_Virtio
- engine: link_input id13_NIC
- engine: new_link id22_Virtio.tx -> id22_NIC.input
- engine: link_output id22_Virtio
- engine: link_input id22_NIC
- engine: new_link id12_Virtio.tx -> id12_NIC.input
- engine: link_output id12_Virtio
- engine: link_input id12_NIC
- engine: new_link id25_Virtio.tx -> id25_NIC.input
- engine: link_output id25_Virtio
- engine: link_input id25_NIC
- engine: new_link id15_Virtio.tx -> id15_NIC.input
- engine: link_output id15_Virtio
- engine: link_input id15_NIC
- engine: new_link id24_Virtio.tx -> id24_NIC.input
- engine: link_output id24_Virtio
- engine: link_input id24_NIC
- engine: new_link id14_Virtio.tx -> id14_NIC.input
- engine: link_output id14_Virtio
- engine: link_input id14_NIC
- engine: new_link id27_Virtio.tx -> id27_NIC.input
- engine: link_output id27_Virtio
- engine: link_input id27_NIC
- engine: new_link id17_Virtio.tx -> id17_NIC.input
- engine: link_output id17_Virtio
- engine: link_input id17_NIC
- engine: new_link id26_Virtio.tx -> id26_NIC.input
- engine: link_output id26_Virtio
- engine: link_input id26_NIC
- engine: new_link id7_NIC.output -> id7_Virtio.rx
- engine: link_output id7_NIC
- engine: link_input id7_Virtio
- engine: new_link id4_NIC.output -> id4_Virtio.rx
- engine: link_output id4_NIC
- engine: link_input id4_Virtio
- engine: new_link id5_NIC.output -> id5_Virtio.rx
- engine: link_output id5_NIC
- engine: link_input id5_Virtio
- engine: new_link id34_Virtio.tx -> id34_NIC.input
- engine: link_output id34_Virtio
- engine: link_input id34_NIC
- engine: new_link id34_NIC.output -> id34_Virtio.rx
- engine: link_output id34_NIC
- engine: link_input id34_Virtio
- engine: new_link id8_NIC.output -> id8_Virtio.rx
- engine: link_output id8_NIC
- engine: link_input id8_Virtio
- engine: new_link id9_NIC.output -> id9_Virtio.rx
- engine: link_output id9_NIC
- engine: link_input id9_Virtio
- engine: new_link id33_Virtio.tx -> id33_NIC.input
- engine: link_output id33_Virtio
- engine: link_input id33_NIC
- engine: new_link id32_Virtio.tx -> id32_NIC.input
- engine: link_output id32_Virtio
- engine: link_input id32_NIC
- engine: new_link id32_NIC.output -> id32_Virtio.rx
- engine: link_output id32_NIC
- engine: link_input id32_Virtio
- engine: new_link id3_NIC.output -> id3_Virtio.rx
- engine: link_output id3_NIC
- engine: link_input id3_Virtio
- engine: new_link id9_Virtio.tx -> id9_NIC.input
- engine: link_output id9_Virtio
- engine: link_input id9_NIC
- engine: new_link id16_Virtio.tx -> id16_NIC.input
- engine: link_output id16_Virtio
- engine: link_input id16_NIC
- engine: new_link id6_NIC.output -> id6_Virtio.rx
- engine: link_output id6_NIC
- engine: link_input id6_Virtio
- engine: new_link id11_Virtio.tx -> id11_NIC.input
- engine: link_output id11_Virtio
- engine: link_input id11_NIC
- load: time: 1.11s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
- testing: program/snabbnfv/test_fixtures/nfvconfig/scale_change/x
- load: time: 0.26s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
- testing: program/snabbnfv/test_fixtures/nfvconfig/scale_change/y
- engine: stop_app id0_NIC
- engine: start_app id0_NIC
- engine: stop_app id1_NIC
- engine: start_app id1_NIC
- engine: stop_app id3_NIC
- engine: start_app id3_NIC
- engine: stop_app id2_NIC
- engine: start_app id2_NIC
- engine: link_output id2_NIC
- engine: link_output id0_NIC
- engine: link_output id1_NIC
- engine: link_input id1_NIC
- engine: link_input id0_NIC
- engine: link_input id3_NIC
- engine: link_input id2_NIC
- engine: link_output id3_NIC
- load: time: 0.28s fps: 29,630 fpGbps: 0.013 fpb: 14 bpp: 0 sleep: 100 us
- Warning: tx_police_gbps is deprecated, use tx_police instead.
- Warning: rx_police_gbps is deprecated, use rx_police instead.
Add Comment
Please, Sign In to add comment