Lane Approximations

Because curves can be easier to handle than a few thousand pixels

Mean absolute distance

The original dataset metric. To get a feeling for the overall accuracy of the detector for each annotated lane segment.
Name All l1 l0 r0 r1 Comment
VSA SP 17.88 32.62 8.74 9.95 25.93 Trained on train and valid sets
VSA SP 18.47 32.84 8.57 11.56 26.51 Trained on training set only
VSA 19.00 34.77 8.74 10.79 27.84
Simple Mean Baseline 31.00 33.78 26.34 30.24 34.75 Within github repo

CULane Metrics

Accuracy metrics for detected lanes based on 30 pixel accuracy and an IoU greater or equal to 0.5
Name TP FP FN Precision Recall F1 Comment
RCLaneDet-L 71964 2385 3405 0.9679 0.9548 0.9613
RCLaneDet-S 71915 2458 3454 0.9670 0.9542 0.9605
RCLaneDet-M 71941 2513 3428 0.9662 0.9545 0.9603
LaneAF 71793 2291 3576 0.9691 0.9526 0.9601 Code https://github.com/sel118/LaneAF, paper http://cvrr.ucsd.edu/publications/2021/LaneAF.pdf, 10 fps
B├ęzierLaneNet (ResNet-34) 71191 3050 4178 0.9589 0.9446 0.9517 Code and models: https://github.com/voldemortX/pytorch-auto-drive, Paper: https://arxiv.org/abs/2203.02431.
SCNN VGG16 71425 3315 3944 0.9556 0.9477 0.9516 Code and models: https://github.com/voldemortX/pytorch-auto-drive, Paper: https://ojs.aaai.org/index.php/AAAI/article/view/12301.
B├ęzierLaneNet (ResNet-18) 70946 3180 4423 0.9571 0.9413 0.9491 Code and models: https://github.com/voldemortX/pytorch-auto-drive, Paper: https://arxiv.org/abs/2203.02431.
PointLaneNet Sup 71460 3636 3729 0.9517 0.9505 0.9511
Baseline ERFNet 71235 3673 4134 0.9510 0.9451 0.9480 Code and models: https://github.com/voldemortX/pytorch-auto-drive
Baseline VGG16 70824 3380 4545 0.9544 0.9397 0.9470 Code and models: https://github.com/voldemortX/pytorch-auto-drive, Paper: https://ojs.aaai.org/index.php/AAAI/article/view/12301.
BGCA Remote 71135 4124 4234 0.9452 0.9438 0.9445
BGCA Local 71294 4550 4075 0.9400 0.9459 0.9430
SCNN ResNet34 71141 4455 4228 0.9411 0.9439 0.9425 Code and models: https://github.com/voldemortX/pytorch-auto-drive
SCNN ERFNet 71329 5050 4040 0.9339 0.9464 0.9401 Code and models: https://github.com/voldemortX/pytorch-auto-drive
MSCA Resnet34 68474 2204 6895 0.9688 0.9085 0.9377
MSCA Resnet19 68493 2283 6876 0.9677 0.9088 0.9373
LaneATT (ResNet-34) 68495 2273 6874 0.9679 0.9088 0.9374 Code and models: https://github.com/lucastabelini/LaneATT.
LaneATT (ResNet-122) 68190 2239 7179 0.9682 0.9047 0.9354 Code and models: https://github.com/lucastabelini/LaneATT.
LaneATT (ResNet-18) 68012 2161 7357 0.9692 0.9024 0.9346 Code and models: https://github.com/lucastabelini/LaneATT.
Baseline ResNet34 71046 5667 4320 0.9261 0.9427 0.9343 Code and models: https://github.com/voldemortX/pytorch-auto-drive, Paper: https://ojs.aaai.org/index.php/AAAI/article/view/12301.
PointLaneNeti Ssl 63794 4109 11575 0.9395 0.8464 0.8905
PolyLaneNet 66272 8302 9097 0.8887 0.8793 0.8840 Code and models: https://github.com/lucastabelini/PolyLaneNet.
PointLaneNet Base 64953 11246 10416 0.8524 0.8618 0.8571
Mean Baseline 917 82799 74452 0.0110 0.0122 0.0115 Not useful as baseline