2022-5-1: PolyLoss, Subquadratic loss landscapes, Large-scale training on spot instances
dblalock.substack.com
An Extendable, Efficient and Effective Transformer-based Object Detector Proposes an object detection approach in the same vein as DETR. There’s a lot of stuff here—a particular neck + head structure, three different types of attention, several loss functions, multi-scale feature map fusion, and more. But they have good results, including tradeoff curves of inference latency vs AP, as well as thorough ablations of different components. Worth digging into if you’re trying to push the limits of object detection (or just turn everything ever into a transformer because it lets you use less competitive baselines).
2022-5-1: PolyLoss, Subquadratic loss landscapes, Large-scale training on spot instances
2022-5-1: PolyLoss, Subquadratic loss…
2022-5-1: PolyLoss, Subquadratic loss landscapes, Large-scale training on spot instances
An Extendable, Efficient and Effective Transformer-based Object Detector Proposes an object detection approach in the same vein as DETR. There’s a lot of stuff here—a particular neck + head structure, three different types of attention, several loss functions, multi-scale feature map fusion, and more. But they have good results, including tradeoff curves of inference latency vs AP, as well as thorough ablations of different components. Worth digging into if you’re trying to push the limits of object detection (or just turn everything ever into a transformer because it lets you use less competitive baselines).