(Moderately cleaned version of the answer I PM'ed you)
My guess is that yes, you on average add a day or maybe 5-30% to your incubation period bookends. I would guess 2-19 days, avg 6, if the original is 2-14 avg 5. I'd guess about 40% chance it does hardly anything either way, 20% chance selection effects or other unexpected mechanism actually means
incubation period is reduced.
(This is only different than JenniferRM's answer because she was looking at the case where your body immediately fights it off successfully, but I think that's counted as "not getting sick" and you're looking more for what happens in the 5-10% of the time where you don't immediately fight it off.)
If you condition on not fighting it off, I expect the default is that the partial antibody production will slow down the progression some, but not much. Some number of antibodies means your "initial" viral load will be significantly reduced, so it will take a bit longer to replicate up to symptom-level. (Or similarly, early growth rate will be reduced, also taking longer to hit a threshold.)
Why not much longer? First, just intuition—it would be pretty outlying if suddenly you routinely had 40-day incubations. Second, it's still exponential, and there are just lots more models that make sense where it doesn't multiply your incubation period for a constant time. I'll expand on these below.
Building models of incubation period means trying to determine when symptoms begin during a growing infection. It probably isn't strictly after some threshold of viral count, like 10^8 viruses—probably it partially depends on that, but mixed with other features like “viral generations” and “time”. For example, the longer the virus is in your body, the more chance your body has to notice it and mount a non-specific immune response. And if the virus is searching spatially over your body for a weak point to replicate faster or cause symptoms, the search time probably depends on the number of generations rather than just raw count, under some assumptions of distribution over travel distance at each generation. So symptom time is probably determined at least by count, generations, and time, and only the first of those gets substantially changed by the vaccine (so it won't linearly affect incubation period).
Another class of model might look at what factors affect growth rate at a given time or count. As the virus replicates, the antibody:virus ratio goes down (we've conditioned on getting sick, so it's not like there's a successful immediate scaling of antibodies). Presumably this means there's an "overwhelm effect" where your antibodies are meaningfully slowing things at the beginning, but as the virus replicates, it quickly asymptotes at its old default growth rate. This type of model, or any other model that doesn't assume constant viral growth the whole time, is going to mean the growth rate changes at different stages, meaning the incubation period won't just be linear in some starting parameters. (And nonlinearities lead to reduced change in this context—there are very few factors that could make superlinear change in incubation period given a reasonable parameter.)
Anyways, I'm not that confident it couldn't be longer than a 30% increase, but I would definitely bet against it. I'd guess 95% it's under that.
The remaining uncertainty I have about my claims above, as mentioned in the summary, is mostly due to the possibility of a strong selection effect in people who get the virus after vaccination. For example, maybe the people who had 14-day incubations were the very healthy ones who definitely won't get COVID after vaccination. Say the vaccine makes everyone "healthier" and moves them down the incubation-timeline, meaning someone who would have gotten symptoms in 2 days now takes 2.5 days and someone who would have gotten symptoms in 8 days now takes 10. But what if everyone who would have gotten symptoms in 9 days no longer can even catch COVID? In this case, the incubation bookends would have moved from 2-14 to 2.5-10, even though every specific person is taking longer. (Obviously there are no sharp cutoffs, this is merely illustrative.) Anyways, selection effects destroy many otherwise-good analyses, and we know they'll affect this, but the question is how much. For now I'm guessing they aren't bigger than the primary slowdown effect, but this is mostly intuition.