I personally encode with the following settings for x264 in Vidcoder (http://vidcoder.net/) for 1080p content. Vidcoder is basically a better UI for Handbrake.
Audio passthru of course and no video filters. Cropping only when needed of course (e.g. black bars on top of bottom).
My settings are basically more or less the equivalent of the high 4.1 profile with a very-slow preset beside some tweaks I did years ago and don’t even remember why exactly
With CRF 18 you should not see any difference, sometimes the size might even be larger than the actual source Blu-ray depending on the video stream itself.
Never encountered a Bluray where I had to go lower than CRF18 personally. Your mileage may of course vary with some titles and you might try even going as low as CRF16. If you are worried about difference maybe settle on CRF17 then and don’t look back.
Just give it a try what the result is for you. At your 110" screen you probably already seeing the shortcomings of the transfers and not of the re-encode. At least I can on a decent projector at this size for 1080p content.
Is there a difference? For sure the bits are difference as it is re-encode. But from my experience I cannot see them at all - it is mathematically different. I experimented a bit until I had my settings in regards to size/quality for storage.
If you are short on size for many movies, you can go up to CRF20 or more. But then you will start to see difference in the encodes. Not necessarily a problem depending on the source material. A lot of Blurays out there with ton of bitrate but still piss poor picture quality or general encoding errors (usually non mainstream movies).
Takes about 6-8 per Bluray on my box (4770K). I just let it ran over night. Encoded about 500 Bluray over the years for archival and easy retrieval from my Mediacenter.
HEVC instead of AVC is a different topic. Come back in a year or two in regards to x265. Right now it is better at low bitrates, but the quality is not up to good x264 encodes when it comes to good 1080p encodes. Part of it is the preprocessing that tends to blur the picture. Same deal as x264 initially, which took also some time to be usable in general. And if you hardware can handle 10bit HEVC it can usually handle 10bit AVC as well. With 4K sources so we can start to argue here, also as well when talking upscaled encodes.