- 
We propose IPRF, a novel photorealistic 3D style transfer framework that leverages Intrinsic Image Decomposition (IID) to move beyond simple color-based transformations while preserving structural consistency. 
- 
IPRF preserves the physical realism of content by independently optimizing albedo and shading losses, enabling faithful separation of material and illumination properties. 
- 
We introduce TSI (Tuning-assisted Style Interpolation), a real-time method that enables smooth style transitions and efficient hyperparameter exploration without requiring additional training. 
- 
Extensive benchmarks demonstrate that IPRF outperforms prior methods in balancing photorealism and style fidelity. 
Clone the repo:
git clone https://github.com/OSHMOS/IPRF.git
cd IPRFInstall the iprf requirements using conda and pip:
conda create -n iprf python=3.9 -y && conda activate iprf
pip install -r requirements.txt
pip install -e . --verboseIPRF supports datasets like NeRF-LLFF and ARF-Style data.
To quickly test the method, download a sample dataset:
# Place the downloaded data in ```IPRF/data/```.
mkdir data && cd data
gdown 1VNaB0Wy1almoXEq44ipw83SERDHMozQB
unzip data.zip && cd ..PIE-Net for Intrinsic Image Decomposition
Download the Pre-trained PIE-Net
Place the downloaded model in
IPRF/opt/iid_extractor/ckpt/<pre-trained model> and
IPRF/controllable/iid_extractor/ckpt/<pre-trained model>
We basically require two GPUs: one for Intrinsic Image Decomposition (24GB VRAM is required) and one for Style Transfer (8GB VRAM is sufficient). If using a single GPU, 32GB VRAM is required.
Stylize the flower scene:
#. ./try_llff.sh [scene_name] [style_number]
. ./ try_llff.sh flower 14For interpolating between albedo and shading, use gradio :
cd IPRF/controllable/
python 3D.pyWe would like to thank the authors of Plenoxel, ARF-svox2 and PIE-Net for open-sourcing their implementations.
